Configure the Event Streaming Framework

After installing Kafka, you must configure the Event Streaming Framework to send events to Kafka.

Procedure

  1. Configure the following event-streamer ConfigMap values in eventstreamer-kafka-cm.yaml.
    For more information about configuration, see MATRIXX Event Streaming.
    # Kafka config for event_streamer
    ---
    apiVersion: v1
    kind: ConfigMap
    metadata:
      name: event-streamer-config
    data:
      mtx_event_streamer.yaml: |
        rsgateway=http://rsgateway:8080/rsgateway
        subdomains=prepaid
        [email protected]
        prepaid.host=publ-cls-s1e1
        prepaid.port=4100
        prepaid.timeout=60000
        prepaid.types=all
        streams=all
        [email protected]
        [email protected]
     
      testingKafkaJsonConnectorConfigK8s.json: |
        {
            "connector": "kafka-connector",
            "topic": "allevents_json",
            "format": "json",
            "enableCallback": "true",
            "settings": {
                "bootstrap.servers": "mtx-kafka.kafka.svc.cluster.local:9092",
                "acks": "all",
                "retries": 0,
                "value.serializer": "org.apache.kafka.common.serialization.StringSerializer",
                "key.serializer": "org.apache.kafka.common.serialization.StringSerializer"
            }
        }
  2. After modifying the values in the ConfigMap you must redeploy the event-streamer pod to use the updated values.
    kubectl rollout restart deployment/event-streamer --namespace matrixx
    Note: When using Helm to install MATRIXX, this step is not required. You must note the namespace where MATRIXX is installed. The name of the event-streamer deployment may be different in a Helm-based installation (for example, it may include the suffix ag1).

Results

After adding the configuration and restarting the event-streamer pod, the events generated by the Event Stream Server from the publishing pod are sent to Kafka.