Kafka CDR Consumer Overview

Kafka CDR Consumer is a cloud native Apache Kafka Consumer microservice that consumes charging data records (CDRs) that are injected into Kafka topics by external systems or from MATRIXX CDRs. The offline Kafka CDR Consumer retrieves the events from the Kafka topics and injects them into MATRIXX Engine for usage rating and charging.

Kafka CDR Consumer enables the flexible import of a variety of access networks and other services and consumes usage requests for any service from a Kafka topic. Kafka input topics are typically stored in JSON or CSV format, and records are translated into a MATRIXX Data Container (MDC) usage request in MATRIXX Charging Application for usage rating and charging. The records retrieved from Kafka topics are charged in MATRIXX Engine under the Immediate Event Charging (IEC) model. Only records containing used units/reported usage are processed, and other record types are ignored. For information about IEC rating, see the information about rating one-time events in MATRIXX Pricing and Rating. Figure 1 shows the Kafka CDR Consumer processing flow.

Figure 1. Kafka CDR Consumer
Kafka CDR Consumer functionality and flow

The records that are injected into MATRIXX Engine by the offline CDR consumer are processed as real-time messages. This means that even if the records are generated by an external system hours or days beforehand, they are processed as real-time events at the time they are received.

The offline CDRs to be processed by this functionality are not only telco-type usage records, but other types of services, for example, cloud services, where usage of cloud resources like vCPU and storage can be reported and charged by MATRIXX.

The type of offline records determines the type of offline CDR consumer instances and Kafka topics. For each type of record, there is one dedicated Kafka topic and one or more offline CDR consumer instances. For other types of records, separate Kafka topics and offline CDR consumer instances are launched. Because Kafka CDR Consumer is a Kubernetes deployment, it can be scaled to multiple pods. When scaled, each pod establishes its own connections to Kafka brokers and MATRIXX Engine.

Kafka CDR Consumer is a microservice that does not have to run constantly and is started on demand.

Each Kafka CDR Consumer instance acts as a Kafka producer. Kafka CDR Consumer records events for the input events that could not be processed or rated. These rejected events are injected into a reject topic. This enables a customer to reconcile failures and calculate potential revenue losses.

The Kafka CDR Consumer can also be used to retrieve events generated by MATRIXX. In case of a total loss of CHF and 5G Converged Charging System (CCS) servers, the network and mediation could populate network CDRs into the same Kafka topics, which Kafka CDR Consumer can then pick up once the system comes back online.

Kafka CDR Consumer supports a wide array of charging solutions, including:

  • Rating and charging for cloud services.
  • Rating and charging for legacy fixed-line networks.
  • Offline rerating of services if there is a system outage.
  • TAP-IN (TAP3) usage ingestion.
  • Inbound roaming.
  • Wholesale charging.
  • 5G charging.

Kafka CDR Consumer supports CDR ingestion for all communication service provider services (data, voice, and messaging) across all access technologies and ingestion for cloud service usage ingestion.

To use Kafka CDR Consumer, you must have the following, depending on your required charging implementation:
  • CGF/mediation system to process, filter, and transform CDRs into a set of records to be rated by MATRIXX.
  • Kafka cluster to stream your CDRs for retrieval by Kafka CDR Consumer.
  • ETL (data pipeline) from the mediation system to Kafka to send CDRs from the mediation system into one or more topics in the Kafka cluster. Ideally, the topics are arranged based on the service type (data, voice, or messaging).
  • If you must reprocess CHF CDRs with the Rating Indicator = False, you must filter the records, possibly as part of the outage recovery process, and move them to a different Kafka topic that can be consumed by Kafka CDR Consumer. A similar process is needed to handle cases of a total loss of service.

To support consuming any input format, an extension (a deserializer plus a mapper) with custom code implementations for deserializing and mapping business logic is required. For more information about creating a custom extension, see the discussion about creating a custom Kafka CDR Consumer extension. Logic for other processing, such as engine validation, message consumption from a Kafka topic, and data transmission to MATRIXX Engine for rating is provided out-of-the-box.

The following out-of-the-box extensions are also provided with Kafka CDR Consumer:
  • Diameter Kafka CDR Consumer — Consumes plain MtxDiamRoMsg protocol buffer binary files and maps them to MtxDiamRoMsg charging messages for rating. For more information about this extension, see the discussion about Diameter Kafka CDR Consumer.
  • CHF Kafka CDR Consumer — Consumes plain Mtx5GMsg compact MDCs (CMDCs) and maps them to Mtx5GMsg charging messages for rating. For more information about this extension, see the discussion about CHF Kafka CDR Consumer. For more information about Mtx5GMsg charging messages, see the discussion about MATRIXX 5G message mapping in MATRIXX 5G Integration.
  • TAP3 JSON Kafka CDR Consumer — Consumes TAP3 JSON format messages and maps them to MtxDiamRoMsg charging messages for rating. For more information about this extension, see the discussion about TAP3 JSON Kafka CDR Consumer.
You can also create custom extensions for other charging implementations, such as wholesale charging. To do this, you must first add a custom MDC that extends MtxChrgMsg records to add additional fields. For more information about adding a custom MDC, see the discussion about adding a custom MDC in MATRIXX Integration.
Note: For 5G charging, you must add a custom MDC to extend the Mtx5GMsg MDC records.
For more information about creating a custom extension, see the discussion about creating a custom Kafka CDR Consumer extension.

Kafka CDR Consumer deployments can be scaled up in a cloud native environment to increase consumption throughput. Kafka consumers with the same group ID are load-balanced. For more information about Kafka CDR Consumer deployment, see the discussion about deploying Kafka CDR Consumer.