Logging
The CDR Aggregation Function (CAF) application uses the standard slf4j+log4j framework for logging messages related to application processing. The log4j2.xml file can be configured using standard log4j options to either log to a file or standard output.
The following is an example of a CAF log4j2.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration monitorInterval="60">
<Properties>
<Property name="LOG_FILE_DIR">${sys:log.file.dir:-${env:log.file.dir:-logs}}</Property>
<Property name="FILE_LOG_SIZE">${sys:FILE_LOG_SIZE:-${env:FILE_LOG_SIZE:-20 MB}}</Property>
<Property name="FILE_LOG_MAX">${sys:FILE_LOG_MAX:-${env:FILE_LOG_MAX:-10}}</Property>
</Properties>
<Appenders>
<Console name="STDOUT" target="SYSTEM_OUT">
<PatternLayout>
<Pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} | %p | %equals{%X{requestId}}{}{%X{sessionId}} | %t | %c | %notEmpty{sessionId=%X{sessionId}} %notEmpty{subscriberId=%X{subscriberId}} %m%n%xEx</Pattern>
</PatternLayout>
</Console>
<RollingFile name="File-Appender" fileName="${LOG_FILE_DIR}/cdr-aggregation-function.log" filePattern="${LOG_FILE_DIR}/archived/cdr-aggregation-function.log.%d{yyyy-MM-dd}.%i.log.gz" createOnDemand="true" >
<PatternLayout>
<Pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} | %p | %equals{%X{requestId}}{}{%X{sessionId}} | %t | %c | %notEmpty{sessionId=%X{sessionId}} %notEmpty{subscriberId=%X{subscriberId}} %m%n%xEx</Pattern>
</PatternLayout>
<Policies>
<SizeBasedTriggeringPolicy size="${FILE_LOG_SIZE}" />
</Policies>
<DefaultRolloverStrategy max="${FILE_LOG_MAX}" />
</RollingFile>
<!-- Choose which Appender to use based on the MTX_LOG_MECHANISM environment variable (defaults to FILE) -->
<Routing name="LogMechanismRouter">
<Routes pattern="$${sys:MTX_LOG_MECHANISM:-${env:MTX_LOG_MECHANISM:-FILE}}">
<Route ref="File-Appender" key="FILE" />
<Route ref="STDOUT" key="STDOUT" />
</Routes>
</Routing>
</Appenders>
<Loggers>
<Root level="error">
<AppenderRef ref="LogMechanismRouter"/>
</Root>
<!-- General -->
<Logger name="org" level="warn" />
<!-- Required to Show the Banner -->
<Logger name="org.springframework.boot.SpringApplication" level="info" />
<!-- Matrixx Specific -->
<Logger name="com.matrixx" level="${sys:MTX_CAF_LOG_LEVEL:-${env:MTX_CAF_LOG_LEVEL:-info}}"/>
<!-- Spring -->
<Logger name="org.springframework" level="info" />
<Logger name="org.springframework.kafka" level="${sys:MTX_SPRING_KAFKA_LOG_LEVEL:-${env:MTX_SPRING_KAFKA_LOG_LEVEL:-info}}" />
<!-- Kafka -->
<Logger name="org.apache.kafka" level="${sys:MTX_KAFKA_LOG_LEVEL:-${env:MTX_KAFKA_LOG_LEVEL:-warn}}"/>
<Logger name="org.apache.kafka.streams.processor" level="${sys:MTX_KAFKA_STREAM_PROC_LOG_LEVEL:-${env:MTX_KAFKA_STREAM_PROC_LOG_LEVEL:-warn}}"/>
<Logger name="org.apache.kafka.streams.processor.internals.TaskManager" level="error"/>
</Loggers>
</Configuration>
Enterprise-level log collection and alerting tools, such as Splunk, can be used to monitor the error and alarm log files.
Messages are logged in key-value pair format for compatibility with log collection and alerting tools.