MDC logging with Apache Karaf and Camel

MDC (Mapped Diagnostic Context) logging is an interesting feature to log contextual messages.

It’s classic to want to log contextual messages in your application. For instance, we want to log the actions performed by an user (identified by an username or user id). As you have a lot of simultaneous users on your application, it’s easier to “follow” the log.

MDC is supported by several logging frameworks, like log4j or slf4j, and so by Karaf (thanks to pax-logging) as well.
The approach is pretty simple:

  1. You define the context using a key ID and a value for the key:
    MDC.put("userid", "user1");
  2. You use the logger as usual, the log messages to this logger will be contextual to the context:
    logger.debug("my message");
  3. After that, we can change the context by overriding the key:
    MDC.put("userid", "user2");logger.debug("another message");

    Or you can remove the key, so to remove the context, and the log will be “global” (not local to a context):

    MDC.remove("userid"); // or MDC.clear() to remove alllogger.debug("my global message");
  4. In the configuration, we can use pattern with %X{key} to log context. A pattern like %X{userid} - %m%n will result to a log file looking like:
    user1 - my messageuser2 - another message

In this blog, we will see how to use MDC in different cases (directly in your bundle, generic Karaf OSGi, and in Camel routes.

The source code of the blog post are available on my github: http://github.com/jbonofre/blog-mdc.

Using MDC in your application/bundle

The purpose here is to use slf4j MDC in our bundle and configure Karaf to create one log file per context.

To illustrate this, we will create multiple threads in the bundle, given a different context key for each thread:

package net.nanthrax.blog.mdc;import org.slf4j.Logger;import org.slf4j.LoggerFactory;import org.slf4j.MDC;public class MdcExampleBean {    private Logger logger = LoggerFactory.getLogger(MdcExampleBean.class);    public void init() throws Exception {        CycleThread thread1 = new CycleThread("thread1");        CycleThread thread2 = new CycleThread("thread2");        CycleThread thread3 = new CycleThread("thread3");        thread1.start();        thread2.start();        thread3.start();    }    class CycleThread extends Thread {        private String mdcContext;        public CycleThread(String mdcContext) {            this.mdcContext = mdcContext;        }        public void run() {            MDC.put("threadId", mdcContext);            for (int i = 0; i < 20; i++) {                logger.info("Cycle {}", i);            }        }    }}

After deploying this bundle in Karaf 3.0.1, we can see the log messages:

karaf@root()> bundle:install mvn:net.nanthrax.blog/mdc-bundle/1.0-SNAPSHOTkaraf@root()> log:display...2014-08-30 09:44:25,594 | INFO  | Thread-15        | MdcExampleBean                   | 78 - net.nanthrax.blog.mdc-bundle - 1.0.0.SNAPSHOT | Cycle 172014-08-30 09:44:25,594 | INFO  | Thread-13        | MdcExampleBean                   | 78 - net.nanthrax.blog.mdc-bundle - 1.0.0.SNAPSHOT | Cycle 192014-08-30 09:44:25,594 | INFO  | Thread-15        | MdcExampleBean                   | 78 - net.nanthrax.blog.mdc-bundle - 1.0.0.SNAPSHOT | Cycle 182014-08-30 09:44:25,595 | INFO  | Thread-15        | MdcExampleBean                   | 78 - net.nanthrax.blog.mdc-bundle - 1.0.0.SNAPSHOT | Cycle 19

Now, we can setup the Karaf etc/org.ops4j.pax.logging.cfg file to use our MDC. For that, we add a MDCSiftingAppender, providing the threadId as MDC key, and displaying the threadId in the log message pattern. We will create one log file per key (threadId in our case), and finally, we add this appender to the rootLogger:

...log4j.rootLogger=INFO, out, mdc-bundle, osgi:*...# MDC Bundle appenderlog4j.appender.mdc-bundle=org.apache.log4j.sift.MDCSiftingAppenderlog4j.appender.mdc-bundle.key=threadIdlog4j.appender.mdc-bundle.default=unknownlog4j.appender.mdc-bundle.appender=org.apache.log4j.FileAppenderlog4j.appender.mdc-bundle.appender.layout=org.apache.log4j.PatternLayoutlog4j.appender.mdc-bundle.appender.layout.ConversionPattern=%d | %-5.5p | %X{threadId} | %m%nlog4j.appender.mdc-bundle.appender.file=${karaf.data}/log/mdc-bundle-$\\{threadId\\}.loglog4j.appender.mdc-bundle.appender.append=true...

Now, in the Karaf data/log folder, we can see:

mdc-bundle-thread1.logmdc-bundle-thread2.logmdc-bundle-thread3.log

each file containing the log messages contextual to the thread:

$ cat data/log/mdc-bundle-thread1.log2014-08-30 09:54:48,287 | INFO  | thread1 | Cycle 02014-08-30 09:54:48,298 | INFO  | thread1 | Cycle 12014-08-30 09:54:48,298 | INFO  | thread1 | Cycle 22014-08-30 09:54:48,299 | INFO  | thread1 | Cycle 32014-08-30 09:54:48,299 | INFO  | thread1 | Cycle 4...$ cat data/log/mdc-bundle-thread2.log2014-08-30 09:54:48,287 | INFO  | thread2 | Cycle 02014-08-30 09:54:48,298 | INFO  | thread2 | Cycle 12014-08-30 09:54:48,298 | INFO  | thread2 | Cycle 22014-08-30 09:54:48,299 | INFO  | thread2 | Cycle 32014-08-30 09:54:48,299 | INFO  | thread2 | Cycle 42014-08-30 09:54:48,299 | INFO  | thread2 | Cycle 5...

In addition, Karaf “natively” provides OSGi MDC data that we can use.

Using Karaf OSGi MDC

So, in Karaf, you can use directly some OSGi headers for MDC logging, especially the bundle name.

We can use this MDC key to create one log file per bundle.

Karaf already provides a pre-defined appender configuration in etc/org.ops4j.pax.logging.cfg:

...# Sift appenderlog4j.appender.sift=org.apache.log4j.sift.MDCSiftingAppenderlog4j.appender.sift.key=bundle.namelog4j.appender.sift.default=karaflog4j.appender.sift.appender=org.apache.log4j.FileAppenderlog4j.appender.sift.appender.layout=org.apache.log4j.PatternLayoutlog4j.appender.sift.appender.layout.ConversionPattern=%d{ISO8601} | %-5.5p | %-16.16t | %-32.32c{1} | %m%nlog4j.appender.sift.appender.file=${karaf.data}/log/$\\{bundle.name\\}.loglog4j.appender.sift.appender.append=true...

The only thing that we have to do is to add this appender to the rootLogger:

log4j.rootLogger=INFO, out, sift, osgi:*

Now, in the Karaf data/log folder, we can see one file per bundle:

data/log$ ls -1karaf.lognet.nanthrax.blog.mdc-bundle.logorg.apache.aries.blueprint.core.logorg.apache.aries.jmx.core.logorg.apache.felix.fileinstall.logorg.apache.karaf.features.core.logorg.apache.karaf.region.persist.logorg.apache.karaf.shell.console.logorg.apache.sshd.core.log

Especially, we can see our mdc-bundle, containing the log messages “local” to the bundle.

However, if this approach works great, it doesn’t always create interesting log file. For instance, when you use Camel, using OSGi headers for MDC logging will gather most of the log messages into the camel-core bundle log file, so, not really contextual to something or easy to read/seek.

The good news is that Camel also provides MDC logging support.

Using Camel MDC

If Camel provides MDC logging support, it’s not enabled by default. It’s up to you to enable it on the camel context.

Once enabled, Camel provides the following MDC logging properties:

  • camel.exchangeId providing the exchange ID
  • camel.messageId providing the message ID
  • camel.routeId providing the route ID
  • camel.contextId providing the Camel Context ID
  • camel.breadcrumbId providing an unique id used for tracking messages across transports
  • camel.correlationId providing the correlation ID of the exchange (if it’s correlated, for instance like in Splitter EIP)
  • camel.trasactionKey providing the ID of the transaction (for transacted exchange).

To enable the MDC logging, you have to:

  • if you use the Blueprint or Spring XML DSL:
    <camelContext xmlns="http://camel.apache.org/schema/blueprint" useMDCLogging="true">
  • if you use the Java DSL:
    CamelContext context = ...context.setUseMDCLogging(true);
  • using the Talend ESB studio, you have to use a cConfig component from the palette:
    studio1

So, let say, we create the following route using the Blueprint DSL:

<?xml version="1.0" encoding="UTF-8"?> <blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0">    <camelContext xmlns="http://camel.apache.org/schema/blueprint" useMDCLogging="true">       <route id="my-route">          <from uri="timer:fire?period=5000"/>          <setBody>             <constant>Hello Blog</constant>          </setBody>          <to uri="log:net.nanthrax.blog?level=INFO"/>      </route>   </camelContext> </blueprint>

We want to create one log file per route (using the routeId). So, we update the Karaf etc/org.ops4j.pax.logging.cfg file to add a MDC sifting appender using the Camel MDC properties, and we add this appender to the rootLogger:

...log4j.rootLogger=INFO, out, camel-mdc, osgi:*...# Camel MDC appenderlog4j.appender.camel-mdc=org.apache.log4j.sift.MDCSiftingAppenderlog4j.appender.camel-mdc.key=camel.routeIdlog4j.appender.camel-mdc.default=unknown log4j.appender.camel-mdc.appender=org.apache.log4j.FileAppenderlog4j.appender.camel-mdc.appender.layout=org.apache.log4j.PatternLayoutlog4j.appender.camel-mdc.appender.layout.ConversionPattern=%d{ISO8601} | %-5.5p | %-16.16t | %-32.32c{1} | %X{camel.exchangeId} | %m%nlog4j.appender.camel-mdc.appender.file=${karaf.data}/log/camel-$\\{camel.routeId\\}.loglog4j.appender.camel-mdc.appender.append=true...

The camel-mdc appender will create one log file by route (named camel-(routeId).log). The log messages will contain the exchange ID.

We start Karaf, and after the installation of the camel-blueprint feature, we can drop our route.xml directly in the deploy folder:

karaf@root()> feature:repo-add camel 2.12.1Adding feature url mvn:org.apache.camel.karaf/apache-camel/2.12.1/xml/featureskaraf@root()> feature:install camel-blueprint
cp route.xml apache-karaf-3.0.1/deploy/

Using log:display command in Karaf, we can see the messages for our route:

karaf@root()> log:display

2014-08-31 08:58:24,176 | INFO | 0 – timer://fire | blog | 85 – org.apache.camel.camel-core – 2.12.1 | Exchange[ExchangePattern: InOnly, BodyType: String, Body: Hello Blog]
2014-08-31 08:58:29,176 | INFO | 0 – timer://fire | blog | 85 – org.apache.camel.camel-core – 2.12.1 | Exchange[ExchangePattern: InOnly, BodyType: String, Body: Hello Blog]

Now, if we go into the Karaf data/log folder, we can see the log file for our route:

$ ls -1 data/logcamel-my-route.log...

If we take a look in the camel-my-route.log file, we can see the messages contextual to the route, including the exchange ID:

2014-08-31 08:58:19,196 | INFO  | 0 - timer://fire | blog                             | ID-latitude-57336-1409468297774-0-2 | Exchange[ExchangePattern: InOnly, BodyType: String, Body: Hello Blog]2014-08-31 08:58:24,176 | INFO  | 0 - timer://fire | blog                             | ID-latitude-57336-1409468297774-0-4 | Exchange[ExchangePattern: InOnly, BodyType: String, Body: Hello Blog]2014-08-31 08:58:29,176 | INFO  | 0 - timer://fire | blog                             | ID-latitude-57336-1409468297774-0-6 | Exchange[ExchangePattern: InOnly, BodyType: String, Body: Hello Blog]2014-08-31 08:58:34,176 | INFO  | 0 - timer://fire | blog                             | ID-latitude-57336-1409468297774-0-8 | Exchange[ExchangePattern: InOnly, BodyType: String, Body: Hello Blog]

Comments

Popular posts from this blog

Exposing Apache Karaf configurations with Apache Arrow Flight

Getting started with Apache Karaf Minho

Using Apache Karaf with Kubernetes