Posts

Showing posts from 2014

Apache Karaf Christmas gifts: docker.io, profiles, and decanter

We are heading to Christmas time, and the Karaf team wanted to prepare some gifts for you ๐Ÿ˜‰ Of course, we are working hard in the preparation of the new Karaf releases. A bunch of bug fixes and improvements will be available in the coming releases: Karaf 2.4.1, Karaf 3.0.3, and Karaf 4.0.0.M2. Some sub-project releases are also in preparation, especially Cellar. We completely refactored Cellar internals, to provide a more reliable, predictable, and stable behavior. New sync policies are available, new properties, new commands, and also interesting new features like HTTP session replication, or HTTP load balancing. I will prepare a blog about this very soon. But, we’re also preparing brand-new features. Docker.io I heard some people saying: “why do I need Karaf when I have docker.io ?”. Honestly, I don’t understand this as the purpose is not the same: actually, Karaf on docker.io is a great value. First, docker.io concepts are not new. It’s more or less new...

Encrypt ConfigAdmin properties values in Apache Karaf

Apache Karaf loads all the configuration from etc/*.cfg files by default, using a mix of Felix FileInstall and Felix ConfigAdmin. These files are regular properties file looking like: key=value Some values may be critical, and so not store in plain text. It could be critical business data (credit card number, etc), or technical data (password to different systems, like database for instance). We want to encrypt such kind of data in the etc/*.cfg files, but being able to use it regulary in the application. Karaf provides a nice feature for that: jasypt-encryption. It’s very easy to use especially with Blueprint. The jasypt-encryption feature is an optional feature, so it means that you have to install it first: karaf@root()> feature:install jasypt-encryption This feature provides: jasypt bundle a namespace handler (enc:*) for blueprint Now, we can create a cfg file containing encrypted value. The encrypted value is “wrapped” in a ENC() function. For instance, we can ...

MDC logging with Apache Karaf and Camel

Image
MDC (Mapped Diagnostic Context) logging is an interesting feature to log contextual messages. It’s classic to want to log contextual messages in your application. For instance, we want to log the actions performed by an user (identified by an username or user id). As you have a lot of simultaneous users on your application, it’s easier to “follow” the log. MDC is supported by several logging frameworks, like log4j or slf4j, and so by Karaf (thanks to pax-logging) as well. The approach is pretty simple: You define the context using a key ID and a value for the key: MDC.put("userid", "user1"); You use the logger as usual, the log messages to this logger will be contextual to the context: logger.debug("my message"); After that, we can change the context by overriding the key: MDC.put("userid", "user2");logger.debug("another message"); Or you can remove the key, so to remove the context, and the log will be ...

Testing (utest and itest) Apache Camel Blueprint route

Image
In any integration project, testing is vital for multiple reasons: to guarantee that the integration logic matches the expectations to quickly identify some regression issues to test some special cases, like the errors for instance to validate the succesful provisioning (deployment) on a runtime as close as possible to the target platform We distinguish two kinds of tests: the unit tests (utest) aim to test the behaviors of integration logic, and define the expectations that the logic has to match the integration tests (itest) aim to provision the integration logic artifact to a runtime, and check the behaviors on the actual platform Camel is THE framework to implement your integration logic (mediation). It provides the Camel Test Kit, based on JUnit to implement utest. In combinaison with Karaf and Pax Exam, we can cover both utest and itest. In this blog, we will: create an OSGi service create a Camel route using the Blueprint DSL, using the previously created OSGi service implement ...

Apache JMeter to test Apache ActiveMQ on CI with Maven/Jenkins

Image
Apache JMeter is a great tool for testing, especially performance testing. It provides a lot of samplers that you can use to test your web services, web applications, etc. It also includes a couple of samplers for JMS that we can use with ActiveMQ. The source code of this blog post is https://github.com/jbonofre/blog-jmeter . Preparing JMeter for ActiveMQ For this article, I downloaded JMeter 2.10 from http://jmeter.apache.org . We uncompress jmeter in a folder: $ tar zxvf apache-jmeter-2.10.tgz We are going to create a test plan for ActiveMQ. After downloading ActiveMQ 5.9.0 from http://activemq.apache.org , we install and start an ActiveMQ broker on the machine. $ tar zxvf apache-activemq-5.9.0-bin.tar.gz$ cd apache-activemq-5.9.0/bin$ ./activemq console... INFO | Apache ActiveMQ 5.9.0 (localhost, ID:latitude-45782-1409139630277-0:1) started In order to use ActiveMQ with JMeter, we have to copy the activemq-all-5.9.0.jar file provided in the ActiveMQ distribution into the JMeter lib...

Webex on Ubuntu 14.04

Webex is a great tool but unfortunately, it doesn’t work “out of the box” on Ubuntu 14.04 (and also with previous Ubuntu releases). For instance, the webex applet starts but it doesn’t refresh correctly, or the share of desktop/application doesn’t work. Actually, the issue is due to: some libraries required by webex are missing on the Ubuntu installation webex expects to run in i386 (not amd64) platform, so, even if you have the libraries installed, you have to install the i386 version. To find the libraries required, you have to go in $HOME/.webex/1324 and $HOME/.webex/1424 folders and check the libraries with: ldd *.so|grep -i not For all missing libraries (not found), you have to find the package providing the library using: apt-file search xxxxxx.so Once you found the package providing the library, you have to install the package for both x64 (that should be the default if your machine is 64bits) and i386. For instance: aptitude install libpangox-1.0-...

Apache Syncope backend with Apache Karaf

Image
Apache Syncope is an identity manager (IdM). It comes with a web console where you can manage users, attributes, roles, etc. It also comes with a REST API allowing to integrate with other applications. By default, Syncope has its own database, but it can also “faรงade” another backend (LDAP, ActiveDirectory, JDBC) by using ConnId. In the next releases (4.0.0, 3.0.2, 2.4.0, and 2.3.7), Karaf provides (by default) a SyncopeLoginModule allowing you to use Syncope as backend for users and roles. This blog introduces this new feature and explains how to configure and use it. Installing Apache Syncope The easiest way to start with Syncope is to use the Syncope standalone distribution. It comes with a Apache Tomcat instance already installed with the different Syncope modules. You can download the Syncope standalone distribution archive from http://www.apache.org/dyn/closer.cgi/syncope/1.1.8/syncope-standalone-1.1.8-distribution.zip . Uncompress the distribution in the directory of ...

Hadoop CDC and processes notification with Apache Falcon, Apache ActiveMQ, and Apache Camel

Image
Some weeks (months ? ;)) ago, I started to work on Apache Falcon. First of all, I would like to thanks all Falcon guys: they are really awesome and do a great job (special thanks to Srikanth, Venkatesh, Swetha). This blog post is a preparation to a set of “recipes documentation” that I will propose in Apache Falcon. Falcon is in incubation at Apache. The purpose is to provide a data processing and management solution for Hadoop designed for data motion, coordination of data pipelines, lifecycle management, and data discovery. Falcon enables end consumers to quickly onboard their data and its associated processing and management tasks on Hadoop clusters. A interesting feature provided by Falcon is notifications of the activities in the Hadoop cluster “outside” of the cluster ๐Ÿ˜‰ In this article, we will see how to get two kinds of notification in Camel routes “outside” of the Hadoop cluster: a Camel route will be notified and triggered when a process is...