Complete metrics collections and analytics with Apache Karaf Decanter, Apache Kafka and Apache Druid
In this blog post, I will show how to extend the Karaf Decanter as log and metrics collection with storage and analytic powered by Apache Druid. The idea is to collect machine metrics (using Decanter OSHI collector for instance), send to a Kafka broker and aggregate and analyze the metrics on Druid. Apache Kafka We can ingest data in Apache Druid using several channels (in streaming mode or batch mode). For this blog post, I will use streaming mode with Apache Kafka. For the purpose of the blog, I will simply start a zookeeper: $ bin/zookeeper-server-start.sh config/zookeeper.properties and kafka 2.6.1 broker: $ bin/kafka-server-start.sh config/server.properties ... [2021-01-19 14:57:26,528] INFO [KafkaServer id=0] started (kafka.server.KafkaServer) I'm create a decanter topic where we gonna send the metrics: $ bin/kafka-topics.sh --bootstrap-server localhost:9092 --create --topic decanter --partitions 2 $ bin/kafka-topics.sh --bootstrap-server localhost:9092 -...