|
| 1 | +# Kafka |
| 2 | + |
| 3 | +The Kafka input plugin allows subscribing to one or more Kafka topics to collect messages from an [Apache Kafka](https://kafka.apache.org/) service. |
| 4 | +This plugin uses the official [librdkafka C library](https://github.com/edenhill/librdkafka) \(built-in dependency\). |
| 5 | + |
| 6 | +## Configuration Parameters |
| 7 | + |
| 8 | +| Key | Description | default | |
| 9 | +| :--- | :--- | :--- | |
| 10 | +| brokers | Single or multiple list of Kafka Brokers, e.g: 192.168.1.3:9092, 192.168.1.4:9092. | | |
| 11 | +| topics | Single entry or list of topics separated by comma \(,\) that Fluent Bit will subscribe to. | | |
| 12 | +| client\_id | Client id passed to librdkafka. | | |
| 13 | +| group\_id | Group id passed to librdkafka. | fluent-bit | |
| 14 | +| poll\_ms | Kafka brokers polling interval in milliseconds. | 500 | |
| 15 | +| rdkafka.{property} | `{property}` can be any [librdkafka properties](https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md) | | |
| 16 | + |
| 17 | +## Getting Started |
| 18 | + |
| 19 | +In order to subscribe/collect messages from Apache Kafka, you can run the plugin from the command line or through the configuration file: |
| 20 | + |
| 21 | +### Command Line |
| 22 | + |
| 23 | +The **kafka** plugin can read parameters through the **-p** argument \(property\), e.g: |
| 24 | + |
| 25 | +```text |
| 26 | +$ fluent-bit -i kafka -o stdout -p brokers=192.168.1.3:9092 -p topics=some-topic |
| 27 | +``` |
| 28 | + |
| 29 | +### Configuration File |
| 30 | + |
| 31 | +In your main configuration file append the following _Input_ & _Output_ sections: |
| 32 | + |
| 33 | +```text |
| 34 | +[INPUT] |
| 35 | + Name kafka |
| 36 | + Brokers 192.168.1.3:9092 |
| 37 | + Topics some-topic |
| 38 | + poll_ms 100 |
| 39 | +
|
| 40 | +[OUTPUT] |
| 41 | + Name stdout |
| 42 | +``` |
| 43 | + |
| 44 | +#### Example of using kafka input/output plugins |
| 45 | + |
| 46 | +The fluent-bit source repository contains a full example of using fluent-bit to process kafka records: |
| 47 | + |
| 48 | +```text |
| 49 | +[INPUT] |
| 50 | + Name kafka |
| 51 | + brokers kafka-broker:9092 |
| 52 | + topics fb-source |
| 53 | + poll_ms 100 |
| 54 | +
|
| 55 | +[FILTER] |
| 56 | + Name lua |
| 57 | + Match * |
| 58 | + script kafka.lua |
| 59 | + call modify_kafka_message |
| 60 | +
|
| 61 | +[OUTPUT] |
| 62 | + Name kafka |
| 63 | + brokers kafka-broker:9092 |
| 64 | + topics fb-sink |
| 65 | +``` |
| 66 | + |
| 67 | +The above will connect to the broker listening on `kafka-broker:9092` and subscribe to the `fb-source` topic, polling for new messages every 100 milliseconds. |
| 68 | + |
| 69 | +Every message received is then processed with `kafka.lua` and sent back to the `fb-sink` topic of the same broker. |
| 70 | + |
| 71 | +The example can be executed locally with `make start` in the `examples/kafka_filter` directory (docker/compose is used). |
0 commit comments