Skip to content

Commit 340d5c0

Browse files
input: kafka: Add initial documentation (fluent#1091)
* input: kafka: Add initial documetation Signed-off-by: Thiago Padilha <[email protected]> * input: kafka: Address review suggestion Co-authored-by: Pat <[email protected]> Signed-off-by: Thiago Padilha <[email protected]> * input: kafka: Put each sentence in a separate line (review request). Signed-off-by: Thiago Padilha <[email protected]> * kafka: Fix some typos/errors found during in_kafka review Signed-off-by: Thiago Padilha <[email protected]> --------- Signed-off-by: Thiago Padilha <[email protected]> Signed-off-by: Thiago Padilha <[email protected]> Co-authored-by: Pat <[email protected]>
1 parent 2d0a1ec commit 340d5c0

File tree

2 files changed

+73
-2
lines changed

2 files changed

+73
-2
lines changed

pipeline/inputs/kafka.md

+71
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
# Kafka
2+
3+
The Kafka input plugin allows subscribing to one or more Kafka topics to collect messages from an [Apache Kafka](https://kafka.apache.org/) service.
4+
This plugin uses the official [librdkafka C library](https://github.com/edenhill/librdkafka) \(built-in dependency\).
5+
6+
## Configuration Parameters
7+
8+
| Key | Description | default |
9+
| :--- | :--- | :--- |
10+
| brokers | Single or multiple list of Kafka Brokers, e.g: 192.168.1.3:9092, 192.168.1.4:9092. | |
11+
| topics | Single entry or list of topics separated by comma \(,\) that Fluent Bit will subscribe to. | |
12+
| client\_id | Client id passed to librdkafka. | |
13+
| group\_id | Group id passed to librdkafka. | fluent-bit |
14+
| poll\_ms | Kafka brokers polling interval in milliseconds. | 500 |
15+
| rdkafka.{property} | `{property}` can be any [librdkafka properties](https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md) | |
16+
17+
## Getting Started
18+
19+
In order to subscribe/collect messages from Apache Kafka, you can run the plugin from the command line or through the configuration file:
20+
21+
### Command Line
22+
23+
The **kafka** plugin can read parameters through the **-p** argument \(property\), e.g:
24+
25+
```text
26+
$ fluent-bit -i kafka -o stdout -p brokers=192.168.1.3:9092 -p topics=some-topic
27+
```
28+
29+
### Configuration File
30+
31+
In your main configuration file append the following _Input_ & _Output_ sections:
32+
33+
```text
34+
[INPUT]
35+
Name kafka
36+
Brokers 192.168.1.3:9092
37+
Topics some-topic
38+
poll_ms 100
39+
40+
[OUTPUT]
41+
Name stdout
42+
```
43+
44+
#### Example of using kafka input/output plugins
45+
46+
The fluent-bit source repository contains a full example of using fluent-bit to process kafka records:
47+
48+
```text
49+
[INPUT]
50+
Name kafka
51+
brokers kafka-broker:9092
52+
topics fb-source
53+
poll_ms 100
54+
55+
[FILTER]
56+
Name lua
57+
Match *
58+
script kafka.lua
59+
call modify_kafka_message
60+
61+
[OUTPUT]
62+
Name kafka
63+
brokers kafka-broker:9092
64+
topics fb-sink
65+
```
66+
67+
The above will connect to the broker listening on `kafka-broker:9092` and subscribe to the `fb-source` topic, polling for new messages every 100 milliseconds.
68+
69+
Every message received is then processed with `kafka.lua` and sent back to the `fb-sink` topic of the same broker.
70+
71+
The example can be executed locally with `make start` in the `examples/kafka_filter` directory (docker/compose is used).

pipeline/outputs/kafka.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ Kafka output plugin allows to ingest your records into an [Apache Kafka](https:/
1111
| message\_key\_field | If set, the value of Message\_Key\_Field in the record will indicate the message key. If not set nor found in the record, Message\_Key will be used \(if set\). | |
1212
| timestamp\_key | Set the key to store the record timestamp | @timestamp |
1313
| timestamp\_format | Specify timestamp format, should be 'double', '[iso8601](https://en.wikipedia.org/wiki/ISO_8601)' (seconds precision) or 'iso8601_ns' (fractional seconds precision) | double |
14-
| brokers | Single of multiple list of Kafka Brokers, e.g: 192.168.1.3:9092, 192.168.1.4:9092. | |
14+
| brokers | Single or multiple list of Kafka Brokers, e.g: 192.168.1.3:9092, 192.168.1.4:9092. | |
1515
| topics | Single entry or list of topics separated by comma \(,\) that Fluent Bit will use to send messages to Kafka. If only one topic is set, that one will be used for all records. Instead if multiple topics exists, the one set in the record by Topic\_Key will be used. | fluent-bit |
1616
| topic\_key | If multiple Topics exists, the value of Topic\_Key in the record will indicate the topic to use. E.g: if Topic\_Key is _router_ and the record is {"key1": 123, "router": "route\_2"}, Fluent Bit will use topic _route\_2_. Note that if the value of Topic\_Key is not present in Topics, then by default the first topic in the Topics list will indicate the topic to be used. | |
1717
| dynamic\_topic | adds unknown topics \(found in Topic\_Key\) to Topics. So in Topics only a default topic needs to be configured | Off |
@@ -26,7 +26,7 @@ In order to insert records into Apache Kafka, you can run the plugin from the co
2626

2727
### Command Line
2828

29-
The **kafka** plugin, can read the parameters from the command line in two ways, through the **-p** argument \(property\), e.g:
29+
The **kafka** plugin can read parameters through the **-p** argument \(property\), e.g:
3030

3131
```text
3232
$ fluent-bit -i cpu -o kafka -p brokers=192.168.1.3:9092 -p topics=test

0 commit comments

Comments
 (0)