How To Send Data To Kafka Using Node-kafka
Getting started with NodeJS and Kafka
In this tutorial, I'll talk nearly how I fix upward Kafka to provide website analytics and event tracking for a web awarding. For my use case, I wanted to store and process raw event information in real-time. With Google Analytics, you can't do that. Enter Kafka.
What is Kafka?
Kafka is a streaming platform that allows applications to produce and eat letters. It is used for building real-fourth dimension information pipelines and streaming apps. Some use cases of Kafka include:
- Messaging
- Website Activity Tracking
- Accumulation statistics from various providers
- Log assemblage
Here'due south a high level architecture diagram of how Kafka works:
To understand Kafka, permit'south first list some of the nuts:
- Applications that transport data to Kafka are chosen producers
- Applications that read information from Kafka are called consumers
- Producers send records. Each record is associated with a topic. A topic is like a category. Each record consists of a key, value, and timestamp.
- Consumers can subscribe to a given topic, and receive a stream of records, and be alerted whenever a new record is sent.
- In the event that a consumer goes downwards, it is able to restart streaming from where information technology left off, by keeping track of the topic's offset.
- Kafka guarantees the gild of messages in a given topics, regardless of the number of consumers or producers
Through Kafka'southward architecture, we are able to decouple the production of messages from the consumption of them.
Setting upward Kafka locally
To setup Kafka locally, we're only going to be using the Quickstart Guide on the website. Follow steps 1 to iv on the guide.
First, download Kafka. And so, un-tar it.
$ tar -xzf /path/to/kafka_2.eleven–ane.0.0.tgz cd kafka_2.11–1.0.0
$ tar -xzf /path/to/kafka_2.eleven–i.0.0.tgz
Kafka uses ZooKeeper so you need to first start a ZooKeeper server if y'all don't already have one. You can use the convenience script packaged with kafka to get a quick-and-dirty single-node ZooKeeper instance.
$ bin/zookeeper-server-start.sh config/zookeeper.properties
$ bin/zookeeper-server-showtime.sh config/zookeeper.backdrop
At present start the Kafka server:
$ bin/kafka-server-start.sh config/server.properties
$ bin/kafka-server-start.sh config/server.properties
Let's create a topic named webevents.dev
with a single partition and only one replica:
$ bin/kafka-topics.sh — create — zookeeper localhost:2181 — replication-factor 1 — partitions 1 — topic webevents.dev
$ bin/kafka-topics.sh — create — zookeeper localhost:2181 — replication-factor 1 — partitions i — topic webevents.dev
We can now see that topic if nosotros run the list topic command:
$ bin/kafka-topics.sh — listing — zookeeper localhost:2181 webevents.dev
$ bin/kafka-topics.sh — listing — zookeeper localhost:2181
Alternatively, instead of manually creating topics you can too configure your brokers to machine-create topics when a not-existent topic is published to.
Now, Kafka is fix upwards and running on http://localhost:2181.
Writing a Kafka Producer in JavaScript
We can write a Producer in JavaScript using the kafka-node npm module. Depending on your use case, you may choose to accept the producer alive on it's own server, or integrate it with your existing web application.
Regardless of what y'all determine, you can create a service like the ane beneath to encapsulate the logic associated with producing records to Kafka.
Basic Kafka Producer Service
The code above exports a KafkaService
object with a single public method, sendRecord(recordObj, callback)
. It accepts a record object and an optional callback.
Chiefly, all of these records are sent against the topic, webevents.dev
. This volition tell my consumer (which we will write side by side) what topic to listen to.
Depending on your employ case, the data that you send will change. For my utilise case, I'm interested in website event tracking so I send some anonymized user data such as userId
, sessionId
, too as arbitrary JavaScript data that is stored in information
. This is all JSON-stringified and sent to the associated topic.
Check out the Producer API for kafka-node to larn more about what you can do with Producers.
Writing a Kafka Consumer in JavaScript
A Kafka Consumer can also be written with the kafka-node npm module. For my utilize instance, my consumer was a split Express server which listened to events and stored them in a database.
However, y'all tin do all sorts of interesting things with consumers, such as emailing, logging, performing real-time information analysis and more.
This consumer just listens to the webevents.dev
and whenever a new record comes in, it stores it in the database. A single consumer can also listen to multiple topics.
Check out the Consumer API for kafka-node to larn more about how consumers piece of work.
In one case you write a consumer such equally this, you lot can start it past running:
node kafka-consumer.js
Now that we have all our services running (our Kafka instance, our producer, and our consumer), we can finer outset producing and consuming events.
I hope that helps you get a grasp on how to go started with Kafka and NodeJS. If you have any questions or comments, you can tweet them to me @tilomitra.
How To Send Data To Kafka Using Node-kafka,
Source: https://tilomitra.medium.com/getting-started-with-nodejs-and-kafka-36b174853cf8
Posted by: simmssestell1948.blogspot.com
0 Response to "How To Send Data To Kafka Using Node-kafka"
Post a Comment