Skip to content

Latest commit

 

History

History
68 lines (46 loc) · 4.6 KB

README.md

File metadata and controls

68 lines (46 loc) · 4.6 KB

cds-integration-sample

This project demonstrates the integration of Cloudant with Kafka and Whisk as depicted in the diagram below. The entities below the dotted line are not yet implemented but the intergration concepts are apparent in the implemented portion of the flow.

integration flow

Setup

The integration sample depicted in this project was built using Python 2.7 targeting a Kafka server running locally. This section details prerequisites and setup instructions in order for the integration sample flow to execute successfully,

Prerequisites

  • Python 2.7
  • Preferably a clean Python virtual environment
  • The cloudant-python client library - Installed when setup.py is executed
  • The kafka-python client library - Installed when setup.py is executed
  • Kafka and Zookeeper instances running locally

Kafka Setup

Once Kafka and Zookeeper have been installed, perform the following in order to start the server instances and register a sample topic. If running locally these steps will need to be performed after every reboot of your local machine.

  1. Start Zookeeper:
  • From a terminal window cd to Kafka root directory (ex. /users/alfinkel/kafka/kafka_2.10-0.8.2.0)
  • Execute bin/zookeeper-server-start.sh config/zookeeper.properties
  1. Start Kafka:
  • From a different terminal window cd to Kafka root directory (ex. /users/alfinkel/kafka/kafka_2.10-0.8.2.0)
  • Execute bin/kafka-server-start.sh config/server.properties
  1. Create the Kafka topic
  • From a different terminal window cd to Kafka root directory (ex. /users/alfinkel/kafka/kafka_2.10-0.8.2.0)
  • Execute bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic sample-topic
    • This will create a topic named sample-topic
    • You can verify that the topic exists by executing list topics - bin/kafka-topics.sh --list --zookeeper localhost:2181

Whisk Setup

One way to configure Whisk is to use the Whisk CLI. This is the method that will be documented below. After setting up Whisk, perform the following in order to register the appropriate action, trigger, and rule in Whisk.

  1. Create the Whisk action update_jules.
  • From a terminal window cd to the Whisk directory that contains the update_jules.js file.
  • Execute wsk action create update_jules update_jules.js. You can verify that the action was indeed created by issuing the wsk action list command.
  1. Create the Whisk trigger updateJules.
  • From the same terminal window execute wsk trigger create updateJules. You can verify that the trigger was indeed created by issuing the wsk trigger list command.
  1. Create the Whisk rule updateJuleRule and associate the updateJules trigger to the update_jules action.
  • From the same terminal window execute wsk rule create updateJulesRule updateJules update_jules. You can verify that the rule was indeed created by issuing the wsk rule list command.

Producer and Consumer

The producer and consumer are both implemented in Python 2.7 using the cloudant-python and kafka-python client libraries.

Python Project Setup

From a terminal and preferably from a clean Python virtual environment, cd to the root of this Python project and execute python setup.py install. This should load all of the necessary dependencies for this project into the virtual environment. The project setup should only need to happen once.

Configuration File Setup

TODO

Running the Consumer

From a terminal, and from the the root of this Python project cd to /src/sample then execute python process_changes.py. This will start a consumer that polls the Kafka server for messages sent to the topic sample-topic.

Running the Producer

From a terminal, and from the the root of this Python project cd to /src/sample then execute python produce_changes.py. This will start a producer that creates a Cloudant database and watches the changes feed of that database. It also creates 10 documents which will in turn trigger the changes feed to produce output. Upon receiving a change, this change is sent as a message to the sample-topic.