DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Enterprise AI Trend Report: Gain insights on ethical AI, MLOps, generative AI, large language models, and much more.

2024 Cloud survey: Share your insights on microservices, containers, K8s, CI/CD, and DevOps (+ enter a $750 raffle!) for our Trend Reports.

PostgreSQL: Learn about the open-source RDBMS' advanced capabilities, core components, common commands and functions, and general DBA tasks.

AI Automation Essentials. Check out the latest Refcard on all things AI automation, including model training, data security, and more.

Related

  • Kafka Connect on Kubernetes The Easy Way!
  • Securing and Monitoring Your Data Pipeline: Best Practices for Kafka, AWS RDS, Lambda, and API Gateway Integration
  • Automated Application Integration With Flask, Kakfa, and API Logic Server
  • The Emergence of Micro Frontends: Integrating With Next.js

Trending

  • API Appliance for Extreme Agility and Simplicity
  • Some Thoughts on Bad Programming Practices
  • DZone's Article Submission Guidelines
  • Organizing Knowledge With Knowledge Graphs: Industry Trends
  1. DZone
  2. Data Engineering
  3. Big Data
  4. Camel Kafka Connector: No Code, No Hassle Integrations

Camel Kafka Connector: No Code, No Hassle Integrations

In this article, we are going to discuss the combination of Camel components and Kafka which has made integrations with Kafka even more easy, stable, and versatile.

By 
Chandra Shekhar Pandey user avatar
Chandra Shekhar Pandey
·
Jan. 16, 21 · Tutorial
Like (2)
Save
Tweet
Share
9.2K Views

Join the DZone community and get the full member experience.

Join For Free

Hi,

In this article, we are going to discuss Camel Kafka Connectors. Apache Camel has more than 300 components used for the integration of different endpoints and protocols. Thus, this combination of Camel components and Kafka has made integrations with Kafka even more easy, stable, and versatile. Also, not a single line of code is required.

We can find more details about Apache Camel Kafka Connectors in community documentation. There are two types of connectors; source and sink. This I want to highlight as per the documentation:

Camel-Kafka Source Connector is a pre-configured Camel consumer which will perform the same action on a fixed rate and send the exchanges to Kafka, while a Camel-Kafka Sink Connector is a pre-configured Camel producer which will perform the same operation on each message exported from Kafka.

In this article, we will implement the Camel-SSH component of Kafka Sink Connector. This example is based on camel-kafka-connector-examples.

I have tested this on Fedora 33 with Apache Kafka-2.7.0 and Podman. Podman we are going to use for running SSH server and KafkaCat utility to send Kafka messages.

So let us start our findings and learning.

1.  Let us first download camel-ssh-kafka-connector. At the time of writing this article, the version I downloaded is camel-ssh-kafka-connector-0.7.0-package.zip.

2. I extracted it in my local disk.

Shell
 




x


 
1
[chandrashekhar@localhost camel-kafka-connectors]$ ls -ltrh |grep ssh
2
drwxr-xr-x. 2 chandrashekhar chandrashekhar 4.0K Dec 21 18:50 camel-ssh-kafka-connector
3
-rw-rw-r--. 1 chandrashekhar chandrashekhar  31M Dec 21 23:20 camel-ssh-kafka-connector-0.7.0-package.zip
4
[chandrashekhar@localhost camel-kafka-connectors]$ 
5

          



3. Download Apache Kafka. While writing this article, the latest version I downloaded is Kafka-2.7.

4. Start Kafka and create a testTopic.

Shell
 




x


 
1
[chandrashekhar@localhost kafka_2.13-2.7.0]$ ./bin/zookeeper-server-start.sh config/zookeeper.properties
2

          
3
[chandrashekhar@localhost kafka_2.13-2.7.0]$ bin/kafka-server-start.sh config/server.properties 
4

          
5
[chandrashekhar@localhost kafka_2.13-2.7.0]$ bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic testTopic
6
Created topic testTopic.
7

          
8

          
9

          



5. Start SSH Server using Podman.

Shell
 




xxxxxxxxxx
1


 
1
[chandrashekhar@localhost strimzi-0.20.1]$ podman run -d -P --name test_sshd rastasheep/ubuntu-sshd:14.04
2

          
3
# check ports mapped
4
[chandrashekhar@localhost strimzi-0.20.1]$ podman port test_sshd
5
22/tcp -> 0.0.0.0:21947
6
[chandrashekhar@localhost strimzi-0.20.1]$ 



6.  Setup plugin path ofcamel-ssh-kafka-connectorin Kafka.

Shell
 




xxxxxxxxxx
1
10


 
1
#Note path were connector is extracted.
2
[chandrashekhar@localhost camel-kafka-connectors]$ pwd
3
/home/chandrashekhar/SSD/Development_SSD/Streams_RH/camel-kafka-connectors
4
[chandrashekhar@localhost camel-kafka-connectors]$ ls -ltr|grep ssh
5
drwxr-xr-x. 2 chandrashekhar chandrashekhar     4096 Dec 21 18:50 camel-ssh-kafka-connector
6
-rw-rw-r--. 1 chandrashekhar chandrashekhar 31543032 Dec 21 23:20 camel-ssh-kafka-connector-0.7.0-package.zip
7
[chandrashekhar@localhost camel-kafka-connectors]$ 
8

          
9
# In [KAFKA_HOME]/config/connect-standalone.properties configure plugin.path with the path where connector is extracted.
10
[chandrashekhar@localhost kafka_2.13-2.7.0]$ vi config/connect-standalone.properties 
11
plugin.path=/home/chandrashekhar/SSD/Development_SSD/Streams_RH/camel-kafka-connectors
12

          



7.  Setup Connector with aCamelSshSinkConnector.propertiesfile, which has SSH sink configurations.

Shell
 




xxxxxxxxxx
1
15


 
1
[chandrashekhar@localhost config]$ pwd
2
/home/chandrashekhar/SSD/Development_SSD/Streams_RH/camel-kafka-connectors/config
3
[chandrashekhar@localhost config]$ vi CamelSshSinkConnector.properties
4
name=CamelSshSinkConnector
5
connector.class=org.apache.camel.kafkaconnector.ssh.CamelSshSinkConnector
6
key.converter=org.apache.kafka.connect.storage.StringConverter
7
value.converter=org.apache.kafka.connect.storage.StringConverter
8
# kafka topic
9
topics=testTopic
10
camel.sink.path.host=localhost
11
#ssh port
12
camel.sink.path.port=21947
13
camel.sink.endpoint.username=root
14
camel.sink.endpoint.password=root



8. Run connector in standalone mode. Being a POC in Kafka one node setup, we will run [KAFKA_HOME]/bin/connect-standalone.sh

Shell
 




xxxxxxxxxx
1


 
1
[chandrashekhar@localhost kafka_2.13-2.7.0]$ bin/connect-standalone.sh config/connect-standalone.properties ../camel-kafka-connectors/config/CamelSshSinkConnector.properties
2

          



9. Create a file with Linux commands to create the file and then append some records.

Shell
 




xxxxxxxxxx
1


 
1
[chandrashekhar@localhost config]$ cat sshCommands.txt 
2
touch sshexample.txt
3
echo 'apple is fruit' >> sshexample.txt
4
echo 'rose is flower' >> sshexample.txt
5
[chandrashekhar@localhost config]$ 



10. Send the record withinsshCommands.txtusing KafkaCat utility to Kafka. Here we are using Podman to run a KafkaCat docker image for sending messages.

Shell
 




xxxxxxxxxx
1


 
1
[chandrashekhar@localhost config]$ pwd
2
/home/chandrashekhar/SSD/Development_SSD/Streams_RH/camel-kafka-connectors/config
3
[chandrashekhar@localhost config]$ ls -ltr|grep ssh
4
-rwxrwxrwx. 1 chandrashekhar chandrashekhar 101 Jan 16 23:19 sshCommands.txt
5
[chandrashekhar@localhost config]$ sudo podman run -it --network=host --volume `pwd`/sshCommands.txt:/data/sshCommands.txt --security-opt label=disable  edenhill/kafkacat:1.6.0 -P -b 0.0.0.0:9092 --name kafkacat -t testTopic -P -l /data/sshCommands.txt
6
[sudo] password for chandrashekhar: 
7
[chandrashekhar@localhost config]$ 



11.  Now, after sending messages to Kafka testTopics and camel-ssh-kafka-connector sink already running, we expect that the SSH server which we started earlier should have received these commands from Kafka with camel-ssh sink connector. Here username and password of this SSH server is root.

Shell
 




xxxxxxxxxx
1
11


 
1
[chandrashekhar@localhost config]$ ssh root@localhost -p 21947
2
root@localhost's password: 
3
Last login: Sat Jan 16 17:59:21 2021 from localhost
4
root@30aebbbcdb82:~# ls
5
sshexample.txt
6
root@30aebbbcdb82:~# cat sshexample.txt 
7
apple is fruit
8
rose is flower
9
root@30aebbbcdb82:~# 
10

          
11

          



12. Once tested, we can stop the Podman container.

Shell
 




x
9


 
1
[chandrashekhar@localhost strimzi-0.20.1]$ podman ps -a|grep ssh
2
36ffa61633bd  docker.io/rastasheep/ubuntu-sshd:14.04           /usr/sbin/sshd -D     8 minutes ago  Up 8 minutes ago        0.0.0.0:26237->22/tcp   test_sshd
3
[chandrashekhar@localhost strimzi-0.20.1]$ 
4
[chandrashekhar@localhost strimzi-0.20.1]$ podman container rm -f test_sshd
5
36ffa61633bd9c5fef110e550a9eb789660d31d0b4949b0da919b1f00269efe8
6
[chandrashekhar@localhost strimzi-0.20.1]$ 
7
[chandrashekhar@localhost strimzi-0.20.1]$ podman ps -a|grep ssh
8
[chandrashekhar@localhost strimzi-0.20.1]$ 
9

          



13. We can finally stop Kafka and Zookeeper. The connector instance can be closed with Ctrl + C on it's terminal or just closing that terminal.

Shell
 




x


 
1
[chandrashekhar@localhost bin]$ pwd
2
/home/chandrashekhar/SSD/Development_SSD/Streams_RH/kafka_2.13-2.7.0/bin
3
[chandrashekhar@localhost bin]$ ./kafka-server-stop.sh 
4
[chandrashekhar@localhost bin]$ ./zookeeper-server-stop.sh 
5

          
6

          



14. Another important point is to check the group and offset details associated with testTopic.

Shell
 




x


 
1
# get list of all consumer-group
2
[chandrashekhar@localhost kafka_2.13-2.7.0]$ bin/kafka-consumer-groups.sh  --list --bootstrap-server localhost:9092
3
connect-CamelSshSinkConnector
4
[chandrashekhar@localhost kafka_2.13-2.7.0]$ 
5
# check offset details of connect-CamelSshSinkConnector group.
6
[chandrashekhar@localhost kafka_2.13-2.7.0]$ bin/kafka-consumer-groups.sh --describe --group connect-CamelSshSinkConnector --bootstrap-server localhost:9092
7

          
8
Consumer group 'connect-CamelSshSinkConnector' has no active members.
9

          
10
GROUP                         TOPIC           PARTITION  CURRENT-OFFSET  LOG-END-OFFSET  LAG             CONSUMER-ID     HOST            CLIENT-ID
11
connect-CamelSshSinkConnector testTopic       0          12              12              0               -               -               -
12

          
13
[chandrashekhar@localhost kafka_2.13-2.7.0]$ 
14

          



That's it guys, hope you would have found this article interesting and informative. 

 

kafka Connector (mathematics) Integration shell

Opinions expressed by DZone contributors are their own.

Related

  • Kafka Connect on Kubernetes The Easy Way!
  • Securing and Monitoring Your Data Pipeline: Best Practices for Kafka, AWS RDS, Lambda, and API Gateway Integration
  • Automated Application Integration With Flask, Kakfa, and API Logic Server
  • The Emergence of Micro Frontends: Integrating With Next.js

Partner Resources


Comments

ABOUT US

  • About DZone
  • Send feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: