Nice video bro. It worked for me. One quick question for you: How to run query like this: select id,CUST_LN_NBR FROM flex_activity limit 1; In this query I am using limit option in due to that it is getting failed. If I use simple query without limit then it works fine: Error I am getting: org.apache.kafka.connect.errors.ConnectException: java.sql.SQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'WHERE `ID` > -1 ORDER BY `ID` ASC' at line 1 This is not only with limit option, this is for all whenever you have to apply any filter.
Very helpful video. I see the 'timestamp.column.name' should have been TXN_DATE (as flashed up on video) otherwise update wont work. It would have been nice to see this working at the end in addition to adding new records
Hi Vishal, I am facing issue Invalid connector configuration: There are 2 fields that require your attention while connecting to mysql db ..is this required mysql jdbc configuration as well ? if yes how we can do that ?
Hi, thank you very for your positive feedback. I have installed it using confluent hub command which you can use from the place where I downloaded confluent bundle
Please, what is the best method or connector available to pull or ingest data that normally get updated or changed. For example bank accounts statement. Thanks.
@@javatechlearning If the data is present in a legacy application table that has no column for timestamp. In this case, how can we ingest the records anytime there is update to the already ingested record. Thanks.
Hey, i tried to connect to postgreSQL and was finally landing with below error in logs Caused by: org.apache.kafka.common.errors.SerializationException: Error serializing Avro message Caused by: org.apache.kafka.connect.errors.DataException: Failed to serialize Avro data from topic any help ?
@@javatechlearning Hai Vishal, i have facing issue no suitable driver found for jdbc mysql, I already put jar file on /kafka/share.. please i need advice. Thank You
Hi Vishal thanks for sharing videos. Please let me know where can i check the errors/logs when connectors failed? also logs/errors if any issues in processing data. Please share me paths
Could you please do a Video loading tables into their respective topics for each tables , using SQL SERVER as a source??.... so everytime each table suffers a DML action the new event s sent to the Confluent plataform..
Hi Vishal, Could you please help me with Snowflake Source connector via (JDBC) to unload the data to Confluent Kafka. If there's any sample code please share me the github details.
Can you please guide me how we do this in local instead of confluent cloud? What I want to do is like to connect database with kafka connect and the data gets inserted into kafka cluster
@@javatechlearning I meant like you used confluent centre right.....What I am trying to do is run zookeeper, kafka from command line then I am stucked like how we can connect database to kafka, I saw some tuto they all are confusing do I need system registry or anything?
Sorry I couldn't follow you much. Confluent center gui is only for monitoring purpose. Main servics are connect, schema registry, Kafka, zookeeper etc. They all should be started in order to connect database
@@javatechlearning Thanks for your time figured it out , my source connector is up now ..would it be possible to make a video on how to read from mysql and write into mongodb ? or you can share any known sources which speaks about it
Thanks for the video , Vishal. I wanted to know that , we can also source Oracle DB using JDBC Source Connector too, isn't it? And where did you deploy the connector config file? In the Confluent cloud?
Source Oracle db to Kafka topic is done by jdbc sink connector. Answer to second question I deploy using control center on local machine itself. Watch my installation video
@@javatechlearning Can you tell whether the confluent platform, if installed in my local machine, is it chargeable? How to continue with confluent platform in my local machine, if you could instruct?
Most sensible video on Kafka topic, thank you very much.
Dear Vishal,
Thank you for your effort & time to produce & upload these tutorials.
Thanks a million
Nice video bro. It worked for me. One quick question for you:
How to run query like this:
select id,CUST_LN_NBR FROM flex_activity limit 1;
In this query I am using limit option in due to that it is getting failed. If I use simple query without limit then it works fine:
Error I am getting:
org.apache.kafka.connect.errors.ConnectException: java.sql.SQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'WHERE `ID` > -1 ORDER BY `ID` ASC' at line 1
This is not only with limit option, this is for all whenever you have to apply any filter.
Thank for the great tutorial. Did you have to create the avro schema or it gets generated?
nvm, I found the ans from the video.
Very helpful video. I see the 'timestamp.column.name' should have been TXN_DATE (as flashed up on video) otherwise update wont work. It would have been nice to see this working at the end in addition to adding new records
AWESOME !!.Can you also upload video on sink connector to oracle database or any rdbms
Hi Vishal,
I am facing issue Invalid connector configuration: There are 2 fields that require your attention
while connecting to mysql db ..is this required mysql jdbc configuration as well ? if yes how we can do that ?
Hi, Vishal , Could you please help me in pushing confluent topic data into scylladb table...
Really cool bro!
I'm beginner on kafka and DB concepts. What is dialect exactly in this case ?
Hi Vishal, thank you very much for this! :)
I have 1 question, how do you installed the connector plugin for sink ?
Hi, thank you very for your positive feedback. I have installed it using confluent hub command which you can use from the place where I downloaded confluent bundle
Please, what is the best method or connector available to pull or ingest data that normally get updated or changed. For example bank accounts statement.
Thanks.
It depends where the data is present. Kafka provides range of connector to source data. In special cases we could implement our own connector too
@@javatechlearning If the data is present in a legacy application table that has no column for timestamp. In this case, how can we ingest the records anytime there is update to the already ingested record.
Thanks.
Hey, i tried to connect to postgreSQL and was finally landing with below error in logs
Caused by: org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: org.apache.kafka.connect.errors.DataException: Failed to serialize Avro data from topic
any help ?
It seems your not using right avro schema to read events from topic. Try comparing topic data samples with your schema
hi Vishal, when i want to upload the connector i faced the issue "invalid connector class" i need advice. thanks
I thiy you might have not installed connector plugin before uploading
@@javatechlearning yes correct, thank you Vishal, now my another problem with jar jdbc
@@javatechlearning Hai Vishal, i have facing issue no suitable driver found for jdbc mysql, I already put jar file on /kafka/share.. please i need advice. Thank You
Hi Vishal thanks for sharing videos. Please let me know where can i check the errors/logs when connectors failed? also logs/errors if any issues in processing data. Please share me paths
Where is your confluent server running? On local or confluent cloud?
Could you please do a Video loading tables into their respective topics for each tables , using SQL SERVER as a source??.... so everytime each table suffers a DML action the new event s sent to the Confluent plataform..
wtf , you are good
Great video! Can you help with connecting to snowflake as source?
Hi Mirza did you find the answer for "snowflake as source connector"??..if yes please let me know the info.
Hi Vishal,
Could you please help me with Snowflake Source connector via (JDBC) to unload the data to Confluent Kafka.
If there's any sample code please share me the github details.
Can you please guide me how we do this in local instead of confluent cloud?
What I want to do is like to connect database with kafka connect and the data gets inserted into kafka cluster
I did it in local itself. I'm not using confluent cloud
@@javatechlearning I meant like you used confluent centre right.....What I am trying to do is run zookeeper, kafka from command line then I am stucked like how we can connect database to kafka, I saw some tuto they all are confusing do I need system registry or anything?
Sorry I couldn't follow you much. Confluent center gui is only for monitoring purpose. Main servics are connect, schema registry, Kafka, zookeeper etc. They all should be started in order to connect database
@@javatechlearning okay got it, sorry I got confued....I thought the GUI is of cloud. Thanks Man!
My Confluent installation dont have JDBC source connector by default , Please help me how to install ?
Hello, you need to install jdbc connect plugin using confluent hub
You can find command in my video
@@javatechlearning Thanks for your time figured it out , my source connector is up now ..would it be possible to make a video on how to read from mysql and write into mongodb ? or you can share any known sources which speaks about it
You first need to build jdbc source connector to read MySQL data and then one sink connector to sink topic data to nosql db
@@javatechlearning not able to find command, please let me know the steps to install jdbc connect plugin
from oracle, it is unable to fetch data. Any license issue?
Thank you for your efforts and time in making this wonderful video.
Thanks for your feedback 🙂
Sir nice video, but atleast you can upload the code link.
Sure will provide you
Please find code link - github.com/vishaluplanchwar/KafkaTutorials
can you give us the link for the config file ?
Thanks for the video , Vishal. I wanted to know that , we can also source Oracle DB using JDBC Source Connector too, isn't it? And where did you deploy the connector config file? In the Confluent cloud?
Source Oracle db to Kafka topic is done by jdbc sink connector. Answer to second question I deploy using control center on local machine itself. Watch my installation video
@@javatechlearning Ok. Then when is the jdbc source connector used?
Source topic data to any relational database
@@javatechlearning Can you tell whether the confluent platform, if installed in my local machine, is it chargeable? How to continue with confluent platform in my local machine, if you could instruct?
Confluent platform for single node machine is free. You can download and setup on ur machine.
how to download kafka connect JDBC COnnector
Could you please provide me the link
github.com/vishaluplanchwar/KafkaTutorials
Sir Can you provide the jdbc driver link
Can anyone would paste the config properties in comment.
thanks i need your email please
hi
by using your code connector is got connected but status is got failed and no data is fetched from db can i know what was the reason
Did you check connector logs? Any error coming up there ?