Kafka Tutorial Schema Evolution Part 1

Поділитися
Вставка
  • Опубліковано 30 січ 2025

КОМЕНТАРІ • 41

  • @ScholarNest
    @ScholarNest  4 роки тому +1

    Want to learn more Big Data Technology courses. You can get lifetime access to our courses on the Udemy platform. Visit the below link for Discounts and Coupon Code.
    www.learningjournal.guru/courses/

  • @AmanGarg95
    @AmanGarg95 6 років тому +5

    I've been watching the playlist right from the start. The method of delivery is concise, succinct and clear. Way to go Sir. Thanks a lot.

  • @shafiahmed3382
    @shafiahmed3382 5 років тому

    Dear Sir - you have brilliantly profuse expertise in teaching right content !

  • @111gangadhar
    @111gangadhar 4 роки тому

    Excellent tutorials.. Sir. Clear and Concise...

  • @bsrameshonline
    @bsrameshonline 2 роки тому

    Nicely explained

  • @vikashverma9
    @vikashverma9 4 роки тому

    Excellent tutorials

  • @s_rr_g9577
    @s_rr_g9577 4 роки тому

    awesome sir

  • @arunkumarramanujam
    @arunkumarramanujam 5 років тому

    Nice video

  • @s_rr_g9577
    @s_rr_g9577 4 роки тому

    Great explanation. Thank you so much

  • @Nilayam-DD
    @Nilayam-DD 5 років тому

    very useful videos

  • @letme4u
    @letme4u 5 років тому

    thanks a lot for wonderful share.

  • @unmeshkadam4876
    @unmeshkadam4876 2 роки тому

    Sir how do you configure the schema registry?

  • @MohammadRasoolShaik
    @MohammadRasoolShaik 7 років тому +1

    Could you please explain how to setup schema registry for windows? And i do understand confluent-schema registry is to register avro-schema, but how it differentiate versions of one schema (Based on name of schema file name) when we are using for lower and higher versions of same schema? Apart from Avro schema, is this registry useful for any other tools or framework or else it is only specific to Avro. Ideally schema registry shouldn't be specific to avro.

  • @arpit35007
    @arpit35007 7 років тому +2

    Please make the tutorial on elastic search.

  • @nguyen4so9
    @nguyen4so9 7 років тому

    Excellent !

  • @AliKahoot
    @AliKahoot 7 років тому

    Thanks for the great tutorial, Very well explained.

  • @pratiksarvan
    @pratiksarvan 7 років тому

    Excellent!!

  • @nishantagnihotri8028
    @nishantagnihotri8028 Рік тому

    Where is the link i am not able to download it its showing download from maven central?

  • @Harikrishna-ie4um
    @Harikrishna-ie4um 8 років тому

    awesome

  • @krishnam1260
    @krishnam1260 7 років тому

    Very well explained Thanks :)

  • @杨正云
    @杨正云 8 років тому

    Great tutorial! I am looking forward to seeing the solution how to make old and new producer/consumers work together because now I can't get it how this could happen...

    • @杨正云
      @杨正云 8 років тому

      After a second watching I got it:)

    • @____R__
      @____R__ 4 роки тому

      Still I didn’t got that. Is it in any other video??

  • @chenhaukhoo
    @chenhaukhoo 7 років тому +1

    Im curious, isn't it is simple if we always serialise our object to string (use gson) before sending to Kafka? And in consumer side, once we received the String, we can just simply deserialise it to the object.

    • @ScholarNest
      @ScholarNest  7 років тому

      +chen hau khoo Yes. We can do that easily. In fact json is quite popular in simple scenarios and Json support is inbuilt in Kafka. However, when you have evolving schema, Avro could be a better option. I have covered schema evolution in a video.

  • @shristiraj9907
    @shristiraj9907 5 років тому

    What is the advantage of using AvraSerializer/Deserializer over the following approach? I have created one google protobuf object and converted into ByteString and send as a message and used org.apache.kafka.common.serialization.ByteArraySerializer and org.apache.kafka.common.serialization.ByteArrayDeserializer

    • @ScholarNest
      @ScholarNest  5 років тому

      What if your producer is changed an now it adds one new field in the message record. Can you use the same consumer without changing it?

  • @Modern_revolution
    @Modern_revolution 6 років тому

    Thank u so much

  • @kunalgupta6152
    @kunalgupta6152 8 років тому +1

    Hi, Its a Nice tutorial over Schema in Kafka, just one clarification i want to have as you have told me earlier that schema is well embedded in data and deserializer extract schema and deserializer data so what is the requirement of schema registry when data has embedded schema in it.

    • @ScholarNest
      @ScholarNest  8 років тому

      Embedding Schema in each record will increase the size of each record and ultimately impact the performance. So Schema is stored in the registry, and an ID is embedded in the message record.

    • @sonunitjsr223
      @sonunitjsr223 6 років тому +1

      Same question here as well. But ClickRecord.java class has the schema(variable name: SCHEMA$) along with the data.
      And while writing the Producer/Consumer code you are using ClickRecord.java, so you have the schema embedded in the JAVA file. Why we need the schema registry

    • @DagangWei
      @DagangWei 6 років тому

      @@sonunitjsr223 Same question, since ClickRecord.java is generated from the schema, on the consumer side, it already knows how to deserialize the message, why we need the schema registry?

    • @jiger83
      @jiger83 5 років тому

      @@DagangWei This approach as @Learning Journal mentioned above can be costly in terms of network, storage, and other processing costs. So it's better to use a schema registry. You only supply schema if you don't use a schema registry. Otherwise, schema id will be sent in the message.

  • @yusufdm5472
    @yusufdm5472 8 років тому

    Nice tutorial, is there a .NET/C# equivalent of Java SDK for Kafka including all the advanced topics you covered like Custom Partition, Commits, Schema Evolution etc...

    • @ScholarNest
      @ScholarNest  8 років тому +1

      You can use Kafka Rest Proxy if you want to use it from C#.

  • @Anshmaster2016
    @Anshmaster2016 7 років тому

    Nicely defined... Good Job.
    As we see Avro schema are defined in JSON, so is there any requirement that the data shall also be in JSON, AVRO, ORC format or simple flatfile, CSV can also be processed by AVRO/JSON schema

    • @ScholarNest
      @ScholarNest  7 років тому

      Avro itself is a data file format. If your data is in another format, your producer need to encode it into Avro object as we have done in the example code.

  •  7 років тому

    Thank you for tutorial, but I have a question about the schema registry. who does set up it? Where?

    • @ScholarNest
      @ScholarNest  7 років тому

      Schema registry is an optional component of Kafka. If you need it, the cluster admin should setup it on a dedicated host machine.

    • @9962366673
      @9962366673 5 років тому

      @@ScholarNest "ClickRecord" class can serve as schema for serializing and deserializing right. Why does it require schema registry when we pass ClickRecord as ValueSerializer... Please clarify this part. Thank you..