Advanced Concepts in Mule | Streams | Repeatable | Non Repeatable

Поділитися
Вставка
  • Опубліковано 14 гру 2024

КОМЕНТАРІ • 39

  • @atulverma.515
    @atulverma.515 3 роки тому +3

    I am new to Mule arena, and the streaming concept, that you have presented is impacabale. can not be better than this. Keep it Up.

  • @rohitKumar-hu2ez
    @rohitKumar-hu2ez 11 місяців тому

    Thanks Vishwas, crystal clear concept explanation.

  • @RamKrishna-vm3mp
    @RamKrishna-vm3mp 3 роки тому

    The best explanation I have come across. Thanks Viswas.

  • @mjpraveen34
    @mjpraveen34 3 роки тому +1

    Your videos are awesome and it helps to get deeper understanding about mule. Appreciate your efforts. Thanks lot

  • @prasanthl8125
    @prasanthl8125 Місяць тому

    Great video. 👍

  • @bibekbazaz
    @bibekbazaz 3 роки тому +1

    Very helpful video. It brings light to many questions like even if we have streaming enabled why the entire data is loaded in memory and also useful tips on how to avoid it. Appreciate the effort

  • @debottamguha1344
    @debottamguha1344 4 роки тому +1

    Things are explained quite nicely. Thanks.

  • @jacekbiaecki8076
    @jacekbiaecki8076 2 роки тому +1

    Awsome video - as always! To be honest... When you were talking about Repeatable In-Memory streams I thought you would add 600 additional rows to the SQL database to demonstrate the exception ;-) (max in-memory instances is set to 500 in your example). It would be fun to see the exception :) But anyway - very valuable video! Thank you!

  • @rithulkumar1387
    @rithulkumar1387 3 роки тому

    Excellent Video.. Explained it well

  • @letsshopeasy
    @letsshopeasy 4 роки тому

    Clear explanation.. thanks!

  • @SimpletravelGirl
    @SimpletravelGirl 3 роки тому

    thanks thats very beautifully expalined

  • @harshtamishra5473
    @harshtamishra5473 3 роки тому +2

    Hi Vishwas, thanks for explaining so well. I have one question. When after transformation we have to append data to file it will again reach to it's original size that is 1gb and it will consume heap memory right?

  • @jerrytom4499
    @jerrytom4499 3 роки тому

    Thanks for the explanation

  • @mohanbapuji
    @mohanbapuji 3 роки тому

    Hi Vishwas, thanks for the valuable sessions,
    I have a doubt, why is the flow got errored out when an Iterable streaming object is returned. Please clarify on why you've used Transform in the end of the flow.
    Thank you

  • @TheDatasmith
    @TheDatasmith 2 місяці тому

    do you have a github repo?

  • @kotteramanareddy4331
    @kotteramanareddy4331 Рік тому

    good explanation

  • @manikondapraveen2213
    @manikondapraveen2213 3 роки тому

    I don't understand the use of these repeatable streams. Why are we writing into the file again in file store streams. Isn't the duplication of data and ending with reading the same data again. I worked on streams earlier in java where we read a chunk of data from file, process it before reading the next chunk of data from file again. This way I am not using anything other than memory to process the entire file and works with unlimited size of file. I don't understand how the same can be achieved using streams in Mule 4. Can this be achievable?

  • @ashokvarma2203
    @ashokvarma2203 4 роки тому +1

    In repeatable file stream, where exactly it will store the file, is it in vCore memory or outside of app

    • @Vishwasp13
      @Vishwasp13  4 роки тому

      Persistent storage of the cloudhub worker.

    • @ashokvarma2203
      @ashokvarma2203 4 роки тому

      @@Vishwasp13 It means it will user vCore memory right ?

    • @Vishwasp13
      @Vishwasp13  4 роки тому

      Memory usually refers to volatile memory, it will store the file in non volatile memory i.e persistent disk storage.

    • @ashokvarma2203
      @ashokvarma2203 4 роки тому

      @@Vishwasp13 Got it. Thank you.

  • @sudheerraja3059
    @sudheerraja3059 4 роки тому

    grate explanation. if you would show this practically with db or file which would help a lot ,

  • @sreenivasulu3623
    @sreenivasulu3623 2 роки тому

    Hi power , it is more conceptual . Can you explain concept with example. One real time senario. Which helps more.

  • @mohan1vamsi
    @mohan1vamsi 4 роки тому

    At 4 .00 minutes you said request will process one after the other in streams , if i get 100 requests which demands 1 st row in parallel, does the requests process one after other . If yes then performance gets impacted right? Only 1st requests will execute faster . Pls correct my understanding

    • @Vishwasp13
      @Vishwasp13  4 роки тому +1

      Every request gets picked up by a separate thread depending upon the Max concurrency of the flow, so each request would get its own stream instance. So if 2 requests are being processed in parallel, they both will have 2 different instances of streams running in parallel.

    • @mohan1vamsi
      @mohan1vamsi 4 роки тому

      @@Vishwasp13 thanks

  • @manishjoshi8529
    @manishjoshi8529 3 роки тому

    Thanks for Explaining! Question regarding how to process files having 1 GB data with CSV records using For-Each loop without parsing the whole content?

    • @bibekbazaz
      @bibekbazaz 3 роки тому

      From the video, what I gather is we will use a File Connector to read the file using Non Repeatable In Memory Stream, Then use a Chice to see if the isEmpty(payload) is false, then in a for each loop having a batch size as suitable, we will use a transform message to perform the transformations inside the for each loop. That way as the stream processes and the data is available to for each , the operations will still continue. But since we never used an operation that requires the entire payload at the same time, we will be spared from landing the entire file in memory.

  • @michaelj1743
    @michaelj1743 4 роки тому

    Wonderful video!! Can you do video on one way SSL and two way SSL ? Appreciated!!

    • @Vishwasp13
      @Vishwasp13  4 роки тому

      Thanks, I'll try to make one.

  • @nagachittoory6632
    @nagachittoory6632 4 роки тому

    Vishwas...you said 500 objects max can be stored in-memory as per the config, and we have 6 records in DB, so can each record be considered as object or this set of 6 records are considered one object, please clarify. Great job!!

    • @Vishwasp13
      @Vishwasp13  3 роки тому

      Each record is one single object.