Create event-based projects using S3, Lambda and SQS

Поділитися
Вставка
  • Опубліковано 20 гру 2024

КОМЕНТАРІ • 26

  • @deepaksingh9318
    @deepaksingh9318 Рік тому +2

    A perfect explanation with appropriate example and use case .
    So anyone who is new to the concept can easily understand after watching the video:
    1. what it is
    2. Why it is used and what is the need of it
    3 And how to do it end to end..
    So a perfect video i would say covering everything

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  9 місяців тому

      Thank you so much for your positive feedback @deepaksingh9318! I'm glad to hear that the explanation resonated well with you, and you found it comprehensive and helpful.

  • @SK-gn3rs
    @SK-gn3rs Рік тому +1

    Thanks for the code, was struggling to read the event to extract bucket name and the object...this made my life easy

  • @likitan4076
    @likitan4076 8 місяців тому +2

    Also, without adding SQS trigger to the lambda how did it detect the s3 file uploads from sqs trigger as seen in the cloudwatch logs?

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  8 місяців тому

      @likitan4076, I have added the trigger at 9:27 ..

    • @likitan4076
      @likitan4076 8 місяців тому +1

      @@KnowledgeAmplifier1 To the SQS you added a lambda trigger..got it.. iwas thinking adding an sqs trigger to the lambda function

  • @manubansal9197
    @manubansal9197 6 місяців тому

    can you tell the all things you performed and used are free to use? i mean if i make a same as yours or make an integration of s3, sqs and lambda, aws would not apply charge na? and can you provide all the codes and steps in a docx format?

  • @RajasthaniINAmerica
    @RajasthaniINAmerica Рік тому +1

    simple & straight forward

  • @Polly10189
    @Polly10189 9 місяців тому +1

    is it possible to get the uploaded file content as well in the SQS message anyhow?

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  9 місяців тому +1

      SQS has a message size limitation, and it's recommended to keep messages as small as possible. Including the actual content of a large file in an SQS message could potentially lead to exceeding these limitations. Moreover, SQS is more efficient when used to transmit metadata or information necessary to trigger subsequent actions.

    • @Polly10189
      @Polly10189 9 місяців тому +1

      ​@@KnowledgeAmplifier1 Thanks for your reply. I need to get the actual data of uploaded file, can we do this by using any AWS service?

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  9 місяців тому +1

      @@Polly10189 from the code explained in the video , you can get the s3 bucket name & key name , now you can use any python module like boto3 or s3fs to read the data from s3 and perform various computation.
      For example , if you want to read the csv data from s3 , then here is the code --
      s3 = boto3.client(
      's3',
      aws_access_key_id='XYZACCESSKEY',
      aws_secret_access_key='XYZSECRETKEY',
      region_name='us-east-1'
      )
      obj = s3.get_object(Bucket='bucket-name', Key='myreadcsvfile.csv')
      data = obj['Body'].read().decode('utf-8').splitlines()
      records = csv.reader(data)
      headers = next(records)
      print('headers: %s' % (headers))
      for eachRecord in records:
      print(eachRecord)
      Like this way for different file format , you can create the code and read from s3 ...

    • @Polly10189
      @Polly10189 9 місяців тому +1

      @@KnowledgeAmplifier1 I am reading the path of file uploaded to S3. It's working, Thanks

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  9 місяців тому

      @@Polly10189 Glad to hear this! Happy Learning

  • @mandarkulkarni9525
    @mandarkulkarni9525 Рік тому

    What is the efficient and cost effective way of moving Messages from SQS Q to S3 bucket
    I have a Lambda function that is processing messages from SQS Q and deletes them once processing is done. I need to persist SQS messages in S3 for compliance. Thank you.

  • @ravikreddy7470
    @ravikreddy7470 2 роки тому +1

    Quick question: Don't we have to upload deployment zip with json package in it? how does lambda install that library?

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  2 роки тому +1

      Hello Ravi K Reddy, json is available by default in AWS Lambda execution environment , so no need deployment zip or lambda layer to use json, you can find the list of available modules in Lambda Eexecution environment for different Python versions here -- gist.github.com/gene1wood/4a052f39490fae00e0c3 Happy Learning

  • @kspremkumar4869
    @kspremkumar4869 2 роки тому +1

    Hi. I have few doubts on kafka. can you please explain?

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  Рік тому

      Hello KS Prem Kumar, please share your doubt here , if I know that topic , I will surely try to help as much as possible..

  • @diptarghyachatterjee6018
    @diptarghyachatterjee6018 2 роки тому

    Great explanation... Is there anyway instead of SQS we can have AWS eventbridge through which we can trigger the lamda .
    2. Also can you provide any python or pspark script through which we can load the CSV file to snowflake db

  • @DineshKumar-bk5vv
    @DineshKumar-bk5vv Рік тому

    Hello Sir, Could you pls make a video to integrate Application using Amazon SQS .

  • @DineshKumar-bk5vv
    @DineshKumar-bk5vv Рік тому

    How to reach out for more information...can I get contact details pls?

  • @harrior1
    @harrior1 Рік тому

    Thanks a lot!