Тёмный

Create event-based projects using S3, Lambda and SQS 

Knowledge Amplifier
Подписаться 27 тыс.
Просмотров 10 тыс.
50% 1

The main purpose of this video is to simulate a small portion of an event-based project which is very frequently used by many companies.
Prerequisite:
--------------------
AWS SQS | AWS Simple Queue Service | How SQS Works | AWS Tutorial
• AWS SQS | AWS Simple Q...
Create and Use an Amazon SQS Queue | Practical
• Create and Use an Amaz...
Fan out Architecture in AWS with SNS + SQS + Lambda + Python
• Fan out Architecture i...
Lambda Code:
------------------
import json
def lambda_handler(event, context):
TODO implement
print(event)
try:
for i in event['Records']:
s3_event = json.loads(i['body'])
if 'Event' in s3_event and s3_event['Event'] == 's3:TestEvent':
print("Test Event")
else:
for j in s3_event['Records']:
print("Bucket Name : {} ".format(j['s3']['bucket']['name']))
print("Object Name : {} ".format(j['s3']['object']['key']))
except Exception as exception:
print(exception)
SQS-s3 Access Policy:
----------------------------------
{
"Version": "2012-10-17",
"Id": "Policy1662050523224",
"Statement": [
{
"Sid": "Stmt1662050521697",
"Effect": "Allow",
"Principal": "*",
"Action": "sqs:*",
"Resource": "{SQS Queue ARN}",
"Condition": {
"ArnEquals": {
"aws:SourceArn": "{s3 bucket ARN}"
}
}
}
]
}
Check this playlist for more Data Engineering related videos:
• Demystifying Data Engi...
Snowflake Complete Course from scratch with End-to-End Project with in-depth explanation--
doc.clickup.com/37466271/d/h/...
🙏🙏🙏🙏🙏🙏🙏🙏
YOU JUST NEED TO DO
3 THINGS to support my channel
LIKE
SHARE
&
SUBSCRIBE
TO MY RU-vid CHANNEL

Наука

Опубликовано:

 

31 авг 2022

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 26   
@deepaksingh9318
@deepaksingh9318 11 месяцев назад
A perfect explanation with appropriate example and use case . So anyone who is new to the concept can easily understand after watching the video: 1. what it is 2. Why it is used and what is the need of it 3 And how to do it end to end.. So a perfect video i would say covering everything
@KnowledgeAmplifier1
@KnowledgeAmplifier1 4 месяца назад
Thank you so much for your positive feedback @deepaksingh9318! I'm glad to hear that the explanation resonated well with you, and you found it comprehensive and helpful.
@SK-gn3rs
@SK-gn3rs Год назад
Thanks for the code, was struggling to read the event to extract bucket name and the object...this made my life easy
@KnowledgeAmplifier1
@KnowledgeAmplifier1 Год назад
Glad to hear the video is helpful to you S K! Happy Learning
@RajasthaniINAmerica
@RajasthaniINAmerica 7 месяцев назад
simple & straight forward
@KnowledgeAmplifier1
@KnowledgeAmplifier1 7 месяцев назад
@mandarkulkarni9525
@mandarkulkarni9525 Год назад
What is the efficient and cost effective way of moving Messages from SQS Q to S3 bucket I have a Lambda function that is processing messages from SQS Q and deletes them once processing is done. I need to persist SQS messages in S3 for compliance. Thank you.
@diptarghyachatterjee6018
@diptarghyachatterjee6018 Год назад
Great explanation... Is there anyway instead of SQS we can have AWS eventbridge through which we can trigger the lamda . 2. Also can you provide any python or pspark script through which we can load the CSV file to snowflake db
@likitan4076
@likitan4076 3 месяца назад
Also, without adding SQS trigger to the lambda how did it detect the s3 file uploads from sqs trigger as seen in the cloudwatch logs?
@KnowledgeAmplifier1
@KnowledgeAmplifier1 3 месяца назад
@likitan4076, I have added the trigger at 9:27 ..
@likitan4076
@likitan4076 3 месяца назад
@@KnowledgeAmplifier1 To the SQS you added a lambda trigger..got it.. iwas thinking adding an sqs trigger to the lambda function
@harrior1
@harrior1 Год назад
Thanks a lot!
@KnowledgeAmplifier1
@KnowledgeAmplifier1 Год назад
You are welcome Sergei Sizov! Happy Learning
@manubansal9197
@manubansal9197 Месяц назад
can you tell the all things you performed and used are free to use? i mean if i make a same as yours or make an integration of s3, sqs and lambda, aws would not apply charge na? and can you provide all the codes and steps in a docx format?
@ravikreddy7470
@ravikreddy7470 Год назад
Quick question: Don't we have to upload deployment zip with json package in it? how does lambda install that library?
@KnowledgeAmplifier1
@KnowledgeAmplifier1 Год назад
Hello Ravi K Reddy, json is available by default in AWS Lambda execution environment , so no need deployment zip or lambda layer to use json, you can find the list of available modules in Lambda Eexecution environment for different Python versions here -- gist.github.com/gene1wood/4a052f39490fae00e0c3 Happy Learning
@DineshKumar-bk5vv
@DineshKumar-bk5vv Год назад
Hello Sir, Could you pls make a video to integrate Application using Amazon SQS .
@kspremkumar4869
@kspremkumar4869 Год назад
Hi. I have few doubts on kafka. can you please explain?
@KnowledgeAmplifier1
@KnowledgeAmplifier1 Год назад
Hello KS Prem Kumar, please share your doubt here , if I know that topic , I will surely try to help as much as possible..
@Polly10189
@Polly10189 4 месяца назад
is it possible to get the uploaded file content as well in the SQS message anyhow?
@KnowledgeAmplifier1
@KnowledgeAmplifier1 4 месяца назад
SQS has a message size limitation, and it's recommended to keep messages as small as possible. Including the actual content of a large file in an SQS message could potentially lead to exceeding these limitations. Moreover, SQS is more efficient when used to transmit metadata or information necessary to trigger subsequent actions.
@Polly10189
@Polly10189 4 месяца назад
​@@KnowledgeAmplifier1 Thanks for your reply. I need to get the actual data of uploaded file, can we do this by using any AWS service?
@KnowledgeAmplifier1
@KnowledgeAmplifier1 4 месяца назад
@@Polly10189 from the code explained in the video , you can get the s3 bucket name & key name , now you can use any python module like boto3 or s3fs to read the data from s3 and perform various computation. For example , if you want to read the csv data from s3 , then here is the code -- s3 = boto3.client( 's3', aws_access_key_id='XYZACCESSKEY', aws_secret_access_key='XYZSECRETKEY', region_name='us-east-1' ) obj = s3.get_object(Bucket='bucket-name', Key='myreadcsvfile.csv') data = obj['Body'].read().decode('utf-8').splitlines() records = csv.reader(data) headers = next(records) print('headers: %s' % (headers)) for eachRecord in records: print(eachRecord) Like this way for different file format , you can create the code and read from s3 ...
@Polly10189
@Polly10189 4 месяца назад
@@KnowledgeAmplifier1 I am reading the path of file uploaded to S3. It's working, Thanks
@KnowledgeAmplifier1
@KnowledgeAmplifier1 4 месяца назад
@@Polly10189 Glad to hear this! Happy Learning
@DineshKumar-bk5vv
@DineshKumar-bk5vv Год назад
How to reach out for more information...can I get contact details pls?
Далее
💜☀️✨
00:47
Просмотров 812 тыс.
AWS SQS message trigger Lambda function 🚀
18:01
Просмотров 1,8 тыс.
AWS SQS to Lambda Tutorial in NodeJS | Step by Step
29:47
API-Gateway & SQS direct Integration
13:03
Просмотров 7 тыс.
ИГРОВОВЫЙ НОУТ ASUS ЗА 57 тысяч
25:33