Thank you but in the begining of the video you're talking about private subnet but we doesn't link EC2 to them, so I'm a little bit lost. Why we create EC2 instance in public subnet?
if you run into the problem of "Error code: InvalidParameterValueException. Error message: The provided execution role does not have permissions to call DeleteMessage on SQS" Follow these instructions: Go to IAM in the AWS Console Click on roles Select your Lambda function execution role or create one if you don't already have Add the AWS managed LambdaSQSQueueExecutionRole policy to the role. The policy contains all the permissions to call the required actions on SQS from Lambda. The ARN of the policy is arn:aws:iam::aws:policy/service-role/AWSLambdaSQSQueueExecutionRole. Save the role, and then try again to add the trigger. This time it will work fine.
Great video Can someone help me understand this statement in the lambda function? domain = os.environ['NAMEOFDOMAIN'] host = "" + es.describe_elasticsearch_domain(DomainName=domain)['DomainStatus']['Endpoint'] How come Domainstatus and endpoint are in ' ' ? Is it being queried by the OS import library?
Hi there, i have actually run into a problem. when i try to add the lambda trigger into sqs it throws an error message "Error code: InvalidParameterValueException. Error message: The provided execution role does not have permissions to call DeleteMessage on SQS" and i have used the same policy for the sqs as shown in this video. any idea how to solve this?
I have followed the same sequence as mentined in your vedio but still facing below error : InvalidParameterValueException: Cannot access stream arn:aws:dynamodb:us-east-1:<stream-event>. Please ensure the role can perform the GetRecords, GetShardIterator, DescribeStream, and ListStreams Actions on your stream in IAM.
Great Work!! Was looking for similar solution, Due to spefciic requirement, i have elasticsearch 7.10 running on AWS Opensearch service, Have tried above steps getting signature miss match error when i try to access <api url>/<stage_name>/_plugin/kibana, do we need to do some changes to make it compatible with elasticsearch
Hi, First of all thank you for provide such helpful content... Everything is working correctly but i wanna know how to add https with the loadbalancer. thx @listentolearn2363
HI alb is not provisioning and the adress is not visble . i will share the log : 5a2-d72b-40b3-b2c0-1889ff5131d7"} {"level":"error","ts":"2024-05-28T09:53:17Z","msg":"Reconciler error","controller":"ingress","object":{"name":"testapp-nginx-ingress","namespace":"default"},"namespace":"default","name":"testapp-nginx-ingress","reconcileID":"ef38c6d9-0165-4026-9539-cc558860cdbf","error":"WebIdentityErr: failed to retrieve credentials caused by: AccessDenied: Not authorized to perform sts:AssumeRoleWithWebIdentity \tstatus code: 403, request id: 1025e76d-ae85-43b6-9511-8a65fcb6b914"} [ec2-user@ip-10-0-8-252 ~]$ kindly reply
Do we need to create index before uploading the file? I wanted to just insert the data dynamically is this possible? For example. I have a sample json using that I wanted to create an index without creating it manually.
Thanks for the video, would you be able to show the video on how to do this using cloud formation template and how to hit the opensearch domain endpoint and ingest and query data
Hi thanks for the video. Question: Is it possible to trigger the same aws lambda function from a rabbitmq instance that is running on EC2 (as oppossed to a managed rabbitmq instance) ?
Quick update the load balancer policy is not working, I had to update the same to create the Load Balancer, otherwise the host address was coming as blank.
When I am applying deployment I am getting this error: error: error validating "deployment.yaml": error validating data: failed to download openapi: the server has asked for the client to provide credentials; if you choose to ignore these errors, turn validation off with --validate=false Any suggestions how to fix it ?
HI, demn girl you explained it so well. I have a question, if we follow the exact pipeline and in the lambda function write a code to connect to ec2 instance and over that ec2 use the newly added file in s3 bucket input for a python code for some calculation. It should work fine right. For ex : - upload the file in s3 -> sns topic-> sqs que -> lambda -> ec2 (run and connect) -> runs the python file.(command given in lambda function). correct?
Thank you for the tutorial. I am getting this error though: "errorMessage": "'NoneType' object has no attribute 'upper'", "errorType": "AttributeError", "stackTrace": [ " File \"/var/task/lambda_function.py\", line 95, in lambda_handler 'method': method.upper(), "
Hello, Looks like the event object is missing or getting passed as None. Please check your api gateway setup and try triggering a test event from api gateway.
Good general overview of Cloudformation and step through of the generisation of the templates is well done. Demo of create Cloudformate template is also very clear.
Easy to follow. But I find the token things at the end is quite confusing. Would be good if you could make another video to explain how the token works.