AWS: Reading File content from S3 on Lambda Trigger

AWS: Reading File content from S3 on Lambda Trigger


what is going on everyone and welcome to
the part 6 of the AWS lambda tutorial with Python in this tutorial I am going
to show you how to get the file name or the content of the file from the s3
bucket whenever the lambda gets triggered on
file drop and s3 bucket so if you remember then we have created an s3 –
trigger lambda function and one of the tutorial of setting an s3 bucket trigger
on lambda so if you have not watched that video please go ahead and watch it
or else you can create a lambda function on the fly and get started from here so
let’s go to the s3 – trigger lambda function so we have already set up an a3
bucket trigger on this lambda function and the previous tutorial so we have set
up an trigger on aws – lambda – trigger bucket so let’s get started and navigate
directly to the code window so here we will be using one library and that
is boto3 fortunately at a place lambda provides that library as an inbuilt
so all we need to go ahead and import that library that is boto3 so so
whenever the file gets dropped into the s3 bucket all the information comes with
an event to this lambda function so before going to the event let’s define
the s3 bucket object so that is s3 equal to boto3.client(“s3”) so this we will be using in
the later part of the code so let’s define it
so when we will get the event then only we will be executing the rest of the
code within the if condition so let’s define a file object so why I have to
defined event[“records”][0] that I will show you in a moment so in file object so event is basically
a dictionary so records is one of the key in that dictionary so we are
accessing the records key with the zero index of its values so in order to
get the file name so let’s define a filename equal to string of file_object of s3 so
s3 is again a key in a dictionary then key of object so our file name resides
and just in this path that is s3 of object key so let us again print out a file
name so don’t worry I will show you all the logs and I will try to explain that
how these things has been defined so once we we got the file name we need to
get the content I mean get the object of that file so here boto3 comes into
play so we will define file equal to a s3.get_object()
so two things we need to pass in this method
that is what is the I mean which is the s3 bucket and what file we need to
retrieve from there so we have an parenthesis bucket equal to so over lambda
function name that is aws – lambda – trigger and the second param is key and
that is file name so here we will get an object of the file name
whichever file is will get dropped into that s3 bucket now – to get the content
of the file now we had already got the object now we will retrieve the content
of this file let’s say file underscore content equal to file object of body dot
read dot decode as utf-8 and let’s print the file_content let’s go ahead and save this so let’s navigate to the s3 bucket
that is aws lambda trigger so I have one it is dot CSV file handy let’s go ahead and upload it now once
you have uploaded the file navigate to the cloud management console the s3 –
trigger lambda function so as you can see we lambda has been triggered let’s go ahead and click on the logs so as you can see here we have printed
out the event so this is how we get the data in events so we grab the as
you can see here on line number seven so we have to defined event of records of zero
so we are grabbing this records then in the values we have a list so we want
this from a event also so we are defining an event of records of zero so
we get this the specs ready then as you can see the filename that is iris dot
CSV as you can see here it is iris dot csv so the iris dot CSV resides
and this key s3 then object and key so as you can see here we have the defined
file object of s3 object key so here we get the file name and that is going down
into the s3 bucket but we have certain other things to see that size and
bucket name and various other things you can explore on your own so now we have
got the file name on line number nine then your on line number ten your
defined file object to get the object of that file from the desired bucket name
now to read the file of the content we are using file object so we are not
printing this file object so let’s go ahead and print and run it again so that
you can see that why we are defining the body it’s once again the file is successfully uploaded let’s
check for the new logs here and as you can see this you can you log here that’s
a refresher yeah so so this is the file object that we are just printed as you
can see a file space obj so here we have a body
defined here you can see so this is the botocore streaming body response so we
can read this response by simply applying the read method so on
line number 12 we are just grabbing that file of this of body and appending it
with dot read method and then decoding it as utf-8 when creating the file
content so here is the file content that sepal length sepal width around 140
rows so yeah this is how you can grab the file name from the s3 bucket and
read the content of the file on trigger I mean on drop of any file into the s3
bucket so this was all do let me know in case of any queries please comment below
thanks for watching stay tuned for more

Daniel Ostrander

Related Posts

49 thoughts on “AWS: Reading File content from S3 on Lambda Trigger

  1. Durga S says:

    How to push these file name and content of file into another file(Another bucket)

  2. Remus K says:

    Hello, can we do this event trigger for ec2 instance, whenever there is a new ec2 instance spinning up can lambda function do the event triggering to get certain information from the instance examples like instance id, instance name, keys -pairs, tags. thanks

  3. RbN says:

    Great tutorial! many thanks

  4. diptiranjan pradhan says:

    Hi I subscribed your channel can you give a example of – regex replace in s3 using lambda

    I have file having lots of special character i want to replace them using lambda daily i am getting 10k files

  5. Satya Narayana says:

    Hello, Great tutorial thanks, can you show how to send the trigger output to elastic search service in aws

    Basically daily apache logs will upload to S3 bucket as soon as logs uploads lambda function should trigger and output send it to elastic search service

  6. shristika yadav says:

    Hey, great video. Can you tell me how we can read this file content column wise?

  7. insanystalker26 says:

    Hi, thanks for the tutorial. i'm really new in python field but didn't take me to much time to understand the code you use for this lambda function. But when i upload a new file to the bucket the function only works when the filename name is in string. Does not work when the file name is with spaces. Can anybody help me on this please?

  8. Suniti Jaiswal says:

    hello, when we upload a file, what to do if we want to use that's file URI in code. How to get uploaded file URI?

  9. kavya sudeep says:

    Hi, great video!!!!!!!! can you create a video showing how an object added to bucket1 can be copied to bucket2 using lambda function. event will be adding content to the bucket1.

  10. Idrees Dar says:

    How would we send either an SNS or an SES for all the objects created for current date. It's supposed to be kind of a report for all uploaded objects for a day

  11. ADM says:

    How can I do it in Java? Plz help sir

  12. Vijay Sadhu says:

    Great Stuff. Keep it going!

  13. Manas Kumar says:

    Hi

    please suggest how to do this with node js ..Any reffrence will be appreciated

  14. aswathi nambiar says:

    Awesome tutorial! Any idea how to index the file say PDF file and then load it to Elastic search?

  15. Shekhar S says:

    Hey Srce Cde, thanks for the videos.
    Do you have any python learning video series also, it will really help. Thanks again.

  16. Jabran Khan says:

    What a helpful tutorial, many thanks!

  17. Divya Kumar says:

    I am doing some file reading operation in node with in lambda. Just wondering if it is possible to install other npm packages like 'event-streams' etc

  18. Manjunath S B says:

    Hi thanks for your video ,after uploading the csv file I am getting below errorcan u pls help in thisAn error occurred (AccessDenied) when calling the GetObject operation: Access Denied: ClientError
    Traceback (most recent call last):
    File "/var/task/lambda_function.py", line 11, in lambda_handler
    fileObj=s3.get_object(Bucket="aws-lamdatrigger",Key=filename)
    File "/var/runtime/botocore/client.py", line 314, in _api_call
    return self._make_api_call(operation_name, kwargs)
    File "/var/runtime/botocore/client.py", line 612, in _make_api_call
    raise error_class(parsed_response, operation_name)
    botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the GetObject operation: Access Denied

  19. Ashish Niranjan says:

    Very nice explanation .!! Could you please tell like to check missing AWS Resources tags through Lambda Function

  20. Nikhil Gupta says:

    Can you share the link of previous video where you are making lambda function that has to be triggered on file upload in s3 bucket

  21. Mxvii_The Great says:

    Why did you print the event? In the beginning?

  22. surabi manideep says:

    Hi,
    This code is working fine if we upload a file at an instance.
    What if I dump a batch of files at a time and expect it to write all the filenames ?
    Its not giving all the file names as expected,Please try to figure it out

  23. Manjunath S B says:

    thanku thanku so much for this configuration ,save my time

  24. KHAN JUAID ALI says:

    Can you please share a python program for word count of file which I upload in S3 bucket.

  25. Himanshu Negi says:

    bro this is not working!! I face an error in file_obj = event['Records'][0],, plz help me if u can

  26. Raghwendra Singh says:

    Hi,

    How I can push data from S3 to any prod or dev server from aws lambda function. just want to create a process , if any change in s3 bucket then lambda function push the content on respective prod or dev server. Thanks

  27. Srce Cde says:

    Hello guys,
    I'm planning to do a Python tutorial series in parallel to AWS tutorials. Is this something you will be interested in? Please let me know in comments.

  28. Adeel Ahmed Syed says:

    Hi, Can you write a code for gziping the files in s3 bucket

  29. Eduardo terradez says:

    Thx for the tutorial !!

  30. Priyasha Agarwalla says:

    hey! Instead of doing the inline coding, if i want to upload lambda_function.py as a deployment package, how will my code look like? I tried with same, its not working.

  31. RUDRA BISWAS says:

    I want to add a spark job in python (pyspark) the similar way. My requirement is to have the spark job added to the steps in EMR. It should look into the data in file in S3 and do some query and result out in another bucket in S3.
    Can you help me in that

  32. Santosh Reddy says:

    START RequestId: f017289d5d9bcb3d Version: $LATEST
    [ERROR] KeyError: 'Records'
    Traceback (most recent call last):
      File "/var/task/lambda_function.py", line 12, in handler
        file_obj = event['Records'][0]
    END RequestId: f017289d5d9bcb3d

    how to resolve this error?

  33. sefirosto says:

    Thanks!

  34. sitanshu bhunia says:

    Great video. If you create a video based on lambda trigger a mail when any data change in RDS mysql database of a particular table's column .
    Thanks in advance

  35. Krishna Teja says:

    Hi, i'm using the same code to trigger an excel file upload in s3 and the read function throws error " 'utf-8' codec can't decode byte 0xb6 in position 14: invalid start byte: UnicodeDecodeError". how to go about this error

  36. Manoharsinh Rana says:

    I am getting an error.

    'Records': KeyError
    Traceback (most recent call last):
    File "/var/task/sample.py", line 11, in handler
    file_obj = event["Records"][0]
    KeyError: 'Records'

  37. Batta Vandana says:

    Is the code applicable for the file in pdf format also??

  38. Batta Vandana says:

    Can you please tell the code to read a pdf file from s3 bucket

  39. sudip das says:

    How can I manipulate excel files in s3 using lambda. I mean operations like reading the excel, doing some manipulations and then converting it to csv ?

  40. ashleyadrias says:

    Very helpful thank you! for also including the source code

  41. Vivek Parmar says:

    Hi,
    I need to copy a file when it gets uploaded to S3 bucket to another S3 bucket.
    kindly suggest how to read the latest uploaded file and move to another S3 bucket.

    Thanks

  42. Vinaya Kharade says:

    hi can u tell about that .csv file from where u get this file……..or u before created?

  43. Shatrughna Jha says:

    Hi, can you share me code to read a file from S3 using lambda python without using trigger?
    Thanks in advance. 🙂

  44. Mohanakrishnan Ramalingam says:

    Simply Perfect. 🙂

  45. junaid akhtar says:

    Hey,
    can u please help me out with getting the path of object in S3 bucket and using the object in my code? please reply ASAP

  46. SAHIL` BHATIA says:

    Hi, Can you please tell me how to read an image file from s3 bucket using lambda trigger through API gateway?

  47. Neeraj Kashyap says:

    Please make tutorials on AWS rekognition using Lambda. You are doing a great job.

  48. jai prakash dadoliya says:

    I have created trigger function and able to get image name from s3. Now i want to convert that image into text using detect_text. So can i write detect_text code into the same trigger function code or do i need to write detect_text into another lambda function ??

    Plz suggest me the right way to implement??

    Thanks 🙂

  49. m krishna says:

    how can i read file from s3 bucket using access keys and endpoint url could you please suggest me the code

Leave a Reply

Your email address will not be published. Required fields are marked *