当前位置: 动力学知识库 > 问答 > 编程问答 >

amazon s3 - AWS Lambda - Is there a way to pass parameters to a lambda function when an event occurs

问题描述:

I have a DynamoDB table and whenever a new record is added, I want to archive the old data to S3. So I thought I could use AWS Lambda. So the lambda function will get the new record that is newly added/modified. But I want to pass(to the lambda function) an additional parameter of the s3 path to which the record has to be uploaded.

One way is to have whatever I want to pass to the lamda function in another table/s3. But this(the parameter) will change as each record is inserted into the main table. So I can't read this from my lambda function. (By the time the lambda function gets executed for the first inserted record, few more records would have been inserted)

Is there a way to pass params to the lambda function?

P.S: I want to execute the lambda asynchronously.

Thanks...

网友答案:

why not to add this parameters (s3 path) to your dynamodb table (where the new raw is added -not in another table, but at the same table that lambda is listening on)

网友答案:

You can now accomplish this by:

  1. Attaching a DynamoDB Stream to your Dynamo Table with a view to NEW_AND_OLD_IMAGES
  2. Creating an event source on your lambda function to read the DynamoDB stream
  3. Add an environment variable to your lambda function to indicate where to write the data to in S3

You'll still have to derive the details of where to store the record from the record itself, but you can indicate the bucket or table name in the environment variable.

分享给朋友:
您可能感兴趣的文章:
随机阅读: