Allow buckets to trigger functions #269
Replies: 8 comments
-
@angelhodar we haven't developed this feature yet, it's on our radar though. I know this is a really common workflow it'd be good to know more about your specific use case to help us flesh out this feature. It'd also be great to get your thoughts on what this feature might look like. Initially we've been thinking of an interface like this: import { bucket } from '@nitric/sdk';
// create a bucket called files
const files = bucket("files");
// using pattern of "<event_type>:<file_pattern>"
// subscribe to write events for all files on the bucket
files.on('write:*', async (ctx) => {
// do something with the event
}); Let us know what you think. |
Beta Was this translation helpful? Give feedback.
-
That syntax was exactly what I was expecting when I was looking at the documentation! What i want to do is to deploy a docker container that optimizes 3d models in a lambda function. That function would be triggered when a 3d model packed as zip is uploaded to an S3 bucket, and then the optimized model would be then uploaded to another bucket by the lambda. Pretty simple but im new at this type of automation I was wondering what is granularity of level permissions can nítric handle. For example the S3 permissions are pretty complex. Thanks for your work creating this amazing library btw! |
Beta Was this translation helpful? Give feedback.
-
Awesome will look to implement that as soon as we can. Really cool use case, I'd be interested to know what software/packages you were planning to use and whether or not you would need a custom container to handle this we've done a few things with simple image optimisation, most npm packages we've used handle native software installation as a pre/post install step. If you need to define a custom image we've been working on some proposals for defining custom runtimes for nitric here: nitrictech/cli#203, if you have any thoughts on requirements let us know there 😃. Currently nitric handles high level permissions at the bucket level for each deployed function, so its possible to ensure a function can only read/write/delete for a bucket, but we don't currently apply permissions at a file/pattern level at the moment, if more granular security is required it would need to be handled in the application layer for now, but it would be good to create a new issue for any specific requirements if you have some. Also keep your questions coming we've been setting up a discord server for community discussion which we're planning to make public soon, so if you'd like some more help with your project implementation I'd be happy to send you an invite. |
Beta Was this translation helpful? Give feedback.
-
Currently, our event context object is quite rigid in terms of the event source (just topics). Perhaps this is a good time to review the cloudevents spec again and adopt the standard while adding this new event source? |
Beta Was this translation helpful? Give feedback.
-
Have created and linked a new issue describing the requirements: |
Beta Was this translation helpful? Give feedback.
-
@tjholm Just in case its helpful, here you have the implementation using Pulumi of the example i commented you: import * as aws from "@pulumi/aws";
import * as awsx from "@pulumi/awsx";
const bucket = new aws.s3.Bucket("3d-model-optimizer");
const image = awsx.ecr.buildAndPushImage("model-optimizer", {
context: "..",
});
const role = new aws.iam.Role("modelOptimizerRole", {
assumeRolePolicy: aws.iam.assumeRolePolicyForPrincipal({
Service: "lambda.amazonaws.com",
}),
});
const attachment = new aws.iam.RolePolicyAttachment("lambdaFullAccess", {
role: role.name,
policyArn: aws.iam.ManagedPolicy.AWSLambdaExecute,
});
const optimizer = new aws.lambda.Function("optimizer", {
packageType: "Image",
imageUri: image.imageValue,
role: role.arn,
timeout: 300,
memorySize: 1536,
}, { dependsOn: [ attachment ]});
bucket.onObjectCreated("onNewModel", optimizer, { filterSuffix: ".zip" }); |
Beta Was this translation helpful? Give feedback.
-
We have opened this PR to address this feature. It will use the syntax that was discussed above: import { bucket } from '@nitric/sdk';
// create a bucket called files
const files = bucket("files");
// using pattern of "<event_type>:<file_pattern>"
// subscribe to created events for all files on the bucket
files.on('created:*', async (ctx) => {
// do something with the event
}); Hope it helps 😄 |
Beta Was this translation helpful? Give feedback.
-
This has been released in both the nitric node-sdk, all nitric providers (starting with v0.27.0) |
Beta Was this translation helpful? Give feedback.
-
Hey! I would like to know how to create the typical workflow for an S3 upload triggering an AWS lambda, because i cant see any events in the bucket class to fire an event. Thanks in advance
Beta Was this translation helpful? Give feedback.
All reactions