Skip to content

Commit 0ade265

Browse files
committed
Version 2.2.1
Included support for a CloudWatch Events & AWS Lambda based trigger file creation (keeping old python version for backward compatibility too) Updated dependency versions async 0.9.0->1.5.2 node-uuid 1.4.2->1.4.7 pg 4.3.0->4.4.3
1 parent c67249a commit 0ade265

File tree

8 files changed

+36
-14
lines changed

8 files changed

+36
-14
lines changed

.pydevproject

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
2+
<?eclipse-pydev version="1.0"?><pydev_project>
3+
<pydev_property name="org.python.pydev.PYTHON_PROJECT_INTERPRETER">Default</pydev_property>
4+
<pydev_property name="org.python.pydev.PYTHON_PROJECT_VERSION">python 2.7</pydev_property>
5+
</pydev_project>

README.md

Lines changed: 17 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -347,9 +347,24 @@ that files are loaded every N minutes, use the following process to force period
347347

348348
When you create the configuration, add a filenameFilterRegex such as '.*\.csv', which
349349
only loads CSV files that are put into the specified S3 prefix. Then every N minutes,
350-
schedule the included dummy file generator through a CRON Job.
350+
schedule one of the included trigger file generators to run:
351351

352-
```./path/to/function/dir/generate-dummy-file.py <region> <input bucket> <input prefix> <local working directory>```
352+
### Using Scheduled Lambda Functions
353+
354+
You can use an included Lambda function to generate trigger files into all configured prefixes that have a regular expression filter, by completing the following:
355+
356+
* Create a new AWS Lambda Function, and deploy the same zip file from the `dist` folder as you did for the AWS Lambda Redshift Loader. However, when you configure the Handler name, use `createS3TriggerFile.handler`, and configure it with the timeout and RAM required.
357+
* In the AWS Web Console, select Services/CloudWatch, and in the left hand navigation select 'Events/Rules'
358+
* Choose Event Source = 'Schedule' and specify the interval for your trigger files to be gnerated
359+
* Add Target to be the Lambda function you previously configured
360+
361+
Once done, you will see CloudWatch Logs being created on the configured schedule, and trigger files arriving in the specified prefixes
362+
363+
### Through a CRON Job
364+
365+
You can use a Python based script to generate trigger files to specific input buckets and prefixes, using the following utility:
366+
367+
```./path/to/function/dir/generate-trigger-file.py <region> <input bucket> <input prefix> <local working directory>```
353368

354369
* region - the region in which the input bucket for loads resides
355370
* input bucket - the bucket which is configured as an input location

build.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,4 @@
22

33
ver=`cat package.json | grep version | cut -d: -f2 | sed -e "s/\"//g" | sed -e "s/ //g" | sed -e "s/\,//g"`
44

5-
zip -r AWSLambdaRedshiftLoader-$ver.zip index.js common.js constants.js kmsCrypto.js upgrades.js *.txt package.json node_modules/ && mv AWSLambdaRedshiftLoader-$ver.zip dist
5+
zip -r AWSLambdaRedshiftLoader-$ver.zip index.js common.js createS3TriggerFile.js constants.js kmsCrypto.js upgrades.js *.txt package.json node_modules/ && mv AWSLambdaRedshiftLoader-$ver.zip dist

createS3TriggerFile.js

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ exports.handler = function(event, context) {
2929
} else {
3030
if (!data.Items) {
3131
console.log("Looks like you don't have any configured Prefix entries!");
32-
context.success();
32+
context.succeed();
3333
} else {
3434
// create a trigger file entry for each prefix
3535
async.each(data.Items, function(configItem, callback) {
@@ -42,13 +42,15 @@ exports.handler = function(event, context) {
4242
var fileKey = configItem.s3Prefix.S.replace(bucketName + "\/", "");
4343

4444
// create a trigger file on S3
45-
createTriggerFile(bucketName, fileKey, callback);
45+
exports.createTriggerFile(bucketName, fileKey, callback);
46+
} else {
47+
callback();
4648
}
4749
}, function(err) {
4850
if (err) {
4951
context.fail(err);
5052
} else {
51-
context.success();
53+
context.succeed();
5254
}
5355
});
5456
}
@@ -57,7 +59,7 @@ exports.handler = function(event, context) {
5759

5860
/** function which will create a trigger file in the specified path */
5961
exports.createTriggerFile = function(bucketName, fileKey, callback) {
60-
var prefix = fileKey + "/lambda-redshift-trigger-file.trigger";
62+
var prefix = fileKey + "/lambda-redshift-trigger-file.dummy";
6163

6264
var createParams = {
6365
Bucket : bucketName,
-299 KB
Binary file not shown.

index.js

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -560,7 +560,7 @@ exports.handler = function(event, context) {
560560
doProcessBatch = true;
561561
}
562562

563-
if (config.batchTimeoutSecs && config.batchTimeoutSecs.N) {
563+
if (config.batchTimeoutSecs && config.batchTimeoutSecs.N && config.batchSize.N > 0) {
564564
if (common.now() - lastUpdateTime > parseInt(config.batchTimeoutSecs.N) && pendingEntries.length > 0) {
565565
console.log("Batch Size " + config.batchSize.N + " not reached but reached Age " + config.batchTimeoutSecs.N + " seconds");
566566
doProcessBatch = true;

package.json

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,16 @@
11
{
22
"name": "aws-lambda-redshift-loader",
33
"description": "An Amazon Redshift Database Loader written for AWS Lambda",
4-
"version": "2.2.0",
4+
"version": "2.2.1",
55
"homepage": "http://github.com/awslabs/aws-lambda-redshift-loader",
66
"bugs": {
77
"url": "http://github.com/awslabs/aws-lambda-redshift-loader/issues",
88
"email": "[email protected]"
99
},
1010
"dependencies": {
11-
"async": "0.9.0",
12-
"node-uuid": "1.4.2",
13-
"pg":"4.3.0"
11+
"async": "1.5.2",
12+
"node-uuid": "1.4.7",
13+
"pg":"4.4.3"
1414
},
1515
"keywords": [
1616
"amazon",
@@ -28,6 +28,7 @@
2828
"addAdditionalClusterEndpoint.js",
2929
"common.js",
3030
"constants.js",
31+
"createS3TriggerFile.js",
3132
"describeBatch.js",
3233
"encryptValue.js",
3334
"generate-trigger-file.py",
@@ -48,5 +49,4 @@
4849
"type": "git",
4950
"url": "http://github.com/awslabs/aws-lambda-redshift-loader"
5051
}
51-
}
52-
52+
}

0 commit comments

Comments
 (0)