You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Edit: It's even worse, the whole thing now cannot be destroyed with the Terraform CLI. That means I gotta manually go in and delete the resources. 👎
The examples are outdated and don't work. I tried the eks_argo TF example, and it would bork out with
│ Warning: Argument is deprecated
│
│ with module.metaflow-datastore.aws_s3_bucket.this,
│ on .terraform/modules/metaflow-datastore/modules/datastore/s3.tf line 1, in resource "aws_s3_bucket""this":
│ 1: resource "aws_s3_bucket""this" {
│
│ Use the aws_s3_bucket_server_side_encryption_configuration resource instead
│
│ (and one more similar warning elsewhere)
╵
╷
│ Error: creating Lambda Function (metaflowdb_migrateir9nhhph): operation error Lambda: CreateFunction, https response error StatusCode: 400, RequestID: XXX, InvalidParameterValueException: The runtime parameter of python3.7 is no longer supported for creating or updating AWS Lambda functions. We recommend you use the new runtime (python3.12) while creating or updating functions.
│
│ with module.metaflow-metadata-service.aws_lambda_function.db_migrate_lambda,
│ on .terraform/modules/metaflow-metadata-service/modules/metadata-service/lambda.tf line 115, in resource "aws_lambda_function""db_migrate_lambda":
│ 115: resource "aws_lambda_function""db_migrate_lambda" {
Also, the EKS version in the examples is outdated and is unsupported from March 2024. I understand that these are simply some first steps, but still a bit disappointing. We built our own Metaflow deployment with AWS CDK but CDK got its own issues and AWS step functions is excruciatingly slow, so I was really hoping for some speed improvements using Argo + k8s + TF both for deployment of the infrastructure and for deployment of workflows.
Steps to reproduce:
cd terraform-aws-metaflow/examples/minimal
locals.resource_prefix = "test-metaflow"
in minimal_example.tfterraform apply
and wait until it finishesaws apigateway get-api-key --api-key <api-key> --include-value | grep value
and paste the result to themetaflow_profile.json
filemetaflow configure import metaflow_profile.json
python mftest.py run
mftest.py
The running task will never finish, created AWS Batch Job in the AWS Batch Job queue is always in status
RUNNABLE
Also tried with
outerbounds/metaflow/aws
version=0.10.1 andterraform-aws-modules/vpc/aws
version = 5.1.2Generated metaflow_profile.json
The text was updated successfully, but these errors were encountered: