You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am trying to write some E2E tests for my PR which adds DynamoDB as a reader/writer. I'm getting a FileNotFoundException, which from my experience means that the files do not exist on HDFS. Looking at the e2e/elasticsearch example, I do not see how the example files are made accessible to the docker containers, or how the tests use the updated JAR with the latest code. So I have the following questions:
How do I make my newly added example files available to the spark-submit image?
How do I run the tests with the updated metorikku.jar file?
I am trying to use dynamodb-local for the tests, but that requires additional configuration options to change the endpoint URL to be local. What is the recommended way to add test specific configs to the examples?
Error snippet:
dynamodb-spark-submit-1 | Exception in thread "main" java.io.FileNotFoundException: File examples/dynamodb/movies.yaml does not exist
The text was updated successfully, but these errors were encountered:
Hi, I am trying to write some E2E tests for my PR which adds DynamoDB as a reader/writer. I'm getting a
FileNotFoundException
, which from my experience means that the files do not exist on HDFS. Looking at thee2e/elasticsearch
example, I do not see how the example files are made accessible to the docker containers, or how the tests use the updated JAR with the latest code. So I have the following questions:metorikku.jar
file?Error snippet:
The text was updated successfully, but these errors were encountered: