Skip to content

Commit

Permalink
new failure mode
Browse files Browse the repository at this point in the history
  • Loading branch information
ErinWeisbart authored Oct 19, 2023
1 parent ade2c86 commit 8ac1375
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions documentation/DCP-documentation/troubleshooting_runs.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
| | "SSL: certificate subject name (*.s3.amazonaws.com) does not match target host name 'xxx.yyy.s3.amazonaws.com'" | Cannot be accessed by fleet. | | S3FS fails to mount if your bucket name has a dot (.) in it. | You can bypass S3FS usage by setting DOWNLOAD_FILES = TRUE in your config.py. Note that, depending upon your job and machine setup, you may need to increase the size of your EBS volume to account for the files being downloaded. Alternatively, you can make your own DCP Docker and edit run-worker.sh to `use_path_request_style`. If your region is not us-east-1 you also need to specify `endpoint`. See S3FS documentation for more information. |
| | Your logs show that files are downloading but it never moves beyond that point. | | | If you have set DOWNLOAD_FILES = TRUE in your config, then your files are failing to completely download because you are running out of space and it is failing silently. | Place larger volumes on your instances by increasing EBS_VOL_SIZE in your config.py |
| | "ValueError: The Mito image is missing from the pipeline." | | | The CellProfiler pipeline is referencing a channel (in this example, "Mito") that is not being loaded in the pipeline. | Check that your load_data csv contains the FileNames and PathNames for all your images. This can sometimes happen when the load_data csv is being automatically generated or edited as part of a workflow. |
| | "Failed to prepare run for module LoadData", "ValueError: zero-size array to reduction operation maximum which has no identity" | | | CellProfiler cannot read any information from your load_data.csv. | Check that your load_data.csv contains data beyond the header. This can sometimes happen when the load_data csv is being automatically generated or edited as part of a workflow. |

Further hints:
- The SSH_KEY_NAME in the config.py file contains the name of the key pair used to access AWS.
Expand Down

0 comments on commit 8ac1375

Please sign in to comment.