Skip to content

Commit

Permalink
Update grand_challenge_forge/partials/example-algorithm/inference.py.j2
Browse files Browse the repository at this point in the history
Co-authored-by: Anne Mickan <[email protected]>
  • Loading branch information
chrisvanrun and amickan authored Jan 10, 2025
1 parent aca20a7 commit dea6d81
Showing 1 changed file with 3 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -59,14 +59,14 @@ def run():
# Process the inputs: any way you'd like
_show_torch_cuda_info()

# Some additional resources might be required, place these in one of two locations.
# Some additional resources might be required, include these in one of two ways.

# First location: part of the Docker-container image: resources/
# Option 1: part of the Docker-container image: resources/
resource_dir = Path("/opt/app/resources")
with open(resource_dir / "some_resource.txt", "r") as f:
print(f.read())

# Second location: part of the model tarball
# Option 2: upload them as a separate tarball to Grand Challenge (go to your Algorithm > Models). The resources in the tarball will be extracted to `model_dir` at runtime.
model_dir = Path("/opt/ml/model")
with open(model_dir / "a_tarball_subdirectory" / "some_tarball_resource.txt", "r") as f:
print(f.read())
Expand Down

0 comments on commit dea6d81

Please sign in to comment.