You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
add note about credentials to the service initialization
update notes about deletion
set minimal requirements (memory, cpu amount, cpu clocktime, diskspace, bandwith, maybe more)
Remaining issues (init.sh and extend_modelbuilder_notebook.py)
in init.sh: after copying ipynb, add cell that calls a to be created upload_model_to_s3(dir_model) function and supplying the correct foldername/dir_model to that function
ipynb/py files are downloaded twice. Seems like init script is executed twice, which might be a bug in edito
clear notebook output with jupyter nbconvert --clear-output --inplace my_notebook.ipynb
add temperature=1 and salinity=1 (including heatfluxmodel comment), check runtime >> not required since we now use 3D run
prevent the need for copy-pasting the password when initializing the service
maybe merge partitions (decoimpact needs this) with delft3dfm mapmerge in run_docker.sh. Can the d-ecoimpact service find both sequential/merged nc file since they have different names? >> UPDATE 29-01-2024: D-EcoImpact now accepts partitioned input, this is also released in the docker for EDITO (https://issuetracker.deltares.nl/browse/DEI-254)
parallel run fails, are we allowed to use multiple cores on EDITO? See also comment in current dockerfile about how to run it, like ulimit and shm-size, maybe we have to demand n cores here also?
simplify the s3 upload commands by avoiding the need for a temp directory
s3 token is not permanently valid, even not two days, increase this to much more
Remaining issues (running delft3dfm):
no access rights for JV
currently fails since job.yaml expects the folder Vietnam_model to exist. upload_model.py now always renames this to DFM_model, which might not be the best name (delft3dfm_model would be a better alternative).
service crashes also if model run is succesful. This happens for both 2D as 3D. Furthermore, the mapfile is missing from the s3 bucket, might be the cause of the failed process.
crash/message if crash, instead of just running service and no output (add process text to screen like dfm-modelbuilder)
add post-processing notebook/service?
The text was updated successfully, but these errors were encountered:
Initialize service:
Service Catalog
>>Ocean modelling
>>Delft3dfm-modelbuilder
>>Launch
Launch
Run the modelbuilder:
modelbuilder_example_edito.ipynb
dxy
in block [2] andmin_edge_size
in block [5].run the D-FlowFM model with another service:
Process Catalog
>>Ocean modelling
>>Delft3dfm-run-docker
>>Launch
Launch
DFM_OUTPUT
folder in your s3 bucketRemaining issues (helm chart):
Remaining issues (init.sh and extend_modelbuilder_notebook.py)
upload_model_to_s3(dir_model)
function and supplying the correct foldername/dir_model to that functionjupyter nbconvert --clear-output --inplace my_notebook.ipynb
Remaining issues (modelbuilder, notebook and run_docker.sh):
create_model_exec_files
is not used yet since multicore dimr cannot be parsed and sequential dimr returns nproc=0 instead of 1: dimr_config for parallel run is incorrect Deltares/HYDROLIB-core#562 >> require hydrolib-core 0.8.0 Deltares/dfm_tools#890 >> require minimal dfmt version and simplify upload scripttemperature=1
andsalinity=1
(including heatfluxmodel comment), check runtime >> not required since we now use 3D runRemaining issues (s3 upload):
run_docker.sh
in this script, first solve dimr_config for parallel run is incorrect Deltares/HYDROLIB-core#562Remaining issues (running delft3dfm):
Vietnam_model
to exist. upload_model.py now always renames this toDFM_model
, which might not be the best name (delft3dfm_model
would be a better alternative).DFM_OUTPUT_Vietnam
toDFM_OUTPUT
.The text was updated successfully, but these errors were encountered: