Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ufs-coastal DATM+WW3 DuckNC #128

Open
Tracked by #95
janahaddad opened this issue Sep 10, 2024 · 23 comments
Open
Tracked by #95

ufs-coastal DATM+WW3 DuckNC #128

janahaddad opened this issue Sep 10, 2024 · 23 comments
Assignees

Comments

@janahaddad
Copy link
Collaborator

No description provided.

@yunfangsun
Copy link
Collaborator

yunfangsun commented Nov 25, 2024

By using the version of c6492b9, the duck case gives the following err message:

The error message appears both in the ufs-coastal ww3 and atm+ww3 are related to the initial configurations.

+ srun --label -n 300 ./fv3.exe
118: Abort(52) on node 118 (rank 118 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 118
191: Abort(52) on node 191 (rank 191 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 191
127: Abort(52) on node 127 (rank 127 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 127
242: Abort(52) on node 242 (rank 242 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 242
202: Abort(52) on node 202 (rank 202 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 202
 35: Abort(52) on node 35 (rank 35 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 35
135: Abort(52) on node 135 (rank 135 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 135
 45: Abort(52) on node 45 (rank 45 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 45
270: Abort(52) on node 270 (rank 270 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 270
136: Abort(52) on node 136 (rank 136 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 136
274: Abort(52) on node 274 (rank 274 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 274
224: Abort(52) on node 224 (rank 224 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 224
 37: Abort(52) on node 37 (rank 37 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 37
 95: Abort(52) on node 95 (rank 95 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 95
277: Abort(52) on node 277 (rank 277 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 277
204: Abort(52) on node 204 (rank 204 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 204
219: Abort(52) on node 219 (rank 219 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 219
103: Abort(52) on node 103 (rank 103 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 103
221: Abort(52) on node 221 (rank 221 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 221
 40: Abort(52) on node 40 (rank 40 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 40
249: Abort(52) on node 249 (rank 249 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 249
 67: Abort(52) on node 67 (rank 67 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 67
122: Abort(52) on node 122 (rank 122 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 122
144: Abort(52) on node 144 (rank 144 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 144
 25: Abort(52) on node 25 (rank 25 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 52) - process 25

@yunfangsun
Copy link
Collaborator

Then I compiled the WW3 stand-alone with the same configuration of UFS-coastal WW3 (812b8f7) and reproduce mod_def.ww3 and nest.ww3 run the same case again, it gives me the following error:

IMPTOTAL is selected
But PDLIB is not
IMPTOTAL is selected
But PDLIB is not
IMPTOTAL is selected
But PDLIB is not
IMPTOTAL is selected
But PDLIB is not
IMPTOTAL is selected
But PDLIB is not
IMPTOTAL is selected
But PDLIB is not
IMPTOTAL is selected
But PDLIB is not
IMPTOTAL is selected
But PDLIB is not

To compare the ww3 stand-alone cases of the successful and failed ones:

I find the switch has been modified:

The successful one is NCO PDLIB SCOTCH SCRIP SCRIPNC NOGRB DIST MPI PR3 UQ FLX0 SEED ST4 STAB0 NL1 BT1 DB1 MLIM FLD2 TR0 BS0 RWND WNX1 WNT1 CRX1 CRT1 O0 O1 O2 O3 O4 O5 O6 O7 O14 O15 IC0 IS0 REF0

The failed one is NCO NOGRB DIST MPI OMPG OMPH SCRIP SCRIPNC WRST PR3 UQ FLX0 SEED ST4 STAB0 NL1 BT1 DB1 MLIM FLD2 TR0 BS0 RWND WNX1 WNT1 CRX1 CRT1 O0 O1 O2 O3 O4 O5 O6 O7 O14 O15 IC0 IS0 REF0

@yunfangsun
Copy link
Collaborator

Then I modified the UFS-coastal to use the switch of NCO PDLIB SCOTCH SCRIP SCRIPNC NOGRB DIST MPI PR3 UQ FLX0 SEED ST4 STAB0 NL1 BT1 DB1 MLIM FLD2 TR0 BS0 RWND WNX1 WNT1 CRX1 CRT1 O0 O1 O2 O3 O4 O5 O6 O7 O14 O15 IC0 IS0 REF0 to compile the UFS-coastal ww3 and atm+ww3 cases.

The 52 error is not existed any more.

And the ufs-coastal ww3 shows the following errors:

141:  *** WAVEWATCH III ERROR IN W3FLDO :
141:      ERROR IN OPENING WND FILE, IOSTAT =    29
141:
 80:
 80:  *** WAVEWATCH III ERROR IN W3FLDO :
 80:      ERROR IN OPENING WND FILE, IOSTAT =    29

And the ufs-coastal atm+ww3 shows the following errors:

0: Abort(1) on node 0 (rank 0 in comm 496): application called MPI_Abort(comm=0x84000003, 1) - process 0
 4:  ESMF_Finalize: Error closing trace stream
 4: Abort(1) on node 4 (rank 4 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 4
 6:  ESMF_Finalize: Error closing trace stream
 6: Abort(1) on node 6 (rank 6 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 6
10:  ESMF_Finalize: Error closing trace stream
10: Abort(1) on node 10 (rank 10 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 10
 2:  ESMF_Finalize: Error closing trace stream
 2: Abort(1) on node 2 (rank 2 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 2
 8:  ESMF_Finalize: Error closing trace stream
 8: Abort(1) on node 8 (rank 8 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 8
 3:  ESMF_Finalize: Error closing trace stream

@yunfangsun
Copy link
Collaborator

By adding a wind.ww3 file, the UFS-Coastal WW3 case could run without any error messages,
it seems that no matter INPUT%FORCING%WINDS = 'H' or 'T', wind.ww3 is needed, which should not in the 'H' case.
And the log files show it reads the wind file from namelist:

  0:  --------------------------------------------------
  0:        water levels   YES/--  (homogeneous field)
  0:        currents       ---/NO
  0:        winds          YES/--  (homogeneous field)
  0:        ice fields     ---/NO
  0:        momentum       ---/NO
  0:        air density    ---/NO
  0:        mean param.    ---/NO
  0:        1D spectra     ---/NO
  0:        2D spectra     ---/NO
  0:
  0:             Fields   : Wind speed
  0:                        Water level
  0:                        Wave height
  0:                        Mean wave length
  0:                        Mean wave period(+2)
  0:                        Mean wave period(+1)
  0:                        Peak frequency
  0:                        Mean wave dir. a1b1
  0:                        Mean dir. spr. a1b1
  0:                        Peak direction
  0:                        Peak prd. (from fp)
  0:                        Part. peak period
  0:                        Part. mean direction
  0:             Point  1 :   -75.74   36.19  p1
  0:                    2 :   -75.74   36.19  p2
  0:                    3 :   -75.71   36.20  p6
  0:                    4 :   -75.75   36.19  p7
  0:                    5 :   -75.59   36.25  p8
  0:                    6 :   -75.75   36.19  p9
  0:                    7 :   -75.75   36.19  p11
  0:             Fields   : no fields defined
  0:        Homogeneous field data (and moving grid) ...
  0:           114  water levels
  0:                1   20121027 000000  0.4370E+00
  0:                2   20121027 003000  0.3290E+00
  0:                3   20121027 010000  0.1600E+00
  0:                4   20121027 013000  0.3400E-01

@yunfangsun
Copy link
Collaborator

Although there are no errors about the water level forcing and the log shows the water level information is read from the namelist, the output of water level shows zero fields:

data:

 WLV =
  0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
    0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
    0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
    0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
    0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
    0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
    0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
    0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
    0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
    0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,

Then I try to add a level.ww3 file as external forcing, and tried both INPUT%FORCING%WATER_LEVELS = "H" or "T", there are error messages about the water levels, however, the water level outputs are always 0, which is different from the WW3 stand-alone outside ufs-coastal.

@yunfangsun
Copy link
Collaborator

And for the ATM+WW3 duck case, there are errors from the mediator:

20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_ice count =      1
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_ice itemNameList = cpl_scalars
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_ice Field = cpl_scalars is not connected.
20241202 084542.460 INFO             PET495 (med_fldList_Realize) done
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):Fr_rof count =      1
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):Fr_rof itemNameList = cpl_scalars
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):Fr_rof Field = cpl_scalars is not connected.
20241202 084542.460 INFO             PET495 (med_fldList_Realize) done
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_rof count =      1
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_rof itemNameList = cpl_scalars
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_rof Field = cpl_scalars is not connected.
20241202 084542.460 INFO             PET495 (med_fldList_Realize) done
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):Fr_wav count =      1
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):Fr_wav itemNameList = cpl_scalars
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):Fr_wav Field = cpl_scalars is connected on root pe
20241202 084542.460 INFO             PET495 (med_fldList_Realize) done
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_wav count =      3
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_wav itemNameList = Sa_u10m
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_wav itemNameList = Sa_v10m
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_wav itemNameList = cpl_scalars
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_wav Field = cpl_scalars is not connected.
20241202 084542.460 INFO             PET495 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):To_wav Field = Sa_u10m is connected, grid/mesh TBD

@yunfangsun
Copy link
Collaborator

For this ATM+WW3, the WW3 switch used is in ufs-weather-model/WW3/model/bin/switch_meshcap_pdlib NCO PDLIB SCOTCH NOGRB DIST MPI PR3 UQ FLX0 SEED ST4 STAB0 NL1 BT1 DB1 MLIM FLD2 TR0 BS0 RWND WNX1 WNT1 CRX1 CRT1 O0 O1 O2 O3 O4 O5 O6 O7 O14 O15 IC0 IS0 REF0

@janahaddad
Copy link
Collaborator Author

Hey @SmithJos13, @yunfangsun will need a DOCN component in for this wave case with sea level surface field input. Would it be possible to share your path for your docn cice case and give Yunfang permissions to read?

@SmithJos13
Copy link

SmithJos13 commented Dec 23, 2024

@ Do you want access to me version of UFS coastal DIR that I have been working on? If so here you go. I think the copy all data mode has been modified to accept some more fields, and you can look at the CMEPS coastal coupling mode to see how the exchange of information is handled in the mediator

/work2/noaa/vdatum/jsmith/dev/ufs-coastal-cice-dev/

This is the working directory on Hercules. Let me know if the permissions are lacking and I can fix it.

@yunfangsun
Copy link
Collaborator

Hi @SmithJos13,

I can access the folder of /work2/noaa/vdatum/jsmith/dev/ufs-coastal-cice-dev/, however, I am not in the group of vdatum, I can't read most of the files in the folder, for example /work2/noaa/vdatum/jsmith/dev/ufs-coastal-cice-dev/tests, /work2/noaa/vdatum/jsmith/dev/stmp/jsmith/FV3_RT/rt_182747.

Could you please add the other user to read permission?

Thank you!

Best,

Yunfang

@SmithJos13
Copy link

@yunfangsun does this mean I need to give you access to the vdatum folder? Or maybe the issue is that I have you read privileges to all the folders and not the files! I’ll look into this and let you know when I fixed it.

Best wishes,
Joey

@yunfangsun
Copy link
Collaborator

Hi @SmithJos13 ,

I am not in the vdatum group, therefore, you have to change the permission of the files from 750 to 755, then I can read the files in your folder.

Best,

Yunfang

@SmithJos13
Copy link

@yunfangsun ahhh gotcha! I’ll fix that!

@SmithJos13
Copy link

@yunfangsun okay let me know if it's working now! You should have 755 for all DIR in the folder and 644 for all the files. I think that should be good enough?

@yunfangsun
Copy link
Collaborator

Hi @SmithJos13 ,

Thank you, I can see the file in the folder of /work2/noaa/vdatum/jsmith/dev/ufs-coastal-cice-dev/. However, I still can't see the run folder in /work2/noaa/vdatum/jsmith/dev/stmp/jsmith/FV3_RT/rt_182747. Could you also please change the permission of the run_dir?

Thank you!

@SmithJos13
Copy link

@yunfangsun

Okay, I think it should be good now! let me know if it is working.

@yunfangsun
Copy link
Collaborator

Hi @uturuncoglu @janahaddad ,

The original CDEPS's ocn components only have T, U, V variables, therefore, I am using @SmithJos13 's docn_datamode_cice_roms_output_mod.F90, config_component.xml and stream_definition_docn.xml to compile the ATM+WW3 case.

And add the docn_in:

"docn_in" 12L, 351B                                                                                                                                           6,17          All
&docn_nml
  datamode = "sstdata"
  model_maskfile = "INPUT/era5_data_19941012_19941014_rot_fix_SCRIP_ESMF.nc"
  model_meshfile = "INPUT/era5_data_19941012_19941014_rot_fix_SCRIP_ESMF.nc"
  nx_global = 1440
  ny_global = 721
  restfilm = "null"
  sst_constant_value = -1.0
  skip_restart_read = true
  import_data_fields = "none"
  export_all = true
/

and docn.streams:

stream_info:               era5.01
taxmode01:                 cycle
mapalgo01:                 redist
tInterpAlgo01:             linear
readMode01:                single
dtlimit01:                 1.5
stream_offset01:           0
yearFirst01:               2018
yearLast01:                2018
yearAlign01:               2018
stream_vectors01:          null
stream_mesh_file01:        "INPUT/era5_data_19941012_19941014_rot_fix_SCRIP_ESMF.nc"
stream_lev_dimname01:      null
stream_data_files01:       "INPUT/era5_data_30min_obs_wind_rot_fix_filled_wlv.nc"
stream_data_variables01:   "msl So_h"

ufs.configure:


# ESMF #
logKindFlag:            ESMF_LOGKIND_MULTI
globalResourceControl:  true

# EARTH #
EARTH_component_list: ATM OCN WAV MED
EARTH_attributes::
  Verbosity = 0
::

# MED #
MED_model:                      cmeps
MED_petlist_bounds:             0 499
MED_omp_num_threads:            1
MED_attributes::
  ATM_model = datm
  WAV_model = ww3
  history_n = 1
  history_option = nhours
  history_ymd = -999
  coupling_mode = coastal
::

# ATM #
ATM_model:                      datm
ATM_petlist_bounds:             0 5
ATM_omp_num_threads:            1
ATM_attributes::
  Verbosity = 0
  DumpFields = false
  ProfileMemory = false
  OverwriteSlice = true
::
# OCN #
OCN_model:                      docn
OCN_petlist_bounds:             6 11
OCN_omp_num_threads:            1
OCN_attributes::
  Verbosity = 0
  DumpFields = false
  ProfileMemory = false
  OverwriteSlice = true
  meshloc = element
  CouplingConfig = none
::


# WAV #
WAV_model:                      ww3
WAV_petlist_bounds:             12 499
WAV_omp_num_threads:            1
WAV_attributes::
  Verbosity = 0
  DumpFields = false
  ProfileMemory = false
  merge_import = .false.
  mesh_wav = duck_ESMFmesh.nc
  multigrid = false
  gridded_netcdfout = true
  diro = "."
  logfile = wav.log
::
# Run Sequence #
runSeq::
@3600
  MED med_phases_prep_atm
  MED med_phases_prep_ocn_accum
  MED med_phases_prep_ocn_avg
  MED med_phases_prep_wav_accum
  MED med_phases_prep_wav_avg
  MED -> ATM :remapMethod=redist
  MED -> OCN :remapMethod=redist
  MED -> WAV :remapMethod=redist
  ATM
  OCN
  WAV
  ATM -> MED :remapMethod=redist
  OCN -> MED :remapMethod=redist
  WAV -> MED :remapMethod=redist
  MED med_phases_post_atm
  MED med_phases_post_wav
  MED med_phases_restart_write
  MED med_phases_history_write
@
::

ALLCOMP_attributes::
  ScalarFieldCount = 3
  ScalarFieldIdxGridNX = 1
  ScalarFieldIdxGridNY = 2
  ScalarFieldIdxNextSwCday = 3
  ScalarFieldName = cpl_scalars
  start_type = startup
  restart_dir = RESTART/
  case_name = ufs.cpld
  restart_n = 24
  restart_option = nhours
  restart_ymd = -999
  orb_eccen = 1.e36
  orb_iyear = 2012
  orb_iyear_align = 2012
  orb_mode = fixed_year
  orb_mvelp = 1.e36
  orb_obliq = 1.e36
  stop_n = 72
  stop_option = nhours
  stop_ymd = -999
::

and try to run the ATM+WW3 case with

&INPUT_NML
INPUT%FORCING%WATER_LEVELS    = 'C'
INPUT%FORCING%WINDS           = 'C'

It gives me the following error messages:


+ srun --label -n 500 ./fv3.exe
  0: Abort with message No such file or directory in file /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/cache/build_stage/spack-stage-parallelio-2.5.10-rdwrsedxim2wqcpndlxgk7wdzc3cdtra/spack-src/src/clib/pioc_support.c at line 2832
  1: Abort with message No such file or directory in file /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/cache/build_stage/spack-stage-parallelio-2.5.10-rdwrsedxim2wqcpndlxgk7wdzc3cdtra/spack-src/src/clib/pioc_support.c at line 2832
  4: Abort with message No such file or directory in file /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/cache/build_stage/spack-stage-parallelio-2.5.10-rdwrsedxim2wqcpndlxgk7wdzc3cdtra/spack-src/src/clib/pioc_support.c at line 2832
  5: Abort with message No such file or directory in file /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/cache/build_stage/spack-stage-parallelio-2.5.10-rdwrsedxim2wqcpndlxgk7wdzc3cdtra/spack-src/src/clib/pioc_support.c at line 2832
  0: Obtained 10 stack frames.
  0: /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/unified-env/install/intel/2021.9.0/parallelio-2.5.10-rdwrsed/lib/libpioc.so(print_trace+0x29) [0x14c00cdffba9]
  0: /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/unified-env/install/intel/2021.9.0/parallelio-2.5.10-rdwrsed/lib/libpioc.so(piodie+0x40) [0x14c00cdfd870]
  0: /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/unified-env/install/intel/2021.9.0/parallelio-2.5.10-rdwrsed/lib/libpioc.so(check_netcdf2+0x1ac) [0x14c00cdfd7fc]
  0: /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/unified-env/install/intel/2021.9.0/parallelio-2.5.10-rdwrsed/lib/libpioc.so(PIOc_openfile_retry+0x882) [0x14c00cdfe212]
  0: /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/unified-env/install/intel/2021.9.0/parallelio-2.5.10-rdwrsed/lib/libpioc.so(PIOc_openfile+0x16) [0x14c00cdf8e46]
  0: /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/unified-env/install/intel/2021.9.0/parallelio-2.5.10-rdwrsed/lib/libpiof.so(piolib_mod_mp_pio_openfile_+0x2b3) [0x14c00cd8e733]
  0: /work2/noaa/nosofs/yunfangs/stmp/yunfangs/FV3_RT/rt_2692414_cdep_12302024/coastal_duck_atm2ww3_intel/./fv3.exe() [0x20f1a67]
  0: /work2/noaa/nosofs/yunfangs/stmp/yunfangs/FV3_RT/rt_2692414_cdep_12302024/coastal_duck_atm2ww3_intel/./fv3.exe() [0x20f4ab1]
  0: /work2/noaa/nosofs/yunfangs/stmp/yunfangs/FV3_RT/rt_2692414_cdep_12302024/coastal_duck_atm2ww3_intel/./fv3.exe() [0x20e7c16]
  0: /work2/noaa/nosofs/yunfangs/stmp/yunfangs/FV3_RT/rt_2692414_cdep_12302024/coastal_duck_atm2ww3_intel/./fv3.exe() [0x207dbbb]
  0: Abort(-1) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0
  4: Obtained 10 stack frames.
  4: /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/unified-env/install/intel/2021.9.0/parallelio-2.5.10-rdwrsed/lib/libpioc.so(print_trace+0x29) [0x146951114ba9]
  4: /work/noaa/epic/role-epic/spack-stack/hercules/spack-stack-1.6.0/envs/unified-env/install/intel/2021.9.0/parallelio-2.5.10-rdwrsed/lib/libpioc.so(piodie+0x40) [0x146951112870]

@uturuncoglu , do you have any suggestions? The case is located at /work2/noaa/nosofs/yunfangs/stmp/yunfangs/FV3_RT/rt_2692414_cdep_12302024/coastal_duck_atm2ww3_intel
The source code is located at /work2/noaa/nosofs/yunfangs/ufs-weather-model_cdep_docn_12302024/

@janahaddad
Copy link
Collaborator Author

@yunfangsun, from our meeting today:

  • modify CMEPS to included So_h
  • fix dates in docn.streams
  • check PET00 file after run.... should show So_h exchanged
  • should also see So_h in med history file

@yunfangsun
Copy link
Collaborator

Hi @uturuncoglu ,

I would like to make it clear that for the option in docn_in

&docn_nml
  datamode = "sstdata"

It will directly try to find the mode in the folder of CDEPS-interface/CDEPS/datm
Screenshot 2024-12-30 at 4 37 49 PM

Do I need further modifications to point it to the datamode when I am compiling it?

Thank you!

@uturuncoglu
Copy link
Collaborator

@yunfangsun I think data mode is fine but for docn you need to look at CDEPS-interface/CDEPS/docn folder.

@yunfangsun
Copy link
Collaborator

Hi @uturuncoglu ,

Could I know which file should I take a look at in the folder of /work2/noaa/nosofs/yunfangs/ufs-weather-model_cdep_docn_12302024/CDEPS-interface/CDEPS/docn?

Thank you!

@uturuncoglu
Copy link
Collaborator

@yunfangsun I think copy all uses docn_datamode_copyall_mod.F90. You could search in ocn_comp_nuopc.F90 to fine others.

@yunfangsun
Copy link
Collaborator

By using the CDEPS-interface/CDEPS/docn/docn_datamode_copyall_mod.F90 from Joey, and add call dshr_fldList_add(fldsExport, 'So_h' )
and editing the mediator CMEPS-interface/CMEPS/mediator/esmFldsExchange_coastal_mod.F90 by

if (coastal_attr%ocn_present .and. coastal_attr%wav_present) then
      allocate(S_flds(3))
      S_flds = (/'So_u', & ! ocn_current_zonal
                 'So_v',   ! ocn_current_merid
                 'So_h' /) ! sea surface height
      do n = 1,size(S_flds)
         fldname = trim(S_flds(n))
         call addfld_from(compocn, trim(fldname))
         call addfld_to(compwav, trim(fldname))
      end do
      deallocate(S_flds)
    end if

And for the WW3, by uncommenting of WW3/model/src/wav_import_export.F90 the call fldlist_add(fldsToWav_num, fldsToWav, 'So_h' )

Now the code is in /work2/noaa/nosofs/yunfangs/ufs-3d/ufs-weather-model_cdep_01052025/
The case is working now with water level exchange in /work2/noaa/nosofs/yunfangs/stmp/yunfangs/FV3_RT/rt_1185130_3d_cdep_cmep_01052025/coastal_duck_atm2ww3_intel_1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Todo
Development

No branches or pull requests

4 participants