From acce91b4a9af247032bbb928930899e6922f1676 Mon Sep 17 00:00:00 2001 From: Gillian Petro Date: Fri, 14 Jun 2024 14:11:33 -0400 Subject: [PATCH 01/49] update intro w/code additions since v1.2.0 --- doc/source/BackgroundInfo/Introduction.rst | 24 ++++++++-------------- 1 file changed, 9 insertions(+), 15 deletions(-) diff --git a/doc/source/BackgroundInfo/Introduction.rst b/doc/source/BackgroundInfo/Introduction.rst index dd457d03..0f8dfe47 100644 --- a/doc/source/BackgroundInfo/Introduction.rst +++ b/doc/source/BackgroundInfo/Introduction.rst @@ -9,21 +9,15 @@ This User's Guide provides guidance for running the Unified Forecast System the Joint Effort for Data assimilation Integration (:term:`JEDI`) software. The offline UFS Land DA System currently only works with snow data. Thus, this User's Guide focuses primarily on the snow DA process. -Since the last release, developers have added a variety of features: - -* Integration of the UFS Noah-MP land component into the Land DA System for use as an alternative to the Common Community Physics Package (:term:`CCPP`) Noah-MP LSM land driver -* Model forcing options for use with the UFS land component: - - * Provided a new analysis option in the cubed-sphere native grid using :term:`GSWP3` forcing - * Established global land grid-point consistency with the head of the UFS WM baseline test cases (New global land grid point is changed from 18360 to 18322.) - * Added a new sample configuration file (``settings_DA_cycle_gswp3``) - * Included a new ECMWF :term:`ERA5` reanalysis forcing option in the existing vector-to-tile conversion analysis process - -* CTest suite upgrades --- the ERA5 CTests now test the operability of seven major components of Land DA: vector2tile, create_ens, letkfoi_snowda, apply_jediincr, tile2vector, land_driver, and UFS datm_land. -* Upgrade of JEDI :term:`DA ` framework to use JEDI Skylab v4.0 (`PR #28 `__) -* Updates to sample datasets for the release (see the `Land DA data bucket `__) -* Singularity/Apptainer container (``ubuntu20.04-intel-landda-release-public-v1.2.0``) updates to support the changes described above -* Documentation updates to reflect the changes above +Since the |latestr| release, the following capabilities have been added to the Land DA System: + +* Added cycled run capability (:land-wflow-repo:`PR #101 `) +* Provide automated run option using cron (:land-wflow-repo:`PR #110 `) +* Added analysis plotting task (:land-wflow-repo:`PR #107 `) +* Upgraded to JEDI Skylab v7.0 (:land-wflow-repo:`PR #92 `) +* Upgraded to spack-stack v1.6.0 (:land-wflow-repo:`PR #102 `) +* Extended container support (:land-wflow-repo:`PR #85 `) +* Updated directory structure for NCO compliance (:land-wflow-repo:`PR #75 `) The Land DA System citation is as follows and should be used when presenting results based on research conducted with the Land DA System: From 56b4483500bc072480dc6e47f9c24ae9d712d7be Mon Sep 17 00:00:00 2001 From: Gillian Petro Date: Tue, 2 Jul 2024 16:09:45 -0400 Subject: [PATCH 02/49] update tech overview dir structure --- .../BackgroundInfo/TechnicalOverview.rst | 44 ++++++++++--------- 1 file changed, 24 insertions(+), 20 deletions(-) diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst index b4d774ab..bddcd19a 100644 --- a/doc/source/BackgroundInfo/TechnicalOverview.rst +++ b/doc/source/BackgroundInfo/TechnicalOverview.rst @@ -29,8 +29,8 @@ The Land DA System requires: * Python * :term:`NetCDF` * Lmod - * `spack-stack `__ - * `jedi-bundle `__ (Skylab v4.0) + * `spack-stack `__ (v1.6.0) + * `jedi-bundle `__ (Skylab v7.0) These software prerequisites are pre-installed in the Land DA :term:`container` and on other Level 1 systems (see :ref:`below ` for details). However, users on non-Level 1 systems will need to install them. @@ -71,6 +71,8 @@ Preconfigured (Level 1) systems for Land DA already have the required external l | | intel-oneapi-mpi/2021.8.0 | /opt/jedi-bundle (inside the container) | +-----------+-----------------------------------+-----------------------------------------------------------------+ +.. COMMENT: Update paths! + Level 2-4 Systems =================== @@ -99,12 +101,8 @@ This :term:`umbrella repository` uses Git submodules and an ``app_build.sh`` fil - Repository Name - Repository Description - Authoritative Repository URL - * - DA_update - - land-DA - - Contains scripts and components for performing data assimilation (DA) procedures. - - https://github.com/ufs-community/land-DA/ - * - *-- add_jedi_incr* - - *-- land-apply_jedi_incr* + * - apply_incr.fd + - land-apply_jedi_incr - Contains code that applies the JEDI-generated DA increment to UFS ``sfc_data`` restart - https://github.com/NOAA-PSL/land-apply_jedi_incr * - ufsLand.fd @@ -143,6 +141,7 @@ The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operatio land-offline_workflow ├── doc ├── (exec) + ├── fix ├── jobs ├── (lib*) ├── modulefiles @@ -150,30 +149,33 @@ The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operatio │ ├── check_release_outputs.sh │ ├── land_analysis__.yaml │ └── run_without_rocoto.sh + ├── scripts ├── sorc + | ├── apply_incr.fd + | | ├── apply_incr_noahmp_snow.f90 + | | └── NoahMPdisag_module.f90 │ ├── (build) │ ├── cmake - │ │ ├── compiler_flags_* - │ │ └── landda_compiler_flags.cmake + │ │ └── compiler_flags_*.cmake │ ├── (conda) - │ ├── DA_update - │ │ ├── add_jedi_incr - │ │ ├── jedi/fv3-jedi - │ │ └── do_LandDA.sh │ ├── test │ ├── tile2tile_converter.fd - │ │ ├── cmake - │ │ └── config │ ├── ufsLand.fd - │ │ └── ccpp-physics + │ │ ├── ccpp-physics + │ │ └── driver │ ├── ufs_model.fd │ ├── vector2tile_converter.fd - │ │ ├── cmake - │ │ └── config │ ├── CMakeLists.txt │ └── app_build.sh + ├── ush + | ├── hofx_analysis_stats.py + | └── letkf_create_ens.py + ├── versions ├── LICENSE - ├── README.md + └── README.md + + +.. COMMENT: Remove to other sections ├── datm_cdeps_lnd_gswp3_rst ├── do_submit_cycle.sh ├── fv3_run @@ -185,6 +187,8 @@ The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operatio ├── settings_DA_* └── submit_cycle.sh +.. COMMENT: Update dir structure! + :numref:`Table %s ` describes the contents of the most important Land DA subdirectories. :numref:`Section %s ` describes the Land DA System components. Users can reference the :nco:`NCO Implementation Standards ` (p. 19) for additional details on repository structure in NCO-compliant repositories. .. _Subdirectories: From c12eb3b9789b0b28382540c5f8c9f1b136af34e1 Mon Sep 17 00:00:00 2001 From: Gillian Petro Date: Tue, 2 Jul 2024 16:10:25 -0400 Subject: [PATCH 03/49] add info on manual vs auto submission --- .../BuildingRunningTesting/BuildRunLandDA.rst | 52 +++++++++++++++---- 1 file changed, 43 insertions(+), 9 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 091f4d6e..15a5173c 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -6,8 +6,10 @@ Land DA Workflow (Hera & Orion) This chapter provides instructions for building and running basic Land DA cases for the Unified Forecast System (:term:`UFS`) Land DA System. Users can choose between two options: - * A Dec. 21, 2019 00z sample case using ERA5 data with the UFS Land Driver (``land_analysis_era5_``) - * A Jan. 3, 2000 00z sample case using GSWP3 data with the UFS Noah-MP land component (``land_analysis_gswp3_``). + * A Dec. 21-22, 2019 00z sample case using ERA5 data with the UFS Land Driver (``land_analysis_era5_``) + * A Jan. 3-4, 2000 00z sample case using GSWP3 data with the UFS Noah-MP land component (``land_analysis_gswp3_``). + +Land DA now includes cycling/restart capabilities to run a multi-day experiment. .. attention:: @@ -207,12 +209,32 @@ Run With Rocoto Users who do not have Rocoto installed on their system can view :numref:`Section %s: Run Without Rocoto `. +To run the experiment, users can automate job submission via crontab or submit tasks manually via ``rocotorun``. + +Automated Run +--------------- + +To automate task submission, users must be on a system where cron is available. + +.. code-block:: console + + cd parm + conda deactivate # optional + ./launch_rocoto_wflow.sh add + + + +Manual Submission +------------------- + To run the experiment, issue a ``rocotorun`` command from the ``parm`` directory: .. code-block:: console rocotorun -w land_analysis.xml -d land_analysis.db +Users will need to issue the ``rocotorun`` command multiple times. The tasks must run in order, and ``rocotorun`` initiates the next task once its dependencies have completed successfully. Note that the status table printed by ``rocotostat`` only updates after each ``rocotorun`` command. Details on checking experiment status are provided in the :ref:`next section `. + .. _VerifySuccess: Track Experiment Status @@ -228,18 +250,30 @@ If ``rocotorun`` was successful, the ``rocotostat`` command will print a status .. code-block:: console - CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION - ====================================================================================================== - 200001030000 prepexp druby://hfe08:41879 SUBMITTING - 2 0.0 - 200001030000 prepobs - - - - - - 200001030000 prepbmat - - - - - - 200001030000 runana - - - - - - 200001030000 runfcst - - - - - + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ========================================================================================================= + 200001030000 prep_obs 61746064 QUEUED - 1 0.0 + 200001030000 pre_anal druby://10.184.3.62:41973 SUBMITTING - 1 0.0 + 200001030000 analysis - - - - - + 200001030000 post_anal - - - - - + 200001030000 plot_stats - - - - - + 200001030000 forecast - - - - - + ================================================================================================================================ + 200001040000 prep_obs druby://10.184.3.62:41973 SUBMITTING - 1 0.0 + 200001040000 pre_anal - - - - - + 200001040000 analysis - - - - - + 200001040000 post_anal - - - - - + 200001040000 plot_stats - - - - - + 200001040000 forecast - - - - - + +.. COMMENT: Add plotting task info! Users will need to issue the ``rocotorun`` command multiple times. The tasks must run in order, and ``rocotorun`` initiates the next task once its dependencies have completed successfully. Note that the status table printed by ``rocotostat`` only updates after each ``rocotorun`` command. For each task, a log file is generated. These files are stored in ``$LANDDAROOT/com/output/logs/run_``, where ```` is either ``gswp3`` or ``era5``. The experiment has successfully completed when all tasks say SUCCEEDED under STATE. Other potential statuses are: QUEUED, SUBMITTING, RUNNING, and DEAD. Users may view the log files to determine why a task may have failed. +.. COMMENT: Where are log files now?! + .. _run-batch-script: Run Without Rocoto From f2639fd5f0cc67fb33b0a403d62ae7cf183fe328 Mon Sep 17 00:00:00 2001 From: Gillian Petro Date: Fri, 5 Jul 2024 16:47:45 -0400 Subject: [PATCH 04/49] several misc changes --- doc/source/BackgroundInfo/Introduction.rst | 3 +- .../BackgroundInfo/TechnicalOverview.rst | 50 +++++++++++-------- .../BuildingRunningTesting/BuildRunLandDA.rst | 8 ++- .../CustomizingTheWorkflow/DASystem.rst | 2 +- doc/source/conf.py | 1 + 5 files changed, 41 insertions(+), 23 deletions(-) diff --git a/doc/source/BackgroundInfo/Introduction.rst b/doc/source/BackgroundInfo/Introduction.rst index 0f8dfe47..e81ea197 100644 --- a/doc/source/BackgroundInfo/Introduction.rst +++ b/doc/source/BackgroundInfo/Introduction.rst @@ -6,7 +6,7 @@ Introduction This User's Guide provides guidance for running the Unified Forecast System (:term:`UFS`) offline Land Data Assimilation (DA) System. Land DA is an offline version of the Noah Multi-Physics (Noah-MP) land surface model (LSM) used in the `UFS Weather Model `__ (WM). Its data assimilation framework uses -the Joint Effort for Data assimilation Integration (:term:`JEDI`) software. The offline UFS Land DA System currently only works with snow data. +the Joint Effort for Data assimilation Integration (:term:`JEDI`) software. Currently, the offline UFS Land DA System only works with snow data. Thus, this User's Guide focuses primarily on the snow DA process. Since the |latestr| release, the following capabilities have been added to the Land DA System: @@ -18,6 +18,7 @@ Since the |latestr| release, the following capabilities have been added to the L * Upgraded to spack-stack v1.6.0 (:land-wflow-repo:`PR #102 `) * Extended container support (:land-wflow-repo:`PR #85 `) * Updated directory structure for NCO compliance (:land-wflow-repo:`PR #75 `) +* Removed land driver from CTest (:land-wflow-repo:`PR #123 `) The Land DA System citation is as follows and should be used when presenting results based on research conducted with the Land DA System: diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst index bddcd19a..345b17bf 100644 --- a/doc/source/BackgroundInfo/TechnicalOverview.rst +++ b/doc/source/BackgroundInfo/TechnicalOverview.rst @@ -1,3 +1,6 @@ +.. role:: raw-html(raw) + :format: html + .. _TechOverview: ********************* @@ -30,7 +33,7 @@ The Land DA System requires: * :term:`NetCDF` * Lmod * `spack-stack `__ (v1.6.0) - * `jedi-bundle `__ (Skylab v7.0) + * `jedi-bundle `__ (|skylabv|) These software prerequisites are pre-installed in the Land DA :term:`container` and on other Level 1 systems (see :ref:`below ` for details). However, users on non-Level 1 systems will need to install them. @@ -51,25 +54,32 @@ Four levels of support have been defined for :term:`UFS` applications, and the L Level 1 Systems ================== -Preconfigured (Level 1) systems for Land DA already have the required external libraries available in a central location via :term:`spack-stack` and the ``jedi-bundle`` (Skylab v4.0). Land DA is expected to build and run out-of-the-box on these systems, and users can download the Land DA code without first installing prerequisite software. With the exception of the Land DA container, users must have access to these Level 1 systems in order to use them. - -.. COMMENT: Update spack-stack to 1.5.1 - -+-----------+-----------------------------------+-----------------------------------------------------------------+ -| Platform | Compiler/MPI | spack-stack & jedi-bundle Installations | -+===========+===================================+=================================================================+ -| Hera | intel/2022.1.2 / | /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.3.0 | -| | | | -| | impi/2022.1.2 | /scratch2/NAGAPE/epic/UFS_Land-DA/jedi/jedi-bundle | -+-----------+-----------------------------------+-----------------------------------------------------------------+ -| Orion | intel/2022.1.2 / | /work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.3.0 | -| | | | -| | impi/2022.1.2 | /work/noaa/epic/UFS_Land-DA/jedi/jedi-bundle | -+-----------+-----------------------------------+-----------------------------------------------------------------+ -| Container | intel-oneapi-compilers/2021.8.0 / | /opt/spack-stack/ (inside the container) | -| | | | -| | intel-oneapi-mpi/2021.8.0 | /opt/jedi-bundle (inside the container) | -+-----------+-----------------------------------+-----------------------------------------------------------------+ +Preconfigured (Level 1) systems for Land DA already have the required external libraries available in a central location via :term:`spack-stack` and the ``jedi-bundle`` (|skylabv|). Land DA is expected to build and run out-of-the-box on these systems, and users can download the Land DA code without first installing prerequisite software. With the exception of the Land DA container, users must have access to these Level 1 systems in order to use them. For the most updated information on stack locations, compilers, and MPI, users can check the :land-wflow-repo:`build and run version files ` for their machine of choice. Similarly, users can check the :land-wflow-repo:`build__intel ` file for the machine of their choice. + +.. _stack-compiler-locations: + +.. list-table:: *Software Prerequisites & Locations* + :header-rows: 1 + :widths: 10 20 70 + + * - Platform + - Compiler/MPI + - spack-stack & jedi-bundle Installations + * - Hera + - - intel/2021.5.0 / + - impi/2021.5.1 + - - /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.6.0/envs/unified-env-rocky8/install/modulefiles/Core + - /scratch2/NAGAPE/epic/UFS_Land-DA_Dev/jedi_v7 + * - Orion + - - intel/2021.9.0 / + - impi/2021.9.0 + - - /work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/unified-env-rocky9/install/modulefiles/Core + - /work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6 + * - Container + - - intel-oneapi-compilers/2021.8.0 / + - intel-oneapi-mpi/2021.8.0 + - - /opt/spack-stack/ (inside the container) + - /opt/jedi-bundle (inside the container) .. COMMENT: Update paths! diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 15a5173c..1d1ef11c 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -11,6 +11,8 @@ This chapter provides instructions for building and running basic Land DA cases Land DA now includes cycling/restart capabilities to run a multi-day experiment. +.. COMMENT: Remove land driver info? + .. attention:: These steps are designed for use on :ref:`Level 1 ` systems (i.e., Hera and Orion) and may require significant changes on other systems. It is recommended that users on other systems run the containerized version of Land DA. Users may reference :numref:`Chapter %s: Containerized Land DA Workflow ` for instructions. @@ -214,7 +216,7 @@ To run the experiment, users can automate job submission via crontab or submit t Automated Run --------------- -To automate task submission, users must be on a system where cron is available. +To automate task submission, users must be on a system where cron is available. On Orion, .. code-block:: console @@ -319,3 +321,7 @@ Check for the output files for each cycle in the experiment directory: ls -l $LANDDAROOT/ptmp/test/com/landda/v1.2.1/landda.YYYYMMDD where ``YYYYMMDD`` is the cycle date. The experiment should generate several restart files. + +Additionally, in the ``plot`` subdirectory, users will find images depicting the results of the ``analysis`` task for each cycle as a scatter plot (``hofx_oma_YYYMMDD_scatter.png``) and as a histogram (``hofx_oma_YYYYMMDD_histogram.png``). The scatter plot depicts a map of snow depth results, where red points indicate _____ and blue points indicate _____. The histogram shows ______. + +.. COMMENT: What do the red/blue points indicate? Fill in above for map & histogram diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index a7d4053c..f94697dd 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -4,7 +4,7 @@ Land Data Assimilation System *************************************************** -This chapter describes the configuration of the offline Land :term:`Data Assimilation` (DA) System, which utilizes the UFS Noah-MP component together with the ``jedi-bundle`` (Skylab v4.0) to enable cycled model forecasts. The data assimilation framework applies the Local Ensemble Transform Kalman Filter-Optimal Interpolation (LETKF-OI) algorithm to combine the state-dependent background error derived from an ensemble forecast with the observations and their corresponding uncertainties to produce an analysis ensemble (:cite:t:`HuntEtAl2007`, 2007). +This chapter describes the configuration of the offline Land :term:`Data Assimilation` (DA) System, which utilizes the UFS Noah-MP component together with the ``jedi-bundle`` (|skylabv|) to enable cycled model forecasts. The data assimilation framework applies the Local Ensemble Transform Kalman Filter-Optimal Interpolation (LETKF-OI) algorithm to combine the state-dependent background error derived from an ensemble forecast with the observations and their corresponding uncertainties to produce an analysis ensemble (:cite:t:`HuntEtAl2007`, 2007). Joint Effort for Data Assimilation Integration (JEDI) ******************************************************** diff --git a/doc/source/conf.py b/doc/source/conf.py index d10d8877..69bb3f5f 100644 --- a/doc/source/conf.py +++ b/doc/source/conf.py @@ -50,6 +50,7 @@ .. |latestr| replace:: v1.2.0 .. |tag| replace:: ``ufs-land-da-v1.2.0`` .. |branch| replace:: ``release/public-v1.2.0`` +.. |skylabv| replace:: Skylab v7.0 """ # -- Linkcheck options ------------------------------------------------- From 4db3bc4f448082bece120e431c4d46342171a3a8 Mon Sep 17 00:00:00 2001 From: Gillian Petro Date: Mon, 8 Jul 2024 11:50:22 -0400 Subject: [PATCH 05/49] add plotting info --- .../BuildingRunningTesting/BuildRunLandDA.rst | 14 +++++--------- 1 file changed, 5 insertions(+), 9 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 1d1ef11c..73872bbd 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -211,12 +211,12 @@ Run With Rocoto Users who do not have Rocoto installed on their system can view :numref:`Section %s: Run Without Rocoto `. -To run the experiment, users can automate job submission via crontab or submit tasks manually via ``rocotorun``. +To run the experiment, users can automate job submission via :term:`crontab` or submit tasks manually via ``rocotorun``. Automated Run --------------- -To automate task submission, users must be on a system where cron is available. On Orion, +To automate task submission, users must be on a system where :term:`cron` is available. On Orion, cron is only available on the orion-login-1 node, so users will need to work on that node when running cron jobs on Orion. .. code-block:: console @@ -224,7 +224,7 @@ To automate task submission, users must be on a system where cron is available. conda deactivate # optional ./launch_rocoto_wflow.sh add - +To check the status of the experiment, see :numref:`Section %s ` on tracking experiment progress. Manual Submission ------------------- @@ -235,7 +235,7 @@ To run the experiment, issue a ``rocotorun`` command from the ``parm`` directory rocotorun -w land_analysis.xml -d land_analysis.db -Users will need to issue the ``rocotorun`` command multiple times. The tasks must run in order, and ``rocotorun`` initiates the next task once its dependencies have completed successfully. Note that the status table printed by ``rocotostat`` only updates after each ``rocotorun`` command. Details on checking experiment status are provided in the :ref:`next section `. +Users will need to issue the ``rocotorun`` command multiple times. The tasks must be run in order, and ``rocotorun`` initiates the next task once its dependencies have completed successfully. Note that the status table printed by ``rocotostat`` only updates after each ``rocotorun`` command. Details on checking experiment status are provided in the :ref:`next section `. .. _VerifySuccess: @@ -268,8 +268,6 @@ If ``rocotorun`` was successful, the ``rocotostat`` command will print a status 200001040000 plot_stats - - - - - 200001040000 forecast - - - - - -.. COMMENT: Add plotting task info! - Users will need to issue the ``rocotorun`` command multiple times. The tasks must run in order, and ``rocotorun`` initiates the next task once its dependencies have completed successfully. Note that the status table printed by ``rocotostat`` only updates after each ``rocotorun`` command. For each task, a log file is generated. These files are stored in ``$LANDDAROOT/com/output/logs/run_``, where ```` is either ``gswp3`` or ``era5``. The experiment has successfully completed when all tasks say SUCCEEDED under STATE. Other potential statuses are: QUEUED, SUBMITTING, RUNNING, and DEAD. Users may view the log files to determine why a task may have failed. @@ -322,6 +320,4 @@ Check for the output files for each cycle in the experiment directory: where ``YYYYMMDD`` is the cycle date. The experiment should generate several restart files. -Additionally, in the ``plot`` subdirectory, users will find images depicting the results of the ``analysis`` task for each cycle as a scatter plot (``hofx_oma_YYYMMDD_scatter.png``) and as a histogram (``hofx_oma_YYYYMMDD_histogram.png``). The scatter plot depicts a map of snow depth results, where red points indicate _____ and blue points indicate _____. The histogram shows ______. - -.. COMMENT: What do the red/blue points indicate? Fill in above for map & histogram +Additionally, in the ``plot`` subdirectory, users will find images depicting the results of the ``analysis`` task for each cycle as a scatter plot (``hofx_oma_YYYMMDD_scatter.png``) and as a histogram (``hofx_oma_YYYYMMDD_histogram.png``). The scatter plot is named OBS-ANA (i.e., Observation Minus Analysis [OMA]), and it depicts a map of snow depth results. Blue points indicate locations where the observed values are less than the analysis values, and red points indicate locations where the observed values are greater than the analysis values. The histogram plots *observation - analysis* values, with the difference in snow depth on the y-axis. From bf343fa5a45864c125063b055a94ae19b442a5e6 Mon Sep 17 00:00:00 2001 From: Gillian Petro Date: Mon, 8 Jul 2024 15:32:23 -0400 Subject: [PATCH 06/49] update plotting info --- .../BuildingRunningTesting/BuildRunLandDA.rst | 29 ++++++++++--------- 1 file changed, 15 insertions(+), 14 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 73872bbd..25103443 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -4,19 +4,19 @@ Land DA Workflow (Hera & Orion) ************************************ -This chapter provides instructions for building and running basic Land DA cases for the Unified Forecast System (:term:`UFS`) Land DA System. Users can choose between two options: +This chapter provides instructions for building and running basic Land DA cases for the Unified Forecast System (:term:`UFS`) Land DA System. Users can choose between two supported options: - * A Dec. 21-22, 2019 00z sample case using ERA5 data with the UFS Land Driver (``land_analysis_era5_``) - * A Jan. 3-4, 2000 00z sample case using GSWP3 data with the UFS Noah-MP land component (``land_analysis_gswp3_``). + * A Jan. 3-4, 2000 00z sample case using GSWP3 data with the UFS Noah-MP land component + * A Dec. 21-22, 2019 00z sample case using ERA5 data with the UFS Land Driver -Land DA now includes cycling/restart capabilities to run a multi-day experiment. -.. COMMENT: Remove land driver info? .. attention:: These steps are designed for use on :ref:`Level 1 ` systems (i.e., Hera and Orion) and may require significant changes on other systems. It is recommended that users on other systems run the containerized version of Land DA. Users may reference :numref:`Chapter %s: Containerized Land DA Workflow ` for instructions. +.. _create-dir: + Create a Working Directory ***************************** @@ -101,13 +101,15 @@ To load the workflow environment, run: where ```` is ``hera`` or ``orion``. +.. _configure-expt: + Modify the Workflow Configuration YAML ======================================== The ``develop`` branch includes two default experiments: - * A Dec. 21, 2019 00z sample case using the UFS Land Driver. * A Jan. 3, 2000 00z sample case using the UFS Noah-MP land component. + * A Dec. 21, 2019 00z sample case using the UFS Land Driver. Copy the experiment settings into ``land_analysis.yaml``: @@ -120,10 +122,8 @@ where ```` is ``hera`` or ``orion``. Users will need to configure certain elements of their experiment in ``land_analysis.yaml``: - * ``MACHINE:`` A valid machine name (i.e., ``hera`` or ``orion``) * ``ACCOUNT:`` A valid account name. Hera, Orion, and most NOAA RDHPCS systems require a valid account name; other systems may not * ``EXP_BASEDIR:`` The full path to the directory where land-DA_workflow was cloned (i.e., ``$LANDDAROOT``) - * ``JEDI_INSTALL:`` The full path to the system's ``jedi-bundle`` installation * ``FORCING:`` Forcing options; ``gswp3`` or ``era5`` * ``cycledef/spec:`` Cycle specification @@ -138,7 +138,7 @@ Users may configure other elements of an experiment in ``land_analysis.yaml`` if Data ------ -:numref:`Table %s ` shows the locations of pre-staged data on NOAA :term:`RDHPCS` (i.e., Hera and Orion). +:numref:`Table %s ` shows the locations of pre-staged data on NOAA :term:`RDHPCS` (i.e., Hera and Orion). These data locations are already included in the ``land_analysis_*`` files but are provided here for informational purposes. .. _Level1Data: @@ -152,7 +152,7 @@ Data | Orion | /work/noaa/epic/UFS_Land-DA_Dev/inputs | +-----------+--------------------------------------------------+ -Users who have difficulty accessing the data on Hera or Orion may download it according to the instructions in :numref:`Section %s `. Its sub-directories are soft-linked to the ``fix`` directory of the land-DA workflow by the build script ``sorc/app_build.sh``. +Users who have difficulty accessing the data on Hera or Orion may download it according to the instructions in :numref:`Section %s `. Its subdirectories are soft-linked to the ``fix`` directory of the land-DA workflow by the build script ``sorc/app_build.sh``. .. _generate-wflow: @@ -268,12 +268,10 @@ If ``rocotorun`` was successful, the ``rocotostat`` command will print a status 200001040000 plot_stats - - - - - 200001040000 forecast - - - - - -Users will need to issue the ``rocotorun`` command multiple times. The tasks must run in order, and ``rocotorun`` initiates the next task once its dependencies have completed successfully. Note that the status table printed by ``rocotostat`` only updates after each ``rocotorun`` command. For each task, a log file is generated. These files are stored in ``$LANDDAROOT/com/output/logs/run_``, where ```` is either ``gswp3`` or ``era5``. +Note that the status table printed by ``rocotostat`` only updates after each ``rocotorun`` command (whether issued manually or via cron automation). For each task, a log file is generated. These files are stored in ``$LANDDAROOT/ptmp/test/com/output/logs/run_``, where ```` is either ``gswp3`` or ``era5``. The experiment has successfully completed when all tasks say SUCCEEDED under STATE. Other potential statuses are: QUEUED, SUBMITTING, RUNNING, and DEAD. Users may view the log files to determine why a task may have failed. -.. COMMENT: Where are log files now?! - .. _run-batch-script: Run Without Rocoto @@ -320,4 +318,7 @@ Check for the output files for each cycle in the experiment directory: where ``YYYYMMDD`` is the cycle date. The experiment should generate several restart files. -Additionally, in the ``plot`` subdirectory, users will find images depicting the results of the ``analysis`` task for each cycle as a scatter plot (``hofx_oma_YYYMMDD_scatter.png``) and as a histogram (``hofx_oma_YYYYMMDD_histogram.png``). The scatter plot is named OBS-ANA (i.e., Observation Minus Analysis [OMA]), and it depicts a map of snow depth results. Blue points indicate locations where the observed values are less than the analysis values, and red points indicate locations where the observed values are greater than the analysis values. The histogram plots *observation - analysis* values, with the difference in snow depth on the y-axis. +Plotting Results +----------------- + +Additionally, in the ``plot`` subdirectory, users will find images depicting the results of the ``analysis`` task for each cycle as a scatter plot (``hofx_oma_YYYMMDD_scatter.png``) and as a histogram (``hofx_oma_YYYYMMDD_histogram.png``). The scatter plot is named OBS-ANA (i.e., Observation Minus Analysis [OMA]), and it depicts a map of snow depth results. Blue points indicate locations where the observed values are less than the analysis values, and red points indicate locations where the observed values are greater than the analysis values. The title lists the mean and standard deviation of the absolute value of the OMA values. The histogram plots OMA values on the x-axis and frequency density values on the y-axis. The title of the histogram lists the mean and standard deviation of the real value of the OMA values. From 9ea7ce7f5e77500968adb7b60c481c32d28e91a2 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Mon, 8 Jul 2024 22:01:37 -0400 Subject: [PATCH 07/49] add plot images, add extlinks, misc other --- README.md | 2 +- .../BuildingRunningTesting/BuildRunLandDA.rst | 13 ++++++++++++ .../CustomizingTheWorkflow/DASystem.rst | 20 +++++++++---------- doc/source/CustomizingTheWorkflow/Model.rst | 12 +++++------ doc/source/conf.py | 1 + 5 files changed, 31 insertions(+), 17 deletions(-) diff --git a/README.md b/README.md index 220aeba7..08af7365 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # UFS Offline Land Data Assimilation System -The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA's operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader Weather Enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. +The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA's operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader Weather Enterprise. For more information about the UFS, visit the UFS Portal at https://ufs.epic.noaa.gov/. The UFS includes [multiple applications](https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This repository hosts the source code for the UFS Land Data Assimilation (DA) System. Land DA is an offline version of the Noah Multi-Physics (Noah-MP) land surface model (LSM) used in the UFS Weather Model (WM). Its data assimilation framework uses the Joint Effort for Data assimilation Integration (JEDI) software stack, which includes the Object-Oriented Prediction System (OOPS) for the data assimilation algorithm, the Interface for Observation Data Access (IODA) for observation formatting and processing, and the Unified Forward Operator (UFO) for comparing model forecasts and observations. diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 25103443..7fc5bc3a 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -322,3 +322,16 @@ Plotting Results ----------------- Additionally, in the ``plot`` subdirectory, users will find images depicting the results of the ``analysis`` task for each cycle as a scatter plot (``hofx_oma_YYYMMDD_scatter.png``) and as a histogram (``hofx_oma_YYYYMMDD_histogram.png``). The scatter plot is named OBS-ANA (i.e., Observation Minus Analysis [OMA]), and it depicts a map of snow depth results. Blue points indicate locations where the observed values are less than the analysis values, and red points indicate locations where the observed values are greater than the analysis values. The title lists the mean and standard deviation of the absolute value of the OMA values. The histogram plots OMA values on the x-axis and frequency density values on the y-axis. The title of the histogram lists the mean and standard deviation of the real value of the OMA values. + +.. |logo1| image:: https://raw.githubusercontent.com/wiki/ufs-community/land-DA_workflow/images/LandDAScatterPlot.png + :alt: Map of snow depth in millimeters (observation minus analysis) + +.. |logo2| image:: https://raw.githubusercontent.com/wiki/ufs-community/land-DA_workflow/images/LandDAHistogram.png + :alt: Histogram of snow depth in millimeters (observation minus analysis) on the x-axis and frequency density on the y-axis + +.. _sample-plots: + +.. list-table:: Snow Depth Plots for 2000-01-04 + + * - |logo1| + - |logo2| \ No newline at end of file diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index f94697dd..7c7a592d 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -11,15 +11,15 @@ Joint Effort for Data Assimilation Integration (JEDI) .. attention:: - Users are encouraged to visit the `JEDI Documentation `__. Much of the information in this chapter is drawn directly from there with modifications to clarify JEDI's use specifically in the context of the Land DA System. + Users are encouraged to visit the :jedi:`JEDI Documentation `. Much of the information in this chapter is drawn directly from there with modifications to clarify JEDI's use specifically in the context of the Land DA System. The Joint Effort for Data assimilation Integration (:term:`JEDI`) is a unified and versatile :term:`data assimilation` (DA) system for Earth System Prediction that can be run on a variety of platforms. JEDI is developed by the Joint Center for Satellite Data Assimilation (`JCSDA `__) and partner agencies, including NOAA. The core feature of JEDI is separation of concerns. The data assimilation update, observation selection and processing, and observation operators are all coded with no knowledge of or dependency on each other or on the forecast model. The NOAH-MP offline Land DA System uses three JEDI components: * The Object-Oriented Prediction System (:ref:`OOPS `) for the data assimilation algorithm - * The Interface for Observation Data Access (`IODA `__) for the observation formatting and processing - * The Unified Forward Operator (`UFO `__) for comparing model forecasts and observations + * The Interface for Observation Data Access (:jedi:`IODA `) for the observation formatting and processing + * The Unified Forward Operator (:jedi:`UFO `) for comparing model forecasts and observations JEDI's Unified Forward Operator (UFO) links observation operators with the Object Oriented Prediction System (OOPS) to compute a simulated observation given a known model state. It does not restrict observation operators based on model-specific code structures or requirements. The UFO code structure provides generic classes for observation bias correction and quality control. Within this system, IODA converts the observation data into model-specific formats to be ingested by each model's data assimilation system. This involves model-specific data conversion efforts. @@ -31,7 +31,7 @@ A data assimilation experiment requires a ``.yaml`` configuration file that spec JEDI Configuration Files & Parameters ---------------------------------------- -To create the DA experiment, the user should create or modify an experiment-specific configuration ``.yaml`` file. This ``.yaml`` file should contain certain fundamental components: geometry, window begin, window length, background, driver, local ensemble DA, output increment, and observations. These components can be implemented differently for different models and observation types, so they frequently contain distinct parameters and variable names depending on the use case. Therefore, this section of the User's Guide focuses on assisting users with understanding and customizing these top-level configuration items in order to run Land DA experiments. Users may also reference the `JEDI Documentation `__ for additional information. +To create the DA experiment, the user should create or modify an experiment-specific configuration ``.yaml`` file. This ``.yaml`` file should contain certain fundamental components: geometry, window begin, window length, background, driver, local ensemble DA, output increment, and observations. These components can be implemented differently for different models and observation types, so they frequently contain distinct parameters and variable names depending on the use case. Therefore, this section of the User's Guide focuses on assisting users with understanding and customizing these top-level configuration items in order to run Land DA experiments. Users may also reference the :jedi:`JEDI Documentation ` for additional information. Users may find the following example ``GHCN.yaml`` configuration file to be a helpful starting point. A similar file (with user-appropriate modifications) is required by JEDI for snow data assimilation. The following subsections will explain the variables within each top-level item of the ``.yaml`` file. The ``GHCN.yaml`` file for the |latestr| release can be found within the cloned repository at ``DA_update/jedi/fv3-jedi/yaml_files/psl_develop/GHCN.yaml``. @@ -430,7 +430,7 @@ The ``obs error:`` section explains how to calculate the observation error covar Observation filters are used to define Quality Control (QC) filters. They have access to observation values and metadata, model values at observation locations, simulated observation value, and their own private data. See :ref:`Observation Filters ` in the JEDI Documentation for more detail. The ``obs filters:`` section contains the following fields: ``filter`` - Describes the parameters of a given QC filter. Valid values include: ``Bounds Check`` | ``Background Check`` | ``Domain Check`` | ``RejectList``. See descriptions in the JEDI's `Generic QC Filters `__ Documentation for more. + Describes the parameters of a given QC filter. Valid values include: ``Bounds Check`` | ``Background Check`` | ``Domain Check`` | ``RejectList``. See descriptions in the JEDI's :jedi:`Generic QC Filters ` Documentation for more. +--------------------+--------------------------------------------------+ | Filter Name | Description | @@ -475,7 +475,7 @@ Observation filters are used to define Quality Control (QC) filters. They have a Maximum value for variables in the filter. ``threshold`` - This variable may function differently depending on the filter it is used in. In the `Background Check Filter `__, an observation is rejected when the difference between the observation value (*y*) and model simulated value (*H(x)*) is larger than the ``threshold`` * *observation error*. + This variable may function differently depending on the filter it is used in. In the :jedi:`Background Check Filter `, an observation is rejected when the difference between the observation value (*y*) and model simulated value (*H(x)*) is larger than the ``threshold`` * *observation error*. ``action`` Indicates which action to take once an observation has been flagged by a filter. See :ref:`Filter Actions ` in the JEDI documentation for a full explanation and list of valid values. @@ -539,7 +539,7 @@ The grid description files appear in :numref:`Section %s ` and ar Observation Data ==================== -Observation data from 2000 and 2019 are provided in NetCDF format for the |latestr| release. Instructions for downloading the data are provided in :numref:`Section %s `, and instructions for accessing the data on :ref:`Level 1 Systems ` are provided in :numref:`Section %s `. Currently, data is taken from the `Global Historical Climatology Network `__ (GHCN), but eventually, data from the U.S. National Ice Center (USNIC) Interactive Multisensor Snow and Ice Mapping System (`IMS `__) will also be available for use. +Observation data from 2000 and 2019 are provided in NetCDF format for the |latestr| release. Instructions for downloading the data are provided in :numref:`Section %s `, and instructions for accessing the data on :ref:`Level 1 Systems ` are provided in :numref:`Section %s `. Currently, data is taken from the `Global Historical Climatology Network `_ (GHCN), but eventually, data from the U.S. National Ice Center (USNIC) Interactive Multisensor Snow and Ice Mapping System (`IMS `_) will also be available for use. Observation Types -------------------- @@ -547,7 +547,7 @@ Observation Types GHCN Snow Depth Files ^^^^^^^^^^^^^^^^^^^^^^^^ -Snow depth observations are taken from the `Global Historical Climatology Network `__, which provides daily climate summaries sourced from a global network of 100,000 stations. NOAA's `NCEI `__ provides access to these snow depth and snowfall measurements through daily-generated individual station ASCII files or GZipped tar files of full-network observations on the NCEI server or Climate Data Online. Alternatively, users may acquire yearly tarballs via ``wget``: +Snow depth observations are taken from the `Global Historical Climatology Network `_, which provides daily climate summaries sourced from a global network of 100,000 stations. NOAA's `NCEI `_ provides access to these snow depth and snowfall measurements through daily-generated individual station ASCII files or GZipped tar files of full-network observations on the NCEI server or Climate Data Online. Alternatively, users may acquire yearly tarballs via ``wget``: .. code-block:: console @@ -626,11 +626,11 @@ Observation Location and Processing GHCN ^^^^^^ -GHCN files for 2000 and 2019 are already provided in IODA format for the |latestr| release. :numref:`Table %s ` indicates where users can find data on NOAA :term:`RDHPCS` platforms. Tar files containing the 2000 and 2019 data are located in the publicly-available `Land DA Data Bucket `__. Once untarred, the snow depth files are located in ``/inputs/DA/snow_depth/GHCN/data_proc/{YEAR}``. The 2019 GHCN IODA files were provided by Clara Draper (NOAA PSL). Each file follows the naming convention of ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc``, where ``${YYYY}`` is the four-digit cycle year, ``${MM}`` is the two-digit cycle month, and ``${DD}`` is the two-digit cycle day. +GHCN files for 2000 and 2019 are already provided in IODA format for the |latestr| release. :numref:`Table %s ` indicates where users can find data on NOAA :term:`RDHPCS` platforms. Tar files containing the 2000 and 2019 data are located in the publicly-available `Land DA Data Bucket `_. Once untarred, the snow depth files are located in ``/inputs/DA/snow_depth/GHCN/data_proc/{YEAR}``. The 2019 GHCN IODA files were provided by Clara Draper (NOAA PSL). Each file follows the naming convention of ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc``, where ``${YYYY}`` is the four-digit cycle year, ``${MM}`` is the two-digit cycle month, and ``${DD}`` is the two-digit cycle day. In each experiment, the ``DA_config`` file sets the name of the experiment configuration file. This configuration file is typically named ``settings_DA_test``. Before assimilation, if "GHCN" was specified as the observation type in the ``DA_config`` file, the ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc`` file corresponding to the specified cycle date is soft-linked to the JEDI working directory (``${JEDIWORKDIR}``) with a naming-convention change (i.e., ``GHCN_${YYYY}${MM}${DD}${HH}.nc``). Here, the GHCN IODA file is appended with the cycle hour, ``${HH}`` which is extracted from the ``${STARTDATE}`` variable defined in the relevant ``DA_config`` file. -Prior to ingesting the GHCN IODA files via the LETKF at the DA analysis time, the observations are further quality controlled and checked using ``letkf_land.yaml`` (itself a concatenation of ``GHCN.yaml`` and ``letkfoi_snow.yaml``; see the `GitHub yaml files `__ for more detail). The GHCN-specific observation filters, domain checks, and quality control parameters from ``GHCN.yaml`` ensure that only snow depth observations which meet specific criteria are assimilated (the rest are rejected). The contents of ``GHCN.yaml`` are listed below: +Prior to ingesting the GHCN IODA files via the LETKF at the DA analysis time, the observations are further quality controlled and checked using ``letkf_land.yaml`` (itself a concatenation of ``GHCN.yaml`` and ``letkfoi_snow.yaml``; see the `GitHub yaml files `_ for more detail). The GHCN-specific observation filters, domain checks, and quality control parameters from ``GHCN.yaml`` ensure that only snow depth observations which meet specific criteria are assimilated (the rest are rejected). The contents of ``GHCN.yaml`` are listed below: .. code-block:: yaml diff --git a/doc/source/CustomizingTheWorkflow/Model.rst b/doc/source/CustomizingTheWorkflow/Model.rst index 48989911..d9ac52d1 100644 --- a/doc/source/CustomizingTheWorkflow/Model.rst +++ b/doc/source/CustomizingTheWorkflow/Model.rst @@ -15,11 +15,11 @@ Input Files The UFS land model requires multiple input files to run, including static datasets (fix files containing climatological information, terrain, and land use data), initial conditions files, and forcing files. Users may reference the `Community Noah-MP User's -Guide `__ +Guide `_ for a detailed technical description of certain elements of the Noah-MP model. In both the land component and land driver implementations of Noah-MP, static file(s) and initial conditions file(s) specify model parameters. -These files are publicly available via the `Land DA data bucket `__. +These files are publicly available via the `Land DA data bucket `_. Users can download the data and untar the file via the command line: .. _TarFile: @@ -64,7 +64,7 @@ Input Files for the ``DATM`` + ``LND`` Configuration with GSWP3 data With the integration of the UFS Noah-MP land component into the Land DA System in the v1.2.0 release, model forcing options have been enhanced so that users can run the UFS land component (:term:`LND`) with the data atmosphere component (:term:`DATM`). Updates provide a new analysis option on the cubed-sphere native grid using :term:`GSWP3` forcing data to run a single-day experiment for 2000-01-03. An artificial GHCN snow depth observation is provided for data assimilation (see :numref:`Section %s ` for more on GHCN files). The GHCN observations will be extended in the near future. A new configuration setting file is also provided (``settings_DA_cycle_gswp3``). -On Level 1 platforms, the requisite data is pre-staged at the locations listed in :numref:`Section %s `. The data are also publicly available via the `Land DA Data Bucket `__. +On Level 1 platforms, the requisite data is pre-staged at the locations listed in :numref:`Section %s `. The data are also publicly available via the `Land DA Data Bucket `_. .. attention:: @@ -73,7 +73,7 @@ On Level 1 platforms, the requisite data is pre-staged at the locations listed i Forcing Files --------------- -:term:`Forcing files` for the land component configuration come from the Global Soil Wetness Project Phase 3 (`GSWP3 `__) dataset. They are located in the ``inputs/UFS_WM/DATM_GSWP3_input_data`` directory (downloaded :ref:`above `). +:term:`Forcing files` for the land component configuration come from the Global Soil Wetness Project Phase 3 (`GSWP3 `_) dataset. They are located in the ``inputs/UFS_WM/DATM_GSWP3_input_data`` directory (downloaded :ref:`above `). .. code-block:: console @@ -96,7 +96,7 @@ Noah-MP Initial Conditions The offline Land DA System currently only supports snow DA. The initial conditions files include the initial state variables that are required for the UFS land snow DA to begin a cycling run. The data must be provided in :term:`netCDF` format. -By default, on Level 1 systems and in the Land DA data bucket, the initial conditions files are located at ``inputs/UFS_WM/NOAHMP_IC`` (downloaded :ref:`above `). Each file corresponds to one of the six tiles of the `global FV3 grid `__. +By default, on Level 1 systems and in the Land DA data bucket, the initial conditions files are located at ``inputs/UFS_WM/NOAHMP_IC`` (downloaded :ref:`above `). Each file corresponds to one of the six tiles of the `global FV3 grid `_. .. code-block:: console @@ -184,7 +184,7 @@ the static file (``ufs-land_C96_static_fields.nc``), the initial conditions file (``ufs-land_C96_init_*.nc``), and the model configuration file (``ufs-land.namelist.noahmp``). These files and their parameters are described in the following subsections. -They are publicly available via the `Land DA Data Bucket `__. +They are publicly available via the `Land DA Data Bucket `_. Static File (``ufs-land_C96_static_fields.nc``) ------------------------------------------------- diff --git a/doc/source/conf.py b/doc/source/conf.py index 69bb3f5f..6b21a2a6 100644 --- a/doc/source/conf.py +++ b/doc/source/conf.py @@ -121,6 +121,7 @@ def setup(app): extlinks_detect_hardcoded_links = True extlinks = {'github-docs': ('https://docs.github.com/en/%s', '%s'), + 'jedi': ('https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/1.7.0/%s', '%s'), 'nco': ('https://www.nco.ncep.noaa.gov/idsb/implementation_standards/%s', '%s'), 'rst': ('https://www.sphinx-doc.org/en/master/usage/restructuredtext/%s', '%s'), 'rtd': ('https://readthedocs.org/projects/land-da-workflow/%s', '%s'), From e45555aec1c006c4a82484de2bc7804a1dd7c23c Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Tue, 9 Jul 2024 23:22:41 -0400 Subject: [PATCH 08/49] add Rocoto chapter, misc edits --- doc/source/BackgroundInfo/Introduction.rst | 2 +- .../BackgroundInfo/TechnicalOverview.rst | 25 +-- .../BuildingRunningTesting/BuildRunLandDA.rst | 2 + .../BuildingRunningTesting/TestingLandDA.rst | 4 +- doc/source/Reference/Rocoto.rst | 209 ++++++++++++++++++ doc/source/Reference/index.rst | 1 + doc/source/conf.py | 2 +- 7 files changed, 220 insertions(+), 25 deletions(-) create mode 100644 doc/source/Reference/Rocoto.rst diff --git a/doc/source/BackgroundInfo/Introduction.rst b/doc/source/BackgroundInfo/Introduction.rst index e81ea197..4e09e26f 100644 --- a/doc/source/BackgroundInfo/Introduction.rst +++ b/doc/source/BackgroundInfo/Introduction.rst @@ -5,7 +5,7 @@ Introduction **************** This User's Guide provides guidance for running the Unified Forecast System -(:term:`UFS`) offline Land Data Assimilation (DA) System. Land DA is an offline version of the Noah Multi-Physics (Noah-MP) land surface model (LSM) used in the `UFS Weather Model `__ (WM). Its data assimilation framework uses +(:term:`UFS`) offline Land Data Assimilation (DA) System. Land DA is an offline version of the Noah Multi-Physics (Noah-MP) land surface model (LSM) used in the `UFS Weather Model `_ (WM). Its data assimilation framework uses the Joint Effort for Data assimilation Integration (:term:`JEDI`) software. Currently, the offline UFS Land DA System only works with snow data. Thus, this User's Guide focuses primarily on the snow DA process. diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst index 345b17bf..6271650e 100644 --- a/doc/source/BackgroundInfo/TechnicalOverview.rst +++ b/doc/source/BackgroundInfo/TechnicalOverview.rst @@ -32,12 +32,12 @@ The Land DA System requires: * Python * :term:`NetCDF` * Lmod - * `spack-stack `__ (v1.6.0) - * `jedi-bundle `__ (|skylabv|) + * `spack-stack `_ (v1.6.0) + * `jedi-bundle `_ (|skylabv|) These software prerequisites are pre-installed in the Land DA :term:`container` and on other Level 1 systems (see :ref:`below ` for details). However, users on non-Level 1 systems will need to install them. -Before using the Land DA container, users will need to install `Singularity/Apptainer `__ and an **Intel** MPI (available `free here `__). +Before using the Land DA container, users will need to install `Singularity/Apptainer `_ and an **Intel** MPI (available `free here `_). .. _LevelsOfSupport: @@ -81,8 +81,6 @@ Preconfigured (Level 1) systems for Land DA already have the required external l - - /opt/spack-stack/ (inside the container) - /opt/jedi-bundle (inside the container) -.. COMMENT: Update paths! - Level 2-4 Systems =================== @@ -184,21 +182,6 @@ The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operatio ├── LICENSE └── README.md - -.. COMMENT: Remove to other sections - ├── datm_cdeps_lnd_gswp3_rst - ├── do_submit_cycle.sh - ├── fv3_run - ├── incdate.sh - ├── land_mods - ├── module_check.sh - ├── release.environment - ├── run_container_executable.sh - ├── settings_DA_* - └── submit_cycle.sh - -.. COMMENT: Update dir structure! - :numref:`Table %s ` describes the contents of the most important Land DA subdirectories. :numref:`Section %s ` describes the Land DA System components. Users can reference the :nco:`NCO Implementation Standards ` (p. 19) for additional details on repository structure in NCO-compliant repositories. .. _Subdirectories: @@ -213,6 +196,8 @@ The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operatio - Repository documentation * - exec - Binary executables + * - fix + - Location of fix/static files * - jobs - :term:`J-job ` scripts launched by Rocoto * - lib diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 7fc5bc3a..130450aa 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -182,6 +182,8 @@ Workflow Overview Each Land DA experiment includes multiple tasks that must be run in order to satisfy the dependencies of later tasks. These tasks are housed in the :term:`J-job ` scripts contained in the ``jobs`` directory. +.. _WorkflowTasksTable: + .. list-table:: *J-job Tasks in the Land DA Workflow* :header-rows: 1 diff --git a/doc/source/BuildingRunningTesting/TestingLandDA.rst b/doc/source/BuildingRunningTesting/TestingLandDA.rst index ba7d3cd6..434c4185 100644 --- a/doc/source/BuildingRunningTesting/TestingLandDA.rst +++ b/doc/source/BuildingRunningTesting/TestingLandDA.rst @@ -31,7 +31,7 @@ This will submit an interactive job, load the appropriate modulefiles, and run t Tests ******* -The ERA5 CTests test the operability of seven major elements of the Land DA System: ``vector2tile``, ``create_ens``, ``letkfoi_snowda``, ``apply_jediincr``, ``tile2vector``, ``land_driver``, and ``ufs_datm_land``. The tests and their dependencies are listed in the ``land-DA_workflow/test/CMakeLists.txt`` file. Currently, the CTests are only run on Hera and Orion; they cannot yet be run via container. +The ERA5 CTests test the operability of six major elements of the Land DA System: ``vector2tile``, ``create_ens``, ``letkfoi_snowda``, ``apply_jediincr``, ``tile2vector``, and ``ufs_datm_land``. The tests and their dependencies are listed in the ``land-DA_workflow/test/CMakeLists.txt`` file. Currently, the CTests are only run on Hera and Orion; they cannot yet be run via container. .. list-table:: *Land DA CTests* :widths: 20 50 @@ -49,7 +49,5 @@ The ERA5 CTests test the operability of seven major elements of the Land DA Syst - Tests the ability to add a JEDI increment. * - ``test_tile2vector`` - Tests the tile-to-vector function for use in ``ufs-land-driver`` - * - ``test_land_driver`` - - Tests proper functioning of ``ufs-land-driver`` * - ``test_ufs_datm_land`` - Tests proper functioning of the UFS land model (``ufs-datm-lnd``) diff --git a/doc/source/Reference/Rocoto.rst b/doc/source/Reference/Rocoto.rst new file mode 100644 index 00000000..8bcb1b43 --- /dev/null +++ b/doc/source/Reference/Rocoto.rst @@ -0,0 +1,209 @@ +.. _RocotoInfo: + +================================== +Rocoto Introductory Information +================================== +The tasks in the Land DA System are typically run using the Rocoto Workflow Manager (see :numref:`Table %s ` for default tasks). Rocoto is a Ruby program that communicates with the batch system on an :term:`HPC` system to run and manage dependencies between the tasks. Rocoto submits jobs to the HPC batch system as the task dependencies allow and runs one instance of the workflow for a set of user-defined :term:`cycles `. More information about Rocoto can be found on the `Rocoto Wiki `_. + +The Land DA workflow is defined in a Jinja-enabled Rocoto XML template called ``land_analysis.xml``, which is generated using the contents of ``land_analysis.yaml`` as input to the Unified Workflow's :uw:`Rocoto tool `. Both files reside in the ``parm`` directory. The completed XML file contains the workflow task names, parameters needed by the job scheduler, and task interdependencies. + +There are a number of Rocoto commands available to run and monitor the workflow; users can find more information in the complete `Rocoto documentation `_. Descriptions and examples of commonly used commands are discussed below. + +.. _RocotoRunCmd: + +rocotorun +========== + +The ``rocotorun`` command is used to run the workflow by submitting tasks to the batch system. It will automatically resubmit failed tasks and can recover from system outages without user intervention. The command takes the following format: + +.. code-block:: console + + rocotorun -w /path/to/workflow/xml/file -d /path/to/workflow/database/file -v 10 + +where + +* ``-w`` specifies the name of the workflow definition file. This must be an XML file. +* ``-d`` specifies the name of the database file that stores the state of the workflow. The database file is a binary file created and used only by Rocoto. It does not need to exist when the command is initially run. +* ``-v`` (optional) specified level of verbosity. If no level is specified, a level of 1 is used. + +From the ``parm`` directory, the ``rocotorun`` command for the workflow would be: + +.. code-block:: console + + rocotorun -w land_analysis.xml -d land_analysis.db + +Users will need to include the absolute or relative path to these files when running the command from another directory. + +It is important to note that the ``rocotorun`` process is iterative; the command must be executed many times before the entire workflow is completed, usually every 1-10 minutes. This command can be placed in the user's :term:`crontab`, and cron will call it with the specified frequency. More information on this command can be found in the `Rocoto documentation `_. + +The first time the ``rocotorun`` command is executed for a workflow, the files ``land_analysis.db`` and ``land_analysis_lock.db`` are created. There is usually no need for the user to modify these files. Each time the ``rocotorun`` command is executed, the last known state of the workflow is read from the ``land_analysis.db`` file, the batch system is queried, jobs are submitted for tasks whose dependencies have been satisfied, and the current state of the workflow is saved in ``land_analysis.db``. If there is a need to relaunch +the workflow from scratch, both database files can be deleted, and the workflow can be run by executing the ``rocotorun`` command +or the launch script (``launch_rocoto_wflow.sh``) multiple times. + +.. _RocotoStatCmd: + +rocotostat +=========== + +``rocotostat`` is a tool for querying the status of tasks in an active Rocoto workflow. Once the workflow has been started with the ``rocotorun`` command, Rocoto can check the status of the workflow using the ``rocotostat`` command: + +.. code-block:: console + + rocotostat -w /path/to/workflow/xml/file -d /path/to/workflow/database/file + +Concretely, this will look like: + +.. code-block:: console + + rocotostat -w land_analysis.xml -d land_analysis.db + +Executing this command will generate a workflow status table similar to the following: + +.. code-block:: console + + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ========================================================================================================= + 200001030000 prep_obs 61746064 QUEUED - 1 0.0 + 200001030000 pre_anal druby://10.184.3.62:41973 SUBMITTING - 1 0.0 + 200001030000 analysis - - - - - + 200001030000 post_anal - - - - - + 200001030000 plot_stats - - - - - + 200001030000 forecast - - - - - + ================================================================================================================================ + 200001040000 prep_obs druby://10.184.3.62:41973 SUBMITTING - 1 0.0 + 200001040000 pre_anal - - - - - + 200001040000 analysis - - - - - + 200001040000 post_anal - - - - - + 200001040000 plot_stats - - - - - + 200001040000 forecast - - - - - + +This table indicates that the ``prep_obs`` task for cycle 200001030000 was sent to the batch system and is now queued, while the ``pre_anal`` task for cycle 200001030000 and the ``prep_obs`` task for cycle 200001040000 are currently being submitted to the batch system. + +Note that issuing a ``rocotostat`` command without an intervening ``rocotorun`` command will not result in an updated workflow status table; it will print out the same table. It is the ``rocotorun`` command that updates the workflow database file (in this case ``land_analysis.db``, located in ``parm``). The ``rocotostat`` command reads the database file and prints the table to the screen. To see an updated table, the ``rocotorun`` command must be executed first, followed by the ``rocotostat`` command. + +After issuing the ``rocotorun`` command several times (over the course of several minutes or longer, depending on the grid size and computational resources available), the output of the ``rocotostat`` command should look like this: + +.. code-block:: console + + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ============================================================================================ + 200001030000 prep_obs 18347451 SUCCEEDED 0 1 3.0 + 200001030000 pre_anal 18347452 SUCCEEDED 0 1 5.0 + 200001030000 analysis 18347525 SUCCEEDED 0 1 65.0 + 200001030000 post_anal 18347558 SUCCEEDED 0 1 10.0 + 200001030000 plot_stats 18347559 SUCCEEDED 0 1 73.0 + 200001030000 forecast 18347562 SUCCEEDED 0 1 103.0 + ========================================================================================== + 200001040000 prep_obs 18347453 SUCCEEDED 0 1 3.0 + 200001040000 pre_anal 18347568 SUCCEEDED 0 1 4.0 + 200001040000 analysis 18347584 SUCCEEDED 0 1 70.0 + 200001040000 post_anal 18347591 SUCCEEDED 0 1 4.0 + 200001040000 plot_stats 18347592 SUCCEEDED 0 1 48.0 + 200001040000 forecast 18347593 RUNNING - 1 0.0 + +When the workflow runs to completion, all tasks will be marked as SUCCEEDED. The log file for each task is located in ``$LANDDAROOT/ptmp/test/com/output/logs/run_``, where ```` is either ``gswp3`` or ``era5``. If any task fails, the corresponding log file can be checked for error messages. Optional arguments for the ``rocotostat`` command can be found in the `Rocoto documentation `_. + +.. _rocotocheck: + +rocotocheck +============ +Sometimes, issuing a ``rocotorun`` command will not cause the next task to launch. ``rocotocheck`` is a tool that can be used to query detailed information about a task or cycle in the Rocoto workflow. To determine why a particular task has not been submitted, the ``rocotocheck`` command can be used from the ``parm`` directory as follows: + +.. code-block:: console + + rocotocheck -w land_analysis.xml -d land_analysis.db -c -t + +where + +* ``-c`` is the cycle to query in YYYYMMDDHHmm format. +* ``-t`` is the task name (e.g., ``prep_obs``, ``analysis``, ``forecast``). + +The cycle and task names appear in the first and second columns of the table output by ``rocotostat``. Users will need to include the absolute or relative path to the workflow XML and database files when running the command from another directory. + +A specific example is: + +.. code-block:: console + + rocotocheck -w /Users/John.Doe/landda/land-DA_workflow/parm/land_analysis.xml -d /Users/John.Doe/landda/land-DA_workflow/parm/land_analysis.db -v 10 -c 200001040000 -t analysis + +Running ``rocotocheck`` will result in output similar to the following: + +.. code-block:: console + :emphasize-lines: 9,34,35,47 + + Task: analysis + account: epic + command: /work/noaa/epic/$USER/landda/land-DA_workflow/parm/task_load_modules_run_jjob.sh "analysis" "/work/noaa/epic/$USER/landda/land-DA_workflow" "orion" + cores: 6 + cycledefs: cycled + final: false + jobname: analysis + join: /work/noaa/epic/$USER/landda/ptmp/test/com/output/logs/run_gswp3/analysis_2000010400.log + maxtries: 2 + name: analysis + queue: batch + throttle: 9999999 + walltime: 00:15:00 + environment + ACCOUNT ==> epic + ATMOS_FORC ==> gswp3 + COMROOT ==> /work/noaa/epic/$USER/landda/ptmp/test/com + DATAROOT ==> /work/noaa/epic/$USER/landda/ptmp/test/tmp + DAtype ==> letkfoi_snow + EXP_NAME ==> LETKF + HOMElandda ==> /work/noaa/epic/$USER/landda/land-DA_workflow + JEDI_INSTALL ==> /work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6 + KEEPDATA ==> YES + MACHINE ==> orion + NPROCS_ANALYSIS ==> 6 + OBS_TYPES ==> GHCN + PDY ==> 20000104 + RES ==> 96 + SCHED ==> slurm + SNOWDEPTHVAR ==> snwdph + TSTUB ==> oro_C96.mx100 + cyc ==> 00 + model_ver ==> v1.2.1 + dependencies + pre_anal of cycle 200001040000 is SUCCEEDED + + Cycle: 200001040000 + Valid for this task: YES + State: active + Activated: 2024-07-05 17:44:40 UTC + Completed: - + Expired: - + + Job: 18347584 + State: DEAD (FAILED) + Exit Status: 1 + Tries: 2 + Unknown count: 0 + Duration: 70.0 + +This output shows that although all dependencies for this task are satisfied (see the dependencies section, highlighted above), it cannot run because its ``maxtries`` value (highlighted) is 2. Rocoto will attempt to launch it at most 2 times, and it has already been tried 2 times (note the ``Tries`` value, also highlighted). + +The output of the ``rocotocheck`` command is often useful in determining whether the dependencies for a given task have been met. If not, the dependencies section in the output of ``rocotocheck`` will indicate this by stating that a dependency "is NOT satisfied". + +rocotorewind +============= +``rocotorewind`` is a tool that attempts to undo the effects of running a task. It is commonly used to rerun part of a workflow that has failed. If a task fails to run (the STATE is DEAD) and needs to be restarted, the ``rocotorewind`` command will rerun tasks in the workflow. The command line options are the same as those described for ``rocotocheck`` (in :numref:`Section %s `), and the general usage statement looks like this: + +.. code-block:: console + + rocotorewind -w /path/to/workflow/xml/file -d /path/to/workflow/database/file -c -t + +Running this command will edit the Rocoto database file ``land_analysis.db`` to remove evidence that the job has been run. ``rocotorewind`` is recommended over ``rocotoboot`` for restarting a task, since ``rocotoboot`` will force a specific task to run, ignoring all dependencies and throttle limits. The throttle limit, denoted by the variable ``cyclethrottle`` in the ``land_analysis.xml`` file, limits how many cycles can be active at one time. An example of how to use the ``rocotorewind`` command to rerun the forecast task from ``parm`` is: + +.. code-block:: console + + rocotorewind -w land_analysis.xml -d land_analysis.db -v 10 -c 200001040000 -t forecast + +rocotoboot +=========== +``rocotoboot`` will force a specific task of a cycle in a Rocoto workflow to run. All dependencies and throttle limits are ignored, and it is generally recommended to use ``rocotorewind`` instead. An example of how to use this command to rerun the ``prep_obs`` task from ``parm`` is: + +.. code-block:: console + + rocotoboot -w land_analysis.xml -d land_analysis.db -v 10 -c 200001040000 -t prep_obs + diff --git a/doc/source/Reference/index.rst b/doc/source/Reference/index.rst index 82b5f7b5..44baaef0 100644 --- a/doc/source/Reference/index.rst +++ b/doc/source/Reference/index.rst @@ -6,5 +6,6 @@ Reference .. toctree:: :maxdepth: 3 + Rocoto FAQ Glossary diff --git a/doc/source/conf.py b/doc/source/conf.py index 6b21a2a6..7f74ca28 100644 --- a/doc/source/conf.py +++ b/doc/source/conf.py @@ -113,7 +113,6 @@ def setup(app): intersphinx_mapping = { 'jedi': ('https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/1.7.0', None), 'spack-stack': ('https://spack-stack.readthedocs.io/en/1.3.0/', None), - 'ufs-wm': ('https://ufs-weather-model.readthedocs.io/en/develop/', None), 'gswp3': ('https://hydro.iis.u-tokyo.ac.jp/GSWP3/', None), } @@ -127,5 +126,6 @@ def setup(app): 'rtd': ('https://readthedocs.org/projects/land-da-workflow/%s', '%s'), 'land-wflow-repo': ('https://github.com/ufs-community/land-DA_workflow/%s', '%s'), 'land-wflow-wiki': ('https://github.com/ufs-community/land-DA_workflow/wiki/%s','%s'), + 'ufs-wm': ('https://ufs-weather-model.readthedocs.io/en/develop/%s', '%s'), 'uw': ('https://uwtools.readthedocs.io/en/main/%s', '%s'), } From 02f3246fdc6d231eb4aa8be03863fb11ffb49efc Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 10 Jul 2024 22:30:59 -0400 Subject: [PATCH 09/49] rm workflow section in DA chapter --- .../CustomizingTheWorkflow/DASystem.rst | 223 ------------------ 1 file changed, 223 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index 7c7a592d..ac742e93 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -893,226 +893,3 @@ Example of ``${FILEDATE}.coupler.res``: 2019 12 22 0 0 0 Model start time: year, month, day, hour, minute, second 2019 12 22 0 0 0 Current model time: year, month, day, hour, minute, second -DA Workflow Overview -************************ - -The cycling Noah-MP offline DA run is initiated using ``do_submit_cycle.sh`` to call the ``submit_cycle.sh`` script. ``submit_cycle.sh`` calls a third script (``do_landDA.sh``) if DA has been activated in the experiment. - -.. note:: - - The offline Noah-MP model runs in vector space, whereas a cycling Noah-MP offline DA job uses JEDI's tiled cubed-sphere (:term:`FV3`) format. :numref:`Section %s ` describes the Vector-to-Tile Converter that maps between these two formats. - -``do_submit_cycle.sh`` -========================= - -The ``do_submit_cycle.sh`` script sets up the cycling job based on the user's input settings. :numref:`Figure %s ` illustrates the steps in this process. - -.. _DoSubmitCyclePng: - -.. figure:: https://github.com/ufs-community/land-DA_workflow/wiki/do_submit_cycle.png - :alt: The do submit cycle shell script starts by loading configuration files and modules. Then it proceeds to set executables, read in dates, compute forecast run variables, and setup work and output directories for the model. If a restart directory is available in the model output directory, it creates the dates file and submits the submit cycle shell script. If there is no output file, the script will try to copy restart files from an initial conditions directory before creating the dates file and submitting the submit cycle shell script. - - *Flowchart of 'do_submit_cycle.sh'* - -First, ``do_submit_cycle.sh`` reads in a configuration file for the cycle settings. This file contains the information required to run the cycle: the experiment name, start date, end date, the paths of the working directory (i.e., ``workdir``) and output directories, the length of each forecast, atmospheric forcing data, the Finite-Volume Cubed-Sphere Dynamical Core (:term:`FV3`) resolution and its paths, the number of cycles per job, the directory with initial conditions, a namelist file for running Land DA, and different DA options. Then, the required modules are loaded, and some executables are set for running the cycle. The restart frequency and running day/hours are computed from the inputs, and directories are created for running DA and saving the DA outputs. If restart files are not in the experiment output directory, the script will try to copy the restart files from the ``ICSDIR`` directory, which should contain initial conditions files if restart files are not available. Finally, the script creates the dates file (``analdates.sh``) and submits the ``submit_cycle.sh`` script, which is described in detail in the next section. - - -``submit_cycle.sh`` -====================== - -The ``submit_cycle.sh`` script first exports the required paths and loads the required modules. Then, it reads the dates file and runs through all the steps for submitting a cycle if the count of dates is less than the number of cycles per job (see :numref:`Figure %s `). - -.. _SubmitCyclePng: - -.. figure:: https://github.com/ufs-community/land-DA_workflow/wiki/submit_cycle.png - :alt: The submit cycle shell script starts by exporting paths and loading required modules. Then it starts a loop for the cycle. For each cycle in the experiment, it gets the data assimilation settings and date/time info; computes the restart frequency, run days, and run hours; and copies the restarts into the work directory. If the user is running a DA experiment, the script updates and submits the vector to tile namelists and submits the snow data assimilation. Then it submits the tile to vector namelists and saves the analysis restart. Regardless of whether DA is being used, the script runs the forecast model, updates the model namelist, and submits the land surface model. It checks existing model output, and then the loop ends. If there are more cycles to run, the script will run through this loop until they are complete. - - *Flowchart of 'submit_cycle.sh'* - -As the script loops through the steps in the process for each cycle, it reads in the DA settings and selects a run type --- either DA or ``openloop`` (which skips DA). Required DA settings include: DA algorithm choice, directory paths for JEDI, Land_DA (where the ``do_landDA.sh`` script is located), JEDI's input observation options, DA window length, options for constructing ``.yaml`` files, etc. - -Next, the system designates work and output directories and copies restart files into the working directory. If the DA option is selected, the script calls the ``vector2tile`` function and tries to convert the format of the Noah-MP model in vector space to the JEDI tile format used in :term:`FV3` cubed-sphere space. After the ``vector2tile`` is done, the script calls the data assimilation job script (``do_landDA.sh``) and runs this script. Then, the ``tile2vector`` function is called and converts the JEDI output tiles back to vector format. The converted vector outputs are saved, and the forecast model is run. Then, the script checks the existing model outputs. Finally, if the current date is less than the end date, this same procedure will be processed for the next cycle. - -.. note:: - - The |latestr| release of Land DA does not support ensemble runs. Thus, the first ensemble member (``mem000``) is the only ensemble member. - -Here is an example of a configuration settings file, ``settings_DA_cycle_era5``, for the ``submit_cycle.sh`` script: - -.. code-block:: console - - # experiment name - export exp_name=DA_ERA5_test - #export BASELINE=hera.internal - - STARTDATE=2019122100 - ENDDATE=2019122200 - - # Get commmon variables - source ./release.environment - ############################ - - #forcing options: gswp3, era5 - export atmos_forc=era5 - - # for LETKF, this is size of ensemble. - # for LETKF-OI pseudo ensemble, or non-ensemble runs use 1 - export ensemble_size=1 - - # length of each forecast - export FCSTHR=24 - - #FV3 resolution - export RES=96 - if [[ $BASELINE =~ 'hera.internal' ]]; then - export TPATH=/scratch2/NCEPDEV/land/data/fix/C96.mx100_frac/ - else - export TPATH="$LANDDA_INPUTS/forcing/${atmos_forc}/orog_files/" - fi - export TSTUB="oro_C96.mx100" # file stub for orography files in $TPATH - # oro_C${RES} for atm only, oro_C${RES}.mx100 for atm/ocean. - - # number of cycles to submit in a single job - export cycles_per_job=1 - - # directory with initial conditions - # can find some here:/scratch2/BMC/gsienkf/Clara.Draper/DA_test_cases/land-offline_workflow/offline_ICS/single - export ICSDIR=$LANDDAROOT/inputs/forcing/${atmos_forc}/orog_files/ - - # namelist for do_landDA.sh - # set to "openloop" to not call do_landDA.sh - export DA_config="settings_DA_test" - - # if want different DA at different times, list here. - export DA_config00=${DA_config} - export DA_config06=${DA_config} - export DA_config12=${DA_config} - export DA_config18=${DA_config} - - -Parameters for ``submit_cycle.sh`` -------------------------------------- - -``exp_name`` - Specifies the name of experiment. - -``STARTDATE`` - Specifies the experiment start date. The form is YYYYMMDDHH, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour. - -``ENDDATE`` - Specifies the experiment end date. The form is YYYYMMDDHH, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour. - -``WORKDIR`` - Specifies the path to a temporary directory from which the experiment is run. - -``OUTDIR`` - Specifies the path to a directory where experiment output is saved. - -``ensemble_size`` - Specifies the size of the ensemble (i.e., number of ensemble members). Use ``1`` for non-ensemble runs. - -``FCSTHR`` - Specifies the length of each forecast in hours. - -``atmos_forc`` - Specifies the name of the atmospheric forcing data. Valid values include: ``gdas`` | ``era5`` - -``RES`` - Specifies the resolution of FV3. Valid values: ``C96`` - - .. note:: - - Other resolutions are possible but not supported for this release. - -``TPATH`` - Specifies the path to the directory containing the orography files. - -``TSTUB`` - Specifies the file stub/name for orography files in ``TPATH``. This file stub is named ``oro_C${RES}`` for atmosphere-only orography files and ``oro_C{RES}.mx100`` for atmosphere and ocean orography files. - -``cycles_per_job`` - Specifies the number of cycles to submit in a single job. - -``ICSDIR`` - Specifies the path to a directory containing initial conditions data. - -``DA_config`` - Configuration setting file for ``do_landDA.sh``. Set ``DA_config`` to ``openloop`` to skip data assimilation (and prevent a call ``do_landDA.sh``). - -``DA_configXX`` - Configuration setting file for ``do_landDA.sh`` at ``XX`` hr. If users want to perform DA experiment at different times, list these in the configuration setting file. Set to ``openloop`` to skip data assimilation (and prevent a call ``do_landDA.sh``). - -``do_landDA.sh`` -=================== - -The ``do_landDA.sh`` runs the data assimilation job inside the ``sumbit_cycle.sh`` script. Currently, the only DA option is the snow Local Ensemble Transform Kalman Filter-Optimal Interpolation (LETKF-OI, :cite:t:`FrolovEtAl2022`, 2022). :numref:`Figure %s ` describes the workflow of this script. - -.. _DoLandDAPng: - -.. figure:: https://github.com/ufs-community/land-DA_workflow/wiki/do_landDA.png - :alt: The do land da shell script starts by reading in the configuration file and setting up directories. Then it formats date strings, stages restart files, and prepares the observation files. It constructs yaml files based on requested JEDI type and then proceeds to create the background ensembles using LETKF-OI. Next, the script runs JEDI and applies the increment to use restarts. Lastly, it performs clean-up operations. - - *Flowchart of 'do_landDA.sh'* - -First, to run the DA job, ``do_landDA.sh`` reads in the configuration file and sets up the directories. The date strings are formatted for the current date and previous date. For each tile, restarts are staged to apply the JEDI update. In this stage, all files will get directly updated. Then, the observation files are read and prepared for this job. Once the JEDI type is determined, ``.yaml`` files are constructed. Note that if the user specifies a ``.yaml`` file, the script uses that one. Otherwise, the script builds the ``.yaml`` files. For LETKF-OI, a pseudo-ensemble is created by running the python script (``letkf_create_ens.py``). Once the ensemble is created, the script runs JEDI and applies increment to UFS restarts. - -Below, users can find an excerpt of a configuration settings file, ``settings_DA_template``, for the ``do_landDA.sh`` script: - -.. code-block:: console - - LANDDADIR=${CYCLEDIR}/DA_update/ - - ############################ - # DA options - - OBS_TYPES=("GHCN") - JEDI_TYPES=("DA") - - WINLEN=24 - - # YAMLS - YAML_DA=construct - YAML_HOFX=construct - - # JEDI DIRECTORIES - #JEDI_EXECDIR= # JEDI FV3 build directory - #IODA_BUILD_DIR= # JEDI IODA-converter source directory - #fv3bundle_vn= # date for JEDI fv3 bundle checkout (used to select correct yaml) - -``LANDDADIR`` - Specifies the path to the ``do_landDA.sh`` script. - -``OBS_TYPES`` - Specifies the observation type. Format is "Obs1" "Obs2". Currently, only GHCN observation data is available. - -``JEDI_TYPES`` - Specifies the JEDI call type for each observation type above. Format is "type1" "type2". Valid values: ``DA`` | ``HOFX`` - - +--------+--------------------------------------------------------+ - | Value | Description | - +========+========================================================+ - | DA | Data assimilation | - +--------+--------------------------------------------------------+ - | HOFX | A generic application for running the model forecast | - | | (or reading forecasts from file) and computing H(x) | - +--------+--------------------------------------------------------+ - -``WINLEN`` - Specifies the DA window length. It is generally the same as the ``FCSTLEN``. - -``YAML_DA`` - Specifies whether to construct the ``.yaml`` name based on requested observation types and their availabilities. Valid values: ``construct`` | *desired YAML name* - - +--------------------+--------------------------------------------------------+ - | Value | Description | - +====================+========================================================+ - | construct | Enable constructing the YAML | - +--------------------+--------------------------------------------------------+ - | desired YAML name | Will not test for availability of observations | - +--------------------+--------------------------------------------------------+ - -``JEDI_EXECDIR`` - Specifies the JEDI FV3 build directory. If using different JEDI version, users will need to edit the ``.yaml`` files with the desired directory path. - -``fv3bundle_vn`` - Specifies the date for JEDI ``fv3-bundle`` checkout (used to select correct ``.yaml``). From 5c8a82dcf4f8909e85ed6ed88f00dc6cc6ad2d49 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 10 Jul 2024 22:31:45 -0400 Subject: [PATCH 10/49] add misc --- .../BuildingRunningTesting/BuildRunLandDA.rst | 8 ++--- .../BuildingRunningTesting/Container.rst | 32 ++++++++++++------- 2 files changed, 25 insertions(+), 15 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 130450aa..3a45316e 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -6,10 +6,8 @@ Land DA Workflow (Hera & Orion) This chapter provides instructions for building and running basic Land DA cases for the Unified Forecast System (:term:`UFS`) Land DA System. Users can choose between two supported options: - * A Jan. 3-4, 2000 00z sample case using GSWP3 data with the UFS Noah-MP land component - * A Dec. 21-22, 2019 00z sample case using ERA5 data with the UFS Land Driver - - + * A Jan. 3-4, 2000 00z sample case using :term:`GSWP3` data with the UFS Noah-MP land component + * A Dec. 21-22, 2019 00z sample case using :term:`ERA5` data with the UFS Land Driver .. attention:: @@ -291,6 +289,8 @@ Check Experiment Output As the experiment progresses, it will generate a number of directories to hold intermediate and output files. The directory structure for those files and directories appears below: +.. _land-da-dir-structure: + .. code-block:: console $LANDDAROOT: Base directory diff --git a/doc/source/BuildingRunningTesting/Container.rst b/doc/source/BuildingRunningTesting/Container.rst index 968ef26d..ad2332f2 100644 --- a/doc/source/BuildingRunningTesting/Container.rst +++ b/doc/source/BuildingRunningTesting/Container.rst @@ -4,12 +4,12 @@ Containerized Land DA Workflow ********************************** -These instructions will help users build and run a basic case for the Unified Forecast System (:term:`UFS`) Land Data Assimilation (DA) System using a `Singularity/Apptainer `__ container. The Land DA :term:`container` packages together the Land DA System with its dependencies (e.g., :term:`spack-stack`, :term:`JEDI`) and provides a uniform environment in which to build and run the Land DA System. Normally, the details of building and running Earth systems models will vary based on the computing platform because there are many possible combinations of operating systems, compilers, :term:`MPIs `, and package versions available. Installation via Singularity/Apptainer container reduces this variability and allows for a smoother experience building and running Land DA. This approach is recommended for users not running Land DA on a supported :ref:`Level 1 ` system (i.e., Hera, Orion). +These instructions will help users build and run a basic case for the Unified Forecast System (:term:`UFS`) Land Data Assimilation (DA) System using a `Singularity/Apptainer `_ container. The Land DA :term:`container` packages together the Land DA System with its dependencies (e.g., :term:`spack-stack`, :term:`JEDI`) and provides a uniform environment in which to build and run the Land DA System. Normally, the details of building and running Earth systems models will vary based on the computing platform because there are many possible combinations of operating systems, compilers, :term:`MPIs `, and package versions available. Installation via Singularity/Apptainer container reduces this variability and allows for a smoother experience building and running Land DA. This approach is recommended for users not running Land DA on a supported :ref:`Level 1 ` system (i.e., Hera, Orion). This chapter provides instructions for building and running basic Land DA cases for the Unified Forecast System (:term:`UFS`) Land DA System. Users can choose between two options: - * A Dec. 21, 2019 00z sample case using :term:`ERA5` data with the UFS Land Driver (``settings_DA_cycle_era5``) - * A Jan. 3, 2000 00z sample case using :term:`GSWP3` data with the UFS Noah-MP land component (``settings_DA_cycle_gswp3``). + * A Jan. 3-4, 2000 00z sample case using :term:`GSWP3` data with the UFS Noah-MP land component + * A Dec. 21-22, 2019 00z sample case using :term:`ERA5` data with the UFS Land Driver .. attention:: @@ -23,8 +23,8 @@ Prerequisites The containerized version of Land DA requires: * `Installation of Apptainer `__ - * At least 6 CPU cores - * An **Intel** compiler and :term:`MPI` (available for free `here `__) + * At least 7 CPU cores + * An **Intel** compiler and :term:`MPI` (available for free `here `_) Install Singularity/Apptainer @@ -32,9 +32,9 @@ Install Singularity/Apptainer .. note:: - As of November 2021, the Linux-supported version of Singularity has been `renamed `__ to *Apptainer*. Apptainer has maintained compatibility with Singularity, so ``singularity`` commands should work with either Singularity or Apptainer (see compatibility details `here `__.) + As of November 2021, the Linux-supported version of Singularity has been `renamed `_ to *Apptainer*. Apptainer has maintained compatibility with Singularity, so ``singularity`` commands should work with either Singularity or Apptainer (see compatibility details `here `_.) -To build and run Land DA using a Singularity/Apptainer container, first install the software according to the `Apptainer Installation Guide `__. This will include the installation of all dependencies. +To build and run Land DA using a Singularity/Apptainer container, first install the software according to the `Apptainer Installation Guide `_. This will include the installation of all dependencies. .. attention:: Docker containers can only be run with root privileges, and users generally do not have root privileges on :term:`HPCs `. However, a Singularity image may be built directly from a Docker image for use on the system. @@ -58,7 +58,7 @@ For users working on systems with limited disk space in their ``/home`` director where ``/absolute/path/to/writable/directory/`` refers to a writable directory (usually a project or user directory within ``/lustre``, ``/work``, ``/scratch``, or ``/glade`` on NOAA :term:`RDHPCS` systems). If the ``cache`` and ``tmp`` directories do not exist already, they must be created with a ``mkdir`` command. -On NOAA Cloud systems, the ``sudo su`` command may also be required: +On NOAA Cloud systems, the ``sudo su`` command may also be required. For example, users would run: .. code-block:: @@ -70,7 +70,7 @@ On NOAA Cloud systems, the ``sudo su`` command may also be required: exit .. note:: - ``/lustre`` is a fast but non-persistent file system used on NOAA Cloud systems. To retain work completed in this directory, `tar the files `__ and move them to the ``/contrib`` directory, which is much slower but persistent. + ``/lustre`` is a fast but non-persistent file system used on NOAA Cloud systems. To retain work completed in this directory, `tar the files `_ and move them to the ``/contrib`` directory, which is much slower but persistent. .. _ContainerBuild: @@ -93,7 +93,9 @@ where ``/path/to/landda`` is the path to this top-level directory (e.g., ``/User NOAA RDHPCS Systems ---------------------- -On many NOAA :term:`RDHPCS` systems, a container named ``ubuntu20.04-intel-landda-release-public-v1.2.0.img`` has already been built, and users may access the container at the locations in :numref:`Table %s `. +On many NOAA :term:`RDHPCS`, a container named ``ubuntu20.04-intel-landda-release-public-v1.2.0.img`` has already been built, and users may access the container at the locations in :numref:`Table %s `. + +.. COMMENT: Is there a develop container now? .. _PreBuiltContainers: @@ -115,22 +117,30 @@ On many NOAA :term:`RDHPCS` systems, a container named ``ubuntu20.04-intel-landd | Orion/Hercules | /work/noaa/epic/role-epic/contrib/containers | +-----------------+--------------------------------------------------------+ +.. COMMENT: Check container locations. + Users can simply set an environment variable to point to the container: .. code-block:: console export img=path/to/ubuntu20.04-intel-landda-release-public-v1.2.0.img +.. COMMENT: Check container path! + If users prefer, they may copy the container to their local working directory. For example, on Jet: .. code-block:: console cp /mnt/lfs4/HFIP/hfv3gfs/role.epic/containers/ubuntu20.04-intel-landda-release-public-v1.2.0.img . +.. COMMENT: Check container path! + Other Systems ---------------- -On other systems, users can build the Singularity container from a public Docker :term:`container` image or download the ``ubuntu20.04-intel-landda-release-public-v1.2.0.img`` container from the `Land DA Data Bucket `__. Downloading may be faster depending on the download speed on the user's system. However, the container in the data bucket is the ``release/v1.2.0`` container rather than the updated ``develop`` branch container. +On other systems, users can build the Singularity container from a public Docker :term:`container` image or download the ``ubuntu20.04-intel-landda-release-public-v1.2.0.img`` container from the `Land DA Data Bucket `_. Downloading may be faster depending on the download speed on the user's system. However, the container in the data bucket is the ``release/v1.2.0`` container rather than the updated ``develop`` branch container. + +.. COMMENT: Check container name! To download from the data bucket, users can run: From 323e7950c45be2bd7ce09b5019a3eaf9a3790876 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Thu, 11 Jul 2024 14:02:50 -0400 Subject: [PATCH 11/49] minor conf.py updates --- doc/source/conf.py | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/doc/source/conf.py b/doc/source/conf.py index 7f74ca28..f6004f53 100644 --- a/doc/source/conf.py +++ b/doc/source/conf.py @@ -119,9 +119,11 @@ def setup(app): # -- Options for extlinks extension --------------------------------------- extlinks_detect_hardcoded_links = True -extlinks = {'github-docs': ('https://docs.github.com/en/%s', '%s'), +extlinks = {'github': ('https://github.com/ufs-community/land-DA_workflow/%s', '%s'), + 'github-docs': ('https://docs.github.com/en/%s', '%s'), 'jedi': ('https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/1.7.0/%s', '%s'), 'nco': ('https://www.nco.ncep.noaa.gov/idsb/implementation_standards/%s', '%s'), + 'rocoto': ('https://christopherwharrop.github.io/rocoto/%s', '%s'), 'rst': ('https://www.sphinx-doc.org/en/master/usage/restructuredtext/%s', '%s'), 'rtd': ('https://readthedocs.org/projects/land-da-workflow/%s', '%s'), 'land-wflow-repo': ('https://github.com/ufs-community/land-DA_workflow/%s', '%s'), From a48cfead19fe97b71dd83f78490ac6ffb4f1c359 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Thu, 11 Jul 2024 14:03:28 -0400 Subject: [PATCH 12/49] add ConfigWorkflow chapter --- .../CustomizingTheWorkflow/ConfigWorkflow.rst | 480 ++++++++++++++++++ 1 file changed, 480 insertions(+) create mode 100644 doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst diff --git a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst new file mode 100644 index 00000000..c7bebf8b --- /dev/null +++ b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst @@ -0,0 +1,480 @@ +.. _ConfigWorkflow: + +*************************************************** +Available Workflow Configuration Parameters +*************************************************** + +To run the Land DA System, users must create an experiment configuration file (named ``land_analysis.yaml`` by default). This file contains experiment-specific information, such as forecast/cycle dates, grid and physics suite choices, data directories, and other relevant settings. To help the user, two sample ``land_analysis_.yaml`` configuration files have been included in the ``parm`` directory for use on Hera and Orion. They contain reasonable experiment default values that work on those machines. The content of these files can be copied into ``land_analysis.yaml`` and used as the starting point from which to generate a variety of experiment configurations for Land DA. + +The following is a list of the parameters in the ``land_analysis_.yaml`` files. For each parameter, the default value and a brief description are provided. + +Workflow Attributes (``attrs:``) +================================= + +Attributes pertaining to the overall workflow are defined in the ``attrs:`` section under ``workflow:``. For example: + +.. code-block:: console + + workflow: + attrs: + realtime: false + scheduler: slurm + cyclethrottle: 24 + taskthrottle: 24 + +``realtime:`` (Default: false) + Indicates whether it is a realtime run (true) or not (false). Valid values: ``true`` | ``false`` + +``scheduler:`` (Default: slurm) + The job scheduler to use on the specified machine. Valid values: ``"slurm"`` | ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"`` + +.. COMMENT: Check valid values! + +``cyclethrottle:`` (Default: 24) + The number of cycles that can be active at one time. Valid values: Integer values >= 0. + +``taskthrottle:`` (Default: 24) + The number of tasks that can be active at one time. Valid values: Integer values >= 0. + + +Workflow Cycle Definition (``cycledef``) +========================================== + +Cycling information is defined in the ``cycledef:`` section under ``workflow:``. Each cycle definition starts with a ``-`` and has information on cycle attributes (``attrs:``) and a cycle specification (``spec:``). For example: + +.. code-block:: console + + workflow: + cycledef: + - attrs: + group: cycled + spec: 201912210000 201912220000 24:00:00 + +``attrs:`` + Attributes of ``cycledef``. Includes ``group:`` but may also include ``activation_offset:``. + + ``group:`` + The group attribute allows users to assign a set of cycles to a particular group. The group tag can later be used to control which tasks are run for which cycles. See the :rocoto:`Rocoto Documentation <>` for more information. + +``spec:`` + The cycle is defined using the "start stop step" method, with the cycle start date listed first in YYYMMDDHHmm format, followed by the end date and then the step in HH:mm:SS format (e.g., ``201912210000 201912220000 24:00:00``). + +Workflow Entities +=================== + +Entities are constants that can be referred to throughout the workflow using the ``&`` prefix and ``;`` suffix (e.g., ``&MACHINE;``) to avoid defining the same constants repetitively in each workflow task. For example, in ``land_analysis_orion.yaml``, the following entities are defined: + +.. code-block:: console + + entities: + MACHINE: "orion" + SCHED: "slurm" + ACCOUNT: "epic" + EXP_NAME: "LETKF" + EXP_BASEDIR: "/work/noaa/epic/{USER}/landda_test" + JEDI_INSTALL: "/work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6" + WARMSTART_DIR: "/work/noaa/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART" + FORCING: "gswp3" + RES: "96" + FCSTHR: "24" + NPROCS_ANALYSIS: "6" + NPROCS_FORECAST: "7" + OBSDIR: "" + OBSDIR_SUBDIR: "" + OBS_TYPES: "GHCN" + DAtype: "letkfoi_snow" + SNOWDEPTHVAR: "snwdph" + TSTUB: "oro_C96.mx100" + NET: "landda" + envir: "test" + model_ver: "v1.2.1" + RUN: "landda" + HOMElandda: "&EXP_BASEDIR;/land-DA_workflow" + PTMP: "&EXP_BASEDIR;/ptmp" + COMROOT: "&PTMP;/&envir;/com" + DATAROOT: "&PTMP;/&envir;/tmp" + KEEPDATA: "YES" + LOGDIR: "&COMROOT;/output/logs/run_&FORCING;" + LOGFN_SUFFIX: "_@Y@m@d@H.log" + PATHRT: "&EXP_BASEDIR;" + PDY: "@Y@m@d" + cyc: "@H" + DATADEP_FILE1: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc" + DATADEP_FILE2: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.nc" + DATADEP_FILE3: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc" + DATADEP_FILE4: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.nc" + +.. note:: + + When two defaults are listed, one is the default on Hera, and one is the default on Orion, depending on ``land_analysis_.yaml`` file used. The default on Hera is listed first, followed by the default on Orion. + +``MACHINE:`` (Default: "hera" or "orion") + The machine (a.k.a. platform or system) on which the workflow will run. Currently supported platforms are listed in :numref:`Section %s `. Valid values: ``"hera"`` | ``"orion"`` | ``"singularity"`` + +.. COMMENT: Check Singularity or NOAA Cloud or anything? + +``SCHED:`` (Default: "slurm") + The job scheduler to use (e.g., Slurm) on the specified ``MACHINE``. Valid values: ``"slurm"`` | ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"`` + +.. COMMENT: Check valid values! Also, isn't this a duplicate of "scheduler:"? + +``ACCOUNT:`` (Default: "epic") + The account under which users submit jobs to the queue on the specified ``MACHINE``. To determine an appropriate ``ACCOUNT`` field on a system with a Slurm job scheduler, users may run the ``saccount_params`` command to display account details. On other systems, users may run the ``groups`` command, which will return a list of projects that the user has permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. + +``EXP_NAME:`` (Default: "LETKF") + Placeholder --- currently not used in workflow. + +``EXP_BASEDIR:`` (Default: "/scratch2/NAGAPE/epic/{USER}/landda_test" or "/work/noaa/epic/{USER}/landda_test") + The full path to the directory that ``land-DA_workflow`` was cloned into (i.e., ``$LANDDAROOT`` in the documentation). + +``JEDI_INSTALL:`` (Default: "/scratch2/NAGAPE/epic/UFS_Land-DA_Dev/jedi_v7" or "/work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6") + The path to the JEDI |skylabv| installation. + +``WARMSTART_DIR:`` (Default: "/scratch2/NAGAPE/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART" or "/work/noaa/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART") + The path to restart files for a warmstart experiment. + +``FORCING:`` (Default: "gswp3") + Type of atmospheric forcing data used. Valid values: "gswp3" or "era5" + +``RES:`` (Default: "96") + Resolution of FV3 grid. Currently, only C96 resolution is supported. + +``FCSTHR:`` (Default: "24") + Specifies the length of each forecast in hours. + +``NPROCS_ANALYSIS:`` (Default: "6") + Number of processors for the analysis task. + +.. COMMENT: Check this! + +``NPROCS_FORECAST:`` (Default: "7") + Number of processors for the forecast task. + +.. COMMENT: Check this! + +``OBSDIR:`` (Default: "") + The path to the directory where ______??? + .. COMMENT: Add definition here! + +``OBSDIR_SUBDIR:`` (Default: "") +.. COMMENT: Add definition! + +``OBS_TYPES:`` (Default: "GHCN") + Specifies the observation type. Format is "Obs1" "Obs2". Currently, only GHCN observation data is available. + +``DAtype:`` (Default: "letkfoi_snow") +.. COMMENT: Add definition! + +``SNOWDEPTHVAR:`` (Default: "snwdph") +.. COMMENT: Add definition! + +``TSTUB:`` (Default: "oro_C96.mx100") + Specifies the file stub/name for orography files in TPATH. This file stub is named oro_C${RES} for atmosphere-only orography files and oro_C{RES}.mx100 for atmosphere and ocean orography files. + +NCO Directory Structure Entities +---------------------------------- + +Standard environment variables are defined in the NCEP Central Operations :nco:`WCOSS Implementation Standards ` document. These variables are used in forming the path to various directories containing input, output, and workflow files. For a visual aid, see the :ref:`Land DA Directory Structure Diagram `. The variables are defined in the WCOSS Implementation Standards document (pp. 4-5) as follows: + +``HOMElandda:`` (Default: "&EXP_BASEDIR;/land-DA_workflow") + The location of the :github:`land-DA_workflow` clone. + +``PTMP:`` (Default: "&EXP_BASEDIR;/ptmp") + User-defined path to the ``com``-type directories. + +``envir:`` (Default: "test") + The run environment. Set to “test” during the initial testing phase, “para” when running in parallel (on a schedule), and “prod” in production. + +``COMROOT:`` (Default: "&PTMP;/&envir;/com") + ``com`` root directory, which contains input/output data on current system. + +``NET:`` (Default: "landda") + Model name (first level of ``com`` directory structure) + +``model_ver:`` (Default: "v1.2.1") + Version number of package in three digits (second level of ``com`` directory) + +``RUN:`` (Default: "landda") + Name of model run (third level of com directory structure). In general, same as ${NET}. + +``DATAROOT:`` (Default: "&PTMP;/&envir;/tmp") + +.. COMMENT: Add definition! + + +``KEEPDATA:`` (Default: "YES") + Flag to keep data ("YES") or not ("NO"). + + .. COMMENT: Check definition! + +``LOGDIR:`` (Default: "&COMROOT;/output/logs/run_&FORCING;") + Path to the log file directory. + +``LOGFN_SUFFIX:`` (Default: "_@Y@m@d@H.log") +.. COMMENT: Add definition! + +``PATHRT:`` (Default: "&EXP_BASEDIR;") +.. COMMENT: Add definition! + +``PDY:`` (Default: "@Y@m@d") + Date in YYYYMMDD format. + +``cyc:`` (Default: "@H") + Cycle time in GMT hours, formatted HH. + +``DATADEP_FILE1:`` (Default: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc") +``DATADEP_FILE2:`` (Default: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.nc") +``DATADEP_FILE3:`` (Default: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc") +``DATADEP_FILE4:`` (Default: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.nc") + +.. COMMENT: Add definitions! + +Workflow Log +============== + log: "&LOGDIR;/workflow.log" + +Workflow Tasks +================ + + tasks: + +Observation Preparation Task (``task_prep_obs``) +-------------------------------------------------- + +Parameters for the observation preparation task are set in the ``task_prep_obs:`` section of the ``land_analysis_.yaml`` file. + + task_prep_obs: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + OBSDIR: "&OBSDIR;" + OBSDIR_SUBDIR: "&OBSDIR_SUBDIR;" + OBS_TYPES: "&OBS_TYPES;" + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + ATMOS_FORC: "&FORCING;" + model_ver: "&model_ver;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + PDY: "&PDY;" + cyc: "&cyc;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "prep_obs" "&HOMElandda;" "&MACHINE;"' + jobname: prep_obs + cores: 1 + walltime: 00:02:00 + queue: batch + join: "&LOGDIR;/prep_obs&LOGFN_SUFFIX;" + +Pre-Analysis Task (``task_pre_anal``) +--------------------------------------- + +Parameters for the pre-analysis task are set in the ``task_pre_anal:`` section of the ``land_analysis_.yaml`` file. + + task_pre_anal: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + ATMOS_FORC: "&FORCING;" + RES: "&RES;" + TSTUB: "&TSTUB;" + WARMSTART_DIR: "&WARMSTART_DIR;" + model_ver: "&model_ver;" + RUN: "&RUN;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + PDY: "&PDY;" + cyc: "&cyc;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "pre_anal" "&HOMElandda;" "&MACHINE;"' + jobname: pre_anal + cores: 1 + walltime: 00:05:00 + queue: batch + join: "&LOGDIR;/pre_anal&LOGFN_SUFFIX;" + dependency: + or: + datadep_file1: + attrs: + age: 5 + value: "&DATADEP_FILE1;" + datadep_file2: + attrs: + age: 5 + value: "&DATADEP_FILE2;" + datadep_file3: + attrs: + age: 5 + value: "&DATADEP_FILE3;" + datadep_file4: + attrs: + age: 5 + value: "&DATADEP_FILE4;" + +Analysis Task (``task_analysis``) +----------------------------------- + +Parameters for the analysis task are set in the ``task_analysis:`` section of the ``land_analysis_.yaml`` file. + + task_analysis: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + OBS_TYPES: "&OBS_TYPES;" + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + ATMOS_FORC: "&FORCING;" + RES: "&RES;" + TSTUB: "&TSTUB;" + model_ver: "&model_ver;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + PDY: "&PDY;" + cyc: "&cyc;" + DAtype: "&DAtype;" + SNOWDEPTHVAR: "&SNOWDEPTHVAR;" + NPROCS_ANALYSIS: "&NPROCS_ANALYSIS;" + JEDI_INSTALL: "&JEDI_INSTALL;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "analysis" "&HOMElandda;" "&MACHINE;"' + jobname: analysis + nodes: "1:ppn=&NPROCS_ANALYSIS;" + walltime: 00:15:00 + queue: batch + join: "&LOGDIR;/analysis&LOGFN_SUFFIX;" + dependency: + taskdep: + attrs: + task: pre_anal + +Post-Analysis Task (``task_post_anal``) +----------------------------------------- + +Parameters for the post analysis task are set in the ``task_post_anal:`` section of the ``land_analysis_.yaml`` file. + + task_post_anal: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + ATMOS_FORC: "&FORCING;" + RES: "&RES;" + TSTUB: "&TSTUB;" + model_ver: "&model_ver;" + RUN: "&RUN;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + PDY: "&PDY;" + cyc: "&cyc;" + FCSTHR: "&FCSTHR;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "post_anal" "&HOMElandda;" "&MACHINE;"' + jobname: post_anal + cores: 1 + walltime: 00:05:00 + queue: batch + join: "&LOGDIR;/post_anal&LOGFN_SUFFIX;" + dependency: + taskdep: + attrs: + task: analysis + +Plotting Task (``task_plot_stats``) +------------------------------------- + +Parameters for the plotting task are set in the ``task_plot_stats:`` section of the ``land_analysis_.yaml`` file. + + task_plot_stats: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + model_ver: "&model_ver;" + RUN: "&RUN;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + PDY: "&PDY;" + cyc: "&cyc;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "plot_stats" "&HOMElandda;" "&MACHINE;"' + jobname: plot_stats + cores: 1 + walltime: 00:10:00 + queue: batch + join: "&LOGDIR;/plot_stats&LOGFN_SUFFIX;" + dependency: + taskdep: + attrs: + task: analysis + +Forecast Task (``task_forecast``) +---------------------------------- + +Parameters for the forecast task are set in the ``task_forecast:`` section of the ``land_analysis_.yaml`` file. + + task_forecast: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + OBS_TYPES: "&OBS_TYPES;" + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + ATMOS_FORC: "&FORCING;" + RES: "&RES;" + WARMSTART_DIR: "&WARMSTART_DIR;" + model_ver: "&model_ver;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + LOGDIR: "&LOGDIR;" + PDY: "&PDY;" + cyc: "&cyc;" + DAtype: "&DAtype;" + FCSTHR: "&FCSTHR;" + NPROCS_FORECAST: "&NPROCS_FORECAST;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "forecast" "&HOMElandda;" "&MACHINE;"' + jobname: forecast + nodes: "1:ppn=&NPROCS_FORECAST;" + walltime: 01:00:00 + queue: batch + join: "&LOGDIR;/forecast&LOGFN_SUFFIX;" + dependency: + taskdep: + attrs: + task: post_anal + + From 071cad1f4d3ca4144e780afa963b002b6f1ae389 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Fri, 12 Jul 2024 19:28:00 -0400 Subject: [PATCH 13/49] explain task parameters & configuration --- .../CustomizingTheWorkflow/ConfigWorkflow.rst | 776 +++++++++++------- 1 file changed, 492 insertions(+), 284 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst index c7bebf8b..730d71d0 100644 --- a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst +++ b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst @@ -8,6 +8,8 @@ To run the Land DA System, users must create an experiment configuration file (n The following is a list of the parameters in the ``land_analysis_.yaml`` files. For each parameter, the default value and a brief description are provided. +.. _wf-attributes: + Workflow Attributes (``attrs:``) ================================= @@ -23,19 +25,18 @@ Attributes pertaining to the overall workflow are defined in the ``attrs:`` sect taskthrottle: 24 ``realtime:`` (Default: false) - Indicates whether it is a realtime run (true) or not (false). Valid values: ``true`` | ``false`` + Indicates whether it is a realtime (true) or retrospective run (false). Valid values: ``true`` | ``false`` ``scheduler:`` (Default: slurm) - The job scheduler to use on the specified machine. Valid values: ``"slurm"`` | ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"`` - -.. COMMENT: Check valid values! + The job scheduler to use on the specified machine. Valid values: ``"slurm"``. Other options may work with a container but have not been tested: ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"`` ``cyclethrottle:`` (Default: 24) - The number of cycles that can be active at one time. Valid values: Integer values >= 0. + The number of cycles that can be active at one time. Valid values: Integers >= 0. ``taskthrottle:`` (Default: 24) - The number of tasks that can be active at one time. Valid values: Integer values >= 0. + The number of tasks that can be active at one time. Valid values: Integers >= 0. +.. _wf-cycledef: Workflow Cycle Definition (``cycledef``) ========================================== @@ -59,6 +60,9 @@ Cycling information is defined in the ``cycledef:`` section under ``workflow:``. ``spec:`` The cycle is defined using the "start stop step" method, with the cycle start date listed first in YYYMMDDHHmm format, followed by the end date and then the step in HH:mm:SS format (e.g., ``201912210000 201912220000 24:00:00``). + +.. _wf-entities: + Workflow Entities =================== @@ -66,43 +70,44 @@ Entities are constants that can be referred to throughout the workflow using the .. code-block:: console - entities: - MACHINE: "orion" - SCHED: "slurm" - ACCOUNT: "epic" - EXP_NAME: "LETKF" - EXP_BASEDIR: "/work/noaa/epic/{USER}/landda_test" - JEDI_INSTALL: "/work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6" - WARMSTART_DIR: "/work/noaa/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART" - FORCING: "gswp3" - RES: "96" - FCSTHR: "24" - NPROCS_ANALYSIS: "6" - NPROCS_FORECAST: "7" - OBSDIR: "" - OBSDIR_SUBDIR: "" - OBS_TYPES: "GHCN" - DAtype: "letkfoi_snow" - SNOWDEPTHVAR: "snwdph" - TSTUB: "oro_C96.mx100" - NET: "landda" - envir: "test" - model_ver: "v1.2.1" - RUN: "landda" - HOMElandda: "&EXP_BASEDIR;/land-DA_workflow" - PTMP: "&EXP_BASEDIR;/ptmp" - COMROOT: "&PTMP;/&envir;/com" - DATAROOT: "&PTMP;/&envir;/tmp" - KEEPDATA: "YES" - LOGDIR: "&COMROOT;/output/logs/run_&FORCING;" - LOGFN_SUFFIX: "_@Y@m@d@H.log" - PATHRT: "&EXP_BASEDIR;" - PDY: "@Y@m@d" - cyc: "@H" - DATADEP_FILE1: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc" - DATADEP_FILE2: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.nc" - DATADEP_FILE3: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc" - DATADEP_FILE4: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.nc" + workflow: + entities: + MACHINE: "orion" + SCHED: "slurm" + ACCOUNT: "epic" + EXP_NAME: "LETKF" + EXP_BASEDIR: "/work/noaa/epic/{USER}/landda_test" + JEDI_INSTALL: "/work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6" + WARMSTART_DIR: "/work/noaa/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART" + FORCING: "gswp3" + RES: "96" + FCSTHR: "24" + NPROCS_ANALYSIS: "6" + NPROCS_FORECAST: "7" + OBSDIR: "" + OBSDIR_SUBDIR: "" + OBS_TYPES: "GHCN" + DAtype: "letkfoi_snow" + SNOWDEPTHVAR: "snwdph" + TSTUB: "oro_C96.mx100" + NET: "landda" + envir: "test" + model_ver: "v1.2.1" + RUN: "landda" + HOMElandda: "&EXP_BASEDIR;/land-DA_workflow" + PTMP: "&EXP_BASEDIR;/ptmp" + COMROOT: "&PTMP;/&envir;/com" + DATAROOT: "&PTMP;/&envir;/tmp" + KEEPDATA: "YES" + LOGDIR: "&COMROOT;/output/logs/run_&FORCING;" + LOGFN_SUFFIX: "_@Y@m@d@H.log" + PATHRT: "&EXP_BASEDIR;" + PDY: "@Y@m@d" + cyc: "@H" + DATADEP_FILE1: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc" + DATADEP_FILE2: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.nc" + DATADEP_FILE3: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc" + DATADEP_FILE4: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.nc" .. note:: @@ -114,9 +119,7 @@ Entities are constants that can be referred to throughout the workflow using the .. COMMENT: Check Singularity or NOAA Cloud or anything? ``SCHED:`` (Default: "slurm") - The job scheduler to use (e.g., Slurm) on the specified ``MACHINE``. Valid values: ``"slurm"`` | ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"`` - -.. COMMENT: Check valid values! Also, isn't this a duplicate of "scheduler:"? + The job scheduler to use (e.g., Slurm) on the specified ``MACHINE``. Valid values: ``"slurm"``. Other options may work with a container but have not been tested: ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"`` ``ACCOUNT:`` (Default: "epic") The account under which users submit jobs to the queue on the specified ``MACHINE``. To determine an appropriate ``ACCOUNT`` field on a system with a Slurm job scheduler, users may run the ``saccount_params`` command to display account details. On other systems, users may run the ``groups`` command, which will return a list of projects that the user has permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. @@ -134,42 +137,44 @@ Entities are constants that can be referred to throughout the workflow using the The path to restart files for a warmstart experiment. ``FORCING:`` (Default: "gswp3") - Type of atmospheric forcing data used. Valid values: "gswp3" or "era5" + Type of atmospheric forcing data used. Valid values: ``"gswp3"`` | ``"era5"`` ``RES:`` (Default: "96") Resolution of FV3 grid. Currently, only C96 resolution is supported. ``FCSTHR:`` (Default: "24") - Specifies the length of each forecast in hours. + Specifies the length of each forecast in hours. Valid values: Integers >= 0. ``NPROCS_ANALYSIS:`` (Default: "6") Number of processors for the analysis task. -.. COMMENT: Check this! - ``NPROCS_FORECAST:`` (Default: "7") Number of processors for the forecast task. -.. COMMENT: Check this! - ``OBSDIR:`` (Default: "") - The path to the directory where ______??? - .. COMMENT: Add definition here! + The path to the directory where DA fix files are located. In ``scripts/exlandda_prep_obs.sh``, this value is set to ``${FIXlandda}/DA`` unless the user specifies a different path in ``land_analysis.yaml``. ``OBSDIR_SUBDIR:`` (Default: "") -.. COMMENT: Add definition! + The path to the directories where different types of fix data (e.g., ERA5, GSWP3, GTS, NOAH-MP) are located. In ``scripts/exlandda_prep_obs.sh``, this value is set based on the type(s) of data requested. The user may choose to set a different value. ``OBS_TYPES:`` (Default: "GHCN") Specifies the observation type. Format is "Obs1" "Obs2". Currently, only GHCN observation data is available. ``DAtype:`` (Default: "letkfoi_snow") -.. COMMENT: Add definition! + Type of data assimilation. Valid values: ``letkfoi_snow``. Currently, Land DA only performs snow DA using the LETKF-OI algorithm. As the application expands, more options may be added. ``SNOWDEPTHVAR:`` (Default: "snwdph") -.. COMMENT: Add definition! + Placeholder --- currently not used in workflow. This value is currently hard-coded into ``scripts/exlandda_analysis.sh``. ``TSTUB:`` (Default: "oro_C96.mx100") - Specifies the file stub/name for orography files in TPATH. This file stub is named oro_C${RES} for atmosphere-only orography files and oro_C{RES}.mx100 for atmosphere and ocean orography files. + Specifies the file stub/name for orography files in ``TPATH``. This file stub is named ``oro_C${RES}`` for atmosphere-only orography files and ``oro_C{RES}.mx100`` for atmosphere and ocean orography files. When Land DA is compiled with ``sorc/app_build.sh``, the subdirectories of the fix files should be linked into the ``fix`` directory, and orography files can be found in ``fix/FV3_fix_tiled/C96``. + +``DATADEP_FILE1:`` (Default: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc") +``DATADEP_FILE2:`` (Default: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.nc") +``DATADEP_FILE3:`` (Default: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc") +``DATADEP_FILE4:`` (Default: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.nc") + File names for the dependency check for the task ``pre_anal``. This means that ``pre_anal`` is triggered only when one of them exists. Otherwise, the task will not be submitted. + NCO Directory Structure Entities ---------------------------------- @@ -198,23 +203,19 @@ Standard environment variables are defined in the NCEP Central Operations :nco:` Name of model run (third level of com directory structure). In general, same as ${NET}. ``DATAROOT:`` (Default: "&PTMP;/&envir;/tmp") - -.. COMMENT: Add definition! - + Directory location for the temporary working directories for running jobs. By default, this is a sibling to the ``$COMROOT`` directory and is located at ``ptmp/test/tmp``. ``KEEPDATA:`` (Default: "YES") - Flag to keep data ("YES") or not ("NO"). - - .. COMMENT: Check definition! + Flag to keep data ("YES") or not ("NO") that is copied to the ``$DATAROOT`` directory during the forecast experiment. ``LOGDIR:`` (Default: "&COMROOT;/output/logs/run_&FORCING;") - Path to the log file directory. + Path to the directory containing log files for each workflow task. ``LOGFN_SUFFIX:`` (Default: "_@Y@m@d@H.log") -.. COMMENT: Add definition! + The cycle suffix appended to each task's log file. It will be rendered in the form ``_YYYYMMDDHH.log``. For example, the ``prep_obs`` task log would become: ``prep_obs_2000010400.log``. ``PATHRT:`` (Default: "&EXP_BASEDIR;") -.. COMMENT: Add definition! + The path to the ``EXP_BASEDIR`` for regression tests (RTs). ``PDY:`` (Default: "@Y@m@d") Date in YYYYMMDD format. @@ -222,259 +223,466 @@ Standard environment variables are defined in the NCEP Central Operations :nco:` ``cyc:`` (Default: "@H") Cycle time in GMT hours, formatted HH. -``DATADEP_FILE1:`` (Default: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc") -``DATADEP_FILE2:`` (Default: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.nc") -``DATADEP_FILE3:`` (Default: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc") -``DATADEP_FILE4:`` (Default: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.nc") +.. _wf-log: -.. COMMENT: Add definitions! - Workflow Log ============== - log: "&LOGDIR;/workflow.log" + +Information related to workflow progress is defined in the ``log:`` section under ``workflow:``: + +.. code-block:: console + + workflow: + log: "&LOGDIR;/workflow.log" + +``log:`` (Default: "&LOGDIR;/workflow.log") + Path and name of Rocoto log file(s). + +.. _wf-tasks: Workflow Tasks ================ - tasks: +The workflow is divided into discrete tasks, and details of each task are defined within the ``tasks:`` section under ``workflow:``. -Observation Preparation Task (``task_prep_obs``) --------------------------------------------------- +.. code-block:: console + + workflow: + tasks: + task_prep_obs: + task_pre_anal: + task_analysis: + task_post_anal: + task_plot_stats: + task_forecast: -Parameters for the observation preparation task are set in the ``task_prep_obs:`` section of the ``land_analysis_.yaml`` file. +Each task may contain attributes (``attrs:``), just as in the overarching ``workflow:`` section. Instead of entities, each task contains an ``envars:`` section to define environment variables that must be passed to the task when it is executed. Any task dependencies are listed under the ``dependency:`` section. Additional details, such as ``jobname:``, ``walltime:``, and ``queue:`` may also be set within a specific task. - task_prep_obs: - attrs: - cycledefs: cycled - maxtries: 2 +The following subsections explain any variables that have not already been explained/defined above. + +.. _sample_task: + +Sample Task: Analysis Task (``task_analysis``) +------------------------------------------------ + +This section walks users through the structure of the analysis task (``task_analysis``) to explain how configuration information is provided in the ``land_analysis_.yaml`` file for each task. Since each task has a similar structure, common information is explained in this section. Variables unique to a particular task are defined in their respective ``task_`` sections below. + +Parameters for a particular task are set in the ``workflow.tasks.task_:`` section of the ``land_analysis_.yaml`` file. For example, settings for the analysis task are provided in the ``task_analysis:`` section of ``land_analysis_.yaml``. The following is an excerpt of the ``task_analysis:`` section of ``land_analysis_.yaml``: + +.. code-block:: console + + workflow: + tasks: + task_analysis: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + OBS_TYPES: "&OBS_TYPES;" + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + ATMOS_FORC: "&FORCING;" + RES: "&RES;" + TSTUB: "&TSTUB;" + model_ver: "&model_ver;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + PDY: "&PDY;" + cyc: "&cyc;" + DAtype: "&DAtype;" + SNOWDEPTHVAR: "&SNOWDEPTHVAR;" + NPROCS_ANALYSIS: "&NPROCS_ANALYSIS;" + JEDI_INSTALL: "&JEDI_INSTALL;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "analysis" "&HOMElandda;" "&MACHINE;"' + jobname: analysis + nodes: "1:ppn=&NPROCS_ANALYSIS;" + walltime: 00:15:00 + queue: batch + join: "&LOGDIR;/analysis&LOGFN_SUFFIX;" + dependency: + taskdep: + attrs: + task: pre_anal + +.. _task-attributes: + +Task Attributes (``attrs:``) +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +The ``attrs:`` section for each task includes the ``cycledefs:`` attribute and the ``maxtries:`` attribute. + +``cycledefs:`` (Default: cycled) + A comma-separated list of ``cycledef:`` group names. A task with a ``cycledefs:`` group ID will be run only if its group ID matches one of the workflow's ``cycledef:`` group IDs. + +.. COMMENT: Clarify! + +``maxtries:`` (Default: 2) + The maximum number of times Rocoto can resumbit a failed task. + +.. _task-envars: + +Task Environment Variables (``envars``) +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +The ``envars:`` section for each task reuses many of the same variables and values defined as ``entities:`` for the overall workflow. These values are needed for each task, but setting them individually is error-prone. Instead, a specific workflow task can reference workflow entities using the ``&VAR;`` syntax. For example, to set the ``ATMOS_FORC:`` value in ``task_analysis:`` to the value of the workflow ``FORCING`` entity, the following statement can be added to the task's ``envars:`` section: + +.. code-block:: console + + task_analysis: envars: - OBSDIR: "&OBSDIR;" - OBSDIR_SUBDIR: "&OBSDIR_SUBDIR;" - OBS_TYPES: "&OBS_TYPES;" - MACHINE: "&MACHINE;" - SCHED: "&SCHED;" - ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" ATMOS_FORC: "&FORCING;" - model_ver: "&model_ver;" - HOMElandda: "&HOMElandda;" - COMROOT: "&COMROOT;" - DATAROOT: "&DATAROOT;" - KEEPDATA: "&KEEPDATA;" - PDY: "&PDY;" - cyc: "&cyc;" - account: "&ACCOUNT;" - command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "prep_obs" "&HOMElandda;" "&MACHINE;"' - jobname: prep_obs - cores: 1 - walltime: 00:02:00 - queue: batch - join: "&LOGDIR;/prep_obs&LOGFN_SUFFIX;" + +For most workflow tasks, whatever value is set in the ``workflow.entities:`` section should be reused/referenced in other tasks. For example, the ``MACHINE`` variable must be defined for each task, and users cannot switch machines mid-workflow. Therefore, users should set the ``MACHINE`` variable in the ``workflow.entities:`` section and reference that definition in each workflow task. For example: + +.. code-block:: console + + workflow: + entities: + MACHINE: "orion" + tasks: + task_prep_obs: + envars: + MACHINE: "&MACHINE;" + task_pre_anal: + envars: + MACHINE: "&MACHINE;" + task_analysis: + envars: + MACHINE: "&MACHINE;" + ... + task_forecast: + envars: + MACHINE: "&MACHINE;" + +.. _misc-tasks: + +Miscellaneous Task Values +^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +The authoritative :rocoto:`Rocoto documentation` discusses a number of miscellaneous task attributes in detail. A brief overview is provided in this section. + +.. code-block:: console + + workflow: + tasks: + task_analysis: + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "analysis" "&HOMElandda;" "&MACHINE;"' + jobname: analysis + nodes: "1:ppn=&NPROCS_ANALYSIS;" + walltime: 00:15:00 + queue: batch + join: "&LOGDIR;/analysis&LOGFN_SUFFIX;" + +``ACCOUNT:`` (Default: "&ACCOUNT;") + The account under which users submit jobs to the queue on the specified ``MACHINE``. This value is typically the same for each task, so the default reuses the value set in the :ref:`Workflow Entities ` section. + +``command:`` (Default: ``'&HOMElandda;/parm/task_load_modules_run_jjob.sh "analysis" "&HOMElandda;" "&MACHINE;"'``) + The command that Rocoto will submit to the batch system to carry out the task's work. + +``jobname:`` (Default: analysis) + Name of the task/job (default will vary based on the task). + +``nodes:`` (Default: "1:ppn=&NPROCS_ANALYSIS;") + Number of nodes required for the task (default will vary based on the task). + +``walltime:`` (Default: 00:15:00) + Time allotted for the task (default will vary based on the task). + +``queue:`` (Default: batch) + The batch system queue or "quality of servie" (QOS) that Rocoto will submit the task to for execution. + +``join:`` (Default: "&LOGDIR;/analysis&LOGFN_SUFFIX;") + The full path to the task's log file, which records output from ``stdout`` and ``stderr``. + +Some tasks include a ``cores:`` value instead of a ``nodes:`` value. For example: + +``cores:`` (Default: 1) + The number of cores required for the task. + +.. _task-dependencies: + +Dependencies +^^^^^^^^^^^^^^ + +The ``dependency:`` section of a task defines what prerequisites must be met for the task to run. In the case of ``task_analysis:``, it must be run after the ``pre_anal`` task. Therefore, the dependecy section lists a task dependency (``taskdep:``). + +.. code-block:: console + + workflow: + tasks: + task_analysis: + dependency: + taskdep: + attrs: + task: pre_anal + +Other tasks may list data or time dependencies. For example, the pre-analysis task (``task_pre_anal:``) requires at least one of four possible data files to be available before it can run. + +.. code-block:: console + + workflow: + tasks: + task_pre_anal: + dependency: + or: + datadep_file1: + attrs: + age: 5 + value: "&DATADEP_FILE1;" + datadep_file2: + attrs: + age: 5 + value: "&DATADEP_FILE2;" + datadep_file3: + attrs: + age: 5 + value: "&DATADEP_FILE3;" + datadep_file4: + attrs: + age: 5 + value: "&DATADEP_FILE4;" + +For details on the dependency details (e.g., ``attrs:``, ``age:``, ``value:`` tags), view the authoritative :rocoto:`Rocoto documentation`. + +.. _prep-obs: + +Observation Preparation Task (``task_prep_obs``) +-------------------------------------------------- + +Parameters for the observation preparation task are set in the ``task_prep_obs:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. + +.. code-block:: console + + workflow: + tasks: + task_prep_obs: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + OBSDIR: "&OBSDIR;" + OBSDIR_SUBDIR: "&OBSDIR_SUBDIR;" + OBS_TYPES: "&OBS_TYPES;" + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + ATMOS_FORC: "&FORCING;" + model_ver: "&model_ver;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + PDY: "&PDY;" + cyc: "&cyc;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "prep_obs" "&HOMElandda;" "&MACHINE;"' + jobname: prep_obs + cores: 1 + walltime: 00:02:00 + queue: batch + join: "&LOGDIR;/prep_obs&LOGFN_SUFFIX;" + +.. _pre-anal: Pre-Analysis Task (``task_pre_anal``) --------------------------------------- -Parameters for the pre-analysis task are set in the ``task_pre_anal:`` section of the ``land_analysis_.yaml`` file. +Parameters for the pre-analysis task are set in the ``task_pre_anal:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. - task_pre_anal: - attrs: - cycledefs: cycled - maxtries: 2 - envars: - MACHINE: "&MACHINE;" - SCHED: "&SCHED;" - ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" - ATMOS_FORC: "&FORCING;" - RES: "&RES;" - TSTUB: "&TSTUB;" - WARMSTART_DIR: "&WARMSTART_DIR;" - model_ver: "&model_ver;" - RUN: "&RUN;" - HOMElandda: "&HOMElandda;" - COMROOT: "&COMROOT;" - DATAROOT: "&DATAROOT;" - KEEPDATA: "&KEEPDATA;" - PDY: "&PDY;" - cyc: "&cyc;" - account: "&ACCOUNT;" - command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "pre_anal" "&HOMElandda;" "&MACHINE;"' - jobname: pre_anal - cores: 1 - walltime: 00:05:00 - queue: batch - join: "&LOGDIR;/pre_anal&LOGFN_SUFFIX;" - dependency: - or: - datadep_file1: - attrs: - age: 5 - value: "&DATADEP_FILE1;" - datadep_file2: - attrs: - age: 5 - value: "&DATADEP_FILE2;" - datadep_file3: - attrs: - age: 5 - value: "&DATADEP_FILE3;" - datadep_file4: - attrs: - age: 5 - value: "&DATADEP_FILE4;" +.. code-block:: console + + workflow: + tasks: + task_pre_anal: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + ATMOS_FORC: "&FORCING;" + RES: "&RES;" + TSTUB: "&TSTUB;" + WARMSTART_DIR: "&WARMSTART_DIR;" + model_ver: "&model_ver;" + RUN: "&RUN;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + PDY: "&PDY;" + cyc: "&cyc;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "pre_anal" "&HOMElandda;" "&MACHINE;"' + jobname: pre_anal + cores: 1 + walltime: 00:05:00 + queue: batch + join: "&LOGDIR;/pre_anal&LOGFN_SUFFIX;" + dependency: + or: + datadep_file1: + attrs: + age: 5 + value: "&DATADEP_FILE1;" + datadep_file2: + attrs: + age: 5 + value: "&DATADEP_FILE2;" + datadep_file3: + attrs: + age: 5 + value: "&DATADEP_FILE3;" + datadep_file4: + attrs: + age: 5 + value: "&DATADEP_FILE4;" + + +.. _analysis: Analysis Task (``task_analysis``) ----------------------------------- -Parameters for the analysis task are set in the ``task_analysis:`` section of the ``land_analysis_.yaml`` file. +Parameters for the analysis task are set in the ``task_analysis:`` section of the ``land_analysis_.yaml`` file. Most are the same as the defaults set in the :ref:`Workflow Entities ` section. The ``task_analysis:`` task is explained fully in the :ref:`Sample Task ` section, although the default values may differ. - task_analysis: - attrs: - cycledefs: cycled - maxtries: 2 - envars: - OBS_TYPES: "&OBS_TYPES;" - MACHINE: "&MACHINE;" - SCHED: "&SCHED;" - ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" - ATMOS_FORC: "&FORCING;" - RES: "&RES;" - TSTUB: "&TSTUB;" - model_ver: "&model_ver;" - HOMElandda: "&HOMElandda;" - COMROOT: "&COMROOT;" - DATAROOT: "&DATAROOT;" - KEEPDATA: "&KEEPDATA;" - PDY: "&PDY;" - cyc: "&cyc;" - DAtype: "&DAtype;" - SNOWDEPTHVAR: "&SNOWDEPTHVAR;" - NPROCS_ANALYSIS: "&NPROCS_ANALYSIS;" - JEDI_INSTALL: "&JEDI_INSTALL;" - account: "&ACCOUNT;" - command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "analysis" "&HOMElandda;" "&MACHINE;"' - jobname: analysis - nodes: "1:ppn=&NPROCS_ANALYSIS;" - walltime: 00:15:00 - queue: batch - join: "&LOGDIR;/analysis&LOGFN_SUFFIX;" - dependency: - taskdep: - attrs: - task: pre_anal +.. _post-analysis: Post-Analysis Task (``task_post_anal``) ----------------------------------------- -Parameters for the post analysis task are set in the ``task_post_anal:`` section of the ``land_analysis_.yaml`` file. +Parameters for the post analysis task are set in the ``task_post_anal:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. - task_post_anal: - attrs: - cycledefs: cycled - maxtries: 2 - envars: - MACHINE: "&MACHINE;" - SCHED: "&SCHED;" - ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" - ATMOS_FORC: "&FORCING;" - RES: "&RES;" - TSTUB: "&TSTUB;" - model_ver: "&model_ver;" - RUN: "&RUN;" - HOMElandda: "&HOMElandda;" - COMROOT: "&COMROOT;" - DATAROOT: "&DATAROOT;" - KEEPDATA: "&KEEPDATA;" - PDY: "&PDY;" - cyc: "&cyc;" - FCSTHR: "&FCSTHR;" - account: "&ACCOUNT;" - command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "post_anal" "&HOMElandda;" "&MACHINE;"' - jobname: post_anal - cores: 1 - walltime: 00:05:00 - queue: batch - join: "&LOGDIR;/post_anal&LOGFN_SUFFIX;" - dependency: - taskdep: - attrs: - task: analysis +.. code-block:: console + + workflow: + tasks: + task_post_anal: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + ATMOS_FORC: "&FORCING;" + RES: "&RES;" + TSTUB: "&TSTUB;" + model_ver: "&model_ver;" + RUN: "&RUN;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + PDY: "&PDY;" + cyc: "&cyc;" + FCSTHR: "&FCSTHR;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "post_anal" "&HOMElandda;" "&MACHINE;"' + jobname: post_anal + cores: 1 + walltime: 00:05:00 + queue: batch + join: "&LOGDIR;/post_anal&LOGFN_SUFFIX;" + dependency: + taskdep: + attrs: + task: analysis + +.. _plot-stats: Plotting Task (``task_plot_stats``) ------------------------------------- -Parameters for the plotting task are set in the ``task_plot_stats:`` section of the ``land_analysis_.yaml`` file. +Parameters for the plotting task are set in the ``task_plot_stats:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. - task_plot_stats: - attrs: - cycledefs: cycled - maxtries: 2 - envars: - MACHINE: "&MACHINE;" - SCHED: "&SCHED;" - ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" - model_ver: "&model_ver;" - RUN: "&RUN;" - HOMElandda: "&HOMElandda;" - COMROOT: "&COMROOT;" - DATAROOT: "&DATAROOT;" - KEEPDATA: "&KEEPDATA;" - PDY: "&PDY;" - cyc: "&cyc;" - account: "&ACCOUNT;" - command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "plot_stats" "&HOMElandda;" "&MACHINE;"' - jobname: plot_stats - cores: 1 - walltime: 00:10:00 - queue: batch - join: "&LOGDIR;/plot_stats&LOGFN_SUFFIX;" - dependency: - taskdep: - attrs: - task: analysis +.. code-block:: console + + workflow: + tasks: + task_plot_stats: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + model_ver: "&model_ver;" + RUN: "&RUN;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + PDY: "&PDY;" + cyc: "&cyc;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "plot_stats" "&HOMElandda;" "&MACHINE;"' + jobname: plot_stats + cores: 1 + walltime: 00:10:00 + queue: batch + join: "&LOGDIR;/plot_stats&LOGFN_SUFFIX;" + dependency: + taskdep: + attrs: + task: analysis + +.. _forecast: Forecast Task (``task_forecast``) ---------------------------------- -Parameters for the forecast task are set in the ``task_forecast:`` section of the ``land_analysis_.yaml`` file. +Parameters for the forecast task are set in the ``task_forecast:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. - task_forecast: - attrs: - cycledefs: cycled - maxtries: 2 - envars: - OBS_TYPES: "&OBS_TYPES;" - MACHINE: "&MACHINE;" - SCHED: "&SCHED;" - ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" - ATMOS_FORC: "&FORCING;" - RES: "&RES;" - WARMSTART_DIR: "&WARMSTART_DIR;" - model_ver: "&model_ver;" - HOMElandda: "&HOMElandda;" - COMROOT: "&COMROOT;" - DATAROOT: "&DATAROOT;" - KEEPDATA: "&KEEPDATA;" - LOGDIR: "&LOGDIR;" - PDY: "&PDY;" - cyc: "&cyc;" - DAtype: "&DAtype;" - FCSTHR: "&FCSTHR;" - NPROCS_FORECAST: "&NPROCS_FORECAST;" - account: "&ACCOUNT;" - command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "forecast" "&HOMElandda;" "&MACHINE;"' - jobname: forecast - nodes: "1:ppn=&NPROCS_FORECAST;" - walltime: 01:00:00 - queue: batch - join: "&LOGDIR;/forecast&LOGFN_SUFFIX;" - dependency: - taskdep: - attrs: - task: post_anal +.. code-block:: console + workflow: + tasks: + task_forecast: + attrs: + cycledefs: cycled + maxtries: 2 + envars: + OBS_TYPES: "&OBS_TYPES;" + MACHINE: "&MACHINE;" + SCHED: "&SCHED;" + ACCOUNT: "&ACCOUNT;" + EXP_NAME: "&EXP_NAME;" + ATMOS_FORC: "&FORCING;" + RES: "&RES;" + WARMSTART_DIR: "&WARMSTART_DIR;" + model_ver: "&model_ver;" + HOMElandda: "&HOMElandda;" + COMROOT: "&COMROOT;" + DATAROOT: "&DATAROOT;" + KEEPDATA: "&KEEPDATA;" + LOGDIR: "&LOGDIR;" + PDY: "&PDY;" + cyc: "&cyc;" + DAtype: "&DAtype;" + FCSTHR: "&FCSTHR;" + NPROCS_FORECAST: "&NPROCS_FORECAST;" + account: "&ACCOUNT;" + command: '&HOMElandda;/parm/task_load_modules_run_jjob.sh "forecast" "&HOMElandda;" "&MACHINE;"' + jobname: forecast + nodes: "1:ppn=&NPROCS_FORECAST;" + walltime: 01:00:00 + queue: batch + join: "&LOGDIR;/forecast&LOGFN_SUFFIX;" + dependency: + taskdep: + attrs: + task: post_anal From ddccaf9d906d4a7c65dab2d0dce822b4b1f0926e Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Tue, 16 Jul 2024 13:49:17 -0400 Subject: [PATCH 14/49] update testing ch --- .../BuildingRunningTesting/TestingLandDA.rst | 26 +++++++++++++++++-- 1 file changed, 24 insertions(+), 2 deletions(-) diff --git a/doc/source/BuildingRunningTesting/TestingLandDA.rst b/doc/source/BuildingRunningTesting/TestingLandDA.rst index 434c4185..eda3b143 100644 --- a/doc/source/BuildingRunningTesting/TestingLandDA.rst +++ b/doc/source/BuildingRunningTesting/TestingLandDA.rst @@ -17,17 +17,39 @@ From the working directory (``$LANDDAROOT``), navigate to ``build``. Then run: .. code-block:: console - salloc --ntasks 8 --exclusive --qos=debug --partition=debug --time=00:30:00 --account= + salloc --ntasks 8 --exclusive --qos=debug --partition= --time=00:30:00 --account= cd land-DA_workflow/sorc/build source ../../versions/build.ver_ module use ../../modulefiles module load build__intel ctest -where ```` corresponds to the user's actual account name and ```` is ``hera`` or ``orion``. +where ```` corresponds to the user's actual account name, ```` is a valid partition on the platform of choice (e.g., ``debug`` or ``orion``), and ```` is ``hera`` or ``orion``. This will submit an interactive job, load the appropriate modulefiles, and run the CTests. +If the tests are successful, a message will be printed to the console. For example: + +.. code-block:: console + + Test project /work/noaa/epic/${USER}/landda/land-DA_workflow/sorc/build + Start 1: test_vector2tile + 1/6 Test #1: test_vector2tile ................. Passed 12.01 sec + Start 2: test_create_ens + 2/6 Test #2: test_create_ens .................. Passed 13.91 sec + Start 3: test_letkfoi_snowda + 3/6 Test #3: test_letkfoi_snowda .............. Passed 67.94 sec + Start 4: test_apply_jediincr + 4/6 Test #4: test_apply_jediincr .............. Passed 6.88 sec + Start 5: test_tile2vector + 5/6 Test #5: test_tile2vector ................. Passed 15.36 sec + Start 6: test_ufs_datm_land + 6/6 Test #6: test_ufs_datm_land ............... Passed 98.56 sec + + 100% tests passed, 0 tests failed out of 6 + + Total Test time (real) = 217.06 sec + Tests ******* From 76f39680fafeef79d82c27a1de0416df009740fc Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Tue, 16 Jul 2024 15:18:13 -0400 Subject: [PATCH 15/49] update glossary --- doc/source/Reference/Glossary.rst | 66 ++++++++++++++++++++++++------- doc/source/conf.py | 6 +-- 2 files changed, 54 insertions(+), 18 deletions(-) diff --git a/doc/source/Reference/Glossary.rst b/doc/source/Reference/Glossary.rst index a8d1f111..99ed7909 100644 --- a/doc/source/Reference/Glossary.rst +++ b/doc/source/Reference/Glossary.rst @@ -6,11 +6,29 @@ Glossary .. glossary:: + ATM + The Weather Model configuration that runs only the standalone atmospheric model. + CCPP The `Common Community Physics Package `_ is a forecast-model agnostic, vetted collection of code containing atmospheric physical parameterizations and suites of parameterizations for use in Numerical Weather Prediction (NWP) along with a framework that connects the physics to the host forecast model. + CDEPS + The `Community Data Models for Earth Predictive Systems `_ repository (CDEPS) contains a set of :term:`NUOPC`-compliant data components and :term:`ESMF`-based "stream" code that selectively removes feedback in coupled model systems. In essence, CDEPS handles the static Data Atmosphere (:term:`DATM`) integration with dynamic coupled model components (e.g., :term:`MOM6`). The CDEPS data models perform the basic function of reading external data files, modifying those data, and then sending the data back to the :term:`CMEPS` mediator. The fields sent to the :term:`mediator` are the same as those that would be sent by an active component. This takes advantage of the fact that the mediator and other CMEPS-compliant model components have no fundamental knowledge of whether another component is fully active or just a data component. More information about DATM is available in the CDEPS `Documentation `_. + + CESM + The `Community Earth System Model `_ (CESM) is a fully-coupled global climate model developed at the National Center for Atmospheric Research (:term:`NCAR`) in collaboration with colleagues in the research community. + + CMEPS + The `Community Mediator for Earth Prediction Systems `_ (CMEPS) is a :term:`NUOPC`-compliant :term:`mediator` used for coupling Earth system model components. It is currently being used in NCAR's Community Earth System Model (:term:`CESM`) and NOAA's subseasonal-to-seasonal (S2S) coupled system. More information is available in the `CMEPS Documentation `_. + container - `Docker `__ describes a container as "a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another." + `Docker `_ describes a container as "a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another." + + cron + cron job + crontab + cron table + Cron is a job scheduler accessed through the command-line on UNIX-like operating systems. It is useful for automating tasks such as regression testing. Cron periodically checks a cron table (aka crontab) to see if any tasks are are ready to execute. If so, it runs them. cycle An hour of the day on which a forecast is started. In the Land DA System, it usually follows YYYYMMDD-HHmmss format. @@ -18,17 +36,20 @@ Glossary data assimilation One of the major sources of error in weather and climate forecasts is uncertainty related to the initial conditions that are used to generate future predictions. Even the most precise instruments have a small range of unavoidable measurement error, which means that tiny measurement errors (e.g., related to atmospheric conditions and instrument location) can compound over time. These small differences result in very similar forecasts in the short term (i.e., minutes, hours), but they cause widely divergent forecasts in the long term. Errors in weather and climate forecasts can also arise because models are imperfect representations of reality. Data assimilation systems seek to mitigate these problems by combining the most timely observational data with a "first guess" of the atmospheric state (usually a previous forecast) and other sources of data to provide a "best guess" analysis of the atmospheric state to start a weather or climate simulation. When combined with an "ensemble" of model runs (many forecasts with slightly different conditions), data assimilation helps predict a range of possible atmospheric states, giving an overall measure of uncertainty in a given forecast. + DATM + DATM is the *Data Atmosphere* component of :term:`CDEPS`. It uses static atmospheric forcing files (derived from observations or previous atmospheric model runs) instead of output from an active atmospheric model. This reduces the complexity and computational cost associated with coupling to an active atmospheric model. The *Data Atmosphere* component is particularly useful when employing computationally intensive Data Assimilation (DA) techniques to update ocean and/or sea ice fields in a coupled model. In general, use of DATM in place of :term:`ATM` can be appropriate when users are running a coupled model and only want certain components of the model to be active. More information about DATM is available in the `CDEPS Documentation `_. + ERA5 - The ECMWF Reanalysis v5 (`ERA5 `__) dataset "is the fifth generation ECMWF atmospheric reanalysis of the global climate covering the period from January 1940 to present." It "provides hourly estimates of a large number of atmospheric, land and oceanic climate variables." + The ECMWF Reanalysis v5 (`ERA5 `_) dataset "is the fifth generation ECMWF atmospheric reanalysis of the global climate covering the period from January 1940 to present." It "provides hourly estimates of a large number of atmospheric, land and oceanic climate variables." ESMF - `Earth System Modeling Framework `__. The ESMF defines itself as "a suite of software tools for developing high-performance, multi-component Earth science modeling applications." It is a community-developed software infrastructure for building and coupling models. + `Earth System Modeling Framework `_. The ESMF defines itself as "a suite of software tools for developing high-performance, multi-component Earth science modeling applications." It is a community-developed software infrastructure for building and coupling models. ex-scripts Scripting layer (contained in ``land-DA_workflow/jobs/``) that should be called by a :term:`J-job ` for each workflow component to run a specific task or sub-task in the workflow. The different scripting layers are described in detail in the :nco:`NCO Implementation Standards document `. FMS - The Flexible Modeling System (`FMS `__) is a software framework for supporting the efficient + The Flexible Modeling System (`FMS `_) is a software framework for supporting the efficient development, construction, execution, and scientific interpretation of atmospheric, oceanic, and climate system models. @@ -39,41 +60,56 @@ Glossary The Finite-Volume Cubed-Sphere dynamical core (dycore). Developed at NOAA’s `Geophysical Fluid Dynamics Laboratory `__ (GFDL), it is a scalable and flexible dycore capable of both hydrostatic and non-hydrostatic atmospheric simulations. It is the dycore used in the UFS Weather Model. GSWP3 - The Global Soil Wetness Project Phase 3 (`GSWP3 `__) dataset is a century-long comprehensive set of data documenting several variables for hydro-energy-eco systems. + The Global Soil Wetness Project Phase 3 dataset is a century-long comprehensive set of data documenting several variables for hydro-energy-eco systems. + + HPC + High-Performance Computing. J-jobs Scripting layer (contained in ``land-DA_workflow/jobs/``) that should be directly called for each workflow component (either on the command line or by the workflow manager) to run a specific task in the workflow. The different scripting layers are described in detail in the :nco:`NCO Implementation Standards document `. JEDI - The Joint Effort for Data assimilation Integration (`JEDI `__) is a unified and versatile data assimilation (DA) system for Earth System Prediction. It aims to enable efficient research and accelerated transition from research to operations by providing a framework that takes into account all components of the Earth system in a consistent manner. The JEDI software package can run on a variety of platforms and for a variety of purposes, and it is designed to readily accommodate new atmospheric and oceanic models and new observation systems. The `JEDI User's Guide `__ contains extensive information on the software. + The Joint Effort for Data assimilation Integration (`JEDI `_) is a unified and versatile data assimilation (DA) system for Earth System Prediction. It aims to enable efficient research and accelerated transition from research to operations by providing a framework that takes into account all components of the Earth system in a consistent manner. The JEDI software package can run on a variety of platforms and for a variety of purposes, and it is designed to readily accommodate new atmospheric and oceanic models and new observation systems. The `JEDI User's Guide `_ contains extensive information on the software. - JEDI is developed and distributed by the `Joint Center for Satellite Data Assimilation `__, a multi-agency research center hosted by the University Corporation for Atmospheric Research (`UCAR `__). JCSDA is dedicated to improving and accelerating the quantitative use of research and operational satellite data in weather, ocean, climate, and environmental analysis and prediction systems. + JEDI is developed and distributed by the `Joint Center for Satellite Data Assimilation `_, a multi-agency research center hosted by the University Corporation for Atmospheric Research (`UCAR `_). JCSDA is dedicated to improving and accelerating the quantitative use of research and operational satellite data in weather, ocean, climate, and environmental analysis and prediction systems. - HPC - High-Performance Computing. + LND + land component + The Noah Multi-Physics (Noah-MP) land surface model (LSM) is an open-source, community-developed LSM that has been incorporated into the UFS Weather Model (WM). It is the UFS WM's land component. LETKF-OI Local Ensemble Transform Kalman Filter-Optimal Interpolation (see :cite:t:`HuntEtAl2007`, 2007). + Mediator + A mediator, sometimes called a coupler, is a software component that includes code for representing component interactions. Typical operations include merging data fields, ensuring consistent treatment of coastlines, computing fluxes, and temporal averaging. + + MOM + MOM6 + Modular Ocean Model + MOM6 is the latest generation of the Modular Ocean Model. It is numerical model code for simulating the ocean general circulation. MOM6 was originally developed by the `Geophysical Fluid Dynamics Laboratory `__. Currently, `MOM6 code `_ and an `extensive suite of test cases `_ are available under an open-development software framework. Although there are many public forks of MOM6, the `NOAA EMC fork `_ is used in the UFS Weather Model. + MPI MPI stands for Message Passing Interface. An MPI is a standardized communication system used in parallel programming. It establishes portable and efficient syntax for the exchange of messages and data between multiple processors that are used by a single computer program. An MPI is required for high-performance computing (HPC) systems. + NCAR + The `National Center for Atmospheric Research `_. + netCDF - NetCDF (`Network Common Data Form `__) is a file format and community standard for storing multidimensional scientific data. It includes a set of software libraries and machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. + NetCDF (`Network Common Data Form `_) is a file format and community standard for storing multidimensional scientific data. It includes a set of software libraries and machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. NCEP National Centers for Environmental Prediction (NCEP) is an arm of the National Weather Service consisting of nine centers. More information can be found at https://www.ncep.noaa.gov. NCO - :term:`NCEP` Central Operations. Visit the `NCO website `__ for more information. + :term:`NCEP` Central Operations. Visit the `NCO website `_ for more information. NUOPC National Unified Operational Prediction Capability - The `National Unified Operational Prediction Capability `__ is a consortium of Navy, NOAA, and Air Force modelers and their research partners. It aims to advance the weather modeling systems used by meteorologists, mission planners, and decision makers. NUOPC partners are working toward a common model architecture --- a standard way of building models --- in order to make it easier to collaboratively build modeling systems. + The `National Unified Operational Prediction Capability `_ is a consortium of Navy, NOAA, and Air Force modelers and their research partners. It aims to advance the weather modeling systems used by meteorologists, mission planners, and decision makers. NUOPC partners are working toward a common model architecture --- a standard way of building models --- in order to make it easier to collaboratively build modeling systems. NUOPC Layer The :term:`NUOPC` Layer "defines conventions and a set of generic components for building coupled models using the Earth System Modeling Framework (:term:`ESMF`)." - NUOPC applications are built on four generic components: driver, model, mediator, and connector. For more information, visit the `NUOPC website `__. + NUOPC applications are built on four generic components: driver, model, mediator, and connector. For more information, visit the `NUOPC website `_. NUOPC Cap NUOPC Model Cap @@ -86,10 +122,10 @@ Glossary Research and Development High-Performance Computing Systems. Spack - `Spack `__ is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers, where many users and application teams share common installations of software on clusters with exotic architectures. + `Spack `_ is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers, where many users and application teams share common installations of software on clusters with exotic architectures. spack-stack - The `spack-stack `__ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack. + The `spack-stack `_ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `_ and the `Joint Effort for Data assimilation Integration (JEDI) `_ framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack. UFS The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system consisting of several applications (apps). These apps span regional to global domains and sub-hourly to seasonal time scales. The UFS is designed to support the :term:`Weather Enterprise` and to be the source system for NOAA's operational numerical weather prediction applications. For more information, visit https://ufscommunity.org/. diff --git a/doc/source/conf.py b/doc/source/conf.py index f6004f53..710bbcbc 100644 --- a/doc/source/conf.py +++ b/doc/source/conf.py @@ -11,9 +11,9 @@ author = ' ' # The short X.Y version -version = 'v1.2' +version = 'develop' # The full version, including alpha/beta/rc tags -release = 'v1.2.0' +release = 'develop' numfig = True @@ -113,7 +113,6 @@ def setup(app): intersphinx_mapping = { 'jedi': ('https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/1.7.0', None), 'spack-stack': ('https://spack-stack.readthedocs.io/en/1.3.0/', None), - 'gswp3': ('https://hydro.iis.u-tokyo.ac.jp/GSWP3/', None), } # -- Options for extlinks extension --------------------------------------- @@ -121,6 +120,7 @@ def setup(app): extlinks_detect_hardcoded_links = True extlinks = {'github': ('https://github.com/ufs-community/land-DA_workflow/%s', '%s'), 'github-docs': ('https://docs.github.com/en/%s', '%s'), + 'gswp3': ('https://hydro.iis.u-tokyo.ac.jp/GSWP3/%s', '%s'), 'jedi': ('https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/1.7.0/%s', '%s'), 'nco': ('https://www.nco.ncep.noaa.gov/idsb/implementation_standards/%s', '%s'), 'rocoto': ('https://christopherwharrop.github.io/rocoto/%s', '%s'), From 786d6fd99780758e93b7063685d386a119a27610 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Tue, 16 Jul 2024 15:19:09 -0400 Subject: [PATCH 16/49] add configworkflow chapter --- .../CustomizingTheWorkflow/ConfigWorkflow.rst | 26 ++++++++++++------- doc/source/CustomizingTheWorkflow/index.rst | 1 + 2 files changed, 17 insertions(+), 10 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst index 730d71d0..38ed0b77 100644 --- a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst +++ b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst @@ -170,10 +170,16 @@ Entities are constants that can be referred to throughout the workflow using the Specifies the file stub/name for orography files in ``TPATH``. This file stub is named ``oro_C${RES}`` for atmosphere-only orography files and ``oro_C{RES}.mx100`` for atmosphere and ocean orography files. When Land DA is compiled with ``sorc/app_build.sh``, the subdirectories of the fix files should be linked into the ``fix`` directory, and orography files can be found in ``fix/FV3_fix_tiled/C96``. ``DATADEP_FILE1:`` (Default: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc") + File name for the dependency check for the task ``pre_anal``. The ``pre_anal`` task is triggered only when one or more of the ``DATADEP_FILE#`` files exists. Otherwise, the task will not be submitted. + ``DATADEP_FILE2:`` (Default: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.nc") + File name for the dependency check for the task ``pre_anal``. The ``pre_anal`` task is triggered only when one or more of the ``DATADEP_FILE#`` files exists. Otherwise, the task will not be submitted. + ``DATADEP_FILE3:`` (Default: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc") + File name for the dependency check for the task ``pre_anal``. The ``pre_anal`` task is triggered only when one or more of the ``DATADEP_FILE#`` files exists. Otherwise, the task will not be submitted. + ``DATADEP_FILE4:`` (Default: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.nc") - File names for the dependency check for the task ``pre_anal``. This means that ``pre_anal`` is triggered only when one of them exists. Otherwise, the task will not be submitted. + File name for the dependency check for the task ``pre_anal``. The ``pre_anal`` task is triggered only when one or more of the ``DATADEP_FILE#`` files exists. Otherwise, the task will not be submitted. NCO Directory Structure Entities @@ -182,7 +188,7 @@ NCO Directory Structure Entities Standard environment variables are defined in the NCEP Central Operations :nco:`WCOSS Implementation Standards ` document. These variables are used in forming the path to various directories containing input, output, and workflow files. For a visual aid, see the :ref:`Land DA Directory Structure Diagram `. The variables are defined in the WCOSS Implementation Standards document (pp. 4-5) as follows: ``HOMElandda:`` (Default: "&EXP_BASEDIR;/land-DA_workflow") - The location of the :github:`land-DA_workflow` clone. + The location of the :github:`land-DA_workflow <>` clone. ``PTMP:`` (Default: "&EXP_BASEDIR;/ptmp") User-defined path to the ``com``-type directories. @@ -260,7 +266,7 @@ Each task may contain attributes (``attrs:``), just as in the overarching ``work The following subsections explain any variables that have not already been explained/defined above. -.. _sample_task: +.. _sample-task: Sample Task: Analysis Task (``task_analysis``) ------------------------------------------------ @@ -364,7 +370,7 @@ For most workflow tasks, whatever value is set in the ``workflow.entities:`` sec Miscellaneous Task Values ^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The authoritative :rocoto:`Rocoto documentation` discusses a number of miscellaneous task attributes in detail. A brief overview is provided in this section. +The authoritative :rocoto:`Rocoto documentation <>` discusses a number of miscellaneous task attributes in detail. A brief overview is provided in this section. .. code-block:: console @@ -448,14 +454,14 @@ Other tasks may list data or time dependencies. For example, the pre-analysis ta age: 5 value: "&DATADEP_FILE4;" -For details on the dependency details (e.g., ``attrs:``, ``age:``, ``value:`` tags), view the authoritative :rocoto:`Rocoto documentation`. +For details on the dependency details (e.g., ``attrs:``, ``age:``, ``value:`` tags), view the authoritative :rocoto:`Rocoto documentation <>`. .. _prep-obs: Observation Preparation Task (``task_prep_obs``) -------------------------------------------------- -Parameters for the observation preparation task are set in the ``task_prep_obs:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. +Parameters for the observation preparation task are set in the ``task_prep_obs:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. .. code-block:: console @@ -494,7 +500,7 @@ Parameters for the observation preparation task are set in the ``task_prep_obs:` Pre-Analysis Task (``task_pre_anal``) --------------------------------------- -Parameters for the pre-analysis task are set in the ``task_pre_anal:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. +Parameters for the pre-analysis task are set in the ``task_pre_anal:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. .. code-block:: console @@ -560,7 +566,7 @@ Parameters for the analysis task are set in the ``task_analysis:`` section of th Post-Analysis Task (``task_post_anal``) ----------------------------------------- -Parameters for the post analysis task are set in the ``task_post_anal:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. +Parameters for the post analysis task are set in the ``task_post_anal:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. .. code-block:: console @@ -604,7 +610,7 @@ Parameters for the post analysis task are set in the ``task_post_anal:`` section Plotting Task (``task_plot_stats``) ------------------------------------- -Parameters for the plotting task are set in the ``task_plot_stats:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. +Parameters for the plotting task are set in the ``task_plot_stats:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. .. code-block:: console @@ -644,7 +650,7 @@ Parameters for the plotting task are set in the ``task_plot_stats:`` section of Forecast Task (``task_forecast``) ---------------------------------- -Parameters for the forecast task are set in the ``task_forecast:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. +Parameters for the forecast task are set in the ``task_forecast:`` section of the ``land_analysis_.yaml`` file. Most task variables are the same as the defaults set and defined in the :ref:`Workflow Entities ` section. Variables common to all tasks are discussed in more detail in the :ref:`Sample Task ` section, although the default values may differ. .. code-block:: console diff --git a/doc/source/CustomizingTheWorkflow/index.rst b/doc/source/CustomizingTheWorkflow/index.rst index d172b27d..87607bf4 100644 --- a/doc/source/CustomizingTheWorkflow/index.rst +++ b/doc/source/CustomizingTheWorkflow/index.rst @@ -6,5 +6,6 @@ Customizing the Workflow .. toctree:: :maxdepth: 3 + ConfigWorkflow Model DASystem From 69ee11b34336fa8f6d9add06b239e2d173769631 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Tue, 16 Jul 2024 19:03:50 -0400 Subject: [PATCH 17/49] update file paths & misc in Model ch --- doc/source/CustomizingTheWorkflow/Model.rst | 124 +++++++++----------- 1 file changed, 53 insertions(+), 71 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/Model.rst b/doc/source/CustomizingTheWorkflow/Model.rst index d9ac52d1..3bc3a70c 100644 --- a/doc/source/CustomizingTheWorkflow/Model.rst +++ b/doc/source/CustomizingTheWorkflow/Model.rst @@ -1,8 +1,8 @@ .. _Model: -******************************** -Noah-MP Land Surface Model -******************************** +*********************************** +Input/Output Files - Noah-MP Model +*********************************** This chapter provides practical information on input files and parameters for the Noah-MP Land Surface Model (LSM) and its Vector-to-Tile Converter component. For background information on the Noah-MP LSM, see :numref:`Section %s ` of the Introduction. @@ -12,18 +12,22 @@ For background information on the Noah-MP LSM, see :numref:`Section %s ` Input Files ************** -The UFS land model requires multiple input files to run, including static datasets -(fix files containing climatological information, terrain, and land use -data), initial conditions files, and forcing files. Users may reference the `Community Noah-MP User's -Guide `_ -for a detailed technical description of certain elements of the Noah-MP model. +The UFS land model requires multiple input files to run, including static datasets (fix files containing climatological information, terrain, and land use data), initial conditions files, and forcing files. +Users may reference the `Community Noah-MP Land Surface Modeling System Technical Description Version 5.0 `_ (2023) and the `Community Noah-MP User's Guide `_ (2011) for a detailed technical description of certain elements of the Noah-MP model. In both the land component and land driver implementations of Noah-MP, static file(s) and initial conditions file(s) specify model parameters. -These files are publicly available via the `Land DA data bucket `_. +These files are publicly available in the `Land DA data bucket `_. Users can download the data and untar the file via the command line: .. _TarFile: +.. code-block:: console + + wget https://noaa-ufs-land-da-pds.s3.amazonaws.com/develop-20240501/Landda_develop_data.tar.gz + tar xvfz Landda_develop_data.tar.gz + +For data specific to the latest release (|latestr|), users can run: + .. code-block:: console wget https://noaa-ufs-land-da-pds.s3.amazonaws.com/current_land_da_release_data/v1.2.0/Landdav1.2.0_input_data.tar.gz @@ -33,38 +37,41 @@ These files and their parameters are described in the following subsections. .. note:: - * Users who wish to use the UFS land component with GSWP3 data can proceed to the :numref:`Section %s `. - * Users who wish to run the land driver implementation of Land DA with ERA5 data should proceed to :numref:`Section %s `. + * Users who wish to use the UFS land component with :term:`GSWP3` data can proceed to the :numref:`Section %s `. + * Users who wish to run the land driver implementation of Land DA with :term:`ERA5` data should proceed to :numref:`Section %s `. .. _view-netcdf-files: Viewing netCDF Files ====================== -Users can view file information and notes for NetCDF files using the ``ncdump`` module. First, load a compiler, MPI, and NetCDF modules: +Users can view file information, variables, and notes for NetCDF files using the ``ncdump`` module. On Level 1 platforms, users can load the Land DA environment from ``land-DA_workflow`` as described in :numref:`Section %s `. + +Then, users can run ``ncdump -h path/to/filename.nc``, where ``path/to/filename.nc`` is replaced with the path to the file. For example, on Orion, users might run: .. code-block:: console - module load intel/2022.1.2 impi/2022.1.2 netcdf/4.7.4 + module load netcdf-c/4.9.2 + ncdump -h /work/noaa/epic/UFS_Land-DA_Dev/inputs/NOAHMP_IC/ufs-land_C96_init_fields.tile1.nc -To view information on the variables contained in a :term:`netCDF` file, users can run ``ncdump -h filename.nc``. Users will need to replace ``filename.nc`` with the actual name of the file they want to view. For example: -.. code-block:: console +On other systems, users can load a compiler, MPI, and NetCDF modules before running the ``ncdump`` command above. For example: - ncdump -h /path/to/ufs-land_C96_init_fields.tile1.nc +.. code-block:: console -where ``/path/to/`` is replaced by the actual path to the file. Users may also need to modify the module load command to reflect modules that are available on their system. + module load intel/2022.1.2 impi/2022.1.2 netcdf-c/4.9.2 + ncdump -h /path/to/inputs/NOAHMP_IC/ufs-land_C96_init_fields.tile1.nc -Alternatively, users on Level 1 platforms can load the Land DA environment, which contains the NetCDF module, from ``land-DA_workflow`` as described in :numref:`Section %s `. +Users may need to modify the ``module load`` command to reflect modules that are available on their system. .. _datm-lnd-input-files: Input Files for the ``DATM`` + ``LND`` Configuration with GSWP3 data ====================================================================== -With the integration of the UFS Noah-MP land component into the Land DA System in the v1.2.0 release, model forcing options have been enhanced so that users can run the UFS land component (:term:`LND`) with the data atmosphere component (:term:`DATM`). Updates provide a new analysis option on the cubed-sphere native grid using :term:`GSWP3` forcing data to run a single-day experiment for 2000-01-03. An artificial GHCN snow depth observation is provided for data assimilation (see :numref:`Section %s ` for more on GHCN files). The GHCN observations will be extended in the near future. A new configuration setting file is also provided (``settings_DA_cycle_gswp3``). +With the integration of the UFS Noah-MP land component into the Land DA System in the v1.2.0 release, model forcing options have been enhanced so that users can run the UFS land component (:term:`LND`) with the data atmosphere component (:term:`DATM`). Updates provide a new analysis option on the cubed-sphere native grid using :term:`GSWP3` forcing data to run a cycled experiment for 2000-01-03 to 2000-01-04. An artificial GHCN snow depth observation is provided for data assimilation (see :numref:`Section %s ` for more on GHCN files). The GHCN observations will be extended in the near future. -On Level 1 platforms, the requisite data is pre-staged at the locations listed in :numref:`Section %s `. The data are also publicly available via the `Land DA Data Bucket `_. +On Level 1 platforms, the requisite data are pre-staged at the locations listed in :numref:`Section %s `. The data are also publicly available via the `Land DA Data Bucket `_. .. attention:: @@ -73,7 +80,7 @@ On Level 1 platforms, the requisite data is pre-staged at the locations listed i Forcing Files --------------- -:term:`Forcing files` for the land component configuration come from the Global Soil Wetness Project Phase 3 (`GSWP3 `_) dataset. They are located in the ``inputs/UFS_WM/DATM_GSWP3_input_data`` directory (downloaded :ref:`above `). +:term:`Forcing files` for the land component configuration come from the Global Soil Wetness Project Phase 3 dataset. They are located in the ``inputs/DATM_input_data/gswp3`` directory (downloaded :ref:`above `). .. code-block:: console @@ -96,15 +103,11 @@ Noah-MP Initial Conditions The offline Land DA System currently only supports snow DA. The initial conditions files include the initial state variables that are required for the UFS land snow DA to begin a cycling run. The data must be provided in :term:`netCDF` format. -By default, on Level 1 systems and in the Land DA data bucket, the initial conditions files are located at ``inputs/UFS_WM/NOAHMP_IC`` (downloaded :ref:`above `). Each file corresponds to one of the six tiles of the `global FV3 grid `_. - -.. code-block:: console - - ufs-land_C96_init_fields.tile*.nc +By default, on Level 1 systems and in the Land DA data bucket, the initial conditions files are located at ``inputs/NOAHMP_IC/ufs-land_C96_init_fields.tile*.nc`` (downloaded :ref:`above `). Each file corresponds to one of the six tiles of the `global FV3 grid `_. The files contain the following data: -.. list-table:: *Variables specified in the initial conditions file ``ufs-land_C96_init_fields.tile*.nc``* +.. list-table:: *Variables specified in the initial conditions file ufs-land_C96_init_fields.tile*.nc* :header-rows: 1 * - Variables @@ -145,29 +148,23 @@ The files contain the following data: FV3_fix_tiled Files --------------------- -The UFS land component also requires a series of tiled static (fix) files that will be used by the component model. These files contain information on maximum snow albedo, slope type, soil color and type, substrate temperature, vegetation greenness and type, and orography (grid and land mask information). These files are located in the ``inputs/UFS_WM/FV3_fix_tiled/C96/`` directory (downloaded :ref:`above `). +The UFS land component also requires a series of tiled static (fix) files that will be used by the component model. These files contain information on maximum snow albedo, slope type, soil color and type, substrate temperature, vegetation greenness and type, and orography (grid and land mask information). These files are located in the ``inputs/FV3_fix_tiled/C96`` directory (downloaded :ref:`above `). .. code-block:: console + C96.facsf.tile*.nc + C96_grid.tile*.nc C96.maximum_snow_albedo.tile*.nc C96.slope_type.tile*.nc + C96.snowfree_albedo.tile*.nc C96.soil_type.tile*.nc C96.soil_color.tile*.nc C96.substrate_temperature.tile*.nc C96.vegetation_greenness.tile*.nc C96.vegetation_type.tile*.nc + grid_spec.nc oro_C96.mx100.tile*.nc -FV3_input_data ----------------- - -The ``FV3_input_data`` directory contains grid information used by the model. This grid information is located in ``inputs/UFS_WM/FV3_input_data/INPUT`` (downloaded :ref:`above `). - -.. code-block:: console - - C96_grid.tile*.nc - grid_spec.nc # aka C96.mosaic.nc - The ``C96_grid.tile*.nc`` files contain grid information for tiles 1-6 at C96 grid resolution. The ``grid_spec.nc`` file contains information on the mosaic grid. .. note:: @@ -195,7 +192,7 @@ The static file is available in the ``inputs`` data directory (downloaded :ref:` .. code-block:: - inputs/forcing/era5/static/ufs-land_C96_static_fields.nc + inputs/static/ufs-land_C96_static_fields.nc .. table:: *Configuration variables specified in the static file* (ufs-land_C96_static_fields.nc) @@ -303,11 +300,11 @@ The UFS land model uses a series of template files combined with user-selected settings to create required namelists and parameter files needed by the UFS Land DA workflow. This section describes the options in the ``ufs-land.namelist.noahmp`` file, which is generated -from the ``template.ufs-noahMP.namelist.*`` file. +from the ``template.ufs-noahMP.namelist.era5`` file. .. note:: - Any default values indicated are the defaults set in the ``template.ufs-noahMP.namelist.*`` files. + Any default values indicated are the defaults set in the ``template.ufs-noahMP.namelist.era5`` files. Run Setup Parameters ^^^^^^^^^^^^^^^^^^^^^^ @@ -336,6 +333,9 @@ Run Setup Parameters ``output_dir`` Specifies the output directory where output files will be saved. If ``separate_output=.true.``, but no ``output_dir`` is specified, it will default to the directory where the executable is run. +``output_frequency_s`` + Specifies the output frequency (in seconds) for the UFS land model. + ``restart_frequency_s`` Specifies the restart frequency (in seconds) for the UFS land model. @@ -386,6 +386,12 @@ Run Setup Parameters ``run_timesteps`` Specifies the number of timesteps to run. +``location_start`` +.. COMMENT: Add definition! + +``location_end`` +.. COMMENT: Add definition! + Land Model Options ^^^^^^^^^^^^^^^^^^^^^ @@ -448,15 +454,6 @@ Noah-MP Options | 10 | crop model on (use maximum vegetation fraction) | +-------+------------------------------------------------------------+ -``LAI`` - Routines for handling Leaf/Stem area index data products - -``FVEG`` - Green vegetation fraction [0.0-1.0] - -``SHDFAC`` - Greenness vegetation (shaded) fraction - ``canopy_stomatal_resistance_option``: (Default: ``2``) Specifies the canopy stomatal resistance option. Valid values: ``1`` | ``2`` @@ -573,12 +570,6 @@ Noah-MP Options | 4 | Use WRF microphysics output | +--------+-----------------------------+ -``SFCTMP`` - Surface air temperature - -``TFRZ`` - Freezing/melting point (K) - ``soil_temp_lower_bdy_option``: (Default: ``2``) Specifies the lower boundary condition of soil temperature option. Valid values: ``1`` | ``2`` @@ -590,12 +581,6 @@ Noah-MP Options | 2 | TBOT at ZBOT (8m) read from a file (original Noah) | +--------+---------------------------------------------------------+ -``TBOT`` - Lower boundary soil temperature [K] - -``ZBOT`` - Depth[m] of lower boundary soil temperature (TBOT) - ``soil_temp_time_scheme_option``: (Default: ``3``) Specifies the snow and soil temperature time scheme. Valid values: ``1`` | ``2`` | ``3`` @@ -609,12 +594,6 @@ Noah-MP Options | 3 | same as 1, but FSNO for TS calculation (generally improves snow; v3.7) | +--------+------------------------------------------------------------------------+ -``FSNO`` - Fraction of surface covered with snow - -``TS`` - Surface temperature - ``thermal_roughness_scheme_option``: (Default: ``2``) Specifies the method/scheme used to calculate the thermal roughness length. Valid values: ``1`` | ``2`` | ``3`` | ``4`` @@ -645,9 +624,6 @@ Noah-MP Options | 4 | option 1 for non-snow; rsurf = rsurf_snow for snow | +----------------+-----------------------------------------------------+ -``rsurf`` - Ground surface resistance (s/m) - ``glacier_option``: (Default: ``1``) Specifies the glacier model option. Valid values: ``1`` | ``2`` @@ -665,6 +641,12 @@ Forcing Parameters ``forcing_timestep_seconds``: (Default: ``3600``) Specifies the forcing timestep in seconds. +``forcing_regrid:`` (Default: "none") +.. COMMENT: Add definition! + +``forcing_regrid_weights_filename:`` (Default: "") +.. COMMENT: Add definition! + ``forcing_type`` Specifies the forcing type option, which describes the frequency and length of forcing in each forcing file. Valid values: ``single-point`` | ``gswp3`` | ``gdas`` From 12ccb8b109dce398578ac409ad0bb5bc095ce679 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Tue, 16 Jul 2024 19:12:38 -0400 Subject: [PATCH 18/49] misc minor update in Model ch --- doc/source/CustomizingTheWorkflow/Model.rst | 9 ++++++--- 1 file changed, 6 insertions(+), 3 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/Model.rst b/doc/source/CustomizingTheWorkflow/Model.rst index 3bc3a70c..310db279 100644 --- a/doc/source/CustomizingTheWorkflow/Model.rst +++ b/doc/source/CustomizingTheWorkflow/Model.rst @@ -838,6 +838,7 @@ The input files containing grid information are listed in :numref:`Table %s Date: Wed, 17 Jul 2024 12:26:32 -0400 Subject: [PATCH 19/49] Update DA ch --- .../CustomizingTheWorkflow/DASystem.rst | 41 +++++++++---------- 1 file changed, 19 insertions(+), 22 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index ac742e93..7605244d 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -13,11 +13,11 @@ Joint Effort for Data Assimilation Integration (JEDI) Users are encouraged to visit the :jedi:`JEDI Documentation `. Much of the information in this chapter is drawn directly from there with modifications to clarify JEDI's use specifically in the context of the Land DA System. -The Joint Effort for Data assimilation Integration (:term:`JEDI`) is a unified and versatile :term:`data assimilation` (DA) system for Earth System Prediction that can be run on a variety of platforms. JEDI is developed by the Joint Center for Satellite Data Assimilation (`JCSDA `__) and partner agencies, including NOAA. The core feature of JEDI is separation of concerns. The data assimilation update, observation selection and processing, and observation operators are all coded with no knowledge of or dependency on each other or on the forecast model. +The Joint Effort for Data assimilation Integration (:term:`JEDI`) is a unified and versatile :term:`data assimilation` (DA) system for Earth system prediction that can be run on a variety of platforms. JEDI is developed by the Joint Center for Satellite Data Assimilation (`JCSDA `_) and partner agencies, including NOAA. The core feature of JEDI is separation of concerns. The data assimilation update, observation selection and processing, and observation operators are all coded with no knowledge of or dependency on each other or on the forecast model. The NOAH-MP offline Land DA System uses three JEDI components: - * The Object-Oriented Prediction System (:ref:`OOPS `) for the data assimilation algorithm + * The Object-Oriented Prediction System (:jedi:`OOPS `) for the data assimilation algorithm * The Interface for Observation Data Access (:jedi:`IODA `) for the observation formatting and processing * The Unified Forward Operator (:jedi:`UFO `) for comparing model forecasts and observations @@ -26,14 +26,14 @@ JEDI's Unified Forward Operator (UFO) links observation operators with the Objec Object-Oriented Prediction System (OOPS) =========================================== -A data assimilation experiment requires a ``.yaml`` configuration file that specifies the details of the data assimilation and observation processing. OOPS provides the core set of data assimilation algorithms in JEDI by combining the generic building blocks required for the algorithms. The OOPS system does not require knowledge of any specific application model implementation structure or observation data information. In the Noah-MP offline Land DA System, OOPS reads the model forecast states from the restart files generated by the Noah-MP model. JEDI UFO contains generic quality control options and filters that can be applied to each observation system, without coding at certain model application levels. More information on the key concepts of the JEDI software design can be found in :cite:t:`Tremolet&Auligne2020` (2020), :cite:t:`HoldawayEtAl2020` (2020), and :cite:t:`HoneyagerEtAl2020` (2020). +A data assimilation experiment requires a YAML configuration file that specifies the details of the data assimilation and observation processing. OOPS provides the core set of data assimilation algorithms in JEDI by combining the generic building blocks required for the algorithms. The OOPS system does not require knowledge of any specific application model implementation structure or observation data information. In the Noah-MP offline Land DA System, OOPS reads the model forecast states from the restart files generated by the Noah-MP model. JEDI UFO contains generic quality control options and filters that can be applied to each observation system, without coding at certain model application levels. More information on the key concepts of the JEDI software design can be found in :cite:t:`Tremolet&Auligne2020` (2020), :cite:t:`HoldawayEtAl2020` (2020), and :cite:t:`HoneyagerEtAl2020` (2020). JEDI Configuration Files & Parameters ---------------------------------------- -To create the DA experiment, the user should create or modify an experiment-specific configuration ``.yaml`` file. This ``.yaml`` file should contain certain fundamental components: geometry, window begin, window length, background, driver, local ensemble DA, output increment, and observations. These components can be implemented differently for different models and observation types, so they frequently contain distinct parameters and variable names depending on the use case. Therefore, this section of the User's Guide focuses on assisting users with understanding and customizing these top-level configuration items in order to run Land DA experiments. Users may also reference the :jedi:`JEDI Documentation ` for additional information. +The DA experiment integrates information from several YAML configuration files, which contain certain fundamental components such as geometry, time window, background, driver, local ensemble DA, output increment, and observations. These components can be implemented differently for different models and observation types, so they frequently contain distinct parameters and variable names depending on the use case. Therefore, this section of the User's Guide focuses on assisting users with understanding and customizing these top-level configuration items in order to run Land DA experiments. Users may also reference the :jedi:`JEDI Documentation ` for additional information. -Users may find the following example ``GHCN.yaml`` configuration file to be a helpful starting point. A similar file (with user-appropriate modifications) is required by JEDI for snow data assimilation. The following subsections will explain the variables within each top-level item of the ``.yaml`` file. The ``GHCN.yaml`` file for the |latestr| release can be found within the cloned repository at ``DA_update/jedi/fv3-jedi/yaml_files/psl_develop/GHCN.yaml``. +In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information on geometry, time window, background, driver, local ensemble DA, and output increment, while ``GHCN.yaml`` contains detailed information to configure observations. In the ``develop`` branch, :github:`these files ` reside in the ``land-DA_workflow/parm/jedi`` directory. Some of the variables in these files are templated, so they bring in information from other files, such as the workflow configuration file (``land_analysis.yaml``) and the actual netCDF observation file (e.g., ``ghcn_snwd_ioda_20000103.nc``). In the ``analysis`` task, this information is assembled into one ``hofx_land.yaml`` file that is used to perform the snow data assimilation analysis. The example below shows what the complete ``hofx_land.yaml`` file might look like. The following subsections explain the variables used within this YAML file. .. code-block:: yaml @@ -42,18 +42,19 @@ Users may find the following example ``GHCN.yaml`` configuration file to be a he namelist filename: Data/fv3files/fmsmpp.nml field table filename: Data/fv3files/field_table akbk: Data/fv3files/akbk64.nc4 - npx: 49 - npy: 49 + npx: 97 # $RES + 1 + npy: 97 # $RES + 1 npz: 64 - field metadata override: Data/fieldmetadata/gfs-land.yaml + field metadata override: gfs-land.yaml time invariant fields: state fields: datetime: 2019-12-21T00:00:00Z filetype: fms restart skip coupler file: true state variables: [orog_filt] - datapath: /scratch2/NAGAPE/epic/UFS_Land-DA/inputs/forcing/era5/orog_files - filename_orog: oro_C96.mx100.nc + datapath: /scratch2/NAGAPE/epic/UFS_Land-DA_Dev/inputs/FV3_fix_tiled/C96 + filename_orog: oro_C96.mx100 + derived fields: [nominal_surface_pressure] window begin: 2019-12-21T00:00:00Z window length: PT24H @@ -123,23 +124,19 @@ Users may find the following example ``GHCN.yaml`` configuration file to be a he filter variables: - name: totalSnowDepth minvalue: 0.0 + maxvalue: 10000.0 - filter: Domain Check # missing station elevation (-999.9) where: - variable: - name: MetaData/height + name: MetaData/stationElevation minvalue: -999.0 + maxvalue: 10000.0 - filter: Domain Check # land only where: - variable: name: GeoVaLs/slmsk minvalue: 0.5 maxvalue: 1.5 - # GFSv17 only. - #- filter: Domain Check # no sea ice - # where: - # - variable: - # name: fraction_of_ice@GeoVaLs - # maxvalue: 0.0 - filter: RejectList # no land-ice where: - variable: @@ -155,7 +152,7 @@ Users may find the following example ``GHCN.yaml`` configuration file to be a he .. note:: - Any default values indicated in the sections below are the defaults set in ``letkfoi_snow.yaml`` or ``GHCN.yaml`` (found within the ``DA_update/jedi/fv3-jedi/yaml_files/psl_develop`` directory). + Any default values indicated in the sections below are the defaults set in ``letkfoi_snow.yaml``, ``GHCN.yaml``, or ``land_analysis.yaml``. Geometry ^^^^^^^^^^^ @@ -175,10 +172,10 @@ The ``geometry:`` section is used in JEDI configuration files to specify the mod Specifies the path to a file containing the coefficients that define the hybrid sigma-pressure vertical coordinate used in FV3. Files are provided with the repository containing ``ak`` and ``bk`` for some common choices of vertical resolution for GEOS and GFS. ``npx`` - Specifies the number of grid cells in the east-west direction. + Specifies the number of grid points in the east-west direction. ``npy`` - Specifies the number of grid cells in the north-south direction. + Specifies the number of grid points in the north-south direction. ``npz`` Specifies the number of vertical layers. @@ -342,7 +339,7 @@ The ``observations:`` item describes one or more types of observations, each of ``obs space:`` ```````````````` -The ``obs space:`` section of the ``.yaml`` comes under the ``observations.observers:`` section and describes the configuration of the observation space. An observation space handles observation data for a single observation type. +The ``obs space:`` section of the YAML comes under the ``observations.observers:`` section and describes the configuration of the observation space. An observation space handles observation data for a single observation type. ``name`` Specifies the name of observation space. The Land DA System uses ``Simulate`` for the default case. @@ -521,7 +518,7 @@ IODA provides a unified, model-agnostic method of sharing observation data and e The IODA file format represents observational field variables (e.g., temperature, salinity, humidity) and locations in two-dimensional tables, where the variables are represented by columns and the locations by rows. Metadata tables are associated with each axis of these data tables, and the location metadata hold the values describing each location (e.g., latitude, longitude). Actual data values are contained in a third dimension of the IODA data table; for instance: observation values, observation error, quality control flags, and simulated observation (H(x)) values. -Since the raw observational data come in various formats, a diverse set of "IODA converters" exists to transform the raw observation data files into IODA format. While many of these Python-based IODA converters have been developed to handle marine-based observations, users can utilize the "IODA converter engine" components to develop and implement their own IODA converters to prepare arbitrary observation types for data assimilation within JEDI. (See https://github.com/NOAA-PSL/land-DA_update/blob/develop/jedi/ioda/imsfv3_scf2ioda_obs40.py for the Land DA IMS IODA converter.) +Since the raw observational data come in various formats, a diverse set of "IODA converters" exists to transform the raw observation data files into IODA format. While many of these Python-based IODA converters have been developed to handle marine-based observations, users can utilize the "IODA converter engine" components to develop and implement their own IODA converters to prepare arbitrary observation types for data assimilation within JEDI. (See https://github.com/NOAA-PSL/land-DA_update/blob/develop/jedi/ioda/imsfv3_scf2iodaTemp.py for the Land DA IMS IODA converter.) Input Files From 21ae278ffd264c2deb7efc0961827faaa020840e Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 17 Jul 2024 12:45:48 -0400 Subject: [PATCH 20/49] Update letkf_land.yaml info --- .../CustomizingTheWorkflow/DASystem.rst | 31 ++++++++++--------- 1 file changed, 16 insertions(+), 15 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index 7605244d..45a28183 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -33,7 +33,7 @@ JEDI Configuration Files & Parameters The DA experiment integrates information from several YAML configuration files, which contain certain fundamental components such as geometry, time window, background, driver, local ensemble DA, output increment, and observations. These components can be implemented differently for different models and observation types, so they frequently contain distinct parameters and variable names depending on the use case. Therefore, this section of the User's Guide focuses on assisting users with understanding and customizing these top-level configuration items in order to run Land DA experiments. Users may also reference the :jedi:`JEDI Documentation ` for additional information. -In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information on geometry, time window, background, driver, local ensemble DA, and output increment, while ``GHCN.yaml`` contains detailed information to configure observations. In the ``develop`` branch, :github:`these files ` reside in the ``land-DA_workflow/parm/jedi`` directory. Some of the variables in these files are templated, so they bring in information from other files, such as the workflow configuration file (``land_analysis.yaml``) and the actual netCDF observation file (e.g., ``ghcn_snwd_ioda_20000103.nc``). In the ``analysis`` task, this information is assembled into one ``hofx_land.yaml`` file that is used to perform the snow data assimilation analysis. The example below shows what the complete ``hofx_land.yaml`` file might look like. The following subsections explain the variables used within this YAML file. +In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information on geometry, time window, background, driver, local ensemble DA, and output increment, while ``GHCN.yaml`` contains detailed information to configure observations. In the ``develop`` branch, :github:`these files ` reside in the ``land-DA_workflow/parm/jedi`` directory. Some of the variables in these files are templated, so they bring in information from other files, such as the workflow configuration file (``land_analysis.yaml``) and the actual netCDF observation file (e.g., ``ghcn_snwd_ioda_20000103.nc``). In the ``analysis`` task, this information is assembled into one ``letkf_land.yaml`` file that is used to perform the snow data assimilation. This file resides in the ``ptmp/test/tmp/analysis.${PDY}${cyc}.${jobid}/`` directory, where ``${PDY}${cyc}`` is in YYYYMMDDHH format (see :numref:`Section %s ` for more on these variables), and the ``${jobid}`` is assigned by the system. The example below shows what the complete ``letkf_land.yaml`` file might look like for the 2000-01-03 00Z cycle. The following subsections explain the variables used within this YAML file. .. code-block:: yaml @@ -48,32 +48,33 @@ In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information field metadata override: gfs-land.yaml time invariant fields: state fields: - datetime: 2019-12-21T00:00:00Z + datetime: 2000-01-02T00:00:00Z filetype: fms restart skip coupler file: true state variables: [orog_filt] - datapath: /scratch2/NAGAPE/epic/UFS_Land-DA_Dev/inputs/FV3_fix_tiled/C96 - filename_orog: oro_C96.mx100 + datapath: /scratch2/NAGAPE/epic/User.Name/landda/land-DA_workflow/fix/FV3_fix_tiled/C96 + filename_orog: oro_C96.mx100.nc derived fields: [nominal_surface_pressure] - window begin: 2019-12-21T00:00:00Z - window length: PT24H + time window: + begin: 2000-01-02T00:00:00Z + length: PT24H background: - date: &date 2019-12-21T00:00:00Z + date: &date 2000-01-03T00:00:00Z members: - - datetime: 2019-12-21T00:00:00Z + - datetime: 2000-01-03T00:00:00Z filetype: fms restart state variables: [snwdph,vtype,slmsk] datapath: mem_pos/ - filename_sfcd: 20191221.000000.sfc_data.nc - filename_cplr: 20191221.000000.coupler.res - - datetime: 2019-12-21T00:00:00Z + filename_sfcd: 20000103.000000.sfc_data.nc + filename_cplr: 20000103.000000.coupler.res + - datetime: 2000-01-03T00:00:00Z filetype: fms restart state variables: [snwdph,vtype,slmsk] datapath: mem_neg/ - filename_sfcd: 20191221.000000.sfc_data.nc - filename_cplr: 20191221.000000.coupler.res + filename_sfcd: 20000103.000000.sfc_data.nc + filename_cplr: 20000103.000000.coupler.res driver: save posterior mean: false @@ -103,11 +104,11 @@ In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information obsdatain: engine: type: H5File - obsfile: GHCN_2019122100.nc + obsfile: GHCN_2000010300.nc obsdataout: engine: type: H5File - obsfile: output/DA/hofx/letkf_hofx_ghcn_2019122100.nc + obsfile: output/DA/hofx/letkf_hofx_ghcn_2000010300.nc obs operator: name: Identity obs error: From 03692d283001ceecd86bb1e94bf9076953fdb71d Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 17 Jul 2024 12:48:46 -0400 Subject: [PATCH 21/49] fix broken intersphinx link --- doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst | 1 + doc/source/CustomizingTheWorkflow/DASystem.rst | 2 +- 2 files changed, 2 insertions(+), 1 deletion(-) diff --git a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst index 38ed0b77..48ceafc7 100644 --- a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst +++ b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst @@ -181,6 +181,7 @@ Entities are constants that can be referred to throughout the workflow using the ``DATADEP_FILE4:`` (Default: "&DATAROOT;/DATA_SHARE/RESTART/ufs_land_restart.@Y-@m-@d_@H-00-00.nc") File name for the dependency check for the task ``pre_anal``. The ``pre_anal`` task is triggered only when one or more of the ``DATADEP_FILE#`` files exists. Otherwise, the task will not be submitted. +.. _nco-dir-entities: NCO Directory Structure Entities ---------------------------------- diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index 45a28183..44c1b085 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -33,7 +33,7 @@ JEDI Configuration Files & Parameters The DA experiment integrates information from several YAML configuration files, which contain certain fundamental components such as geometry, time window, background, driver, local ensemble DA, output increment, and observations. These components can be implemented differently for different models and observation types, so they frequently contain distinct parameters and variable names depending on the use case. Therefore, this section of the User's Guide focuses on assisting users with understanding and customizing these top-level configuration items in order to run Land DA experiments. Users may also reference the :jedi:`JEDI Documentation ` for additional information. -In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information on geometry, time window, background, driver, local ensemble DA, and output increment, while ``GHCN.yaml`` contains detailed information to configure observations. In the ``develop`` branch, :github:`these files ` reside in the ``land-DA_workflow/parm/jedi`` directory. Some of the variables in these files are templated, so they bring in information from other files, such as the workflow configuration file (``land_analysis.yaml``) and the actual netCDF observation file (e.g., ``ghcn_snwd_ioda_20000103.nc``). In the ``analysis`` task, this information is assembled into one ``letkf_land.yaml`` file that is used to perform the snow data assimilation. This file resides in the ``ptmp/test/tmp/analysis.${PDY}${cyc}.${jobid}/`` directory, where ``${PDY}${cyc}`` is in YYYYMMDDHH format (see :numref:`Section %s ` for more on these variables), and the ``${jobid}`` is assigned by the system. The example below shows what the complete ``letkf_land.yaml`` file might look like for the 2000-01-03 00Z cycle. The following subsections explain the variables used within this YAML file. +In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information on geometry, time window, background, driver, local ensemble DA, and output increment, while ``GHCN.yaml`` contains detailed information to configure observations. In the ``develop`` branch, :github:`these files ` reside in the ``land-DA_workflow/parm/jedi`` directory. Some of the variables in these files are templated, so they bring in information from other files, such as the workflow configuration file (``land_analysis.yaml``) and the actual netCDF observation file (e.g., ``ghcn_snwd_ioda_20000103.nc``). In the ``analysis`` task, this information is assembled into one ``letkf_land.yaml`` file that is used to perform the snow data assimilation. This file resides in the ``ptmp/test/tmp/analysis.${PDY}${cyc}.${jobid}/`` directory, where ``${PDY}${cyc}`` is in YYYYMMDDHH format (see :numref:`Section %s ` for more on these variables), and the ``${jobid}`` is the job ID assigned by the system. The example below shows what the complete ``letkf_land.yaml`` file might look like for the 2000-01-03 00Z cycle. The following subsections explain the variables used within this YAML file. .. code-block:: yaml From 3df81b18b5bba1fa2f22b63a112cf3aa3ffcfa04 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 17 Jul 2024 13:00:50 -0400 Subject: [PATCH 22/49] update jedi doc version --- doc/source/conf.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/source/conf.py b/doc/source/conf.py index 710bbcbc..da3736bc 100644 --- a/doc/source/conf.py +++ b/doc/source/conf.py @@ -121,7 +121,7 @@ def setup(app): extlinks = {'github': ('https://github.com/ufs-community/land-DA_workflow/%s', '%s'), 'github-docs': ('https://docs.github.com/en/%s', '%s'), 'gswp3': ('https://hydro.iis.u-tokyo.ac.jp/GSWP3/%s', '%s'), - 'jedi': ('https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/1.7.0/%s', '%s'), + 'jedi': ('https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/7.0.0/%s', '%s'), 'nco': ('https://www.nco.ncep.noaa.gov/idsb/implementation_standards/%s', '%s'), 'rocoto': ('https://christopherwharrop.github.io/rocoto/%s', '%s'), 'rst': ('https://www.sphinx-doc.org/en/master/usage/restructuredtext/%s', '%s'), From ed9465acbaa304a2c36b46cfb853be6b2b07db2b Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 17 Jul 2024 15:38:24 -0400 Subject: [PATCH 23/49] DA ch updates --- .../CustomizingTheWorkflow/DASystem.rst | 326 ++++++++---------- doc/source/conf.py | 2 +- 2 files changed, 137 insertions(+), 191 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index 44c1b085..2359a143 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -28,12 +28,14 @@ Object-Oriented Prediction System (OOPS) A data assimilation experiment requires a YAML configuration file that specifies the details of the data assimilation and observation processing. OOPS provides the core set of data assimilation algorithms in JEDI by combining the generic building blocks required for the algorithms. The OOPS system does not require knowledge of any specific application model implementation structure or observation data information. In the Noah-MP offline Land DA System, OOPS reads the model forecast states from the restart files generated by the Noah-MP model. JEDI UFO contains generic quality control options and filters that can be applied to each observation system, without coding at certain model application levels. More information on the key concepts of the JEDI software design can be found in :cite:t:`Tremolet&Auligne2020` (2020), :cite:t:`HoldawayEtAl2020` (2020), and :cite:t:`HoneyagerEtAl2020` (2020). +.. _jedi-config-and-params: + JEDI Configuration Files & Parameters ---------------------------------------- The DA experiment integrates information from several YAML configuration files, which contain certain fundamental components such as geometry, time window, background, driver, local ensemble DA, output increment, and observations. These components can be implemented differently for different models and observation types, so they frequently contain distinct parameters and variable names depending on the use case. Therefore, this section of the User's Guide focuses on assisting users with understanding and customizing these top-level configuration items in order to run Land DA experiments. Users may also reference the :jedi:`JEDI Documentation ` for additional information. -In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information on geometry, time window, background, driver, local ensemble DA, and output increment, while ``GHCN.yaml`` contains detailed information to configure observations. In the ``develop`` branch, :github:`these files ` reside in the ``land-DA_workflow/parm/jedi`` directory. Some of the variables in these files are templated, so they bring in information from other files, such as the workflow configuration file (``land_analysis.yaml``) and the actual netCDF observation file (e.g., ``ghcn_snwd_ioda_20000103.nc``). In the ``analysis`` task, this information is assembled into one ``letkf_land.yaml`` file that is used to perform the snow data assimilation. This file resides in the ``ptmp/test/tmp/analysis.${PDY}${cyc}.${jobid}/`` directory, where ``${PDY}${cyc}`` is in YYYYMMDDHH format (see :numref:`Section %s ` for more on these variables), and the ``${jobid}`` is the job ID assigned by the system. The example below shows what the complete ``letkf_land.yaml`` file might look like for the 2000-01-03 00Z cycle. The following subsections explain the variables used within this YAML file. +In the Land DA workflow, ``letkfoi_snow.yaml`` contains most of the information on geometry, time window, background, driver, local ensemble DA, and output increment, while ``GHCN.yaml`` contains detailed information to configure observations. In the ``develop`` branch, :github:`these files ` reside in the ``land-DA_workflow/parm/jedi`` directory. Some of the variables in these files are templated, so they bring in information from other files, such as the workflow configuration file (``land_analysis.yaml``) and the actual netCDF observation file (e.g., ``ghcn_snwd_ioda_20000103.nc``). In the ``analysis`` task, this information is assembled into one ``letkf_land.yaml`` file that is used to perform the snow data assimilation. This file resides in the ``ptmp/test/tmp/analysis.${PDY}${cyc}.${jobid}/`` directory, where ``${PDY}${cyc}`` is in YYYYMMDDHH format (see :numref:`Section %s ` for more on these variables), and the ``${jobid}`` is the job ID assigned by the system. The example below shows what the complete ``letkf_land.yaml`` file might look like for the 2000-01-03 00Z cycle. The following subsections explain the variables used within this YAML file. .. code-block:: yaml @@ -161,104 +163,114 @@ Geometry The ``geometry:`` section is used in JEDI configuration files to specify the model grid's parallelization across compute nodes (horizontal and vertical). ``fms initialization`` - This section contains two parameters, ``namelist filename`` and ``field table filename``. + This section contains two parameters, ``namelist filename`` and ``field table filename``, which are required for :term:`FMS` initialization. - ``namelist filename`` - Specifies the path for the namelist filename. + ``namelist filename`` (Default: Data/fv3files/fmsmpp.nml) + Specifies the path to the namelist filename. - ``field table filename`` - Specifies the path for the field table filename. + ``field table filename`` (Default: Data/fv3files/field_table) + Specifies the path to the field table filename. - ``akbk`` + ``akbk`` (Default: Data/fv3files/akbk64.nc4) Specifies the path to a file containing the coefficients that define the hybrid sigma-pressure vertical coordinate used in FV3. Files are provided with the repository containing ``ak`` and ``bk`` for some common choices of vertical resolution for GEOS and GFS. - ``npx`` + ``npx`` (Default: 97) Specifies the number of grid points in the east-west direction. - ``npy`` + ``npy`` (Default: 97) Specifies the number of grid points in the north-south direction. - ``npz`` + ``npz`` (Default: 64) Specifies the number of vertical layers. - ``field metadata override`` - Specifies the path for file metadata. + ``field metadata override`` (Default: gfs-land.yaml) + Specifies the path to field metadata file. - ``time invariant state fields`` - This parameter contains several subparameters listed below. + ``time invariant fields`` + This YAML section contains state fields and derived fields. + ``state fields:`` + This parameter contains several subparameters listed below. - ``datetime`` + ``datetime`` (Default: XXYYYP-XXMP-XXDPTXXHP:00:00Z) Specifies the time in YYYY-MM-DDTHH:00:00Z format, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour. - ``filetype`` + ``filetype`` (Default: fms restart) Specifies the type of file. Valid values include: ``fms restart`` - ``skip coupler file`` + ``skip coupler file`` (Default: true) Specifies whether to enable skipping coupler file. Valid values are: ``true`` | ``false`` - +--------+-----------------+ - | Value | Description | - +========+=================+ - | true | enable | - +--------+-----------------+ - | false | do not enable | - +--------+-----------------+ + +--------+-----------------+ + | Value | Description | + +========+=================+ + | true | enable | + +--------+-----------------+ + | false | do not enable | + +--------+-----------------+ - ``state variables`` + ``state variables`` (Default: [orog_filt]) Specifies the list of state variables. Valid values include: ``[orog_filt]`` - ``datapath`` + ``datapath`` (Default: $LANDDAROOT/land-DA_workflow/fix/FV3_fix_tiled/C96) Specifies the path for state variables data. - ``filename_orog`` + ``filename_orog`` (Default: oro_C96.mx100.nc) Specifies the name of orographic data file. + ``derived fields:`` (Default: [nominal_surface_pressure]) + .. COMMENT: Add definition! + + + Window begin, Window length ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ These two items define the assimilation window for many applications, including Land DA. -``window begin:`` +``time window:`` + Contains information related to the start, end, and length of the experiment. + +``begin:`` (Default: XXYYYP-XXMP-XXDPTXXHP:00:00Z) Specifies the beginning time window. The format is YYYY-MM-DDTHH:00:00Z, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour. -``window length:`` +``length:`` (Default: PT24H) Specifies the time window length. The form is PTXXH, where XX is a 1- or 2-digit hour. For example: ``PT6H`` Background ^^^^^^^^^^^^^^ The ``background:`` section includes information on the analysis file(s) (also known as "members") generated by the previous cycle. - ``date`` - Specifies the background date. The format is ``&date YYYY-MM-DDTHH:00:00Z``, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour. For example: ``&date 2019-12-21T00:00:00Z`` + ``date:`` (Default: &date XXYYYY-XXMM-XXDDTXXHH:00:00Z) + Specifies the background date. The format is ``&date YYYY-MM-DDTHH:00:00Z``, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour. For example: ``&date 2000-01-03T00:00:00Z`` - ``members`` - Specifies information on analysis file(s) generated by a previous cycle. + ``members:`` + Specifies information on analysis file(s) generated using information from a previous cycle. - ``datetime`` + ``datetime:`` (Default: XXYYYY-XXMM-XXDDTXXHH:00:00Z) Specifies the date and time. The format is YYYY-MM-DDTHH:00:00Z, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour. - ``filetype`` + ``filetype:`` (Default: fms restart) Specifies the type of file. Valid values include: ``fms restart`` - ``state variables`` + ``state variables:`` (Default: [snwdph,vtype,slmsk]) Specifies a list of state variables. Valid values: ``[snwdph,vtype,slmsk]`` - ``datapath`` - Specifies the path for state variables data. Valid values: ``mem_pos/`` | ``mem_neg/``. (With default experiment values, the full path will be ``workdir/mem000/jedi/$datapath``.) + ``datapath:`` + Specifies the path for state variable data. Valid values: ``mem_pos/`` | ``mem_neg/``. (With default experiment values, the full path will be ``ptmp/test/tmp/analysis.${PDY}${cyc}.${jobid}``.) - ``filename_sfcd`` - Specifies the name of the surface data file. This usually takes the form ``YYYYMMDD.HHmmss.sfc_data.nc``, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour, mm is a valid 2-digit minute and ss is a valid 2-digit second. For example: ``20191221.000000.sfc_data.nc`` + ``filename_sfcd:`` (Default: XXYYYYXXMMXXDD.XXHH0000.sfc_data.nc) + Specifies the name of the surface data file. This usually takes the form ``YYYYMMDD.HHmmss.sfc_data.nc``, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour, mm is a valid 2-digit minute and ss is a valid 2-digit second. For example: ``20000103.000000.sfc_data.nc`` - ``filename_cprl`` - Specifies the name of file that contains metadata for the restart. This usually takes the form ``YYYYMMDD.HHmmss.coupler.res``, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour, mm is a valid 2-digit minute and ss is a valid 2-digit second. For example: ``20191221.000000.coupler.res`` + ``filename_cprl:`` (Default: XXYYYYXXMMXXDD.XXHH0000.coupler.res) + Specifies the name of file that contains metadata for the restart. This usually takes the form ``YYYYMMDD.HHmmss.coupler.res``, where YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, and HH is a valid 2-digit hour, mm is a valid 2-digit minute and ss is a valid 2-digit second. For example: ``20000103.000000.coupler.res`` Driver ^^^^^^^^^ -The ``driver:`` section describes optional modifications to the behavior of the LocalEnsembleDA driver. For details, refer to :ref:`Local Ensemble Data Assimilation in OOPS ` in the JEDI Documentation. +The ``driver:`` section describes optional modifications to the behavior of the LocalEnsembleDA driver. For details, refer to :jedi:`Local Ensemble Data Assimilation in OOPS ` in the JEDI Documentation. - ``save posterior mean`` + ``save posterior mean:`` (Default: false) Specifies whether to save the posterior mean. Valid values: ``true`` | ``false`` +--------+-----------------+ @@ -269,7 +281,7 @@ The ``driver:`` section describes optional modifications to the behavior of the | false | do not save | +--------+-----------------+ - ``save posterior mean increment`` + ``save posterior mean increment:`` (Default: true) Specifies whether to save the posterior mean increment. Valid values: ``true`` | ``false`` +--------+-----------------+ @@ -280,7 +292,7 @@ The ``driver:`` section describes optional modifications to the behavior of the | false | do not enable | +--------+-----------------+ - ``save posterior ensemble`` + ``save posterior ensemble:`` (Default: false) Specifies whether to save the posterior ensemble. Valid values: ``true`` | ``false`` +--------+-----------------+ @@ -291,7 +303,7 @@ The ``driver:`` section describes optional modifications to the behavior of the | false | do not enable | +--------+-----------------+ - ``run as observer only`` + ``run as observer only:`` (Default: false) Specifies whether to run as observer only. Valid values: ``true`` | ``false`` +--------+-----------------+ @@ -307,76 +319,76 @@ Local Ensemble DA The ``local ensemble DA:`` section configures the local ensemble DA solver package. - ``solver`` + ``solver:`` (Default: LETKF) Specifies the type of solver. Currently, ``LETKF`` is the only available option. See :cite:t:`HuntEtAl2007` (2007). - ``inflation`` + ``inflation:`` Describes ensemble inflation methods. - ``rtps``: (Default: ``0.0``) + ``rtps:`` (Default: ``0.0``) Relaxation to prior spread (:cite:t:`Whitaker&Hamill2012`, 2012). - ``rtpp``: (Default: ``0.0``) + ``rtpp:`` (Default: ``0.0``) Relaxation to prior perturbation (:cite:t:`ZhangEtAl2004`, 2004). - ``mult``: (Default: ``1.0``) + ``mult:`` (Default: ``1.0``) Parameter of multiplicative inflation. Output Increment ^^^^^^^^^^^^^^^^^^^ -``output increment:`` - ``filetype`` +``output increment:`` (Default: fms restart) + ``filetype:`` Type of file provided for the output increment. Valid values include: ``fms restart`` - ``filename_sfcd`` + ``filename_sfcd:`` (Default: xainc.sfc_data.nc) Name of the file provided for the output increment. For example: ``xainc.sfc_data.nc`` Observations ^^^^^^^^^^^^^^^ -The ``observations:`` item describes one or more types of observations, each of which is a multi-level YAML/JSON object in and of itself. Each of these observation types is read into JEDI as an ``eckit::Configuration`` object (see :ref:`JEDI Documentation ` for more details). +The ``observations:`` item describes one or more types of observations, each of which is a multi-level YAML/JSON object in and of itself. Each of these observation types is read into JEDI as an ``eckit::Configuration`` object (see :jedi:`JEDI Documentation ` for more details). ``obs space:`` ```````````````` The ``obs space:`` section of the YAML comes under the ``observations.observers:`` section and describes the configuration of the observation space. An observation space handles observation data for a single observation type. - ``name`` - Specifies the name of observation space. The Land DA System uses ``Simulate`` for the default case. + ``name:`` (Default: SnowDepthGHCN) + Specifies the name of observation space. The Land DA System uses ``SnowDepthGHCN`` for the default case. ``distribution:`` - ``name`` - Specifies the name of distribution. Valid values include: ``Halo`` + ``name:`` + Specifies the name of the distribution. Valid values include: ``Halo`` - ``halo size`` - Specifies the size of the halo distribution. Format is e-notation. For example: ``250e3`` + ``halo size:`` + Specifies the size of the distribution. Format is e-notation. For example: ``250e3`` - ``simulated variables`` - Specifies the list of variables that need to be simulated by observation operator. Valid values: ``[totalSnowDepth]`` + ``simulated variables:`` + Specifies the list of variables that need to be simulated by the observation operator. Valid values: ``[totalSnowDepth]`` - ``obsdatain`` + ``obsdatain:`` This section specifies information about the observation input data. - ``engine`` + ``engine:`` This section specifies parameters required for the file matching engine. - ``type`` + ``type:`` (Default: H5File) Specifies the type of input observation data. Valid values: ``H5File`` | ``OBS`` - ``obsfile`` + ``obsfile:`` (Default: GHCN_XXYYYYXXMMXXDDXXHH.nc) Specifies the input filename. - ``obsdataout`` + ``obsdataout:`` This section contains information about the observation output data. - ``engine`` + ``engine:`` This section specifies parameters required for the file matching engine. - ``type`` + ``type:`` (Default: H5File) Specifies the type of output observation data. Valid values: ``H5File`` - ``obsfile`` + ``obsfile:`` (Default: output/DA/hofx/letkf_hofx_ghcn_XXYYYYXXMMXXDDXXHH.nc) Specifies the output file path. ``obs operator:`` @@ -384,23 +396,23 @@ The ``obs space:`` section of the YAML comes under the ``observations.observers: The ``obs operator:`` section describes the observation operator and its options. An observation operator is used for computing H(x). - ``name`` - Specifies the name in the ``ObsOperator`` and ``LinearObsOperator`` factory, defined in the C++ code. Valid values include: ``Identity``. See :ref:`JEDI Documentation ` for more options. + ``name:`` (Default: Identity) + Specifies the name in the ``ObsOperator`` and ``LinearObsOperator`` factory, defined in the C++ code. Valid values include: ``Identity``. See :jedi:`JEDI Documentation ` for more options. ``obs error:`` `````````````````` The ``obs error:`` section explains how to calculate the observation error covariance matrix and gives instructions (required for DA applications). The key covariance model, which describes how observation error covariances are created, is frequently the first item in this section. For diagonal observation error covariances, only the diagonal option is currently supported. - ``covariance model`` + ``covariance model:`` Specifies the covariance model. Valid values include: ``diagonal`` ``obs localizations:`` ```````````````````````` ``obs localizations:`` - ``localization method`` - Specifies the observation localization method. Valid values include: ``Horizontal SOAR`` + ``localization method:`` + Specifies the observation localization method. Valid values include: ``Horizontal SOAR`` | ``Vertical Brasnett`` +--------------------+--------------------------------------------------+ | Value | Description | @@ -413,21 +425,21 @@ The ``obs error:`` section explains how to calculate the observation error covar | | and used in the snow DA. | +--------------------+--------------------------------------------------+ - ``lengthscale`` + ``lengthscale:`` Radius of influence (i.e., maximum distance of observations from the location being updated) in meters. Format is e-notation. For example: ``250e3`` - ``soar horizontal decay`` + ``soar horizontal decay:`` Decay scale of SOAR localization function. Recommended value: ``0.000021``. Users may adjust based on need/preference. - ``max nobs`` + ``max nobs:`` Maximum number of observations used to update each location. ``obs filters:`` `````````````````` -Observation filters are used to define Quality Control (QC) filters. They have access to observation values and metadata, model values at observation locations, simulated observation value, and their own private data. See :ref:`Observation Filters ` in the JEDI Documentation for more detail. The ``obs filters:`` section contains the following fields: +Observation filters are used to define Quality Control (QC) filters. They have access to observation values and metadata, model values at observation locations, simulated observation value, and their own private data. See :jedi:`Observation Filters ` in the JEDI Documentation for more detail. The ``obs filters:`` section contains the following fields: - ``filter`` + ``filter:`` Describes the parameters of a given QC filter. Valid values include: ``Bounds Check`` | ``Background Check`` | ``Domain Check`` | ``RejectList``. See descriptions in the JEDI's :jedi:`Generic QC Filters ` Documentation for more. +--------------------+--------------------------------------------------+ @@ -454,11 +466,11 @@ Observation filters are used to define Quality Control (QC) filters. They have a | | Check filter. | +--------------------+--------------------------------------------------+ - ``filter variables`` + ``filter variables:`` Limit the action of a QC filter to a subset of variables or to specific channels. - ``name`` - Name of the filter variable. Users may indicate additional filter variables using the ``name`` field on consecutive lines (see code snippet below). Valid values include: ``totalSnowDepth`` + ``name:`` + Name of the filter variable. Users may indicate additional filter variables using the ``name:`` field on consecutive lines (see code snippet below). Valid values include: ``totalSnowDepth`` .. code-block:: yaml @@ -466,28 +478,28 @@ Observation filters are used to define Quality Control (QC) filters. They have a - name: variable_1 - name: variable_2 - ``minvalue`` + ``minvalue:`` Minimum value for variables in the filter. - ``maxvalue`` + ``maxvalue:`` Maximum value for variables in the filter. - ``threshold`` + ``threshold:`` This variable may function differently depending on the filter it is used in. In the :jedi:`Background Check Filter `, an observation is rejected when the difference between the observation value (*y*) and model simulated value (*H(x)*) is larger than the ``threshold`` * *observation error*. - ``action`` - Indicates which action to take once an observation has been flagged by a filter. See :ref:`Filter Actions ` in the JEDI documentation for a full explanation and list of valid values. + ``action:`` + Indicates which action to take once an observation has been flagged by a filter. See :jedi:`Filter Actions ` in the JEDI documentation for a full explanation and list of valid values. - ``name`` + ``name:`` The name of the desired action. Valid values include: ``accept`` | ``reject`` - ``where`` - By default, filters are applied to all filter variables listed. The ``where`` keyword applies a filter only to observations meeting certain conditions. See the :ref:`Where Statement ` section of the JEDI Documentation for a complete description of valid ``where`` conditions. + ``where:`` + By default, filters are applied to all filter variables listed. The ``where`` keyword applies a filter only to observations meeting certain conditions. See the :jedi:`Where Statement ` section of the JEDI Documentation for a complete description of valid ``where`` conditions. - ``variable`` + ``variable:`` A list of variables to check using the ``where`` statement. - ``name`` + ``name:`` Name of a variable to check using the ``where`` statement. Multiple variable names may be listed under ``variable``. The conditions in the where statement will be applied to all of them. For example: .. code-block:: yaml @@ -500,10 +512,10 @@ Observation filters are used to define Quality Control (QC) filters. They have a minvalue: 0.5 maxvalue: 1.5 - ``minvalue`` + ``minvalue:`` Minimum value for variables in the ``where`` statement. - ``maxvalue`` + ``maxvalue:`` Maximum value for variables in the ``where`` statement. .. _IODA: @@ -539,6 +551,16 @@ Observation Data Observation data from 2000 and 2019 are provided in NetCDF format for the |latestr| release. Instructions for downloading the data are provided in :numref:`Section %s `, and instructions for accessing the data on :ref:`Level 1 Systems ` are provided in :numref:`Section %s `. Currently, data is taken from the `Global Historical Climatology Network `_ (GHCN), but eventually, data from the U.S. National Ice Center (USNIC) Interactive Multisensor Snow and Ice Mapping System (`IMS `_) will also be available for use. +Users can view file header information and notes for NetCDF formatted files using the instructions in :numref:`Section %s `. For example, on Orion, users can run: + +.. code-block:: console + + # Load modules: + module load netcdf-c/4.9.2 + ncdump -h /work/noaa/epic/UFS_Land-DA_Dev/inputs/DA/snow_depth/GHCN/data_proc/v3/2019/ghcn_snwd_ioda_20191221.nc + +to see the header contents of the 2019-12-21 GHCN snow depth file. Users may need to modify the module load command and the file path to reflect module versions/file paths that are available on their system. + Observation Types -------------------- @@ -551,43 +573,42 @@ Snow depth observations are taken from the `Global Historical Climatology Networ wget https://www1.ncdc.noaa.gov/pub/data/ghcn/daily/by_year/{YYYY}.csv.gz -where ``${YYYY}`` should be replaced with the year of interest. Note that these yearly tarballs contain all measurement types from the daily GHCN output, and thus, snow depth must be manually extracted from this broader data set. +where ``${YYYY}`` is replaced with the year of interest. Note that these yearly tarballs contain all measurement types from the daily GHCN output, and thus, snow depth must be manually extracted from this broader data set. These raw snow depth observations need to be converted into IODA-formatted netCDF files for ingestion into the JEDI LETKF system. However, this process was preemptively handled outside of the Land DA workflow, and the 2019 GHCN IODA files were provided by NOAA PSL (Clara Draper). -The IODA-formatted GHCN files are available in the ``inputs/DA/snow_depth/GHCN/data_proc/v3/`` directory and are structured as follows (using 20191221 as an example): +The IODA-formatted GHCN files are available in the ``inputs/DA/snow_depth/GHCN/data_proc/v3/${YEAR}`` directory and are structured as follows (using 20000103 as an example): .. code-block:: console netcdf ghcn_snwd_ioda_20191221 { dimensions: - Location = 9379 ; + Location = UNLIMITED ; // (10466 currently) ; variables: - int Location(Location) ; - Location:suggested_chunk_dim = 9379LL ; + int64 Location(Location) ; + Location:suggested_chunk_dim = 10000LL ; // global attributes: string :_ioda_layout = "ObsGroup" ; :_ioda_layout_version = 0 ; - string :converter = "ghcn_snod2ioda_newV2.py" ; - string :date_time_string = "2019-12-21T18:00:00Z" ; - :nlocs = 9379 ; - :history = "Fri Aug 12 20:27:37 2022: ncrename -O -v altitude,height ./data_proc_test/nc4_ghcn_snwd_ioda_20191221.nc ./data_proc_Update/ghcn_snwd_ioda_20191221.nc" ; - :NCO = "netCDF Operators version 4.9.1 (Homepage = http://nco.sf.net, Code = http://github.com/nco/nco)" ; + string :converter = "ghcn_snod2ioda.py" ; + string :date_time_string = "2000-01-01T18:00:00Z" ; + :nlocs = 10466 ; group: MetaData { variables: int64 dateTime(Location) ; - dateTime:_FillValue = -2208988800LL ; + dateTime:_FillValue = -9223372036854775806LL ; string dateTime:units = "seconds since 1970-01-01T00:00:00Z" ; - float height(Location) ; - height:_FillValue = 9.96921e+36f ; float latitude(Location) ; latitude:_FillValue = 9.96921e+36f ; string latitude:units = "degrees_north" ; float longitude(Location) ; longitude:_FillValue = 9.96921e+36f ; string longitude:units = "degrees_east" ; + float stationElevation(Location) ; + stationElevation:_FillValue = 9.96921e+36f ; + string stationElevation:units = "m" ; string stationIdentification(Location) ; string stationIdentification:_FillValue = "" ; } // group MetaData @@ -616,7 +637,7 @@ The IODA-formatted GHCN files are available in the ``inputs/DA/snow_depth/GHCN/d } // group PreQC } -The primary observation variable is ``totalSnowDepth``, which, along with the metadata fields of ``datetime``, ``latitude``, ``longitude``, and ``height`` is defined along the ``nlocs`` dimension. Also present are ``ObsError`` and ``PreQC`` values corresponding to each ``totalSnowDepth`` measurement on ``nlocs``. These values were attributed during the IODA conversion step (not supported for this release). The magnitude of ``nlocs`` varies between files; this is due to the fact that the number of stations reporting snow depth observations for a given day can vary in the GHCN. +The primary observation variable is ``totalSnowDepth``, which, along with the metadata fields of ``datetime``, ``latitude``, ``longitude``, and ``stationElevation`` is defined along the ``nlocs`` dimension. Also present are ``ObsError`` and ``PreQC`` values corresponding to each ``totalSnowDepth`` measurement on ``nlocs``. These values were attributed during the IODA conversion step (not supported for this release). The magnitude of ``nlocs`` varies between files; this is due to the fact that the number of stations reporting snow depth observations for a given day can vary in the GHCN. Observation Location and Processing -------------------------------------- @@ -624,91 +645,16 @@ Observation Location and Processing GHCN ^^^^^^ -GHCN files for 2000 and 2019 are already provided in IODA format for the |latestr| release. :numref:`Table %s ` indicates where users can find data on NOAA :term:`RDHPCS` platforms. Tar files containing the 2000 and 2019 data are located in the publicly-available `Land DA Data Bucket `_. Once untarred, the snow depth files are located in ``/inputs/DA/snow_depth/GHCN/data_proc/{YEAR}``. The 2019 GHCN IODA files were provided by Clara Draper (NOAA PSL). Each file follows the naming convention of ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc``, where ``${YYYY}`` is the four-digit cycle year, ``${MM}`` is the two-digit cycle month, and ``${DD}`` is the two-digit cycle day. - -In each experiment, the ``DA_config`` file sets the name of the experiment configuration file. This configuration file is typically named ``settings_DA_test``. Before assimilation, if "GHCN" was specified as the observation type in the ``DA_config`` file, the ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc`` file corresponding to the specified cycle date is soft-linked to the JEDI working directory (``${JEDIWORKDIR}``) with a naming-convention change (i.e., ``GHCN_${YYYY}${MM}${DD}${HH}.nc``). Here, the GHCN IODA file is appended with the cycle hour, ``${HH}`` which is extracted from the ``${STARTDATE}`` variable defined in the relevant ``DA_config`` file. - -Prior to ingesting the GHCN IODA files via the LETKF at the DA analysis time, the observations are further quality controlled and checked using ``letkf_land.yaml`` (itself a concatenation of ``GHCN.yaml`` and ``letkfoi_snow.yaml``; see the `GitHub yaml files `_ for more detail). The GHCN-specific observation filters, domain checks, and quality control parameters from ``GHCN.yaml`` ensure that only snow depth observations which meet specific criteria are assimilated (the rest are rejected). The contents of ``GHCN.yaml`` are listed below: - -.. code-block:: yaml +GHCN files for 2000 and 2019 are already provided in IODA format for the |latestr| release. :numref:`Table %s ` indicates where users can find data on NOAA :term:`RDHPCS` platforms. Tar files containing the 2000 and 2019 data are located in the publicly-available `Land DA Data Bucket `_. Once untarred, the snow depth files are located in ``/inputs/DA/snow_depth/GHCN/data_proc/${YEAR}``. The 2019 GHCN IODA files were provided by Clara Draper (NOAA PSL). Each file follows the naming convention of ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc``, where ``${YYYY}`` is the four-digit cycle year, ``${MM}`` is the two-digit cycle month, and ``${DD}`` is the two-digit cycle day. - - obs space: - name: SnowDepthGHCN - distribution: - name: Halo - halo size: 250e3 - simulated variables: [totalSnowDepth] - obsdatain: - engine: - type: H5File - obsfile: GHCN_XXYYYYXXMMXXDDXXHH.nc - obsdataout: - engine: - type: H5File - obsfile: output/DA/hofx/letkf_hofx_ghcn_XXYYYYXXMMXXDDXXHH.nc - obs operator: - name: Identity - obs error: - covariance model: diagonal - obs localizations: - - localization method: Horizontal SOAR - lengthscale: 250e3 - soar horizontal decay: 0.000021 - max nobs: 50 - - localization method: Vertical Brasnett - vertical lengthscale: 700 - obs filters: - - filter: Bounds Check # negative / missing snow - filter variables: - - name: totalSnowDepth - minvalue: 0.0 - - filter: Domain Check # missing station elevation (-999.9) - where: - - variable: - name: MetaData/height - minvalue: -999.0 - - filter: Domain Check # land only - where: - - variable: - name: GeoVaLs/slmsk - minvalue: 0.5 - maxvalue: 1.5 - # GFSv17 only. - #- filter: Domain Check # no sea ice - # where: - # - variable: - # name: fraction_of_ice@GeoVaLs - # maxvalue: 0.0 - - filter: RejectList # no land-ice - where: - - variable: - name: GeoVaLs/vtype - minvalue: 14.5 - maxvalue: 15.5 - - filter: Background Check # gross error check - filter variables: - - name: totalSnowDepth - threshold: 6.25 - action: - name: reject - -Viewing NetCDF Files ------------------------ - -Users can view file information and notes for NetCDF files using the instructions in :numref:`Section %s `. For example, on Orion: - -.. code-block:: console - - # Load modules: - module load intel/2022.1.2 impi/2022.1.2 netcdf/4.7.4 - ncdump -h /work/noaa/epic/UFS_Land-DA/inputs/DA/snow_depth/GHCN/data_proc/v3/2019/ghcn_snwd_ioda_20191221.nc +In each experiment, the ``land_analysis_*.yaml`` file sets the type of observation file (e.g., ``OBS_TYPES: "GHCN"``). Before assimilation, if "GHCN" was specified as the observation type, the ``ghcn_snwd_ioda_${YYYY}${MM}${DD}.nc`` file corresponding to the specified cycle date is copied to the run directory (usually ``$LANDDAROOT/ptmp/test/com/landda/$model_ver/landda.$PDY$cyc/obs`` by default --- see :numref:`Section %s ` for more on these variables) with a naming-convention change (i.e., ``GHCN_${YYYY}${MM}${DD}${HH}.nc``). -to see the contents of the 2019-12-21 GHCN file on Hera. Users may need to modify the module load command and the file path to reflect module versions/file paths that are available on their system. +Prior to ingesting the GHCN IODA files via the LETKF at the DA analysis time, the observations are combined into a single ``letkf_land.yaml`` file, which is a concatenation of ``letkfoi_snow.yaml`` and ``GHCN.yaml`` (see :numref:`Section %s ` for further explanation). The GHCN-specific observation filters, domain checks, and quality control parameters from ``GHCN.yaml`` ensure that only snow depth observations which meet specific criteria are assimilated (the rest are rejected). View the contents of ``GHCN.yaml`` are :github:`on GitHub `. Restart Files ================ -To restart the UFS land driver successfully after land model execution, all parameters, states, and fluxes used for a subsequent time iteration are stored in a restart file. This restart file is named ``ufs_land_restart.{FILEDATE}.nc`` where ``FILEDATE`` is in YYYY-MM-DD_HH-mm-SS format (e.g., ``ufs_land_restart.2019-12-21_00-00-00.nc``). The restart file contains all the model fields and their values at a specific point in time; this information can be used to restart the model immediately to run the next cycle. The Land DA System reads the states from the restart file and replaces them after the DA step with the updated analysis. :numref:`Table %s ` lists the fields in the Land DA restart file. Within the UFS land driver (submodule ``ufs-land-driver-emc-dev``), read/write of the restart file is performed in ``ufsLandNoahMPRestartModule.f90``. +To restart the UFS land driver successfully after land model execution, all parameters, states, and fluxes used for a subsequent time iteration are stored in a restart file. This restart file is named ``ufs_land_restart.${FILEDATE}.nc`` where ``FILEDATE`` is in YYYY-MM-DD_HH-mm-SS format (e.g., ``ufs_land_restart.2019-12-21_00-00-00.nc``). The restart file contains all the model fields and their values at a specific point in time; this information can be used to restart the model immediately to run the next cycle. The Land DA System reads the states from the restart file and replaces them after the DA step with the updated analysis. :numref:`Table %s ` lists the fields in the Land DA restart file. Within the UFS land driver (submodule ``ufs-land-driver-emc-dev``), read/write of the restart file is performed in ``ufsLandNoahMPRestartModule.f90``. .. _RestartFiles: diff --git a/doc/source/conf.py b/doc/source/conf.py index da3736bc..9e8159c0 100644 --- a/doc/source/conf.py +++ b/doc/source/conf.py @@ -95,7 +95,7 @@ # further. html_theme_options = { "body_max_width": "none", - 'navigation_depth': 6, + 'navigation_depth': 8, } # Add any paths that contain custom static files (such as style sheets) here, From 065c15e0e04bbada720ef3774f83ba38cc6f129c Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 17 Jul 2024 15:49:08 -0400 Subject: [PATCH 24/49] misc minor edits --- doc/source/BackgroundInfo/TechnicalOverview.rst | 2 +- doc/source/BuildingRunningTesting/Container.rst | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst index 6271650e..6e2abd2f 100644 --- a/doc/source/BackgroundInfo/TechnicalOverview.rst +++ b/doc/source/BackgroundInfo/TechnicalOverview.rst @@ -123,7 +123,7 @@ This :term:`umbrella repository` uses Git submodules and an ``app_build.sh`` fil - https://github.com/ufs-community/ccpp-physics/ * - ufs_model.fd - ufs-weather-model - - Repository for the UFS Weather Model (WM). This repository contains a number of subrepositories, which are documented :doc:`in the WM User's `. + - Repository for the UFS Weather Model (WM). This repository contains a number of subrepositories, which are documented :ufs-wm:`in the WM User's `. - https://github.com/ufs-community/ufs-weather-model/ * - vector2tile_converter.fd - land-vector2tile diff --git a/doc/source/BuildingRunningTesting/Container.rst b/doc/source/BuildingRunningTesting/Container.rst index ad2332f2..160c2d47 100644 --- a/doc/source/BuildingRunningTesting/Container.rst +++ b/doc/source/BuildingRunningTesting/Container.rst @@ -24,7 +24,7 @@ The containerized version of Land DA requires: * `Installation of Apptainer `__ * At least 7 CPU cores - * An **Intel** compiler and :term:`MPI` (available for free `here `_) + * An **Intel** compiler and :term:`MPI` (available for `free here `_) Install Singularity/Apptainer @@ -32,7 +32,7 @@ Install Singularity/Apptainer .. note:: - As of November 2021, the Linux-supported version of Singularity has been `renamed `_ to *Apptainer*. Apptainer has maintained compatibility with Singularity, so ``singularity`` commands should work with either Singularity or Apptainer (see compatibility details `here `_.) + As of November 2021, the Linux-supported version of Singularity has been `renamed `_ to *Apptainer*. Apptainer has maintained compatibility with Singularity, so ``singularity`` commands should work with either Singularity or Apptainer (see `compatibility details here `_.) To build and run Land DA using a Singularity/Apptainer container, first install the software according to the `Apptainer Installation Guide `_. This will include the installation of all dependencies. From e783edf373b3eb86ae3ad564dd05a69a59622872 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Fri, 19 Jul 2024 13:27:24 -0400 Subject: [PATCH 25/49] update wflow dir structure diagram --- .../BuildingRunningTesting/BuildRunLandDA.rst | 28 +++++++++---------- 1 file changed, 14 insertions(+), 14 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 3a45316e..444c91c8 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -295,20 +295,20 @@ As the experiment progresses, it will generate a number of directories to hold i $LANDDAROOT: Base directory ├── land-DA_workflow(): Home directory of the land DA workflow - ├── ptmp () - │ └── test () - │ └── com - │ ├── landda () - │ │ └── vX.Y.Z () - │ │ └── landda.YYYYMMDD (.): Directory containing the output files - │ └── output - │ └── logs - │ └── run_ (): Directory containing the log file of the Rocoto workflow - └── tmp () - ├── (): Working directory - └── DATA_SHARE - ├── YYYYMMDD (): Directory containing the intermediate or temporary files - └── DATA_RESTART: Directory containing the soft-links to the restart files for the next cycles + └── ptmp () + └── test () + └── com + │ ├── landda () + │ │ └── vX.Y.Z () + │ │ └── landda.YYYYMMDD (.): Directory containing the output files + │ └── output + │ └── logs + │ └── run_ (): Directory containing the log file of the Rocoto workflow + └── tmp () + ├── (): Working directory + └── DATA_SHARE + ├── YYYYMMDD (): Directory containing the intermediate or temporary files + └── DATA_RESTART: Directory containing the soft-links to the restart files for the next cycles ```` refers to the type of forcing data used (``gswp3`` or ``era5``). Each variable in parentheses and angle brackets (e.g., ``()``) is the name for the directory defined in the file ``land_analysis.yaml``. In the future, this directory structure will be further modified to meet the :nco:`NCO Implementation Standards<>`. From fcdbc3635288f41f2586cab5393e0bd2aff2a5e6 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Fri, 19 Jul 2024 13:32:52 -0400 Subject: [PATCH 26/49] update wflow dir structure diagram --- doc/source/BuildingRunningTesting/BuildRunLandDA.rst | 11 +++++++---- 1 file changed, 7 insertions(+), 4 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 444c91c8..8f11ed49 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -296,19 +296,22 @@ As the experiment progresses, it will generate a number of directories to hold i $LANDDAROOT: Base directory ├── land-DA_workflow(): Home directory of the land DA workflow └── ptmp () - └── test () - └── com + └── () + └── com () │ ├── landda () │ │ └── vX.Y.Z () │ │ └── landda.YYYYMMDD (.): Directory containing the output files + │ │ ├── hofx + │ │ └── plot │ └── output │ └── logs - │ └── run_ (): Directory containing the log file of the Rocoto workflow + │ └── run_ (): Directory containing the log files for the Rocoto workflow └── tmp () ├── (): Working directory └── DATA_SHARE ├── YYYYMMDD (): Directory containing the intermediate or temporary files - └── DATA_RESTART: Directory containing the soft-links to the restart files for the next cycles + ├── hofx: Directory containing the soft links to the results of the analysis task for plotting + └── DATA_RESTART: Directory containing the soft links to the restart files for the next cycles ```` refers to the type of forcing data used (``gswp3`` or ``era5``). Each variable in parentheses and angle brackets (e.g., ``()``) is the name for the directory defined in the file ``land_analysis.yaml``. In the future, this directory structure will be further modified to meet the :nco:`NCO Implementation Standards<>`. From 9e2e8912b52dfa8dea3c41dde0ecaa4b60f874a4 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 24 Jul 2024 16:52:16 -0400 Subject: [PATCH 27/49] add new chapters to introduction --- doc/source/BackgroundInfo/Introduction.rst | 7 +++++-- doc/source/Reference/FAQ.rst | 22 +++++++++++----------- 2 files changed, 16 insertions(+), 13 deletions(-) diff --git a/doc/source/BackgroundInfo/Introduction.rst b/doc/source/BackgroundInfo/Introduction.rst index 4e09e26f..ebdd5fea 100644 --- a/doc/source/BackgroundInfo/Introduction.rst +++ b/doc/source/BackgroundInfo/Introduction.rst @@ -32,8 +32,8 @@ This User's Guide is organized into four sections: (1) *Background Information*; Background Information ======================== - * This chapter (Introduction) provides background information on the Unified Forecast System (:term:`UFS`) and the NoahMP model. - * :numref:`Chapter %s ` (Technical Overview) outlines prerequisites, user support levels, and directory structure. + * This chapter (Introduction) provides user support information and background information on the Unified Forecast System (:term:`UFS`) and the Noah-MP model. + * :numref:`Chapter %s ` (Technical Overview) outlines prerequisites, supported systems, and directory structure. Building, Running, and Testing the Land DA System =================================================== @@ -45,12 +45,15 @@ Building, Running, and Testing the Land DA System Customizing the Workflow ========================= + * :numref:`Chapter %s: Available Workflow Configuration Parameters ` explains all of the user-configurable options currently available in the workflow configuration file (``land_analysis*.yaml``). * :numref:`Chapter %s: Model ` provides information on input data and configuration parameters in the Noah-MP LSM and its Vector-to-Tile Converter. * :numref:`Chapter %s: DA Framework ` provides information on the DA system, required data, and configuration parameters. Reference =========== + * :numref:`Chapter %s: Rocoto ` provides background information on the Rocoto workflow manager as used in Land DA. + * :numref:`Chapter %s: FAQ ` addresses frequently asked questions. * :numref:`Chapter %s: Glossary ` lists important terms. User Support and Documentation diff --git a/doc/source/Reference/FAQ.rst b/doc/source/Reference/FAQ.rst index 6b6d28fa..89434c08 100644 --- a/doc/source/Reference/FAQ.rst +++ b/doc/source/Reference/FAQ.rst @@ -17,23 +17,23 @@ On platforms that utilize Rocoto workflow software (including Hera and Orion), i .. code-block:: console - rocotostat -w land_analysis.xml -d land_analysis.db - CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION - ============================================================================= - 200001030000 prepexp 16779414 SUCCEEDED 0 1 11.0 - 200001030000 prepobs 16779415 SUCCEEDED 0 1 0.0 - 200001030000 prepbmat 16779416 SUCCEEDED 0 1 9.0 - 200001030000 runana 16779434 SUCCEEDED 0 1 68.0 - 200001030000 runfcst - DEAD 256 1 2186.0 + $ rocotostat -w land_analysis.xml -d land_analysis.db + CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION + ======================================================================================= + 200001030000 prep_obs 61746034 SUCCEEDED 0 1 11.0 + 200001030000 pre_anal 61746035 SUCCEEDED 0 1 13.0 + 200001030000 analysis 61746081 SUCCEEDED 0 1 76.0 + 200001030000 post_anal 61746109 SUCCEEDED 0 1 4.0 + 200001030000 plot_stats 61746110 SUCCEEDED 0 1 70.0 + 200001030000 forecast 61746128 DEAD 256 1 - -This means that the dead task has not completed successfully, so the workflow has stopped. Once the issue has been identified and fixed (by referencing the log files), users can re-run the failed task using the ``rocotorewind`` command: -.. COMMENT: Where are the log files actually? +This means that the dead task has not completed successfully, so the workflow has stopped. Once the issue has been identified and fixed (by referencing the log files in ``$LANDDAROOT/ptmp/test/com/output/logs/run_``), users can re-run the failed task using the ``rocotorewind`` command: .. code-block:: console - rocotorewind -w land_analysis.xml -d land_analysis.db -v 10 -c 200001030000 -t runfcst + rocotorewind -w land_analysis.xml -d land_analysis.db -v 10 -c 200001030000 -t forecast where ``-c`` specifies the cycle date (first column of ``rocotostat`` output) and ``-t`` represents the task name (second column of ``rocotostat`` output). After using ``rocotorewind``, the next time ``rocotorun`` is used to From 0fa38328bd64c112708aab441b0696929a6efb7a Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 24 Jul 2024 17:27:09 -0400 Subject: [PATCH 28/49] misc minor Intro edits --- doc/source/BackgroundInfo/Introduction.rst | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/doc/source/BackgroundInfo/Introduction.rst b/doc/source/BackgroundInfo/Introduction.rst index ebdd5fea..f061d8cd 100644 --- a/doc/source/BackgroundInfo/Introduction.rst +++ b/doc/source/BackgroundInfo/Introduction.rst @@ -52,7 +52,7 @@ Customizing the Workflow Reference =========== - * :numref:`Chapter %s: Rocoto ` provides background information on the Rocoto workflow manager as used in Land DA. + * :numref:`Chapter %s: Rocoto ` provides background information on the Rocoto workflow manager as used in Land DA. * :numref:`Chapter %s: FAQ ` addresses frequently asked questions. * :numref:`Chapter %s: Glossary ` lists important terms. @@ -83,10 +83,11 @@ If users (especially new users) believe they have identified a bug in the system Feature Requests and Enhancements ================================== -Users who want to request a feature enhancement or the addition of a new feature have two options: +Users who want to request a feature enhancement or the addition of a new feature have a few options: #. File a `GitHub Issue `__ and add (or request that a code manager add) the ``EPIC Support Requested`` label. #. Post a request for a feature or enhancement in the `Enhancements `__ category of GitHub Discussions. These feature requests will be forwarded to the Earth Prediction Innovation Center (`EPIC `__) management team for prioritization and eventual addition to the Land DA System. + #. Email the request to support.epic@noaa.gov. .. _Background: @@ -129,7 +130,7 @@ The Noah-MP LSM has evolved through community efforts to pursue and refine a mod Noah-MP has been implemented in the UFS via the :term:`CCPP` physics package and is currently being tested for operational use in GFSv17 and RRFS v2. Additionally, the UFS Weather Model now contains a Noah-MP land component. Noah-MP has -also been used operationally in the NOAA National Water Model (NWM) since 2016. Details about the model's physical parameterizations can be found in :cite:t:`NiuEtAl2011` (2011). +also been used operationally in the NOAA National Water Model (NWM) since 2016. Details about the model's physical parameterizations can be found in :cite:t:`NiuEtAl2011` (2011), and a full description of the model is available in the `Community Noah-MP Land Surface Modeling System Technical Description Version 5.0 `_. Disclaimer ************* From 422fbe70debcf213bccdc0fd7e6db3e21b313af9 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 24 Jul 2024 22:44:56 -0400 Subject: [PATCH 29/49] misc tech overview edits --- doc/source/BackgroundInfo/TechnicalOverview.rst | 10 +++++----- doc/source/Reference/Glossary.rst | 5 +++++ doc/source/conf.py | 2 ++ 3 files changed, 12 insertions(+), 5 deletions(-) diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst index 6e2abd2f..967b04e7 100644 --- a/doc/source/BackgroundInfo/TechnicalOverview.rst +++ b/doc/source/BackgroundInfo/TechnicalOverview.rst @@ -20,7 +20,7 @@ Minimum System Requirements Additionally, users will need: * Disk space: ~23GB (11GB for Land DA System [or 6.5GB for Land DA container], 11GB for Land DA data, and ~1GB for staging and output) - * 6 CPU cores (or option to run with "oversubscribe") + * 7 CPU cores (or option to run with "oversubscribe") Software Prerequisites ======================== @@ -32,7 +32,7 @@ The Land DA System requires: * Python * :term:`NetCDF` * Lmod - * `spack-stack `_ (v1.6.0) + * `spack-stack `_ (|spack-stack-ver|) * `jedi-bundle `_ (|skylabv|) These software prerequisites are pre-installed in the Land DA :term:`container` and on other Level 1 systems (see :ref:`below ` for details). However, users on non-Level 1 systems will need to install them. @@ -54,7 +54,7 @@ Four levels of support have been defined for :term:`UFS` applications, and the L Level 1 Systems ================== -Preconfigured (Level 1) systems for Land DA already have the required external libraries available in a central location via :term:`spack-stack` and the ``jedi-bundle`` (|skylabv|). Land DA is expected to build and run out-of-the-box on these systems, and users can download the Land DA code without first installing prerequisite software. With the exception of the Land DA container, users must have access to these Level 1 systems in order to use them. For the most updated information on stack locations, compilers, and MPI, users can check the :land-wflow-repo:`build and run version files ` for their machine of choice. Similarly, users can check the :land-wflow-repo:`build__intel ` file for the machine of their choice. +Preconfigured (Level 1) systems for Land DA already have the required external libraries available in a central location via :term:`spack-stack` and the :term:`jedi-bundle` (Skylab |skylabv|). Land DA is expected to build and run out-of-the-box on these systems, and users can download the Land DA code without first installing prerequisite software. With the exception of the Land DA container, users must have access to these Level 1 systems in order to use them. For the most updated information on stack locations, compilers, and MPI, users can check the :land-wflow-repo:`build and run version files ` for their machine of choice. .. _stack-compiler-locations: @@ -84,7 +84,7 @@ Preconfigured (Level 1) systems for Land DA already have the required external l Level 2-4 Systems =================== -On non-Level 1 platforms, the Land DA System can be run within a container that includes the prerequisite software; otherwise, the required libraries will need to be installed as part of the Land DA build process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. +On non-Level 1 platforms, the Land DA System can be run within a container that includes the prerequisite software; otherwise, the required libraries will need to be installed as part of the Land DA build process. Once these prerequisite libraries are installed, Land DA should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems. .. _repos-dir-structure: @@ -135,7 +135,7 @@ This :term:`umbrella repository` uses Git submodules and an ``app_build.sh`` fil - https://github.com/ufs-community/uwtools .. note:: - The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS Land DA System repository. The `spack-stack `__ repository assembles these prerequisite libraries. Spack-stack has already been built on `preconfigured (Level 1) platforms `__. However, it must be built on other systems. See the :doc:`spack-stack Documentation ` for details on installing spack-stack. + The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS Land DA System repository. The `spack-stack `__ repository assembles these prerequisite libraries. Spack-stack has already been built on :ref:`preconfigured (Level 1) platforms `. However, it must be built on other systems. See the :spack-stack:`spack-stack Documentation <>` for details on installing spack-stack. .. _file-dir-structure: diff --git a/doc/source/Reference/Glossary.rst b/doc/source/Reference/Glossary.rst index 99ed7909..264090b3 100644 --- a/doc/source/Reference/Glossary.rst +++ b/doc/source/Reference/Glossary.rst @@ -73,6 +73,11 @@ Glossary JEDI is developed and distributed by the `Joint Center for Satellite Data Assimilation `_, a multi-agency research center hosted by the University Corporation for Atmospheric Research (`UCAR `_). JCSDA is dedicated to improving and accelerating the quantitative use of research and operational satellite data in weather, ocean, climate, and environmental analysis and prediction systems. + jedi-bundle + Skylab + `JEDI Skylab `_ is the name for roll-up releases of :term:`JCSDA`'s `jedi-bundle `_ repository. + This software provides an integrated Earth System Data Assimilation capability. JCSDA has tested Skylab capabilities internally via the SkyLab testbed for the following components: atmosphere, land/snow, ocean, sea-ice, aerosols, and atmospheric composition. + LND land component The Noah Multi-Physics (Noah-MP) land surface model (LSM) is an open-source, community-developed LSM that has been incorporated into the UFS Weather Model (WM). It is the UFS WM's land component. diff --git a/doc/source/conf.py b/doc/source/conf.py index 9e8159c0..294e48cc 100644 --- a/doc/source/conf.py +++ b/doc/source/conf.py @@ -51,6 +51,7 @@ .. |tag| replace:: ``ufs-land-da-v1.2.0`` .. |branch| replace:: ``release/public-v1.2.0`` .. |skylabv| replace:: Skylab v7.0 +.. |spack-stack-ver| replace:: v1.6.0 """ # -- Linkcheck options ------------------------------------------------- @@ -128,6 +129,7 @@ def setup(app): 'rtd': ('https://readthedocs.org/projects/land-da-workflow/%s', '%s'), 'land-wflow-repo': ('https://github.com/ufs-community/land-DA_workflow/%s', '%s'), 'land-wflow-wiki': ('https://github.com/ufs-community/land-DA_workflow/wiki/%s','%s'), + 'spack-stack': ('https://spack-stack.readthedocs.io/en/1.6.0/%s', '%s'), 'ufs-wm': ('https://ufs-weather-model.readthedocs.io/en/develop/%s', '%s'), 'uw': ('https://uwtools.readthedocs.io/en/main/%s', '%s'), } From 38e4273d1800be7b44ecd09db1d6b6c407f5aa35 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 24 Jul 2024 22:59:50 -0400 Subject: [PATCH 30/49] misc tech overview updates --- doc/source/BackgroundInfo/TechnicalOverview.rst | 6 ++++-- doc/source/Reference/Glossary.rst | 2 +- 2 files changed, 5 insertions(+), 3 deletions(-) diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst index 967b04e7..1b7dff4e 100644 --- a/doc/source/BackgroundInfo/TechnicalOverview.rst +++ b/doc/source/BackgroundInfo/TechnicalOverview.rst @@ -210,8 +210,10 @@ The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operatio - Scripts launched by the :term:`J-jobs` * - sorc - External source code used to build the Land DA System + * - ush + - Utility scripts * - versions - - Contain run.ver and build.ver, which are files that get automatically sourced in order to track package versions at run and compile time respectively. + - Contains ``build.ver_*`` and ``run.ver_*``, which are files that get automatically sourced in order to track package versions at compile and run time respectively. .. _land-component: @@ -226,4 +228,4 @@ Unlike the standalone Noah-MP land driver, the Noah-MP :term:`NUOPC cap` is able Unified Workflow (UW) Tools ============================ -The Unified Workflow (UW) is a set of tools intended to unify the workflow for various UFS applications under one framework. The UW toolkit currently includes rocoto, template, and configuration (config) tools, which are being incorporated into the Land DA workflow. Additional tools are under development. More details about UW tools can be found in the `uwtools ` GitHub repository and in the :uw:`UW Documentation <>`. \ No newline at end of file +The Unified Workflow (UW) is a set of tools intended to unify the workflow for various UFS applications under one framework. The UW toolkit currently includes rocoto, template, and configuration (config) tools, which are being incorporated into the Land DA workflow. Additional tools are under development. More details about UW tools can be found in the `uwtools `_ GitHub repository and in the :uw:`UW Documentation <>`. \ No newline at end of file diff --git a/doc/source/Reference/Glossary.rst b/doc/source/Reference/Glossary.rst index 264090b3..10d82578 100644 --- a/doc/source/Reference/Glossary.rst +++ b/doc/source/Reference/Glossary.rst @@ -130,7 +130,7 @@ Glossary `Spack `_ is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers, where many users and application teams share common installations of software on clusters with exotic architectures. spack-stack - The `spack-stack `_ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `_ and the `Joint Effort for Data assimilation Integration (JEDI) `_ framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack. + The `spack-stack `_ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `_ and the :jedi:`Joint Effort for Data assimilation Integration (JEDI) <>` framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack. UFS The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system consisting of several applications (apps). These apps span regional to global domains and sub-hourly to seasonal time scales. The UFS is designed to support the :term:`Weather Enterprise` and to be the source system for NOAA's operational numerical weather prediction applications. For more information, visit https://ufscommunity.org/. From a9597564f2ffc88c0bf584d107be27d96b9a49b0 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 24 Jul 2024 23:04:25 -0400 Subject: [PATCH 31/49] rm uw from hierarchical repo structure? --- doc/source/BackgroundInfo/TechnicalOverview.rst | 6 +----- 1 file changed, 1 insertion(+), 5 deletions(-) diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst index 1b7dff4e..1c36e783 100644 --- a/doc/source/BackgroundInfo/TechnicalOverview.rst +++ b/doc/source/BackgroundInfo/TechnicalOverview.rst @@ -129,13 +129,9 @@ This :term:`umbrella repository` uses Git submodules and an ``app_build.sh`` fil - land-vector2tile - Contains code to map between the vector format used by the Noah-MP offline driver, and the tile format used by the UFS atmospheric model. - https://github.com/NOAA-PSL/land-vector2tile - * - N/A - - uwtools - - Repository for the Unified Workflow (UW) Toolkit. This repository is not a Git submodule, but the build script installs UW tools, if desired, as part of the build. - - https://github.com/ufs-community/uwtools .. note:: - The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS Land DA System repository. The `spack-stack `__ repository assembles these prerequisite libraries. Spack-stack has already been built on :ref:`preconfigured (Level 1) platforms `. However, it must be built on other systems. See the :spack-stack:`spack-stack Documentation <>` for details on installing spack-stack. + The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS Land DA System repository. The `spack-stack `_ repository assembles these prerequisite libraries. Spack-stack has already been built on :ref:`preconfigured (Level 1) platforms `. However, it must be built on other systems. See the :spack-stack:`spack-stack Documentation <>` for details on installing spack-stack. .. _file-dir-structure: From 818bbb6e70ede6920a5c55b0444cfb40000627a1 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Wed, 24 Jul 2024 23:09:55 -0400 Subject: [PATCH 32/49] update table 1.1 --- .../BackgroundInfo/TechnicalOverview.rst | 32 ++++++++++--------- 1 file changed, 17 insertions(+), 15 deletions(-) diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst index 1c36e783..8ca3c399 100644 --- a/doc/source/BackgroundInfo/TechnicalOverview.rst +++ b/doc/source/BackgroundInfo/TechnicalOverview.rst @@ -60,26 +60,28 @@ Preconfigured (Level 1) systems for Land DA already have the required external l .. list-table:: *Software Prerequisites & Locations* :header-rows: 1 - :widths: 10 20 70 + :widths: 10 20 20 100 70 * - Platform - - Compiler/MPI - - spack-stack & jedi-bundle Installations + - Compiler + - MPI + - ``spack-stack`` Installation + - ``jedi-bundle`` Installation * - Hera - - - intel/2021.5.0 / - - impi/2021.5.1 - - - /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.6.0/envs/unified-env-rocky8/install/modulefiles/Core - - /scratch2/NAGAPE/epic/UFS_Land-DA_Dev/jedi_v7 + - intel/2021.5.0 / + - impi/2021.5.1 + - /scratch1/NCEPDEV/nems/role.epic/spack-stack/spack-stack-1.6.0/envs/unified-env-rocky8/install/modulefiles/Core + - /scratch2/NAGAPE/epic/UFS_Land-DA_Dev/jedi_v7 * - Orion - - - intel/2021.9.0 / - - impi/2021.9.0 - - - /work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/unified-env-rocky9/install/modulefiles/Core - - /work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6 + - intel/2021.9.0 / + - impi/2021.9.0 + - /work/noaa/epic/role-epic/spack-stack/orion/spack-stack-1.6.0/envs/unified-env-rocky9/install/modulefiles/Core + - /work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6 * - Container - - - intel-oneapi-compilers/2021.8.0 / - - intel-oneapi-mpi/2021.8.0 - - - /opt/spack-stack/ (inside the container) - - /opt/jedi-bundle (inside the container) + - intel-oneapi-compilers/2021.8.0 / + - intel-oneapi-mpi/2021.8.0 + - /opt/spack-stack/ (inside the container) + - /opt/jedi-bundle (inside the container) Level 2-4 Systems =================== From 8f7b1e515c41f765a8abb878381f92fbdd821f61 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Thu, 25 Jul 2024 15:51:18 -0400 Subject: [PATCH 33/49] update L1 Build/Run ch --- .../BackgroundInfo/TechnicalOverview.rst | 4 +- .../BuildingRunningTesting/BuildRunLandDA.rst | 43 +++++++++++-------- 2 files changed, 28 insertions(+), 19 deletions(-) diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst index 8ca3c399..16a5ddeb 100644 --- a/doc/source/BackgroundInfo/TechnicalOverview.rst +++ b/doc/source/BackgroundInfo/TechnicalOverview.rst @@ -65,8 +65,8 @@ Preconfigured (Level 1) systems for Land DA already have the required external l * - Platform - Compiler - MPI - - ``spack-stack`` Installation - - ``jedi-bundle`` Installation + - *spack-stack* Installation + - *jedi-bundle* Installation * - Hera - intel/2021.5.0 / - impi/2021.5.1 diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 8f11ed49..774f6f9e 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -26,14 +26,14 @@ Create a directory for the Land DA experiment (``$LANDDAROOT``): cd /path/to/landda export LANDDAROOT=`pwd` -where ``/path/to/landda`` is the path to the directory where the user plans to run Land DA experiments. +where ``/path/to/landda`` is the path to the directory where the user plans to run Land DA experiments. In the experiment configuration file, ``$LANDDAROOT`` is referred to as ``$EXP_BASEDIR``, and refers to the Land DA workflow's parent directory. .. _GetCode: Get Code *********** -Clone the Land DA repository. To clone the ``develop`` branch, run: +Clone the Land DA workflow repository. To clone the ``develop`` branch, run: .. code-block:: console @@ -97,7 +97,7 @@ To load the workflow environment, run: module load wflow_ conda activate land_da -where ```` is ``hera`` or ``orion``. +where ```` is ``hera`` or ``orion``. This activates the land_da conda environment, and the user typically sees (land_da) in front of the Terminal prompt at this point. .. _configure-expt: @@ -120,7 +120,7 @@ where ```` is ``hera`` or ``orion``. Users will need to configure certain elements of their experiment in ``land_analysis.yaml``: - * ``ACCOUNT:`` A valid account name. Hera, Orion, and most NOAA RDHPCS systems require a valid account name; other systems may not + * ``ACCOUNT:`` A valid account name. Hera, Orion, and most NOAA RDHPCS systems require a valid account name; other systems may not (in which case, any value will do). * ``EXP_BASEDIR:`` The full path to the directory where land-DA_workflow was cloned (i.e., ``$LANDDAROOT``) * ``FORCING:`` Forcing options; ``gswp3`` or ``era5`` * ``cycledef/spec:`` Cycle specification @@ -129,14 +129,14 @@ Users will need to configure certain elements of their experiment in ``land_anal To determine an appropriate ``ACCOUNT`` field for Level 1 systems that use the Slurm job scheduler, run ``saccount_params``. On other systems, running ``groups`` will return a list of projects that the user has permissions for. Not all listed projects/groups have an HPC allocation, but those that do are potentially valid account names. -Users may configure other elements of an experiment in ``land_analysis.yaml`` if desired. The ``land_analysis_*`` files contain reasonable default values for running a Land DA experiment. Users who wish to run a more complex experiment may change the values in these files and the files they reference using information in Sections :numref:`%s ` & :numref:`%s `. +Users may configure other elements of an experiment in ``land_analysis.yaml`` if desired. The ``land_analysis_*.yaml`` files contain reasonable default values for running a Land DA experiment. Users who wish to run a more complex experiment may change the values in these files and the files they reference using information in Sections :numref:`%s ` & :numref:`%s `. .. _GetData: Data ------ -:numref:`Table %s ` shows the locations of pre-staged data on NOAA :term:`RDHPCS` (i.e., Hera and Orion). These data locations are already included in the ``land_analysis_*`` files but are provided here for informational purposes. +:numref:`Table %s ` shows the locations of pre-staged data on NOAA :term:`RDHPCS` (i.e., Hera and Orion). These data locations are already included in the ``land_analysis_*.yaml`` files but are provided here for informational purposes. .. _Level1Data: @@ -150,14 +150,14 @@ Data | Orion | /work/noaa/epic/UFS_Land-DA_Dev/inputs | +-----------+--------------------------------------------------+ -Users who have difficulty accessing the data on Hera or Orion may download it according to the instructions in :numref:`Section %s `. Its subdirectories are soft-linked to the ``fix`` directory of the land-DA workflow by the build script ``sorc/app_build.sh``. +Users who have difficulty accessing the data on Hera or Orion may download it according to the instructions in :numref:`Section %s `. Its subdirectories are soft-linked to the ``fix`` directory of ``land-DA_workflow`` by the build script ``sorc/app_build.sh``. .. _generate-wflow: Generate the Rocoto XML File ============================== -Generate the workflow with ``uwtools`` by running: +Generate the workflow XML file with ``uwtools`` by running: .. code-block:: console @@ -170,6 +170,8 @@ If the command runs without problems, ``uwtools`` will output a "0 errors found" [2024-03-01T20:36:03] INFO 0 UW schema-validation errors found [2024-03-01T20:36:03] INFO 0 Rocoto validation errors found +The generated workflow XML file (``land_analysis.xml``) will be used by the Rocoto workflow manager to determine which tasks (or "jobs") to submit to the batch system and when to submit them (e.g., as soon as task dependencies are satisfied). + Run the Experiment ******************** @@ -221,11 +223,14 @@ To automate task submission, users must be on a system where :term:`cron` is ava .. code-block:: console cd parm - conda deactivate # optional ./launch_rocoto_wflow.sh add To check the status of the experiment, see :numref:`Section %s ` on tracking experiment progress. +.. note:: + + If users run into issues with the launch script, they can run ``conda deactivate`` before running the launch script. + Manual Submission ------------------- @@ -235,7 +240,7 @@ To run the experiment, issue a ``rocotorun`` command from the ``parm`` directory rocotorun -w land_analysis.xml -d land_analysis.db -Users will need to issue the ``rocotorun`` command multiple times. The tasks must be run in order, and ``rocotorun`` initiates the next task once its dependencies have completed successfully. Note that the status table printed by ``rocotostat`` only updates after each ``rocotorun`` command. Details on checking experiment status are provided in the :ref:`next section `. +Users will need to issue the ``rocotorun`` command multiple times. The tasks must be run in order, and ``rocotorun`` initiates the next task once its dependencies have completed successfully. Details on checking experiment status are provided in the :ref:`next section `. .. _VerifySuccess: @@ -260,7 +265,7 @@ If ``rocotorun`` was successful, the ``rocotostat`` command will print a status 200001030000 post_anal - - - - - 200001030000 plot_stats - - - - - 200001030000 forecast - - - - - - ================================================================================================================================ + ========================================================================================================= 200001040000 prep_obs druby://10.184.3.62:41973 SUBMITTING - 1 0.0 200001040000 pre_anal - - - - - 200001040000 analysis - - - - - @@ -277,7 +282,7 @@ The experiment has successfully completed when all tasks say SUCCEEDED under STA Run Without Rocoto -------------------- -Users may choose not to run the workflow with uwtools and Rocoto for a non-cycled run. To run the :term:`J-jobs` scripts in the ``jobs`` directory, navigate to the ``parm`` directory and edit ``run_without_rocoto.sh`` (e.g., using vim or preferred command line editor). Users will likely need to change the ``MACHINE``, ``ACCOUNT``, and ``EXP_BASEDIR`` variables to match their system. Then, run ``run_without_rocoto.sh``: +Users may choose *not* to run the workflow with *uwtools* and Rocoto for a non-cycled run. To run the :term:`J-job ` scripts in the ``jobs`` directory, navigate to the ``parm`` directory and edit ``run_without_rocoto.sh`` (e.g., using vim or preferred command line editor). Users will likely need to change the ``MACHINE``, ``ACCOUNT``, and ``EXP_BASEDIR`` variables to match their system. Then, run the script: .. code-block:: console @@ -287,7 +292,7 @@ Users may choose not to run the workflow with uwtools and Rocoto for a non-cycle Check Experiment Output ========================= -As the experiment progresses, it will generate a number of directories to hold intermediate and output files. The directory structure for those files and directories appears below: +As the experiment progresses, it will generate a number of directories to hold intermediate and output files. The structure of those files and directories appears below: .. _land-da-dir-structure: @@ -296,7 +301,7 @@ As the experiment progresses, it will generate a number of directories to hold i $LANDDAROOT: Base directory ├── land-DA_workflow(): Home directory of the land DA workflow └── ptmp () - └── () + └── test ( or ) └── com () │ ├── landda () │ │ └── vX.Y.Z () @@ -319,14 +324,18 @@ Check for the output files for each cycle in the experiment directory: .. code-block:: console - ls -l $LANDDAROOT/ptmp/test/com/landda/v1.2.1/landda.YYYYMMDD + ls -l $LANDDAROOT/ptmp/test/com/landda//landda.YYYYMMDD -where ``YYYYMMDD`` is the cycle date. The experiment should generate several restart files. +where ``YYYYMMDD`` is the cycle date, and ```` is the model version (currently ``v1.2.1`` in the ``develop`` branch). The experiment should generate several restart files. Plotting Results ----------------- -Additionally, in the ``plot`` subdirectory, users will find images depicting the results of the ``analysis`` task for each cycle as a scatter plot (``hofx_oma_YYYMMDD_scatter.png``) and as a histogram (``hofx_oma_YYYYMMDD_histogram.png``). The scatter plot is named OBS-ANA (i.e., Observation Minus Analysis [OMA]), and it depicts a map of snow depth results. Blue points indicate locations where the observed values are less than the analysis values, and red points indicate locations where the observed values are greater than the analysis values. The title lists the mean and standard deviation of the absolute value of the OMA values. The histogram plots OMA values on the x-axis and frequency density values on the y-axis. The title of the histogram lists the mean and standard deviation of the real value of the OMA values. +Additionally, in the ``plot`` subdirectory, users will find images depicting the results of the ``analysis`` task for each cycle as a scatter plot (``hofx_oma_YYYMMDD_scatter.png``) and as a histogram (``hofx_oma_YYYYMMDD_histogram.png``). + +The scatter plot is named OBS-ANA (i.e., Observation Minus Analysis [OMA]), and it depicts a map of snow depth results. Blue points indicate locations where the observed values are less than the analysis values, and red points indicate locations where the observed values are greater than the analysis values. The title lists the mean and standard deviation of the absolute value of the OMA values. + +The histogram plots OMA values on the x-axis and frequency density values on the y-axis. The title of the histogram lists the mean and standard deviation of the real value of the OMA values. .. |logo1| image:: https://raw.githubusercontent.com/wiki/ufs-community/land-DA_workflow/images/LandDAScatterPlot.png :alt: Map of snow depth in millimeters (observation minus analysis) From bc19c487673d423b4c604586d241555eafd35949 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Thu, 25 Jul 2024 16:07:03 -0400 Subject: [PATCH 34/49] L1 Build/Run ch minor/misc updates --- doc/source/BuildingRunningTesting/BuildRunLandDA.rst | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 774f6f9e..456a7419 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -26,7 +26,7 @@ Create a directory for the Land DA experiment (``$LANDDAROOT``): cd /path/to/landda export LANDDAROOT=`pwd` -where ``/path/to/landda`` is the path to the directory where the user plans to run Land DA experiments. In the experiment configuration file, ``$LANDDAROOT`` is referred to as ``$EXP_BASEDIR``, and refers to the Land DA workflow's parent directory. +where ``/path/to/landda`` is the path to the directory where the user plans to run Land DA experiments. In the experiment configuration file, ``$LANDDAROOT`` is referred to as ``$EXP_BASEDIR``. .. _GetCode: @@ -97,7 +97,7 @@ To load the workflow environment, run: module load wflow_ conda activate land_da -where ```` is ``hera`` or ``orion``. This activates the land_da conda environment, and the user typically sees (land_da) in front of the Terminal prompt at this point. +where ```` is ``hera`` or ``orion``. This activates the ``land_da`` conda environment, and the user typically sees ``(land_da)`` in front of the Terminal prompt at this point. .. _configure-expt: @@ -129,7 +129,7 @@ Users will need to configure certain elements of their experiment in ``land_anal To determine an appropriate ``ACCOUNT`` field for Level 1 systems that use the Slurm job scheduler, run ``saccount_params``. On other systems, running ``groups`` will return a list of projects that the user has permissions for. Not all listed projects/groups have an HPC allocation, but those that do are potentially valid account names. -Users may configure other elements of an experiment in ``land_analysis.yaml`` if desired. The ``land_analysis_*.yaml`` files contain reasonable default values for running a Land DA experiment. Users who wish to run a more complex experiment may change the values in these files and the files they reference using information in Sections :numref:`%s ` & :numref:`%s `. +Users may configure other elements of an experiment in ``land_analysis.yaml`` if desired. The ``land_analysis_*.yaml`` files contain reasonable default values for running a Land DA experiment. Users who wish to run a more complex experiment may change the values in these files and the files they reference using information in Sections :numref:`%s `, :numref:`%s `, and :numref:`%s `. .. _GetData: @@ -170,7 +170,7 @@ If the command runs without problems, ``uwtools`` will output a "0 errors found" [2024-03-01T20:36:03] INFO 0 UW schema-validation errors found [2024-03-01T20:36:03] INFO 0 Rocoto validation errors found -The generated workflow XML file (``land_analysis.xml``) will be used by the Rocoto workflow manager to determine which tasks (or "jobs") to submit to the batch system and when to submit them (e.g., as soon as task dependencies are satisfied). +The generated workflow XML file (``land_analysis.xml``) will be used by the Rocoto workflow manager to determine which tasks (or "jobs") to submit to the batch system and when to submit them (e.g., when task dependencies are satisfied). Run the Experiment ******************** @@ -282,7 +282,7 @@ The experiment has successfully completed when all tasks say SUCCEEDED under STA Run Without Rocoto -------------------- -Users may choose *not* to run the workflow with *uwtools* and Rocoto for a non-cycled run. To run the :term:`J-job ` scripts in the ``jobs`` directory, navigate to the ``parm`` directory and edit ``run_without_rocoto.sh`` (e.g., using vim or preferred command line editor). Users will likely need to change the ``MACHINE``, ``ACCOUNT``, and ``EXP_BASEDIR`` variables to match their system. Then, run the script: +Users may choose to run the workflow *without* ``uwtools`` and Rocoto for a non-cycled run. To run the :term:`J-job ` scripts in the ``jobs`` directory, navigate to the ``parm`` directory and edit ``run_without_rocoto.sh`` (e.g., using vim or preferred command line editor). Users will likely need to change the ``MACHINE``, ``ACCOUNT``, and ``EXP_BASEDIR`` variables to match their system. Then, run the script: .. code-block:: console From fa4342bfebe7632f359d21d4ffe507403d1b52f7 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Thu, 25 Jul 2024 19:01:01 -0400 Subject: [PATCH 35/49] update container chapter --- .../BuildingRunningTesting/Container.rst | 113 ++++++++++++++++-- .../BuildingRunningTesting/TestingLandDA.rst | 6 +- 2 files changed, 105 insertions(+), 14 deletions(-) diff --git a/doc/source/BuildingRunningTesting/Container.rst b/doc/source/BuildingRunningTesting/Container.rst index 160c2d47..a7428073 100644 --- a/doc/source/BuildingRunningTesting/Container.rst +++ b/doc/source/BuildingRunningTesting/Container.rst @@ -4,9 +4,9 @@ Containerized Land DA Workflow ********************************** -These instructions will help users build and run a basic case for the Unified Forecast System (:term:`UFS`) Land Data Assimilation (DA) System using a `Singularity/Apptainer `_ container. The Land DA :term:`container` packages together the Land DA System with its dependencies (e.g., :term:`spack-stack`, :term:`JEDI`) and provides a uniform environment in which to build and run the Land DA System. Normally, the details of building and running Earth systems models will vary based on the computing platform because there are many possible combinations of operating systems, compilers, :term:`MPIs `, and package versions available. Installation via Singularity/Apptainer container reduces this variability and allows for a smoother experience building and running Land DA. This approach is recommended for users not running Land DA on a supported :ref:`Level 1 ` system (i.e., Hera, Orion). +These instructions will help users build and run a basic case for the Unified Forecast System (:term:`UFS`) Land Data Assimilation (DA) System using a `Singularity/Apptainer `_ container. The Land DA :term:`container` packages together the Land DA System with its dependencies (e.g., :term:`spack-stack`, :term:`JEDI`) and provides a uniform environment in which to build and run the Land DA System. Normally, the details of building and running Earth systems models will vary based on the computing platform because there are many possible combinations of operating systems, compilers, :term:`MPIs `, and package versions available. Installation via Singularity/Apptainer container reduces this variability and allows for a smoother experience building and running Land DA. This approach is recommended for users not running Land DA on a supported :ref:`Level 1 ` system (i.e., Hera, Orion). -This chapter provides instructions for building and running basic Land DA cases for the Unified Forecast System (:term:`UFS`) Land DA System. Users can choose between two options: +This chapter provides instructions for building and running basic Land DA cases in a container. Users can choose between two options: * A Jan. 3-4, 2000 00z sample case using :term:`GSWP3` data with the UFS Noah-MP land component * A Dec. 21-22, 2019 00z sample case using :term:`ERA5` data with the UFS Land Driver @@ -22,8 +22,8 @@ Prerequisites The containerized version of Land DA requires: - * `Installation of Apptainer `__ - * At least 7 CPU cores + * `Installation of Apptainer `_ + * At least 6 CPU cores * An **Intel** compiler and :term:`MPI` (available for `free here `_) @@ -219,7 +219,7 @@ Users may convert a container ``.img`` file to a writable sandbox. This step is singularity build --sandbox ubuntu20.04-intel-landda-release-public-v1.2.0 $img -When making a writable sandbox on NOAA RDHPCS systems, the following warnings commonly appear and can be ignored: +When making a writable sandbox on NOAA :term:`RDHPCS`, the following warnings commonly appear and can be ignored: .. code-block:: console @@ -240,7 +240,7 @@ There should now be a ``land-DA_workflow`` directory in the ``$LANDDAROOT`` dire singularity exec -B /:/ $img cp -r /opt/land-DA_workflow . -where ```` and ```` are replaced with a top-level directory on the local system and in the container, respectively. Additional directories can be bound by adding another ``-B /:/`` argument before the container location (``$img``). +where ```` and ```` are replaced with a top-level directory on the local system and in the container, respectively. Additional directories can be bound by adding another ``-B /:/`` argument before the container location (``$img``). Note that if previous steps included a ``sudo`` command, ``sudo`` may be required in front of this command. .. attention:: @@ -288,11 +288,13 @@ The remaining Level 1 systems that do not have Intel MPI available will need to | Hercules | module load intel-oneapi-compilers/2022.2.1 intel-oneapi-mpi/2021.7.1 | +-----------------+-------------------------------------------------------------------------+ -For Derecho and Gaea, an additional script is needed to help set up the land-DA workflow scripts so that the container can run there. +For Derecho and Gaea, an additional script is needed to help set up the ``land-DA_workflow`` scripts so that the container can run there. .. code-block:: console - ./setup_container.sh -p= + ./setup_container.sh -p= + +where ```` is ``derecho`` or ``gaea``. .. _ConfigureExptC: @@ -323,7 +325,7 @@ The Land DA System uses a script-based workflow that is launched using the ``do_ .. attention:: - Note that the GSWP3 option will only run as-is on Hera and Orion. Users on other systems may need to make significant changes to configuration files, which is not a supported option for the |latestr| release. It is recommended that users on these systems use the UFS land driver ERA5 sample experiment set in ``settings_DA_cycle_era5``. + Note that the GSWP3 option will only run as-is on Hera and Orion. Users on other systems may need to make significant changes to configuration files, which is not a supported option for the |latestr| release. It is recommended that users on other systems use the UFS land driver ERA5 sample experiment set in ``settings_DA_cycle_era5``. First, update the ``$BASELINE`` environment variable in the selected ``settings_DA_*`` file to say ``singularity.internal`` instead of ``hera.internal``: @@ -344,7 +346,49 @@ To start the experiment, run: ./do_submit_cycle.sh settings_DA_cycle_era5 -The ``do_submit_cycle.sh`` script will read the ``settings_DA_cycle_*`` file and the ``release.environment`` file, which contain sensible experiment default values to simplify the process of running the workflow for the first time. Advanced users will wish to modify the parameters in ``do_submit_cycle.sh`` to fit their particular needs. After reading the defaults and other variables from the settings files, ``do_submit_cycle.sh`` creates a working directory (named ``workdir`` by default) and an output directory called ``landda_expts`` in the parent directory of ``land-DA_workflow`` and then submits a job (``submit_cycle.sh``) to the queue that will run through the workflow. If all succeeds, users will see ``log`` and ``err`` files created in ``land-DA_workflow`` along with a ``cycle.log`` file, which will show where the cycle has ended. The ``landda_expts`` directory will also be populated with data in the following directories: +The ``do_submit_cycle.sh`` script will read the ``settings_DA_cycle_*`` file and the ``release.environment`` file, which contain sensible experiment default values to simplify the process of running the workflow for the first time. Advanced users will wish to modify the parameters in ``do_submit_cycle.sh`` to fit their particular needs. After reading the defaults and other variables from the settings files, ``do_submit_cycle.sh`` creates a working directory (named ``workdir`` by default) and an output directory called ``landda_expts`` in the parent directory of ``land-DA_workflow`` and then submits a job (``submit_cycle.sh``) to the queue that will run through the workflow. If all succeeds, users will see ``log`` and ``err`` files created in ``land-DA_workflow`` along with a ``cycle.log`` file, which will show where the cycle has ended. + + + +Check Progress +---------------- + +To check on the experiment status, users on a system with a Slurm job scheduler may run: + +.. code-block:: console + + squeue -u $USER + +To view progress, users can open the ``log*`` and ``err*`` files once they have been generated: + +.. code-block:: console + + tail -f log* err* + +Users will need to type ``Ctrl+C`` to exit the files. For examples of what the log and error files should look like in a successful experiment, reference :ref:`ERA5 Experiment Logs ` or :ref:`GSWP3 Experiment Logs ` below. + +.. attention:: + + If the log file contains a NetCDF error (e.g., ``ModuleNotFoundError: No module named 'netCDF4'``), run: + + .. code-block:: console + + python -m pip install netCDF4 + + Then, resubmit the job (``sbatch submit_cycle.sh``). + +Next, check for the background and analysis files in the test directory. + +.. code-block:: console + + ls -l ../landda_expts/DA__test/mem000/restarts/`` + +where: + + * ```` is either ``era5`` or ``gswp3``, and + * ```` is either ``vector`` or ``tile`` depending on whether ERA5 or GSWP3 forcing data was used, respectively. + +The experiment should populate the ``landda_expts`` directory with data in the following locations: .. code-block:: console @@ -354,4 +398,51 @@ The ``do_submit_cycle.sh`` script will read the ``settings_DA_cycle_*`` file and Depending on the experiment, either the ``vector`` or the ``tile`` directory will have data, but not both. -Users can check experiment progress/success according to the instructions in :numref:`Section %s `, which apply to both containerized and non-containerized versions of the Land DA System. + +.. _era5-log-output: + +ERA5 Experiment Logs +===================== + +For the ERA5 experiment, the ``log*`` file for a successful experiment will a message like: + +.. code-block:: console + + Creating: .//ufs_land_restart.2019-12-22_00-00-00.nc + Searching for forcing at time: 2019-12-22 01:00:00 + +The ``err*`` file for a successful experiment will end with something similar to: + +.. code-block:: console + + + THISDATE=2019122200 + + date_count=1 + + '[' 1 -lt 1 ']' + + '[' 2019122200 -lt 2019122200 ']' + +.. _gswp3-log-output: + +GSWP3 Experiment Logs +======================= + +For the GSWP3 experiment, the ``log*`` file for a successful experiment will end with a list of resource statistics. For example: + +.. code-block:: console + + Number of times filesystem performed OUTPUT = 250544 + Number of Voluntary Context Switches = 3252 + Number of InVoluntary Context Switches = 183 + *****************END OF RESOURCE STATISTICS************************* + +The ``err*`` file for a successful experiment will end with something similar to: + +.. code-block:: console + + + echo 'do_landDA: calling apply snow increment' + + [[ '' =~ hera\.internal ]] + + /apps/intel-2022.1.2/intel-2022.1.2/mpi/2021.5.1/bin/mpiexec -n 6 /path/to/land-DA_workflow/build/bin/apply_incr.exe /path/to/landda_expts/DA_GSWP3_test/DA/logs//apply_incr.log + + [[ 0 != 0 ]] + + '[' YES == YES ']' + + '[' YES == YES ']' + + cp /path/to/workdir/mem000/jedi/20000103.000000.xainc.sfc_data.tile1.nc /path/to/workdir/mem000/jedi/20000103.000000.xainc.sfc_data.tile2.nc /path/to/workdir/mem000/jedi/20000103.000000.xainc.sfc_data.tile3.nc /path/to/workdir/mem000/jedi/20000103.000000.xainc.sfc_data.tile4.nc /path/to/workdir/mem000/jedi/20000103.000000.xainc.sfc_data.tile5.nc /path/to/workdir/mem000/jedi/20000103.000000.xainc.sfc_data.tile6.nc /path/to/landda_expts/DA_GSWP3_test/DA/jedi_incr/ + + [[ YES == \N\O ]] diff --git a/doc/source/BuildingRunningTesting/TestingLandDA.rst b/doc/source/BuildingRunningTesting/TestingLandDA.rst index eda3b143..d1df8fc0 100644 --- a/doc/source/BuildingRunningTesting/TestingLandDA.rst +++ b/doc/source/BuildingRunningTesting/TestingLandDA.rst @@ -53,7 +53,7 @@ If the tests are successful, a message will be printed to the console. For examp Tests ******* -The ERA5 CTests test the operability of six major elements of the Land DA System: ``vector2tile``, ``create_ens``, ``letkfoi_snowda``, ``apply_jediincr``, ``tile2vector``, and ``ufs_datm_land``. The tests and their dependencies are listed in the ``land-DA_workflow/test/CMakeLists.txt`` file. Currently, the CTests are only run on Hera and Orion; they cannot yet be run via container. +The CTests test the operability of six major elements of the Land DA System: ``vector2tile``, ``create_ens``, ``letkfoi_snowda``, ``apply_jediincr``, ``tile2vector``, and ``ufs_datm_land``. The tests and their dependencies are listed in the ``land-DA_workflow/test/CMakeLists.txt`` file. Currently, the CTests are only run on Hera and Orion; they cannot yet be run via container. .. list-table:: *Land DA CTests* :widths: 20 50 @@ -62,11 +62,11 @@ The ERA5 CTests test the operability of six major elements of the Land DA System * - Test - Description * - ``test_vector2tile`` - - Tests the vector-to-tile function for use in JEDI + - Tests the vector-to-tile function for use in JEDI. * - ``test_create_ens`` - Tests creation of a pseudo-ensemble for use in LETKF-OI. * - ``test_letkfoi_snowda`` - - Tests the use of LETKF-OI to assimilate snow DA. + - Tests the use of LETKF-OI to assimilate snow data. * - ``test_apply_jediincr`` - Tests the ability to add a JEDI increment. * - ``test_tile2vector`` From 054482661d1a88e0ec74ccc939711d35ef405350 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Thu, 25 Jul 2024 19:10:44 -0400 Subject: [PATCH 36/49] misc minor edits --- doc/source/BuildingRunningTesting/Container.rst | 10 ++++++---- 1 file changed, 6 insertions(+), 4 deletions(-) diff --git a/doc/source/BuildingRunningTesting/Container.rst b/doc/source/BuildingRunningTesting/Container.rst index a7428073..03a0b48a 100644 --- a/doc/source/BuildingRunningTesting/Container.rst +++ b/doc/source/BuildingRunningTesting/Container.rst @@ -348,7 +348,7 @@ To start the experiment, run: The ``do_submit_cycle.sh`` script will read the ``settings_DA_cycle_*`` file and the ``release.environment`` file, which contain sensible experiment default values to simplify the process of running the workflow for the first time. Advanced users will wish to modify the parameters in ``do_submit_cycle.sh`` to fit their particular needs. After reading the defaults and other variables from the settings files, ``do_submit_cycle.sh`` creates a working directory (named ``workdir`` by default) and an output directory called ``landda_expts`` in the parent directory of ``land-DA_workflow`` and then submits a job (``submit_cycle.sh``) to the queue that will run through the workflow. If all succeeds, users will see ``log`` and ``err`` files created in ``land-DA_workflow`` along with a ``cycle.log`` file, which will show where the cycle has ended. - +.. _CheckProgress: Check Progress ---------------- @@ -381,19 +381,21 @@ Next, check for the background and analysis files in the test directory. .. code-block:: console - ls -l ../landda_expts/DA__test/mem000/restarts/`` + ls -l ../landda_expts/DA__test/mem000/restarts/`` where: * ```` is either ``era5`` or ``gswp3``, and - * ```` is either ``vector`` or ``tile`` depending on whether ERA5 or GSWP3 forcing data was used, respectively. + * ```` is either ``vector`` or ``tile`` depending on whether ERA5 or GSWP3 forcing data were used, respectively. The experiment should populate the ``landda_expts`` directory with data in the following locations: .. code-block:: console landda_expts/DA_GHCN_test/DA/ + # AND landda_expts/DA_GHCN_test/mem000/restarts/vector/ + # OR landda_expts/DA_GHCN_test/mem000/restarts/tile/ Depending on the experiment, either the ``vector`` or the ``tile`` directory will have data, but not both. @@ -404,7 +406,7 @@ Depending on the experiment, either the ``vector`` or the ``tile`` directory wil ERA5 Experiment Logs ===================== -For the ERA5 experiment, the ``log*`` file for a successful experiment will a message like: +For the ERA5 experiment, the ``log*`` file for a successful experiment will contain a message like: .. code-block:: console From f885141249083cc348ea89fe8c74f5ae737c47f9 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Mon, 29 Jul 2024 10:22:36 -0400 Subject: [PATCH 37/49] ConfigWorkflow updates --- .../CustomizingTheWorkflow/ConfigWorkflow.rst | 42 +++++++++---------- 1 file changed, 20 insertions(+), 22 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst index 48ceafc7..42aaee2b 100644 --- a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst +++ b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst @@ -25,23 +25,23 @@ Attributes pertaining to the overall workflow are defined in the ``attrs:`` sect taskthrottle: 24 ``realtime:`` (Default: false) - Indicates whether it is a realtime (true) or retrospective run (false). Valid values: ``true`` | ``false`` + Indicates whether it is a realtime run (true) or a retrospective run (false). Valid values: ``true`` | ``false`` ``scheduler:`` (Default: slurm) The job scheduler to use on the specified machine. Valid values: ``"slurm"``. Other options may work with a container but have not been tested: ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"`` ``cyclethrottle:`` (Default: 24) - The number of cycles that can be active at one time. Valid values: Integers >= 0. + The number of cycles that can be active at one time. Valid values: Integers > 0. ``taskthrottle:`` (Default: 24) - The number of tasks that can be active at one time. Valid values: Integers >= 0. + The number of tasks that can be active at one time. Valid values: Integers > 0. .. _wf-cycledef: Workflow Cycle Definition (``cycledef``) ========================================== -Cycling information is defined in the ``cycledef:`` section under ``workflow:``. Each cycle definition starts with a ``-`` and has information on cycle attributes (``attrs:``) and a cycle specification (``spec:``). For example: +Cycling information is defined in the ``cycledef:`` section under ``workflow:``. Each cycle definition starts with a hyphen (``-``) and has information on cycle attributes (``attrs:``) and a cycle specification (``spec:``). For example: .. code-block:: console @@ -52,7 +52,7 @@ Cycling information is defined in the ``cycledef:`` section under ``workflow:``. spec: 201912210000 201912220000 24:00:00 ``attrs:`` - Attributes of ``cycledef``. Includes ``group:`` but may also include ``activation_offset:``. + Attributes of ``cycledef``. Includes ``group:`` but may also include ``activation_offset:``. See the :rocoto:`Rocoto Documentation <>` for more information. ``group:`` The group attribute allows users to assign a set of cycles to a particular group. The group tag can later be used to control which tasks are run for which cycles. See the :rocoto:`Rocoto Documentation <>` for more information. @@ -66,7 +66,7 @@ Cycling information is defined in the ``cycledef:`` section under ``workflow:``. Workflow Entities =================== -Entities are constants that can be referred to throughout the workflow using the ``&`` prefix and ``;`` suffix (e.g., ``&MACHINE;``) to avoid defining the same constants repetitively in each workflow task. For example, in ``land_analysis_orion.yaml``, the following entities are defined: +Entities are constants that can be referred to throughout the workflow using the ampersand (``&``) prefix and semicolon (``;``) suffix (e.g., ``&MACHINE;``) to avoid defining the same constants repetitively in each workflow task. For example, in ``land_analysis_orion.yaml``, the following entities are defined: .. code-block:: console @@ -111,24 +111,22 @@ Entities are constants that can be referred to throughout the workflow using the .. note:: - When two defaults are listed, one is the default on Hera, and one is the default on Orion, depending on ``land_analysis_.yaml`` file used. The default on Hera is listed first, followed by the default on Orion. + When two defaults are listed, one is the default on Hera, and one is the default on Orion, depending on the ``land_analysis_.yaml`` file used. The default on Hera is listed first, followed by the default on Orion. ``MACHINE:`` (Default: "hera" or "orion") - The machine (a.k.a. platform or system) on which the workflow will run. Currently supported platforms are listed in :numref:`Section %s `. Valid values: ``"hera"`` | ``"orion"`` | ``"singularity"`` - -.. COMMENT: Check Singularity or NOAA Cloud or anything? + The machine (a.k.a. platform or system) on which the workflow will run. Currently supported platforms are listed in :numref:`Section %s `. Valid values: ``"hera"`` | ``"orion"`` ``SCHED:`` (Default: "slurm") The job scheduler to use (e.g., Slurm) on the specified ``MACHINE``. Valid values: ``"slurm"``. Other options may work with a container but have not been tested: ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"`` ``ACCOUNT:`` (Default: "epic") - The account under which users submit jobs to the queue on the specified ``MACHINE``. To determine an appropriate ``ACCOUNT`` field on a system with a Slurm job scheduler, users may run the ``saccount_params`` command to display account details. On other systems, users may run the ``groups`` command, which will return a list of projects that the user has permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. + An account where users can charge their compute resources on the specified ``MACHINE``. To determine an appropriate ``ACCOUNT`` field on a system with a Slurm job scheduler, users may run the ``saccount_params`` command to display account details. On other systems, users may run the ``groups`` command, which will return a list of projects that the user has permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. ``EXP_NAME:`` (Default: "LETKF") Placeholder --- currently not used in workflow. ``EXP_BASEDIR:`` (Default: "/scratch2/NAGAPE/epic/{USER}/landda_test" or "/work/noaa/epic/{USER}/landda_test") - The full path to the directory that ``land-DA_workflow`` was cloned into (i.e., ``$LANDDAROOT`` in the documentation). + The full path to the parent directory of ``land-DA_workflow`` (i.e., ``$LANDDAROOT`` in the documentation). ``JEDI_INSTALL:`` (Default: "/scratch2/NAGAPE/epic/UFS_Land-DA_Dev/jedi_v7" or "/work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6") The path to the JEDI |skylabv| installation. @@ -143,7 +141,7 @@ Entities are constants that can be referred to throughout the workflow using the Resolution of FV3 grid. Currently, only C96 resolution is supported. ``FCSTHR:`` (Default: "24") - Specifies the length of each forecast in hours. Valid values: Integers >= 0. + Specifies the length of each forecast in hours. Valid values: Integers > 0. ``NPROCS_ANALYSIS:`` (Default: "6") Number of processors for the analysis task. @@ -186,16 +184,16 @@ Entities are constants that can be referred to throughout the workflow using the NCO Directory Structure Entities ---------------------------------- -Standard environment variables are defined in the NCEP Central Operations :nco:`WCOSS Implementation Standards ` document. These variables are used in forming the path to various directories containing input, output, and workflow files. For a visual aid, see the :ref:`Land DA Directory Structure Diagram `. The variables are defined in the WCOSS Implementation Standards document (pp. 4-5) as follows: +Standard environment variables are defined in the NCEP Central Operations :nco:`WCOSS Implementation Standards ` document (pp. 4-5). These variables are used in forming the path to various directories containing input, output, and workflow files. For a visual aid, see the :ref:`Land DA Directory Structure Diagram `. ``HOMElandda:`` (Default: "&EXP_BASEDIR;/land-DA_workflow") The location of the :github:`land-DA_workflow <>` clone. ``PTMP:`` (Default: "&EXP_BASEDIR;/ptmp") - User-defined path to the ``com``-type directories. + Product temporary (PTMP) experiment output space. This directory is used to mimic the operational file structure and contains all of the files and subdirectories used by or generated by the experiment. By default, it is a sibling to the ``land-DA_workflow`` directory. ``envir:`` (Default: "test") - The run environment. Set to “test” during the initial testing phase, “para” when running in parallel (on a schedule), and “prod” in production. + The run environment. Set to “test” during the initial testing phase, “para” when running in parallel (on a schedule), and “prod” in production. In operations, this is the operations root directory (aka ``$OPSROOT``). ``COMROOT:`` (Default: "&PTMP;/&envir;/com") ``com`` root directory, which contains input/output data on current system. @@ -204,10 +202,10 @@ Standard environment variables are defined in the NCEP Central Operations :nco:` Model name (first level of ``com`` directory structure) ``model_ver:`` (Default: "v1.2.1") - Version number of package in three digits (second level of ``com`` directory) + Version number of package in three digits (e.g., v#.#.#); second level of ``com`` directory ``RUN:`` (Default: "landda") - Name of model run (third level of com directory structure). In general, same as ${NET}. + Name of model run (third level of ``com`` directory structure). In general, same as ``${NET}``. ``DATAROOT:`` (Default: "&PTMP;/&envir;/tmp") Directory location for the temporary working directories for running jobs. By default, this is a sibling to the ``$COMROOT`` directory and is located at ``ptmp/test/tmp``. @@ -219,7 +217,7 @@ Standard environment variables are defined in the NCEP Central Operations :nco:` Path to the directory containing log files for each workflow task. ``LOGFN_SUFFIX:`` (Default: "_@Y@m@d@H.log") - The cycle suffix appended to each task's log file. It will be rendered in the form ``_YYYYMMDDHH.log``. For example, the ``prep_obs`` task log would become: ``prep_obs_2000010400.log``. + The cycle suffix appended to each task's log file. It will be rendered in the form ``_YYYYMMDDHH.log``. For example, the ``prep_obs`` task log file for the Jan. 4, 2000 00z cycle would be named: ``prep_obs_2000010400.log``. ``PATHRT:`` (Default: "&EXP_BASEDIR;") The path to the ``EXP_BASEDIR`` for regression tests (RTs). @@ -235,7 +233,7 @@ Standard environment variables are defined in the NCEP Central Operations :nco:` Workflow Log ============== -Information related to workflow progress is defined in the ``log:`` section under ``workflow:``: +Information related to overall workflow progress is defined in the ``log:`` section under ``workflow:``: .. code-block:: console @@ -387,7 +385,7 @@ The authoritative :rocoto:`Rocoto documentation <>` discusses a number of miscel join: "&LOGDIR;/analysis&LOGFN_SUFFIX;" ``ACCOUNT:`` (Default: "&ACCOUNT;") - The account under which users submit jobs to the queue on the specified ``MACHINE``. This value is typically the same for each task, so the default reuses the value set in the :ref:`Workflow Entities ` section. + An account where users can charge their compute resources on the specified ``MACHINE``. This value is typically the same for each task, so the default is to reuse the value set in the :ref:`Workflow Entities ` section. ``command:`` (Default: ``'&HOMElandda;/parm/task_load_modules_run_jjob.sh "analysis" "&HOMElandda;" "&MACHINE;"'``) The command that Rocoto will submit to the batch system to carry out the task's work. @@ -560,7 +558,7 @@ Parameters for the pre-analysis task are set in the ``task_pre_anal:`` section o Analysis Task (``task_analysis``) ----------------------------------- -Parameters for the analysis task are set in the ``task_analysis:`` section of the ``land_analysis_.yaml`` file. Most are the same as the defaults set in the :ref:`Workflow Entities ` section. The ``task_analysis:`` task is explained fully in the :ref:`Sample Task ` section, although the default values may differ. +Parameters for the analysis task are set in the ``task_analysis:`` section of the ``land_analysis_.yaml`` file. Most are the same as the defaults set in the :ref:`Workflow Entities ` section. The ``task_analysis:`` task is explained fully in the :ref:`Sample Task ` section. .. _post-analysis: From b3f19c0dfedd8dd934c228967efef2ad7f78a28a Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Mon, 29 Jul 2024 17:03:41 -0400 Subject: [PATCH 38/49] minor updates to reference chs --- doc/source/CustomizingTheWorkflow/DASystem.rst | 6 +++--- doc/source/CustomizingTheWorkflow/Model.rst | 6 +++--- doc/source/Reference/Glossary.rst | 2 +- doc/source/Reference/Rocoto.rst | 8 ++++---- 4 files changed, 11 insertions(+), 11 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index 2359a143..c14eb3c9 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -1,8 +1,8 @@ .. _DASystem: -*************************************************** -Land Data Assimilation System -*************************************************** +****************************************** +Input/Output Files & the JEDI DA System +****************************************** This chapter describes the configuration of the offline Land :term:`Data Assimilation` (DA) System, which utilizes the UFS Noah-MP component together with the ``jedi-bundle`` (|skylabv|) to enable cycled model forecasts. The data assimilation framework applies the Local Ensemble Transform Kalman Filter-Optimal Interpolation (LETKF-OI) algorithm to combine the state-dependent background error derived from an ensemble forecast with the observations and their corresponding uncertainties to produce an analysis ensemble (:cite:t:`HuntEtAl2007`, 2007). diff --git a/doc/source/CustomizingTheWorkflow/Model.rst b/doc/source/CustomizingTheWorkflow/Model.rst index 310db279..da685e23 100644 --- a/doc/source/CustomizingTheWorkflow/Model.rst +++ b/doc/source/CustomizingTheWorkflow/Model.rst @@ -1,8 +1,8 @@ .. _Model: -*********************************** -Input/Output Files - Noah-MP Model -*********************************** +**************************************** +Input/Output Files & the Noah-MP Model +**************************************** This chapter provides practical information on input files and parameters for the Noah-MP Land Surface Model (LSM) and its Vector-to-Tile Converter component. For background information on the Noah-MP LSM, see :numref:`Section %s ` of the Introduction. diff --git a/doc/source/Reference/Glossary.rst b/doc/source/Reference/Glossary.rst index 10d82578..bd3c1fcd 100644 --- a/doc/source/Reference/Glossary.rst +++ b/doc/source/Reference/Glossary.rst @@ -66,7 +66,7 @@ Glossary High-Performance Computing. J-jobs - Scripting layer (contained in ``land-DA_workflow/jobs/``) that should be directly called for each workflow component (either on the command line or by the workflow manager) to run a specific task in the workflow. The different scripting layers are described in detail in the :nco:`NCO Implementation Standards document `. + Scripts (contained in ``land-DA_workflow/jobs/``) that should be directly called for each workflow component (either on the command line or by the workflow manager) to run a specific task in the workflow. The different scripting layers are described in detail in the :nco:`NCO Implementation Standards document `. JEDI The Joint Effort for Data assimilation Integration (`JEDI `_) is a unified and versatile data assimilation (DA) system for Earth System Prediction. It aims to enable efficient research and accelerated transition from research to operations by providing a framework that takes into account all components of the Earth system in a consistent manner. The JEDI software package can run on a variety of platforms and for a variety of purposes, and it is designed to readily accommodate new atmospheric and oceanic models and new observation systems. The `JEDI User's Guide `_ contains extensive information on the software. diff --git a/doc/source/Reference/Rocoto.rst b/doc/source/Reference/Rocoto.rst index 8bcb1b43..b5d39763 100644 --- a/doc/source/Reference/Rocoto.rst +++ b/doc/source/Reference/Rocoto.rst @@ -5,7 +5,7 @@ Rocoto Introductory Information ================================== The tasks in the Land DA System are typically run using the Rocoto Workflow Manager (see :numref:`Table %s ` for default tasks). Rocoto is a Ruby program that communicates with the batch system on an :term:`HPC` system to run and manage dependencies between the tasks. Rocoto submits jobs to the HPC batch system as the task dependencies allow and runs one instance of the workflow for a set of user-defined :term:`cycles `. More information about Rocoto can be found on the `Rocoto Wiki `_. -The Land DA workflow is defined in a Jinja-enabled Rocoto XML template called ``land_analysis.xml``, which is generated using the contents of ``land_analysis.yaml`` as input to the Unified Workflow's :uw:`Rocoto tool `. Both files reside in the ``parm`` directory. The completed XML file contains the workflow task names, parameters needed by the job scheduler, and task interdependencies. +The Land DA workflow is defined in a Jinja-enabled Rocoto XML template called ``land_analysis.xml``, which is generated using the contents of ``land_analysis.yaml`` as input to the Unified Workflow's :uw:`Rocoto tool `. Both files reside in the ``land-DA_workflow/parm`` directory. The completed XML file contains the workflow task names, parameters needed by the job scheduler, and task interdependencies. There are a number of Rocoto commands available to run and monitor the workflow; users can find more information in the complete `Rocoto documentation `_. Descriptions and examples of commonly used commands are discussed below. @@ -22,8 +22,8 @@ The ``rocotorun`` command is used to run the workflow by submitting tasks to the where -* ``-w`` specifies the name of the workflow definition file. This must be an XML file. -* ``-d`` specifies the name of the database file that stores the state of the workflow. The database file is a binary file created and used only by Rocoto. It does not need to exist when the command is initially run. +* ``-w`` specifies the name of the workflow definition file. This must be an XML file (e.g., ``land_analysis.xml``). +* ``-d`` specifies the name of the database file that stores the state of the workflow (e.g., ``land_analysis.db``). The database file is a binary file created and used only by Rocoto. It does not need to exist when the command is initially run. * ``-v`` (optional) specified level of verbosity. If no level is specified, a level of 1 is used. From the ``parm`` directory, the ``rocotorun`` command for the workflow would be: @@ -34,7 +34,7 @@ From the ``parm`` directory, the ``rocotorun`` command for the workflow would be Users will need to include the absolute or relative path to these files when running the command from another directory. -It is important to note that the ``rocotorun`` process is iterative; the command must be executed many times before the entire workflow is completed, usually every 1-10 minutes. This command can be placed in the user's :term:`crontab`, and cron will call it with the specified frequency. More information on this command can be found in the `Rocoto documentation `_. +It is important to note that the ``rocotorun`` process is iterative; the command must be executed many times before the entire workflow is completed, usually every 1-10 minutes. More information on this command can be found in the `Rocoto documentation `_. The first time the ``rocotorun`` command is executed for a workflow, the files ``land_analysis.db`` and ``land_analysis_lock.db`` are created. There is usually no need for the user to modify these files. Each time the ``rocotorun`` command is executed, the last known state of the workflow is read from the ``land_analysis.db`` file, the batch system is queried, jobs are submitted for tasks whose dependencies have been satisfied, and the current state of the workflow is saved in ``land_analysis.db``. If there is a need to relaunch the workflow from scratch, both database files can be deleted, and the workflow can be run by executing the ``rocotorun`` command From 0ebe51b4884b8578288b37b3939928d94b75dfa0 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Mon, 29 Jul 2024 17:17:30 -0400 Subject: [PATCH 39/49] minor misc fixes --- doc/source/BuildingRunningTesting/BuildRunLandDA.rst | 2 ++ doc/source/BuildingRunningTesting/Container.rst | 10 ---------- 2 files changed, 2 insertions(+), 10 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 456a7419..21e4bc73 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -328,6 +328,8 @@ Check for the output files for each cycle in the experiment directory: where ``YYYYMMDD`` is the cycle date, and ```` is the model version (currently ``v1.2.1`` in the ``develop`` branch). The experiment should generate several restart files. +.. _plotting: + Plotting Results ----------------- diff --git a/doc/source/BuildingRunningTesting/Container.rst b/doc/source/BuildingRunningTesting/Container.rst index 03a0b48a..6949e8d0 100644 --- a/doc/source/BuildingRunningTesting/Container.rst +++ b/doc/source/BuildingRunningTesting/Container.rst @@ -95,8 +95,6 @@ NOAA RDHPCS Systems On many NOAA :term:`RDHPCS`, a container named ``ubuntu20.04-intel-landda-release-public-v1.2.0.img`` has already been built, and users may access the container at the locations in :numref:`Table %s `. -.. COMMENT: Is there a develop container now? - .. _PreBuiltContainers: .. table:: Locations of Pre-Built Containers @@ -117,31 +115,23 @@ On many NOAA :term:`RDHPCS`, a container named ``ubuntu20.04-intel-landda-releas | Orion/Hercules | /work/noaa/epic/role-epic/contrib/containers | +-----------------+--------------------------------------------------------+ -.. COMMENT: Check container locations. - Users can simply set an environment variable to point to the container: .. code-block:: console export img=path/to/ubuntu20.04-intel-landda-release-public-v1.2.0.img -.. COMMENT: Check container path! - If users prefer, they may copy the container to their local working directory. For example, on Jet: .. code-block:: console cp /mnt/lfs4/HFIP/hfv3gfs/role.epic/containers/ubuntu20.04-intel-landda-release-public-v1.2.0.img . -.. COMMENT: Check container path! - Other Systems ---------------- On other systems, users can build the Singularity container from a public Docker :term:`container` image or download the ``ubuntu20.04-intel-landda-release-public-v1.2.0.img`` container from the `Land DA Data Bucket `_. Downloading may be faster depending on the download speed on the user's system. However, the container in the data bucket is the ``release/v1.2.0`` container rather than the updated ``develop`` branch container. -.. COMMENT: Check container name! - To download from the data bucket, users can run: .. code-block:: console From fa0b8b093794a18faf5fe73d79746670444ee7c9 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Tue, 30 Jul 2024 11:08:06 -0400 Subject: [PATCH 40/49] minor misc updates --- doc/source/CustomizingTheWorkflow/DASystem.rst | 6 +++--- doc/source/CustomizingTheWorkflow/Model.rst | 6 +++--- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index c14eb3c9..02c8b237 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -1,7 +1,7 @@ .. _DASystem: ****************************************** -Input/Output Files & the JEDI DA System +Input/Output Files for the JEDI DA System ****************************************** This chapter describes the configuration of the offline Land :term:`Data Assimilation` (DA) System, which utilizes the UFS Noah-MP component together with the ``jedi-bundle`` (|skylabv|) to enable cycled model forecasts. The data assimilation framework applies the Local Ensemble Transform Kalman Filter-Optimal Interpolation (LETKF-OI) algorithm to combine the state-dependent background error derived from an ensemble forecast with the observations and their corresponding uncertainties to produce an analysis ensemble (:cite:t:`HuntEtAl2007`, 2007). @@ -166,10 +166,10 @@ The ``geometry:`` section is used in JEDI configuration files to specify the mod This section contains two parameters, ``namelist filename`` and ``field table filename``, which are required for :term:`FMS` initialization. ``namelist filename`` (Default: Data/fv3files/fmsmpp.nml) - Specifies the path to the namelist filename. + Specifies the path to the namelist file. ``field table filename`` (Default: Data/fv3files/field_table) - Specifies the path to the field table filename. + Specifies the path to the field table file. ``akbk`` (Default: Data/fv3files/akbk64.nc4) Specifies the path to a file containing the coefficients that define the hybrid sigma-pressure vertical coordinate used in FV3. Files are provided with the repository containing ``ak`` and ``bk`` for some common choices of vertical resolution for GEOS and GFS. diff --git a/doc/source/CustomizingTheWorkflow/Model.rst b/doc/source/CustomizingTheWorkflow/Model.rst index da685e23..b156887d 100644 --- a/doc/source/CustomizingTheWorkflow/Model.rst +++ b/doc/source/CustomizingTheWorkflow/Model.rst @@ -1,8 +1,8 @@ .. _Model: -**************************************** -Input/Output Files & the Noah-MP Model -**************************************** +***************************************** +Input/Output Files for the Noah-MP Model +***************************************** This chapter provides practical information on input files and parameters for the Noah-MP Land Surface Model (LSM) and its Vector-to-Tile Converter component. For background information on the Noah-MP LSM, see :numref:`Section %s ` of the Introduction. From f56d91c48ce8dda6b4e8df6a4701577dff74a96a Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Mon, 5 Aug 2024 11:19:38 -0400 Subject: [PATCH 41/49] update docs w/changes from PR #129 --- doc/source/BackgroundInfo/Introduction.rst | 3 +- .../BackgroundInfo/TechnicalOverview.rst | 16 --------- .../BuildingRunningTesting/BuildRunLandDA.rst | 3 +- .../BuildingRunningTesting/TestingLandDA.rst | 34 +++++++------------ .../CustomizingTheWorkflow/ConfigWorkflow.rst | 34 +++++-------------- 5 files changed, 25 insertions(+), 65 deletions(-) diff --git a/doc/source/BackgroundInfo/Introduction.rst b/doc/source/BackgroundInfo/Introduction.rst index f061d8cd..6bfe1c90 100644 --- a/doc/source/BackgroundInfo/Introduction.rst +++ b/doc/source/BackgroundInfo/Introduction.rst @@ -19,6 +19,7 @@ Since the |latestr| release, the following capabilities have been added to the L * Extended container support (:land-wflow-repo:`PR #85 `) * Updated directory structure for NCO compliance (:land-wflow-repo:`PR #75 `) * Removed land driver from CTest (:land-wflow-repo:`PR #123 `) +* Removed land-driver and vector2tile (:land-wflow-repo:`PR #129 `) The Land DA System citation is as follows and should be used when presenting results based on research conducted with the Land DA System: @@ -46,7 +47,7 @@ Customizing the Workflow ========================= * :numref:`Chapter %s: Available Workflow Configuration Parameters ` explains all of the user-configurable options currently available in the workflow configuration file (``land_analysis*.yaml``). - * :numref:`Chapter %s: Model ` provides information on input data and configuration parameters in the Noah-MP LSM and its Vector-to-Tile Converter. + * :numref:`Chapter %s: Model ` provides information on input data and configuration parameters in the Noah-MP LSM. * :numref:`Chapter %s: DA Framework ` provides information on the DA system, required data, and configuration parameters. Reference diff --git a/doc/source/BackgroundInfo/TechnicalOverview.rst b/doc/source/BackgroundInfo/TechnicalOverview.rst index 16a5ddeb..76d1c811 100644 --- a/doc/source/BackgroundInfo/TechnicalOverview.rst +++ b/doc/source/BackgroundInfo/TechnicalOverview.rst @@ -115,22 +115,10 @@ This :term:`umbrella repository` uses Git submodules and an ``app_build.sh`` fil - land-apply_jedi_incr - Contains code that applies the JEDI-generated DA increment to UFS ``sfc_data`` restart - https://github.com/NOAA-PSL/land-apply_jedi_incr - * - ufsLand.fd - - ufs-land-driver-emc-dev - - Repository for the UFS Land Driver - - https://github.com/NOAA-EPIC/ufs-land-driver-emc-dev - * - *-- ccpp-physics* - - *-- ccpp-physics* - - Repository for the Common Community Physics Package (CCPP) - - https://github.com/ufs-community/ccpp-physics/ * - ufs_model.fd - ufs-weather-model - Repository for the UFS Weather Model (WM). This repository contains a number of subrepositories, which are documented :ufs-wm:`in the WM User's `. - https://github.com/ufs-community/ufs-weather-model/ - * - vector2tile_converter.fd - - land-vector2tile - - Contains code to map between the vector format used by the Noah-MP offline driver, and the tile format used by the UFS atmospheric model. - - https://github.com/NOAA-PSL/land-vector2tile .. note:: The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS Land DA System repository. The `spack-stack `_ repository assembles these prerequisite libraries. Spack-stack has already been built on :ref:`preconfigured (Level 1) platforms `. However, it must be built on other systems. See the :spack-stack:`spack-stack Documentation <>` for details on installing spack-stack. @@ -166,11 +154,7 @@ The ``land-DA_workflow`` is evolving to follow the :term:`NCEP` Central Operatio │ ├── (conda) │ ├── test │ ├── tile2tile_converter.fd - │ ├── ufsLand.fd - │ │ ├── ccpp-physics - │ │ └── driver │ ├── ufs_model.fd - │ ├── vector2tile_converter.fd │ ├── CMakeLists.txt │ └── app_build.sh ├── ush diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 21e4bc73..afd12b56 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -309,8 +309,7 @@ As the experiment progresses, it will generate a number of directories to hold i │ │ ├── hofx │ │ └── plot │ └── output - │ └── logs - │ └── run_ (): Directory containing the log files for the Rocoto workflow + │ └── logs (): Directory containing the log files for the Rocoto workflow └── tmp () ├── (): Working directory └── DATA_SHARE diff --git a/doc/source/BuildingRunningTesting/TestingLandDA.rst b/doc/source/BuildingRunningTesting/TestingLandDA.rst index d1df8fc0..4bb5d849 100644 --- a/doc/source/BuildingRunningTesting/TestingLandDA.rst +++ b/doc/source/BuildingRunningTesting/TestingLandDA.rst @@ -33,27 +33,23 @@ If the tests are successful, a message will be printed to the console. For examp .. code-block:: console Test project /work/noaa/epic/${USER}/landda/land-DA_workflow/sorc/build - Start 1: test_vector2tile - 1/6 Test #1: test_vector2tile ................. Passed 12.01 sec - Start 2: test_create_ens - 2/6 Test #2: test_create_ens .................. Passed 13.91 sec - Start 3: test_letkfoi_snowda - 3/6 Test #3: test_letkfoi_snowda .............. Passed 67.94 sec - Start 4: test_apply_jediincr - 4/6 Test #4: test_apply_jediincr .............. Passed 6.88 sec - Start 5: test_tile2vector - 5/6 Test #5: test_tile2vector ................. Passed 15.36 sec - Start 6: test_ufs_datm_land - 6/6 Test #6: test_ufs_datm_land ............... Passed 98.56 sec - - 100% tests passed, 0 tests failed out of 6 - - Total Test time (real) = 217.06 sec + Start 1: test_create_ens + 1/4 Test #1: test_create_ens .................. Passed 13.91 sec + Start 2: test_letkfoi_snowda + 2/4 Test #2: test_letkfoi_snowda .............. Passed 67.94 sec + Start 3: test_apply_jediincr + 3/4 Test #3: test_apply_jediincr .............. Passed 6.88 sec + Start 4: test_ufs_datm_land + 4/4 Test #4: test_ufs_datm_land ............... Passed 98.56 sec + + 100% tests passed, 0 tests failed out of 4 + + Total Test time (real) = 187.29 sec Tests ******* -The CTests test the operability of six major elements of the Land DA System: ``vector2tile``, ``create_ens``, ``letkfoi_snowda``, ``apply_jediincr``, ``tile2vector``, and ``ufs_datm_land``. The tests and their dependencies are listed in the ``land-DA_workflow/test/CMakeLists.txt`` file. Currently, the CTests are only run on Hera and Orion; they cannot yet be run via container. +The CTests test the operability of four major elements of the Land DA System: ``create_ens``, ``letkfoi_snowda``, ``apply_jediincr``, and ``ufs_datm_land``. The tests and their dependencies are listed in the ``land-DA_workflow/test/CMakeLists.txt`` file. Currently, the CTests are only run on Hera and Orion; they cannot yet be run via container. .. list-table:: *Land DA CTests* :widths: 20 50 @@ -61,15 +57,11 @@ The CTests test the operability of six major elements of the Land DA System: ``v * - Test - Description - * - ``test_vector2tile`` - - Tests the vector-to-tile function for use in JEDI. * - ``test_create_ens`` - Tests creation of a pseudo-ensemble for use in LETKF-OI. * - ``test_letkfoi_snowda`` - Tests the use of LETKF-OI to assimilate snow data. * - ``test_apply_jediincr`` - Tests the ability to add a JEDI increment. - * - ``test_tile2vector`` - - Tests the tile-to-vector function for use in ``ufs-land-driver`` * - ``test_ufs_datm_land`` - Tests proper functioning of the UFS land model (``ufs-datm-lnd``) diff --git a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst index 42aaee2b..8ce0284c 100644 --- a/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst +++ b/doc/source/CustomizingTheWorkflow/ConfigWorkflow.rst @@ -75,11 +75,10 @@ Entities are constants that can be referred to throughout the workflow using the MACHINE: "orion" SCHED: "slurm" ACCOUNT: "epic" - EXP_NAME: "LETKF" EXP_BASEDIR: "/work/noaa/epic/{USER}/landda_test" JEDI_INSTALL: "/work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6" WARMSTART_DIR: "/work/noaa/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART" - FORCING: "gswp3" + ATMOS_FORC: "gswp3" RES: "96" FCSTHR: "24" NPROCS_ANALYSIS: "6" @@ -99,9 +98,8 @@ Entities are constants that can be referred to throughout the workflow using the COMROOT: "&PTMP;/&envir;/com" DATAROOT: "&PTMP;/&envir;/tmp" KEEPDATA: "YES" - LOGDIR: "&COMROOT;/output/logs/run_&FORCING;" + LOGDIR: "&COMROOT;/output/logs;" LOGFN_SUFFIX: "_@Y@m@d@H.log" - PATHRT: "&EXP_BASEDIR;" PDY: "@Y@m@d" cyc: "@H" DATADEP_FILE1: "&WARMSTART_DIR;/ufs_land_restart.@Y-@m-@d_@H-00-00.tile1.nc" @@ -122,9 +120,6 @@ Entities are constants that can be referred to throughout the workflow using the ``ACCOUNT:`` (Default: "epic") An account where users can charge their compute resources on the specified ``MACHINE``. To determine an appropriate ``ACCOUNT`` field on a system with a Slurm job scheduler, users may run the ``saccount_params`` command to display account details. On other systems, users may run the ``groups`` command, which will return a list of projects that the user has permissions for. Not all of the listed projects/groups have an HPC allocation, but those that do are potentially valid account names. -``EXP_NAME:`` (Default: "LETKF") - Placeholder --- currently not used in workflow. - ``EXP_BASEDIR:`` (Default: "/scratch2/NAGAPE/epic/{USER}/landda_test" or "/work/noaa/epic/{USER}/landda_test") The full path to the parent directory of ``land-DA_workflow`` (i.e., ``$LANDDAROOT`` in the documentation). @@ -134,8 +129,8 @@ Entities are constants that can be referred to throughout the workflow using the ``WARMSTART_DIR:`` (Default: "/scratch2/NAGAPE/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART" or "/work/noaa/epic/UFS_Land-DA_Dev/inputs/DATA_RESTART") The path to restart files for a warmstart experiment. -``FORCING:`` (Default: "gswp3") - Type of atmospheric forcing data used. Valid values: ``"gswp3"`` | ``"era5"`` +``ATMOS_FORC:`` (Default: "gswp3") + Type of atmospheric forcing data used. Valid values: ``"gswp3"`` ``RES:`` (Default: "96") Resolution of FV3 grid. Currently, only C96 resolution is supported. @@ -213,15 +208,12 @@ Standard environment variables are defined in the NCEP Central Operations :nco:` ``KEEPDATA:`` (Default: "YES") Flag to keep data ("YES") or not ("NO") that is copied to the ``$DATAROOT`` directory during the forecast experiment. -``LOGDIR:`` (Default: "&COMROOT;/output/logs/run_&FORCING;") +``LOGDIR:`` (Default: "&COMROOT;/output/logs;") Path to the directory containing log files for each workflow task. ``LOGFN_SUFFIX:`` (Default: "_@Y@m@d@H.log") The cycle suffix appended to each task's log file. It will be rendered in the form ``_YYYYMMDDHH.log``. For example, the ``prep_obs`` task log file for the Jan. 4, 2000 00z cycle would be named: ``prep_obs_2000010400.log``. -``PATHRT:`` (Default: "&EXP_BASEDIR;") - The path to the ``EXP_BASEDIR`` for regression tests (RTs). - ``PDY:`` (Default: "@Y@m@d") Date in YYYYMMDD format. @@ -288,7 +280,6 @@ Parameters for a particular task are set in the ``workflow.tasks.task_:`` SCHED: "&SCHED;" ACCOUNT: "&ACCOUNT;" EXP_NAME: "&EXP_NAME;" - ATMOS_FORC: "&FORCING;" RES: "&RES;" TSTUB: "&TSTUB;" model_ver: "&model_ver;" @@ -334,13 +325,13 @@ The ``attrs:`` section for each task includes the ``cycledefs:`` attribute and t Task Environment Variables (``envars``) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The ``envars:`` section for each task reuses many of the same variables and values defined as ``entities:`` for the overall workflow. These values are needed for each task, but setting them individually is error-prone. Instead, a specific workflow task can reference workflow entities using the ``&VAR;`` syntax. For example, to set the ``ATMOS_FORC:`` value in ``task_analysis:`` to the value of the workflow ``FORCING`` entity, the following statement can be added to the task's ``envars:`` section: +The ``envars:`` section for each task reuses many of the same variables and values defined as ``entities:`` for the overall workflow. These values are needed for each task, but setting them individually is error-prone. Instead, a specific workflow task can reference workflow entities using the ``&VAR;`` syntax. For example, to set the ``ACCOUNT:`` value in ``task_analysis:`` to the value of the workflow ``ACCOUNT:`` entity, the following statement can be added to the task's ``envars:`` section: .. code-block:: console task_analysis: envars: - ATMOS_FORC: "&FORCING;" + ACCOUNT: "&ACCOUNT;" For most workflow tasks, whatever value is set in the ``workflow.entities:`` section should be reused/referenced in other tasks. For example, the ``MACHINE`` variable must be defined for each task, and users cannot switch machines mid-workflow. Therefore, users should set the ``MACHINE`` variable in the ``workflow.entities:`` section and reference that definition in each workflow task. For example: @@ -477,8 +468,7 @@ Parameters for the observation preparation task are set in the ``task_prep_obs:` MACHINE: "&MACHINE;" SCHED: "&SCHED;" ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" - ATMOS_FORC: "&FORCING;" + ATMOS_FORC: "&ATMOS_FORC;" model_ver: "&model_ver;" HOMElandda: "&HOMElandda;" COMROOT: "&COMROOT;" @@ -513,8 +503,6 @@ Parameters for the pre-analysis task are set in the ``task_pre_anal:`` section o MACHINE: "&MACHINE;" SCHED: "&SCHED;" ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" - ATMOS_FORC: "&FORCING;" RES: "&RES;" TSTUB: "&TSTUB;" WARMSTART_DIR: "&WARMSTART_DIR;" @@ -579,8 +567,6 @@ Parameters for the post analysis task are set in the ``task_post_anal:`` section MACHINE: "&MACHINE;" SCHED: "&SCHED;" ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" - ATMOS_FORC: "&FORCING;" RES: "&RES;" TSTUB: "&TSTUB;" model_ver: "&model_ver;" @@ -623,7 +609,6 @@ Parameters for the plotting task are set in the ``task_plot_stats:`` section of MACHINE: "&MACHINE;" SCHED: "&SCHED;" ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" model_ver: "&model_ver;" RUN: "&RUN;" HOMElandda: "&HOMElandda;" @@ -664,8 +649,7 @@ Parameters for the forecast task are set in the ``task_forecast:`` section of th MACHINE: "&MACHINE;" SCHED: "&SCHED;" ACCOUNT: "&ACCOUNT;" - EXP_NAME: "&EXP_NAME;" - ATMOS_FORC: "&FORCING;" + ATMOS_FORC: "&ATMOS_FORC;" RES: "&RES;" WARMSTART_DIR: "&WARMSTART_DIR;" model_ver: "&model_ver;" From a9b0bac5d1c377b878bb90f96e718f836401f65a Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Mon, 5 Aug 2024 11:23:55 -0400 Subject: [PATCH 42/49] update docs w/changes from PR #129 --- doc/source/Reference/FAQ.rst | 2 +- doc/source/Reference/Rocoto.rst | 3 +-- 2 files changed, 2 insertions(+), 3 deletions(-) diff --git a/doc/source/Reference/FAQ.rst b/doc/source/Reference/FAQ.rst index 89434c08..3af435b2 100644 --- a/doc/source/Reference/FAQ.rst +++ b/doc/source/Reference/FAQ.rst @@ -29,7 +29,7 @@ On platforms that utilize Rocoto workflow software (including Hera and Orion), i 200001030000 forecast 61746128 DEAD 256 1 - -This means that the dead task has not completed successfully, so the workflow has stopped. Once the issue has been identified and fixed (by referencing the log files in ``$LANDDAROOT/ptmp/test/com/output/logs/run_``), users can re-run the failed task using the ``rocotorewind`` command: +This means that the dead task has not completed successfully, so the workflow has stopped. Once the issue has been identified and fixed (by referencing the log files in ``$LANDDAROOT/ptmp/test/com/output/logs``), users can re-run the failed task using the ``rocotorewind`` command: .. code-block:: console diff --git a/doc/source/Reference/Rocoto.rst b/doc/source/Reference/Rocoto.rst index b5d39763..24357472 100644 --- a/doc/source/Reference/Rocoto.rst +++ b/doc/source/Reference/Rocoto.rst @@ -101,7 +101,7 @@ After issuing the ``rocotorun`` command several times (over the course of severa 200001040000 plot_stats 18347592 SUCCEEDED 0 1 48.0 200001040000 forecast 18347593 RUNNING - 1 0.0 -When the workflow runs to completion, all tasks will be marked as SUCCEEDED. The log file for each task is located in ``$LANDDAROOT/ptmp/test/com/output/logs/run_``, where ```` is either ``gswp3`` or ``era5``. If any task fails, the corresponding log file can be checked for error messages. Optional arguments for the ``rocotostat`` command can be found in the `Rocoto documentation `_. +When the workflow runs to completion, all tasks will be marked as SUCCEEDED. The log file for each task is located in ``$LANDDAROOT/ptmp/test/com/output/logs``. If any task fails, the corresponding log file can be checked for error messages. Optional arguments for the ``rocotostat`` command can be found in the `Rocoto documentation `_. .. _rocotocheck: @@ -150,7 +150,6 @@ Running ``rocotocheck`` will result in output similar to the following: COMROOT ==> /work/noaa/epic/$USER/landda/ptmp/test/com DATAROOT ==> /work/noaa/epic/$USER/landda/ptmp/test/tmp DAtype ==> letkfoi_snow - EXP_NAME ==> LETKF HOMElandda ==> /work/noaa/epic/$USER/landda/land-DA_workflow JEDI_INSTALL ==> /work/noaa/epic/UFS_Land-DA_Dev/jedi_v7_stack1.6 KEEPDATA ==> YES From 5671e7b8ed404592698e3df54cb3aab76acc5f53 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Mon, 5 Aug 2024 13:30:04 -0400 Subject: [PATCH 43/49] update docs w/changes from PR #129 --- .../BuildingRunningTesting/BuildRunLandDA.rst | 17 ++++++++++------- doc/source/CustomizingTheWorkflow/DASystem.rst | 2 +- 2 files changed, 11 insertions(+), 8 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index afd12b56..ee31daa5 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -4,10 +4,7 @@ Land DA Workflow (Hera & Orion) ************************************ -This chapter provides instructions for building and running basic Land DA cases for the Unified Forecast System (:term:`UFS`) Land DA System. Users can choose between two supported options: - - * A Jan. 3-4, 2000 00z sample case using :term:`GSWP3` data with the UFS Noah-MP land component - * A Dec. 21-22, 2019 00z sample case using :term:`ERA5` data with the UFS Land Driver +This chapter provides instructions for building and running basic Land DA cases for the Unified Forecast System (:term:`UFS`) Land DA System using a Jan. 3-4, 2000 00z sample case using :term:`GSWP3` data with the UFS Noah-MP land component. .. attention:: @@ -18,15 +15,22 @@ This chapter provides instructions for building and running basic Land DA cases Create a Working Directory ***************************** -Create a directory for the Land DA experiment (``$LANDDAROOT``): +Create a base directory for the Land DA experiment and navigate into it: .. code-block:: console mkdir /path/to/landda cd /path/to/landda + +where ``/path/to/landda`` is the path to the directory where the user plans to run Land DA experiments. In the experiment configuration file, this directory is referred to as ``$EXP_BASEDIR``. + +Optionally, users can save this directory path in an environment variable (e.g., ``$LANDDAROOT``) to avoid typing out full path names later. + +.. code-block:: console + export LANDDAROOT=`pwd` -where ``/path/to/landda`` is the path to the directory where the user plans to run Land DA experiments. In the experiment configuration file, ``$LANDDAROOT`` is referred to as ``$EXP_BASEDIR``. +In this documentation, ``$LANDDAROOT`` is used, but users are welcome to choose another name for this variable if they prefer. .. _GetCode: @@ -122,7 +126,6 @@ Users will need to configure certain elements of their experiment in ``land_anal * ``ACCOUNT:`` A valid account name. Hera, Orion, and most NOAA RDHPCS systems require a valid account name; other systems may not (in which case, any value will do). * ``EXP_BASEDIR:`` The full path to the directory where land-DA_workflow was cloned (i.e., ``$LANDDAROOT``) - * ``FORCING:`` Forcing options; ``gswp3`` or ``era5`` * ``cycledef/spec:`` Cycle specification .. note:: diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index 02c8b237..dd4d6a85 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -654,7 +654,7 @@ Prior to ingesting the GHCN IODA files via the LETKF at the DA analysis time, th Restart Files ================ -To restart the UFS land driver successfully after land model execution, all parameters, states, and fluxes used for a subsequent time iteration are stored in a restart file. This restart file is named ``ufs_land_restart.${FILEDATE}.nc`` where ``FILEDATE`` is in YYYY-MM-DD_HH-mm-SS format (e.g., ``ufs_land_restart.2019-12-21_00-00-00.nc``). The restart file contains all the model fields and their values at a specific point in time; this information can be used to restart the model immediately to run the next cycle. The Land DA System reads the states from the restart file and replaces them after the DA step with the updated analysis. :numref:`Table %s ` lists the fields in the Land DA restart file. Within the UFS land driver (submodule ``ufs-land-driver-emc-dev``), read/write of the restart file is performed in ``ufsLandNoahMPRestartModule.f90``. +To restart the Land DA System successfully after land model execution, all parameters, states, and fluxes used for a subsequent time iteration are stored in a restart file. This restart file is named ``ufs_land_restart.${FILEDATE}.tile#.nc`` where ``FILEDATE`` is in YYYY-MM-DD_HH-mm-SS format and ``#`` is 1-6 (e.g., ``ufs_land_restart.2000-01-05_00-00-00.tile1.nc``). The restart file contains all the model fields and their values at a specific point in time; this information can be used to restart the model immediately to run the next cycle. The Land DA System reads the states from the restart file and replaces them after the DA step with the updated analysis. :numref:`Table %s ` lists the fields in the Land DA restart file. .. _RestartFiles: From ac1804e49f368a7301dcc493eb7736f7169b0615 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Mon, 5 Aug 2024 15:02:03 -0400 Subject: [PATCH 44/49] rm mention of 2019 case --- doc/source/BuildingRunningTesting/BuildRunLandDA.rst | 5 ----- 1 file changed, 5 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index ee31daa5..cd02195c 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -108,11 +108,6 @@ where ```` is ``hera`` or ``orion``. This activates the ``land_da`` co Modify the Workflow Configuration YAML ======================================== -The ``develop`` branch includes two default experiments: - - * A Jan. 3, 2000 00z sample case using the UFS Noah-MP land component. - * A Dec. 21, 2019 00z sample case using the UFS Land Driver. - Copy the experiment settings into ``land_analysis.yaml``: .. code-block:: console From 6c5d384489eee6cb728487e2789349242978005c Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Thu, 8 Aug 2024 11:27:59 -0400 Subject: [PATCH 45/49] remove ERA5 model input file section --- doc/source/CustomizingTheWorkflow/Model.rst | 649 -------------------- 1 file changed, 649 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/Model.rst b/doc/source/CustomizingTheWorkflow/Model.rst index b156887d..851daabc 100644 --- a/doc/source/CustomizingTheWorkflow/Model.rst +++ b/doc/source/CustomizingTheWorkflow/Model.rst @@ -35,10 +35,6 @@ For data specific to the latest release (|latestr|), users can run: These files and their parameters are described in the following subsections. -.. note:: - - * Users who wish to use the UFS land component with :term:`GSWP3` data can proceed to the :numref:`Section %s `. - * Users who wish to run the land driver implementation of Land DA with :term:`ERA5` data should proceed to :numref:`Section %s `. .. _view-netcdf-files: @@ -171,651 +167,6 @@ The ``C96_grid.tile*.nc`` files contain grid information for tiles 1-6 at C96 gr ``grid_spec.nc`` and ``C96.mosaic.nc`` are the same file under different names and may be used interchangeably. -.. _land-driver-input-files: - -Input Files for the Land Driver Configuration with ERA5 Data -============================================================== - -There are several important files used to specify model parameters in the land driver-based offline Land DA System: -the static file (``ufs-land_C96_static_fields.nc``), -the initial conditions file (``ufs-land_C96_init_*.nc``), -and the model configuration file (``ufs-land.namelist.noahmp``). -These files and their parameters are described in the following subsections. -They are publicly available via the `Land DA Data Bucket `_. - -Static File (``ufs-land_C96_static_fields.nc``) -------------------------------------------------- - -The static file includes specific information on location, time, soil layers, and fixed (invariant) experiment parameters that are required for Noah-MP to run. The data must be provided in :term:`netCDF` format. - -The static file is available in the ``inputs`` data directory (downloaded :ref:`above `) at the following path: - -.. code-block:: - - inputs/static/ufs-land_C96_static_fields.nc - -.. table:: *Configuration variables specified in the static file* (ufs-land_C96_static_fields.nc) - - +---------------------------+------------------------------------------+ - | Configuration Variables | Description | - +===========================+==========================================+ - | land_mask | land-sea mask (0-ocean, 1-land) | - +---------------------------+------------------------------------------+ - | vegetation_category | vegetation type | - +---------------------------+------------------------------------------+ - | soil_category | soil type | - +---------------------------+------------------------------------------+ - | slope_category | slope type | - +---------------------------+------------------------------------------+ - | albedo_monthly | monthly albedo | - +---------------------------+------------------------------------------+ - | lai_monthly (leaf area | monthly leaf area index | - | index_monthly) | | - +---------------------------+------------------------------------------+ - | emissivity | emissivity | - +---------------------------+------------------------------------------+ - | z0_monthly | monthly ground roughness length | - +---------------------------+------------------------------------------+ - | cube_tile | FV3 tile where the grid is located | - +---------------------------+------------------------------------------+ - | cube_i | i-location in the FV3 tile where the | - | | grid is located | - +---------------------------+------------------------------------------+ - | cube_j | j-location in the FV3 tile where the | - | | grid is located | - +---------------------------+------------------------------------------+ - | latitude | latitude | - +---------------------------+------------------------------------------+ - | longitude | longitude | - +---------------------------+------------------------------------------+ - | elevation | elevation | - +---------------------------+------------------------------------------+ - | deep_soil_temperature | lower boundary soil temperature | - +---------------------------+------------------------------------------+ - | max_snow_albedo | maximum snow albedo | - +---------------------------+------------------------------------------+ - | gvf_monthly | monthly green vegetation fraction (gvf) | - +---------------------------+------------------------------------------+ - | visible_black_sky_albedo | visible black sky albedo | - +---------------------------+------------------------------------------+ - | visible_white_sky_albedo | visible white sky albedo | - +---------------------------+------------------------------------------+ - | near_IR_black_sky_albedo | near infrared black sky albedo | - +---------------------------+------------------------------------------+ - | near_IR_white_sky_albedo | near infrared white sky albedo | - +---------------------------+------------------------------------------+ - | soil_level_nodes | soil level nodes | - +---------------------------+------------------------------------------+ - | soil_level_thickness | soil level thickness | - +---------------------------+------------------------------------------+ - -Initial Conditions File (``ufs-land_C96_init_*.nc``) ------------------------------------------------------- - -The offline Land DA System currently only supports snow DA. -The initial conditions file includes the initial state variables that are required for the UFS land snow DA to begin a cycling run. The data must be provided in :term:`netCDF` format. - -The initial conditions file is available in the ``inputs`` data directory (downloaded :ref:`above `) at the following path: - -.. code-block:: - - inputs/forcing/era5/init/ufs-land_C96_init_2010-12-31_23-00-00.nc - -.. table:: Configuration variables specified in the initial forcing file (ufs-land_C96_init_fields_1hr.nc) - - +-----------------------------+----------------------------------------+ - | Configuration Variables | Units | - +=============================+========================================+ - | time | seconds since 1970-01-01 00:00:00 | - +-----------------------------+----------------------------------------+ - | date (date length) | UTC date | - +-----------------------------+----------------------------------------+ - | latitude | degrees north-south | - +-----------------------------+----------------------------------------+ - | longitude | degrees east-west | - +-----------------------------+----------------------------------------+ - | snow_water_equivalent | mm | - +-----------------------------+----------------------------------------+ - | snow_depth | m | - +-----------------------------+----------------------------------------+ - | canopy_water | mm | - +-----------------------------+----------------------------------------+ - | skin_temperature | K | - +-----------------------------+----------------------------------------+ - | soil_temperature | mm | - +-----------------------------+----------------------------------------+ - | soil_moisture | m\ :sup:`3`/m\ :sup:`3` | - +-----------------------------+----------------------------------------+ - | soil_liquid | m\ :sup:`3`/m\ :sup:`3` | - +-----------------------------+----------------------------------------+ - | soil_level_thickness | m | - +-----------------------------+----------------------------------------+ - | soil_level_nodes | m | - +-----------------------------+----------------------------------------+ - -Model Configuration File (``ufs-land.namelist.noahmp``) ----------------------------------------------------------- - -The UFS land model uses a series of template files combined with -user-selected settings to create required namelists and parameter -files needed by the UFS Land DA workflow. This section describes the -options in the ``ufs-land.namelist.noahmp`` file, which is generated -from the ``template.ufs-noahMP.namelist.era5`` file. - -.. note:: - - Any default values indicated are the defaults set in the ``template.ufs-noahMP.namelist.era5`` files. - -Run Setup Parameters -^^^^^^^^^^^^^^^^^^^^^^ - -``static_file`` - Specifies the path to the UFS land static file. - -``init_file`` - Specifies the path to the UFS land initial condition file. - -``forcing_dir`` - Specifies the path to the UFS land forcing directory where atmospheric forcing files are located. - -``separate_output`` - Specifies whether to enable separate output files for each output time. Valid values: ``.false.`` | ``.true.`` - - +----------+---------------------------------------+ - | Value | Description | - +==========+=======================================+ - | .false. | do not enable (should only be used | - | | for single point or short simulations)| - +----------+---------------------------------------+ - | .true. | enable | - +----------+---------------------------------------+ - -``output_dir`` - Specifies the output directory where output files will be saved. If ``separate_output=.true.``, but no ``output_dir`` is specified, it will default to the directory where the executable is run. - -``output_frequency_s`` - Specifies the output frequency (in seconds) for the UFS land model. - -``restart_frequency_s`` - Specifies the restart frequency (in seconds) for the UFS land model. - -``restart_simulation`` - Specifies whether to enable the restart simulation. Valid values: ``.false.`` | ``.true.`` - - +----------+----------------+ - | Value | Description | - +==========+================+ - | .false. | do not enable | - +----------+----------------+ - | .true. | enable | - +----------+----------------+ - -``restart_date`` - Specifies the restart date. The form is ``YYYY-MM-DD HH:MM:SS``, where - YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, - HH is a valid 2-digit hour, MM is a valid 2-digit minute, and SS is a valid 2-digit second. - -``restart_dir`` - Specifies the restart directory. - -``timestep_seconds`` - Specifies the land model timestep in seconds. - -``simulation_start`` - Specifies the simulation start time. The form is ``YYYY-MM-DD HH:MM:SS``, where - YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, - HH is a valid 2-digit hour, MM is a valid 2-digit minute, and SS is a valid 2-digit second. - -``simulation_end`` - Specifies the simulation end time. The form is ``YYYY-MM-DD HH:MM:SS``, where - YYYY is a 4-digit year, MM is a valid 2-digit month, DD is a valid 2-digit day, - HH is a valid 2-digit hour, MM is a valid 2-digit minute, and SS is a valid 2-digit second. - -``run_days`` - Specifies the number of days to run. - -``run_hours`` - Specifies the number of hours to run. - -``run_minutes`` - Specifies the number of minutes to run. - -``run_seconds`` - Specifies the number of seconds to run. - -``run_timesteps`` - Specifies the number of timesteps to run. - -``location_start`` -.. COMMENT: Add definition! - -``location_end`` -.. COMMENT: Add definition! - -Land Model Options -^^^^^^^^^^^^^^^^^^^^^ - -``land_model`` - Specifies which land surface model to use. Valid values: ``1`` | ``2`` - - +--------+-------------+ - | Value | Description | - +========+=============+ - | 1 | Noah | - +--------+-------------+ - | 2 | Noah-MP | - +--------+-------------+ - -Structure-Related Parameters -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -``num_soil_levels`` - Specifies the number of soil levels. - -``forcing_height`` - Specifies the forcing height in meters. - -Soil Setup Parameters -^^^^^^^^^^^^^^^^^^^^^^^ - -``soil_level_thickness`` - Specifies the thickness (in meters) of each of the soil layers (top layer to bottom layer). - -``soil_level_nodes`` - Specifies the soil level centroids from the surface (in meters). - -Noah-MP Options -^^^^^^^^^^^^^^^^^^ - -``dynamic_vegetation_option``: (Default: ``4``) - Specifies the dynamic vegetation model option. Valid values: ``1`` | ``2`` | ``3`` | ``4`` | ``5`` | ``6`` | ``7`` | ``8`` | ``9`` | ``10`` - - +-------+------------------------------------------------------------+ - | Value | Description | - +=======+============================================================+ - | 1 | off (use table LAI; use FVEG=SHDFAC from input) | - +-------+------------------------------------------------------------+ - | 2 | on (dynamic vegetation; must use Ball-Berry canopy option) | - +-------+------------------------------------------------------------+ - | 3 | off (use table LAI; calculate FVEG) | - +-------+------------------------------------------------------------+ - | 4 | off (use table LAI; use maximum vegetation fraction) | - +-------+------------------------------------------------------------+ - | 5 | on (use maximum vegetation fraction) | - +-------+------------------------------------------------------------+ - | 6 | on (use FVEG = SHDFAC from input) | - +-------+------------------------------------------------------------+ - | 7 | off (use input LAI; use FVEG = SHDFAC from input) | - +-------+------------------------------------------------------------+ - | 8 | off (use input LAI; calculate FVEG) | - +-------+------------------------------------------------------------+ - | 9 | off (use input LAI; use maximum vegetation fraction) | - +-------+------------------------------------------------------------+ - | 10 | crop model on (use maximum vegetation fraction) | - +-------+------------------------------------------------------------+ - -``canopy_stomatal_resistance_option``: (Default: ``2``) - Specifies the canopy stomatal resistance option. Valid values: ``1`` | ``2`` - - +--------+--------------+ - | Value | Description | - +========+==============+ - | 1 | Ball-Berry | - +--------+--------------+ - | 2 | Jarvis | - +--------+--------------+ - -``soil_wetness_option``: (Default: ``1``) - Specifies the soil moisture factor for the stomatal resistance option. Valid values: ``1`` | ``2`` | ``3`` - - +--------+-------------------------+ - | Value | Description | - +========+=========================+ - | 1 | Noah (soil moisture) | - +--------+-------------------------+ - | 2 | CLM (matric potential) | - +--------+-------------------------+ - | 3 | SSiB (matric potential) | - +--------+-------------------------+ - -``runoff_option``: (Default: ``1``) - Specifies the runoff option. Valid values: ``1`` | ``2`` | ``3`` | ``4`` | ``5`` - - +--------+-----------------------------------------------------------------------+ - | Value | Description | - +========+=======================================================================+ - | 1 | SIMGM: TOPMODEL with groundwater (:cite:t:`NiuEtAl2007`) | - +--------+-----------------------------------------------------------------------+ - | 2 | SIMTOP: TOPMODEL with an equilibrium water table | - | | (:cite:t:`NiuEtAl2005`) | - +--------+-----------------------------------------------------------------------+ - | 3 | Noah original surface and subsurface runoff (free drainage) | - | | (:cite:t:`SchaakeEtAl1996`) | - +--------+-----------------------------------------------------------------------+ - | 4 | BATS surface and subsurface runoff (free drainage) | - +--------+-----------------------------------------------------------------------+ - | 5 | Miguez-Macho & Fan groundwater scheme (:cite:t:`Miguez-MachoEtAl2007`;| - | | :cite:t:`FanEtAl2007`) | - +--------+-----------------------------------------------------------------------+ - -``surface_exchange_option``: (Default: ``3``) - Specifies the surface layer drag coefficient option. Valid values: ``1`` | ``2`` - - +--------+---------------------------+ - | Value | Description | - +========+===========================+ - | 1 | Monin-Obukhov | - +--------+---------------------------+ - | 2 | original Noah (Chen 1997) | - +--------+---------------------------+ - -``supercooled_soilwater_option``: (Default: ``1``) - Specifies the supercooled liquid water option. Valid values: ``1`` | ``2`` - - +--------+---------------------------------------------+ - | Value | Description | - +========+=============================================+ - | 1 | no iteration (:cite:t:`Niu&Yang2006`) | - +--------+---------------------------------------------+ - | 2 | Koren's iteration (:cite:t:`KorenEtAl1999`) | - +--------+---------------------------------------------+ - -``frozen_soil_adjust_option``: (Default: ``1``) - Specifies the frozen soil permeability option. Valid values: ``1`` | ``2`` - - +--------+-------------------------------------------------------------+ - | Value | Description | - +========+=============================================================+ - | 1 | linear effects, more permeable (:cite:t:`Niu&Yang2006`) | - +--------+-------------------------------------------------------------+ - | 2 | nonlinear effects, less permeable (:cite:t:`KorenEtAl1999`) | - +--------+-------------------------------------------------------------+ - -``radiative_transfer_option``: (Default: ``3``) - Specifies the radiation transfer option. Valid values: ``1`` | ``2`` | ``3`` - - +--------+--------------------------------------------------------------------+ - | Value | Description | - +========+====================================================================+ - | 1 | modified two-stream (gap = F(solar angle, 3D structure...)<1-FVEG) | - +--------+--------------------------------------------------------------------+ - | 2 | two-stream applied to grid-cell (gap = 0) | - +--------+--------------------------------------------------------------------+ - | 3 | two-stream applied to a vegetated fraction (gap=1-FVEG) | - +--------+--------------------------------------------------------------------+ - -``snow_albedo_option``: (Default: ``2``) - Specifies the snow surface albedo option. Valid values: ``1`` | ``2`` - - +--------+--------------+ - | Value | Description | - +========+==============+ - | 1 | BATS | - +--------+--------------+ - | 2 | CLASS | - +--------+--------------+ - -``precip_partition_option``: (Default: ``1``) - Specifies the option for partitioning precipitation into rainfall and snowfall. Valid values: ``1`` | ``2`` | ``3`` | ``4`` - - +--------+-----------------------------+ - | Value | Description | - +========+=============================+ - | 1 | :cite:t:`Jordan1991` (1991) | - +--------+-----------------------------+ - | 2 | BATS: when SFCTMP Date: Thu, 8 Aug 2024 11:42:23 -0400 Subject: [PATCH 46/49] rm vector2tile info --- doc/source/CustomizingTheWorkflow/Model.rst | 169 -------------------- 1 file changed, 169 deletions(-) diff --git a/doc/source/CustomizingTheWorkflow/Model.rst b/doc/source/CustomizingTheWorkflow/Model.rst index 851daabc..4d27ad01 100644 --- a/doc/source/CustomizingTheWorkflow/Model.rst +++ b/doc/source/CustomizingTheWorkflow/Model.rst @@ -167,172 +167,3 @@ The ``C96_grid.tile*.nc`` files contain grid information for tiles 1-6 at C96 gr ``grid_spec.nc`` and ``C96.mosaic.nc`` are the same file under different names and may be used interchangeably. - -.. _VectorTileConverter: - -Vector-to-Tile Converter -*************************** - -The Vector-to-Tile Converter is used for mapping between the vector format -used by the Noah-MP offline driver and the tile format used by the UFS -atmospheric model. This converter is currently used to prepare input tile files -for JEDI. Note that these files include only those fields required by -JEDI, rather than the full restart. - -.. _V2TInputFiles: - -Input File -============= - -The input files containing grid information are listed in :numref:`Table %s `: - -.. _GridInputFiles: - -.. list-table:: Input Files Containing Grid Information - :widths: 30 70 - :header-rows: 1 - - * - Filename - - Description - * - Cxx_grid.tile[1-6].nc - - Cxx grid information for tiles 1-6, where ``xx`` is the grid resolution. - * - Cxx_oro_data.tile[1-6].nc - - oro_Cxx.mx100.tile[1-6].nc - - - Orography files that contain grid and land mask information. - Cxx refers to the atmospheric resolution, and mx100 refers to the ocean - resolution (100=1º). Both file names refer to the same file; there are symbolic links between them. - -Configuration File -====================== - -This section describes the options in the ``namelist.vector2tile`` file (derived from ``parm/template.vector2tile`` files. ) - -Run Setup Parameters ----------------------- - -``direction`` - Specifies the conversion option. Valid values: ``vector2tile`` | ``tile2vector`` | ``lndp2tile`` | ``lndp2vector`` - - +--------------+---------------------------------------------+ - | Value | Description | - +==============+=============================================+ - | vector2tile | vector-to-tile conversion for restart file | - +--------------+---------------------------------------------+ - | tile2vector | tile-to-vector conversion for restart file | - +--------------+---------------------------------------------+ - | lndp2tile | land perturbation to tile | - +--------------+---------------------------------------------+ - | lndp2vector | land perturbation to vector | - +--------------+---------------------------------------------+ - -FV3 Tile-Related Parameters for Restart/Perturbation Conversion ---------------------------------------------------------------- - -Parameters in this section include the FV3 resolution and path to orographic files -for restart/perturbation conversion. - -``tile_size`` - Specifies the size (horizontal resolution) of the FV3 tile. Valid values: ``96``. - - .. note:: - - * The ``C96`` grid files correspond to approximately 1º latitude/longitude. - * Additional resolutions (e.g., ``192``, ``384``, ``768``) are under development. - -``tile_path`` - Specifies the path to the orographic tile files. - -``tile_fstub`` - Specifies the name (file stub) of orographic tile files. The file stub will be named ``oro_C${RES}`` for atmosphere-only and ``oro_C{RES}.mx100`` for atmosphere and ocean. - -Parameters for Restart Conversion ------------------------------------- - -These parameters apply *only* to restart conversion. - -``restart_date`` - Specifies the time stamp for restart conversion in "YYYY-MM-DD HH:00:00" format. - -``static_filename`` - Specifies the path for static file. - -``vector_restart_path`` - Specifies the location of vector restart file, vector-to-tile direction. - -``tile_restart_path`` - Specifies the location of tile restart file, tile-to-vector direction. - -``output_path`` - Specifies the path for converted files. If this is same as tile/vector path, the files may be overwritten. - -Perturbation Mapping Parameters ----------------------------------- - -These parameters are *only* relevant for perturbation mapping in ensembles. -Support for ensembles is *not* provided for the Land DA v1.0.0 release. - -``lndp_layout`` - Specifies the layout options. Valid values: ``1x4`` | ``4x1`` | ``2x2`` - -``lndp_input_file`` - Specifies the path for the input file. - -``output files`` - Specifies the path for the output file. - -``lndp_var_list`` - Specifies the land perturbation variable options. Valid values: ``vgf`` | ``smc`` - - +-------+------------------------------------------+ - | Value | Description | - +=======+==========================================+ - | vgf | Perturbs the vegetation green fraction | - +-------+------------------------------------------+ - | smc | Perturbs the soil moisture | - +-------+------------------------------------------+ - -Example of a ``namelist.vector2tile`` Entry ----------------------------------------------- - -.. code-block:: console - - &run_setup - - direction = "vector2tile" - - &FV3 resolution and path to oro files for restart/perturbation - conversion - - tile_size = 96 - tile_path ="/ /" - tile_fstub = "oro_C96.mx100" - - !------------------- only restart conversion ------------------- - - ! Time stamp for conversion for restart conversion - restart_date = "2019-09-30 23:00:00" - - ! Path for static file - static_filename="/*/filename.nc " - - ! Location of vector restart file (vector2tile direction) - vector_restart_path ="/ /" - - ! Location of tile restart files (tile2vector direction) - tile_restart_path ="/ /" - - output_path ="/ /" - - !------------------- only perturbation mapping ------------------- - lndp_layout = "1x4" - - ! input files - lndp_input_file ="/*/filename.nc " - - ! output files - lndp_output_file = "./output.nc" - - ! land perturbation variable list - lndp_var_list='vgf','smc' From 818be7538125e72c4ea7b31a4ec98a274391dde2 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Thu, 8 Aug 2024 11:43:34 -0400 Subject: [PATCH 47/49] add info on grid description files to DA chapter/IODA section --- .../CustomizingTheWorkflow/DASystem.rst | 21 ++++++++++++++++++- 1 file changed, 20 insertions(+), 1 deletion(-) diff --git a/doc/source/CustomizingTheWorkflow/DASystem.rst b/doc/source/CustomizingTheWorkflow/DASystem.rst index dd4d6a85..fa47c5f8 100644 --- a/doc/source/CustomizingTheWorkflow/DASystem.rst +++ b/doc/source/CustomizingTheWorkflow/DASystem.rst @@ -542,7 +542,26 @@ The Land DA System requires grid description files, observation files, and resta Grid Description Files ========================= -The grid description files appear in :numref:`Section %s ` and are also used as input files to the Vector-to-Tile Converter and the UFS land component. See :numref:`Table %s ` for a description of these files. +The grid description files appear in :numref:`Table %s ` below: + +.. _GridInputFiles: + +.. list-table:: Input Files Containing Grid Information + :widths: 30 70 + :header-rows: 1 + + * - Filename + - Description + * - Cxx_grid.tile[1-6].nc + - Cxx grid information for tiles 1-6, where ``xx`` is the grid resolution. + * - Cxx_oro_data.tile[1-6].nc + + oro_Cxx.mx100.tile[1-6].nc + + - Orography files that contain grid and land mask information. + Cxx refers to the atmospheric resolution, and mx100 refers to the ocean + resolution (100=1º). Both file names refer to the same file; there are symbolic links between them. + .. _observation-data: From e98cb6ee1fcc4099b0ae2ea768514d378c7f5f36 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Thu, 8 Aug 2024 12:27:23 -0400 Subject: [PATCH 48/49] rm run without rocoto section --- .../BuildingRunningTesting/BuildRunLandDA.rst | 16 +--------------- 1 file changed, 1 insertion(+), 15 deletions(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index cd02195c..621209f9 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -207,10 +207,6 @@ Users may run these tasks :ref:`using the Rocoto workflow manager Run With Rocoto ================= -.. note:: - - Users who do not have Rocoto installed on their system can view :numref:`Section %s: Run Without Rocoto `. - To run the experiment, users can automate job submission via :term:`crontab` or submit tasks manually via ``rocotorun``. Automated Run @@ -275,17 +271,7 @@ Note that the status table printed by ``rocotostat`` only updates after each ``r The experiment has successfully completed when all tasks say SUCCEEDED under STATE. Other potential statuses are: QUEUED, SUBMITTING, RUNNING, and DEAD. Users may view the log files to determine why a task may have failed. -.. _run-batch-script: - -Run Without Rocoto --------------------- - -Users may choose to run the workflow *without* ``uwtools`` and Rocoto for a non-cycled run. To run the :term:`J-job ` scripts in the ``jobs`` directory, navigate to the ``parm`` directory and edit ``run_without_rocoto.sh`` (e.g., using vim or preferred command line editor). Users will likely need to change the ``MACHINE``, ``ACCOUNT``, and ``EXP_BASEDIR`` variables to match their system. Then, run the script: - -.. code-block:: console - - cd $LANDDAROOT/land-DA_workflow/parm - sbatch run_without_rocoto.sh +.. _check-output: Check Experiment Output ========================= From a492e38fce31f6eec0062ccf1f9c0d9aa7661035 Mon Sep 17 00:00:00 2001 From: gspetro-NOAA Date: Thu, 8 Aug 2024 12:42:11 -0400 Subject: [PATCH 49/49] update wording re: landdaroot --- doc/source/BuildingRunningTesting/BuildRunLandDA.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst index 621209f9..629d166b 100644 --- a/doc/source/BuildingRunningTesting/BuildRunLandDA.rst +++ b/doc/source/BuildingRunningTesting/BuildRunLandDA.rst @@ -15,7 +15,7 @@ This chapter provides instructions for building and running basic Land DA cases Create a Working Directory ***************************** -Create a base directory for the Land DA experiment and navigate into it: +Users can either create a new directory for their Land DA work or choose an existing directory, depending on preference. Then, users should navigate to this directory. For example, to create a new directory and navigate to it, run: .. code-block:: console