-
Notifications
You must be signed in to change notification settings - Fork 1
Existing VIC RASM workflow
The workflow in the VIC implementation within RASM is different than the VIC classic workflow. The main reasons for this are:
-
The model needs to run in space-first mode ( VIC image ), which means that all grid cells are processed for a single time step before time is advanced.
-
The model needs to run on multiple processors using MPI.
-
The model needs to communicate with the CESM coupler (CPL7).
-
Since CPL7 (and all the other RASM component models) are written in Fortran, a Fortran-C interface must be used to invoke VIC functions.
In addition, there are a number of changes with respect to the stand-alone version of VIC. Most of these changes should make its way to the original VIC source code:
-
NetCDF support for writing output files and reading and writing model state files. However, model parameter
-
No support for MTCLIM code to be run as part of VIC. Meteorological forcings are provided via the coupler and are all generated independent of VIC.
-
Support for temporal averaging of output variables within the model.
-
Support for internal model time steps of less than one hour.
Note that the VIC code that is currently in RASM is old (VIC 4.0.4 or thereabouts) and is already a modification that was used in earlier coupling attempts.
In the existing implementation, the routines that manage the communication between CLM and CPL7 have been reused to manage the communication between VIC and CPL7. The result is that there is a lot of extraneous code that is never used and which makes debugging and code maintenance more confusing than it needs to be.
One potential problem is that it appears that the MPI CODE that is used to distribute VIC across multiple cores overallocates memory. That is, it seems that VIC allocates memory for the entire domain on each core rather than only for those model grid cells that are managed by that core. This is potentially one of the reasons why VIC does not scale well beyond a certain number of nodes, although the main reason may simply be that VIC computational requirements are low for the current version. This overallocation has not been a problem so far since VIC memory requirements are modest, but the new VIC version is more memory intensive and simulations with many more grid cells in their domain are likely to run into problems.
The following tracks the communication between CPL7 and VIC via the CLM-interface and the main VIC workflow. Not all function calls are mentioned, only the top level calls. This is based on RASM tag 22.
All the VIC code is within the racm/rasm/trunk/models/lnd/vic
subtree in the RASM svn repository:
|-vic
|---bld # build scripts and namelist template
|---src # source code tree
|-----biogeochem # CLM biochemistry code - not used
|-----biogeophys # CLM biogeoohysics code - not used
|-----main # CLM main routines - some of these are used to communicate between CPL7 and VIC
|-----riverroute # CLM river routing code - not used (rvic code is used instead)
|-----vic # VIC source code
Although some of these directories contain code that is not used at all, they are currently referenced during the build phase and through use
statements in the Fortran code that is needed. Further cleanup of the code is required before the extra code can be safely removed.
In the coupled environment, the coupler makes all the calls to VIC using the calls in main/lnd_comp_mct.F90
:
-
main/lnd_comp_mct.F90
:lnd_init_mct()
: Called at the start of the run. -
main/lnd_comp_mct.F90
:lnd_run_mct()
: Called every time step. -
main/lnd_comp_mct.F90
:lnd_final_mct()
: Called at the end of the run.