Skip to content
Snippets Groups Projects
Commit 2f027761 authored by Stefano Serafin's avatar Stefano Serafin
Browse files

WRF.md, documented WPS with ECMWF

parent 106d80e1
No related branches found
No related tags found
No related merge requests found
# Table of contents # Table of contents
1. [What is WRF](#what-is-wrf) 1. [What is WRF](#what-is-wrf)
1. [For the impatient](#for-the-impatient) 1. [Quick start](#quick-start)
1. [Basic usage](#basic-usage) 1. [Basic usage](#basic-usage)
* [Organization of the source code](#organization-of-the-source-code) * [Organization of the source code](#organization-of-the-source-code)
* [Compiling the model](#compiling-the-model) * [Compiling the model](#compiling-the-model)
...@@ -27,7 +27,7 @@ ...@@ -27,7 +27,7 @@
* [Before running the model](#) * [Before running the model](#)
* [Defining the vertical grid](#) * [Defining the vertical grid](#)
* [Defining a new geographical database](#) * [Defining a new geographical database](#)
* [Using ECMWF data as IC/BC](#) * [Using ECMWF data as IC/BC](#)
* [Spinning up soil fields](#) * [Spinning up soil fields](#)
* [After running the model](#) * [After running the model](#)
* [Interpolating model output to a new grid](#) * [Interpolating model output to a new grid](#)
...@@ -64,9 +64,60 @@ tar xzvf v4.4.2.tar.gz ...@@ -64,9 +64,60 @@ tar xzvf v4.4.2.tar.gz
unzip v4.4.2.zip unzip v4.4.2.zip
``` ```
# For the impatient # Quick start
(quick overview of the compilation/simulation process, without details) Compiling WRF for an idealized simulation (LES):
```
./configure
./compile em_les > compile.log 2>&1 &
```
Running WRF for an idealized simulation (LES):
```
cd ./test/em_les
./ideal.exe
./wrf.exe
```
For other test cases, compilation might create a `run_me_first.csh` script in the same directory as the executables. If there is one, run it only once, before any other program. It will link any necessary lookup tables needed for the simulation (land-use, parameterizations, etc.).
Compiling WRF for an idealized simulation (LES):
```
./configure
./compile em_real > compile.log 2>&1 &
```
Running WRF for a real_case simulation:
```
cd test/em_real
ln -s $WPS_PATH/met_em* .
./real.exe
./wrf.exe
```
To do the WRF pre-processing for a real-case simulation getting initial and boundary conditions from ECMWF-IFS data on model levels, you could use a script such as the following. However, it depends on namelists, variable tables and other settings files being correctly specified. See below for details.
```
#!/bin/bash
set -eu
# Set paths
date=20190726.0000
gribdir=/users/staff/serafin/data/GRIB_IC_for_LAM/ECMWF/TEAMx_convection/
# Run WPS
./geogrid.exe
./link_grib.csh ${gribdir}/${date}/*
./ungrib.exe
./calc_ecmwf_p.exe
./avg_tsfc.exe
mpirun -np 32 ./metgrid.exe
# Archive results and clean up
archive=./archive/TEAMxConv_${date}
mkdir -p ${archive}
mv geo_em.d0?.nc met_em*nc ${archive}
cp namelist.wps geogrid/GEOGRID.TBL.HIRES ${archive}
rm -fr FILE* PRES* TAVGSFC GRIBFILE* metgrid.log.*
```
# Basic usage # Basic usage
...@@ -155,6 +206,32 @@ However, we describe the typical workflow of the compilation, for anyone that wi ...@@ -155,6 +206,32 @@ However, we describe the typical workflow of the compilation, for anyone that wi
Load modules with `module load LIST-OF-MODULE-NAMES`, unload them one by one with `module unload LIST-OF-MODULE-NAMES`, unload all of them at the same time with `module purge`, get information about a specific module with `module show MODULE_NAME`. Modules may depend on each other. If the system is set up properly, a request to load one module will automatically load any other prerequisite ones. Load modules with `module load LIST-OF-MODULE-NAMES`, unload them one by one with `module unload LIST-OF-MODULE-NAMES`, unload all of them at the same time with `module purge`, get information about a specific module with `module show MODULE_NAME`. Modules may depend on each other. If the system is set up properly, a request to load one module will automatically load any other prerequisite ones.
After loading modules, it is also recommended to set the `NETCDF` environment variable to the root variable of the netcdf installation. Use `module show` to see which directory is correct. For instance:
```
(skylake) [serafins@l46 TEAMx_real]$ module list
Currently Loaded Modulefiles:
1) pkgconf/1.8.0-intel-2021.5.0-bkuyrr7 4) zlib/1.2.12-intel-2021.5.0-pctnhmb 7) netcdf-fortran/4.6.0-intel-2021.5.0-pnaropy
2) intel-oneapi-compilers/2022.1.0-gcc-8.5.0-kiyqwf7 5) hdf5/1.12.2-intel-2021.5.0-loke5pd
3) intel-oneapi-mpi/2021.6.0-intel-2021.5.0-wpt4y32 6) netcdf-c/4.8.1-intel-2021.5.0-hmrqrz2
(skylake) [serafins@l46 TEAMx_real]$ module show netcdf-fortran/4.6.0-intel-2021.5.0-pnaropy
-------------------------------------------------------------------
/opt/sw/spack-0.19.0/var/spack/environments/skylake/modules/linux-almalinux8-skylake/netcdf-fortran/4.6.0-intel-2021.5.0-pnaropy:
module-whatis {NetCDF (network Common Data Form) is a set of software libraries and machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. This is the Fortran distribution.}
prepend-path PATH /gpfs/opt/sw/spack-0.19.0/opt/spack/linux-almalinux8-skylake/intel-2021.5.0/netcdf-fortran-4.6.0-pnaropyoft7hicu7bfsugqa2aqcsggxj/bin
prepend-path LIBRARY_PATH /gpfs/opt/sw/spack-0.19.0/opt/spack/linux-almalinux8-skylake/intel-2021.5.0/netcdf-fortran-4.6.0-pnaropyoft7hicu7bfsugqa2aqcsggxj/lib
prepend-path LD_LIBRARY_PATH /gpfs/opt/sw/spack-0.19.0/opt/spack/linux-almalinux8-skylake/intel-2021.5.0/netcdf-fortran-4.6.0-pnaropyoft7hicu7bfsugqa2aqcsggxj/lib
prepend-path CPATH /gpfs/opt/sw/spack-0.19.0/opt/spack/linux-almalinux8-skylake/intel-2021.5.0/netcdf-fortran-4.6.0-pnaropyoft7hicu7bfsugqa2aqcsggxj/include
prepend-path MANPATH /gpfs/opt/sw/spack-0.19.0/opt/spack/linux-almalinux8-skylake/intel-2021.5.0/netcdf-fortran-4.6.0-pnaropyoft7hicu7bfsugqa2aqcsggxj/share/man
prepend-path PKG_CONFIG_PATH /gpfs/opt/sw/spack-0.19.0/opt/spack/linux-almalinux8-skylake/intel-2021.5.0/netcdf-fortran-4.6.0-pnaropyoft7hicu7bfsugqa2aqcsggxj/lib/pkgconfig
prepend-path CMAKE_PREFIX_PATH /gpfs/opt/sw/spack-0.19.0/opt/spack/linux-almalinux8-skylake/intel-2021.5.0/netcdf-fortran-4.6.0-pnaropyoft7hicu7bfsugqa2aqcsggxj/
-------------------------------------------------------------------
(skylake) [serafins@l46 TEAMx_real]$ export NETCDF=/gpfs/opt/sw/spack-0.19.0/opt/spack/linux-almalinux8-skylake/intel-2021.5.0/netcdf-fortran-4.6.0-pnaropyoft7hicu7bfsugqa2aqcsggxj
(skylake) [serafins@l46 TEAMx_real]$ env|grep NETCDF
NETCDF=/gpfs/opt/sw/spack-0.19.0/opt/spack/linux-almalinux8-skylake/intel-2021.5.0/netcdf-fortran-4.6.0-pnaropyoft7hicu7bfsugqa2aqcsggxj
```
The environment must be consistent between compilation and runtime. If you compile WRF with a set of modules loaded, you must run it with the same set of modules. The environment must be consistent between compilation and runtime. If you compile WRF with a set of modules loaded, you must run it with the same set of modules.
* *Configure WRF for compilation.* This will test the system to check that all libraries can be properly linked. Type `./configure`, pick a generic dmpar INTEL (ifort/icc) configuration (usually 15), answer 1 when asked if you want to compile for nesting, then hit enter. "dmpar" means "distributed memory parallelization" and enables running WRF in parallel computing mode. For test compilations or for a toy setup, you might also choose a "serial" configuration. * *Configure WRF for compilation.* This will test the system to check that all libraries can be properly linked. Type `./configure`, pick a generic dmpar INTEL (ifort/icc) configuration (usually 15), answer 1 when asked if you want to compile for nesting, then hit enter. "dmpar" means "distributed memory parallelization" and enables running WRF in parallel computing mode. For test compilations or for a toy setup, you might also choose a "serial" configuration.
...@@ -282,18 +359,109 @@ We cover this separately. See DART-WRF. ...@@ -282,18 +359,109 @@ We cover this separately. See DART-WRF.
### Using ECMWF data as IC/BC ### Using ECMWF data as IC/BC
The long story made short is: you should link grib1 files and process them with `ungrib.exe` using `Vtable.ECMWF_sigma`.
More in detail, since a few years ECMWF has been distributing a mixture of grib2 and grib1 files. Namely:
* grib1 files for surface and soil model levels.
* grib2 files for atmospheric model levels.
The WPS has a predefined Vtable for grib1 files from ECMWF, so the easiest way to process ECMWF data is to:
1. convert model-level grib2 files to grib1
2. if necessary, for every time stamp, concatenate the model-level and surface grib1 files into a single file. This is only necessary if the grib1 and grib2 data were downloaded as separate sets of GRIB files.
3. process the resulting files with ungrib after linking `ungrib/Variable_Tables/Vtable.ECMWF_sigma` as `Vtable`
In detail:
1. Conversion to grib1 (needs the grib_set utility from eccodes):
for i in det.CROSSINN.mlv.20190913.0000.f*.grib2; do j=`basename $i .grib2`; grib_set -s deletePV=1,edition=1 ${i} ${j}; done
2. Concatenation of grib files (two sets of files, `*mlv*` and `*sfc*`, with names ending with "grib1" yield a new set of files with names ending with "grib"; everything is grib1):
for i in det.CROSSINN.mlv.20190913.0000.f*.grib1; do j=`echo $i|sed 's/.mlv./.sfc./'`; k=`echo $i|sed 's/.mlv././'|sed 's/.grib1/.grib/'`; cat $i $j > $k; done
3. In the WPS main directory:
link_grib.csh /data/GRIB_IC_for_LAM/ECMWF/20190913_CROSSINN_IOP8/det.CROSSINN.20190913.0000.f*.grib
ln -s ungrib/Variable_Tables/Vtable.ECMWF_sigma Vtable
./ungrib.exe
An alternative procedure would be to convert everything to grib2 instead of grib1. Then, one has to use a Vtable with grib2 information for the surface fields, for instance the one included here at the bottom. But: Data from the bottom soil level will not be read correctly with this Vtable, because the Level2 value for the bottom level is actually MISSING in grib2 files (at the moment of writing, 6 May 2022; this may be fixed in the future).
GRIB1| Level| From | To | metgrid | metgrid | metgrid |GRIB2|GRIB2|GRIB2|GRIB2|
Param| Type |Level1|Level2| Name | Units | Description |Discp|Catgy|Param|Level|
-----+------+------+------+----------+----------+------------------------------------------+-----------------------+
130 | 109 | * | | TT | K | Temperature | 0 | 0 | 0 | 105 |
131 | 109 | * | | UU | m s-1 | U | 0 | 2 | 2 | 105 |
132 | 109 | * | | VV | m s-1 | V | 0 | 2 | 3 | 105 |
133 | 109 | * | | SPECHUMD | kg kg-1 | Specific humidity | 0 | 1 | 0 | 105 |
152 | 109 | * | | LOGSFP | Pa | Log surface pressure | 0 | 3 | 25 | 105 |
129 | 109 | * | | SOILGEO | m | Surface geopotential | 0 | 3 | 4 | 1 |
| 109 | * | | SOILHGT | m | Terrain field of source analysis | 0 | 3 | 5 | 1 |
134 | 109 | 1 | | PSFCH | Pa | | 0 | 3 | 0 | 1 |
157 | 109 | * | | RH | % | Relative Humidity | 0 | 1 | 1 | 105 |
165 | 1 | 0 | | UU | m s-1 | U | 0 | 2 | 2 | 103 |
166 | 1 | 0 | | VV | m s-1 | V | 0 | 2 | 3 | 103 |
167 | 1 | 0 | | TT | K | Temperature | 0 | 0 | 0 | 103 |
168 | 1 | 0 | | DEWPT | K | | 0 | 0 | 6 | 103 |
172 | 1 | 0 | | LANDSEA | 0/1 Flag | Land/Sea flag | 2 | 0 | 0 | 1 |
151 | 1 | 0 | | PMSL | Pa | Sea-level Pressure | 0 | 3 | 0 | 101 |
235 | 1 | 0 | | SKINTEMP | K | Sea-Surface Temperature | 0 | 0 | 17 | 1 |
34 | 1 | 0 | | SST | K | Sea-Surface Temperature | 10 | 3 | 0 | 1 |
139 | 112 | 0| 700| ST000007 | K | T of 0-7 cm ground layer | 192 | 128 | 139 | 106 |
170 | 112 | 700| 2800| ST007028 | K | T of 7-28 cm ground layer | 192 | 128 | 170 | 106 |
183 | 112 | 2800| 10000| ST028100 | K | T of 28-100 cm ground layer | 192 | 128 | 183 | 106 |
236 | 112 | 10000| 0| ST100289 | K | T of 100-289 cm ground layer | 192 | 128 | 236 | 106 |
39 | 112 | 0| 700| SM000007 | fraction | Soil moisture of 0-7 cm ground layer | 192 | 128 | 39 | 106 |
40 | 112 | 700| 2800| SM007028 | fraction | Soil moisture of 7-28 cm ground layer | 192 | 128 | 40 | 106 |
41 | 112 | 2800| 10000| SM028100 | fraction | Soil moisture of 28-100 cm ground layer | 192 | 128 | 41 | 106 |
42 | 112 | 10000| 0| SM100289 | fraction | Soil moisture of 100-289 cm ground layer | 192 | 128 | 42 | 106 |
-----+------+------+------+----------+----------+------------------------------------------+-----------------------+
### Spinning up soil fields ### Spinning up soil fields
## After running the model ## After running the model
### Converting model output to CF-compliant NetCDF
1. To convert WRF output to CF-compliant NetCDF, use `wrfout_to_cf.ncl` (from <https://sundowner.colorado.edu/wrfout_to_cf/overview.html>):
ncl 'file_in="wrfinput_d01"' 'file_out="wrfpost.nc"' wrfout_to_cf.ncl
### Interpolating model output to a new grid ### Interpolating model output to a new grid
1. First convert to CF-compliant NetCDF (see above)
1. Then use cdo to interpolate the CF-compliant WRF output:
cdo -remapnn,gridfile.lonlat.txt wrfpost.nc wrfpost_interpolated.nc
1. In the code snippet above, -remapnn specifies the interpolation engine, in this case nearest-neighbour. See alternatives here: <https://code.mpimet.mpg.de/projects/cdo/wiki/Tutorial#Horizontal-fields>
1. File gridfile.lonlat.txt contans the grid specifications, e.g.:
gridtype = lonlat
gridsize = 721801
xsize = 1201
ysize = 601
xname = lon
xlongname = "longitude"
xunits = "degrees_east"
yname = lat
ylongname = "latitude"
yunits = "degrees_north"
xfirst = 5.00
xinc = 0.01
yfirst = 43.00
yinc = 0.01
### Subsetting model output ### Subsetting model output
### Further compression of model output (data packing) ### Further compression of model output (data packing)
### Converting model output to CF-compliant NetCDF
### 3D visualization ### 3D visualization
# Useful tools # Useful tools
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment