Skip to content
Snippets Groups Projects
Commit 8158d301 authored by lkugler's avatar lkugler
Browse files

test dummy ipynb files

parent bcb7ee9c
No related branches found
No related tags found
No related merge requests found
%% Cell type:markdown id:fd5c3005-f237-4495-9185-2d4d474cafd5 tags:
%% Cell type:markdown id:09371891-3ca3-404e-aeb3-7b58b900f563 tags:
# Tutorial 1: The assimilation step
DART-WRF is a python package which automates many things like configuration, saving configuration and output, handling computing resources, etc.
# Test1
The data for this experiment is accessible for students on the server srvx1.
%% Cell type:code id:ab8f6d86-dc41-4d5c-b6d9-a9f396c1ec70 tags:
%% Cell type:markdown id:93d59d4d-c514-414e-81fa-4ff390290811 tags:
### Configuring the experiment
Firstly, you need to configure the experiment in `config/cfg.py`.
Let's go through the most important settings:
- expname should be a unique identifier and will be used as folder name
- model_dx is the model resolution in meters
- n_ens is the ensemble size
- update_vars are the WRF variables which shall be updated by the assimilation
- filter_kind is 1 for the EAKF (see the DART documentation for more)
- prior and post_inflation defines what inflation we want (see the DART docs)
- sec is the statistical sampling error correction from Anderson (2012)
```python
exp = utils.Experiment()
exp.expname = "test_newcode"
exp.model_dx = 2000
exp.n_ens = 10
exp.update_vars = ['U', 'V', 'W', 'THM', 'PH', 'MU', 'QVAPOR', 'QCLOUD', 'QICE', 'PSFC']
exp.filter_kind = 1
exp.prior_inflation = 0
exp.post_inflation = 4
exp.sec = True
```
In case you want to generate new observations like for an observing system simulations experiment, OSSE), set
```python
exp.use_existing_obsseq = False`.
```
`exp.nature` defines which WRF files will be used to draw observations from, e.g.:
```python
exp.nature = '/users/students/lehre/advDA_s2023/data/sample_nature/'
```
`exp.input_profile` is used, if you create initial conditions from a so called wrf_profile (see WRF guide).
```python
exp.input_profile = '/doesnt_exist/initial_profiles/wrf/ens/raso.fc.<iens>.wrfprof'
```
For horizontal localization half-width of 20 km and 3 km vertically, set
```python
exp.cov_loc_vert_km_horiz_km = (3, 20)
```
You can also set it to False for no vertical localization.
#### Single observation
Set your desired observations like this.
```python
t = dict(plotname='Temperature', plotunits='[K]',
kind='RADIOSONDE_TEMPERATURE',
n_obs=1, # number of observations
obs_locations=[(45., 0.)], # location of observations
error_generate=0.2, # observation error used to generate observations
error_assimilate=0.2, # observation error used for assimilation
heights=[1000,], # for radiosondes, use range(1000, 17001, 2000)
cov_loc_radius_km=50) # horizontal localization half-width
exp.observations = [t,] # select observations for assimilation
```
#### Multiple observations
To generate a grid of observations, use
```python
vis = dict(plotname='VIS 0.6µm', plotunits='[1]',
kind='MSG_4_SEVIRI_BDRF', sat_channel=1,
n_obs=961, obs_locations='square_array_evenly_on_grid',
error_generate=0.03, error_assimilate=0.03,
cov_loc_radius_km=20)
```
But caution, n_obs should only be one of the following:
- 22500 for 2km observation density/resolution
- 5776 for 4km;
- 961 for 10km;
- 256 for 20km;
- 121 for 30km
For vertically resolved data, like radar, n_obs is the number of observations at each observation height level.
%% Cell type:markdown id:16bd3521-f98f-4c4f-8019-31029fd678ae tags:
### Configuring the hardware
In case you use a cluster which is not supported, configure paths inside `config/clusters.py`.
### Assimilate observations
We start by importing some modules:
```python
import datetime as dt
from dartwrf.workflows import WorkFlows
```
To assimilate observations at dt.datetime `time` we set the directory paths and times of the prior ensemble forecasts:
```python
prior_path_exp = '/users/students/lehre/advDA_s2023/data/sample_ensemble/'
prior_init_time = dt.datetime(2008,7,30,12)
prior_valid_time = dt.datetime(2008,7,30,12,30)
assim_time = prior_valid_time
```
Finally, we run the data assimilation by calling
```python
w = WorkFlows(exp_config='cfg.py', server_config='srvx1.py')
w.assimilate(assim_time, prior_init_time, prior_valid_time, prior_path_exp)
``` python
import os, sys
```
Congratulations! You're done!
%% Cell type:code id:82e809a8-5972-47f3-ad78-6290afe4ae17 tags:
%% Cell type:code id:3bc1d633-cf91-46ad-96d2-256dda6fa335 tags:
``` python
```
......
%% Cell type:markdown id:fd5c3005-f237-4495-9185-2d4d474cafd5 tags:
%% Cell type:markdown id:02ac2a1f-272e-4f23-960d-e7f95d683b53 tags:
# Tutorial 2: Forecast after DA
# Test2
**Goal**: To run a cycled data assimilation experiment.
[`cycled_exp.py`](https://github.com/lkugler/DART-WRF/blob/master/generate_free.py) contains an example which will be explained here:
Now there are two options:
1) To start a forecast from an existing forecast, i.e. from WRF restart files
2) To start a forecast from defined thermodynamic profiles, i.e. from a `wrf_profile`
### Restart a forecast
To run a forecast from initial conditions of a previous forecasts, we import these modules
```python
import datetime as dt
from dartwrf.workflows import WorkFlows
```
Let's say you want to run a forecast starting at 9 UTC until 12 UTC.
Initial conditions shall be taken from a previous experiment in `/user/test/data/sim_archive/exp_abc` which was initialized at 6 UTC and there are WRF restart files for 9 UTC.
Then the code would be
```python
prior_path_exp = '/user/test/data/sim_archive/exp_abc'
prior_init_time = dt.datetime(2008,7,30,6)
prior_valid_time = dt.datetime(2008,7,30,9)
w = WorkFlows(exp_config='cfg.py', server_config='srvx1.py')
begin = dt.datetime(2008, 7, 30, 9)
end = dt.datetime(2008, 7, 30, 12)
w.prepare_WRFrundir(begin)
w.prepare_IC_from_prior(prior_path_exp, prior_init_time, prior_valid_time)
w.run_ENS(begin=begin, # start integration from here
end=end, # integrate until here
output_restart_interval=9999, # do not write WRF restart files
)
```
Note that we use predefined workflow functions like `run_ENS`.
### Forecast run after Data Assimilation
In order to continue after assimilation you need the posterior = prior (1) + increments (2)
1. Set posterior = prior
```python
id = w.prepare_IC_from_prior(prior_path_exp, prior_init_time, prior_valid_time, depends_on=id)
```
2) Update posterior with increments from assimilation
After this, the wrfrst files are updated with assimilation increments from DART output and copied to the WRF's run directories so you can continue to run the forecast ensemble.
```python
id = w.update_IC_from_DA(time, depends_on=id)
```
3) Define how long you want to run the forecast and when you want WRF restart files. Since they take a lot of space, we want as few as possible.
```python
timedelta_integrate = dt.timedelta(hours=5)
output_restart_interval = 9999 # any value larger than the forecast duration
```
If you run DA in cycles of 15 minutes, it will be
```python
timedelta_integrate = dt.timedelta(hours=5)
timedelta_btw_assim = dt.timedelta(minutes=15)
output_restart_interval = timedelta_btw_assim.total_seconds()/60
```
3) Run WRF ensemble
```python
id = w.run_ENS(begin=time, # start integration from here
end=time + timedelta_integrate, # integrate until here
output_restart_interval=output_restart_interval,
depends_on=id)
```
%% Cell type:code id:400244f1-098b-46ea-b29d-2226c7cbc827 tags:
%% Cell type:code id:afdee309-2ee0-418d-89db-47d057386a29 tags:
``` python
import os, sys
```
......
%% Cell type:markdown id:fd5c3005-f237-4495-9185-2d4d474cafd5 tags:
%% Cell type:markdown id:8ced2b8b-6829-4b16-acb7-03fd1b8b0ff8 tags:
# Tutorial 3: Cycle forecast and assimilation
# Test3
**Goal**: To run a cycled data assimilation experiment.
[`cycled_exp.py`](https://github.com/lkugler/DART-WRF/blob/master/generate_free.py) contains an example which will be explained here:
For example, your experiment can look like this
```python
prior_path_exp = '/jetfs/home/lkugler/data/sim_archive/exp_v1.19_P2_noDA'
init_time = dt.datetime(2008, 7, 30, 13)
time = dt.datetime(2008, 7, 30, 14)
last_assim_time = dt.datetime(2008, 7, 30, 14)
forecast_until = dt.datetime(2008, 7, 30, 14, 15)
w.prepare_WRFrundir(init_time)
id = w.run_ideal(depends_on=id)
prior_init_time = init_time
prior_valid_time = time
while time <= last_assim_time:
# usually we take the prior from the current time
# but one could use a prior from a different time from another run
# i.e. 13z as a prior to assimilate 12z observations
prior_valid_time = time
id = w.assimilate(time, prior_init_time, prior_valid_time, prior_path_exp, depends_on=id)
# 1) Set posterior = prior
id = w.prepare_IC_from_prior(prior_path_exp, prior_init_time, prior_valid_time, depends_on=id)
# 2) Update posterior += updates from assimilation
id = w.update_IC_from_DA(time, depends_on=id)
# How long shall we integrate?
timedelta_integrate = timedelta_btw_assim
output_restart_interval = timedelta_btw_assim.total_seconds()/60
if time == last_assim_time: #this_forecast_init.minute in [0,]: # longer forecast every full hour
timedelta_integrate = forecast_until - last_assim_time # dt.timedelta(hours=4)
output_restart_interval = 9999 # no restart file after last assim
# 3) Run WRF ensemble
id = w.run_ENS(begin=time, # start integration from here
end=time + timedelta_integrate, # integrate until here
output_restart_interval=output_restart_interval,
depends_on=id)
# as we have WRF output, we can use own exp path as prior
prior_path_exp = cluster.archivedir
id_sat = w.create_satimages(time, depends_on=id)
# increment time
time += timedelta_btw_assim
# update time variables
prior_init_time = time - timedelta_btw_assim
w.verify_sat(id_sat)
w.verify_wrf(id)
w.verify_fast(id)
```
#### Job scheduling status
If you work on a server with a queueing system, the script submits jobs into the SLURM queue with dependencies so that SLURM starts the jobs itself as soon as resources are available. Most jobs need only a few cores, but model integration is done across many nodes. You can look at the status with
```bash
$ squeue -u `whoami` --sort=i
JOBID PARTITION NAME USER ST TIME NODES NODELIST(REASON)
1710274 mem_0384 prepwrfr lkugler PD 0:00 1 (Priority)
1710275 mem_0384 IC-prior lkugler PD 0:00 1 (Dependency)
1710276 mem_0384 Assim-42 lkugler PD 0:00 1 (Dependency)
1710277 mem_0384 IC-prior lkugler PD 0:00 1 (Dependency)
1710278 mem_0384 IC-updat lkugler PD 0:00 1 (Dependency)
1710279 mem_0384 preWRF2- lkugler PD 0:00 1 (Dependency)
1710280_[1-10] mem_0384 runWRF2- lkugler PD 0:00 1 (Dependency)
1710281 mem_0384 pRTTOV-6 lkugler PD 0:00 1 (Dependency)
1710282 mem_0384 Assim-3a lkugler PD 0:00 1 (Dependency)
1710283 mem_0384 IC-prior lkugler PD 0:00 1 (Dependency)
1710284 mem_0384 IC-updat lkugler PD 0:00 1 (Dependency)
1710285 mem_0384 preWRF2- lkugler PD 0:00 1 (Dependency)
1710286_[1-10] mem_0384 runWRF2- lkugler PD 0:00 1 (Dependency)
1710287 mem_0384 pRTTOV-7 lkugler PD 0:00 1 (Dependency)
```
%% Cell type:code id:400244f1-098b-46ea-b29d-2226c7cbc827 tags:
%% Cell type:code id:b8d43e28-c9cb-403e-915c-79266b5a03c7 tags:
``` python
import os, sys
```
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment