diff --git a/docs/source/notebooks/tutorial1.ipynb b/docs/source/notebooks/tutorial1.ipynb index 798d6b941dc14d3bca583e5c14f92de2b8c18fdb..192bddad849b33102efdbd6aa8bcee30fe836b61 100644 --- a/docs/source/notebooks/tutorial1.ipynb +++ b/docs/source/notebooks/tutorial1.ipynb @@ -126,10 +126,15 @@ "assim_time = prior_valid_time\n", "```\n", "\n", - "Finally, we run the data assimilation by calling\n", + "To set up the experiment, call\n", "```python\n", "w = WorkFlows(exp_config='cfg.py', server_config='srvx1.py')\n", + "```\n", + "It will also create the output folders and backup the configuration files and scripts.\n", + "\n", "\n", + "Finally, we run the data assimilation by calling\n", + "```python\n", "w.assimilate(assim_time, prior_init_time, prior_valid_time, prior_path_exp)\n", "```\n", "\n", diff --git a/docs/source/notebooks/tutorial2.ipynb b/docs/source/notebooks/tutorial2.ipynb index df58b78b35dd5099e742d2e50933d4395150e716..b620b3f598ce62205dcd27826b07ce6bcebf75a0 100644 --- a/docs/source/notebooks/tutorial2.ipynb +++ b/docs/source/notebooks/tutorial2.ipynb @@ -10,15 +10,20 @@ "# Tutorial 2: Forecast after DA\n", "\n", "\n", - "**Goal**: To run a cycled data assimilation experiment.\n", - "[`cycled_exp.py`](https://github.com/lkugler/DART-WRF/blob/master/generate_free.py) contains an example which will be explained here:\n", + "**Goal**: To run an ensemble of forecasts. \n", + "[free_forecast.py ](https://github.com/lkugler/DART-WRF/blob/master/free_forecast.py) contains examples.\n", "\n", - "Now there are two options:\n", - "1) To start a forecast from an existing forecast, i.e. from WRF restart files\n", - "2) To start a forecast from defined thermodynamic profiles, i.e. from a `wrf_profile`\n", + "Initialize the forecast with either (1) or (2). Run the forecast with (3)\n", + "1) Initialize a forecast from defined profiles of temperature, humidity and wind, i.e. from a `wrf_profile` (see WRF guide)\n", + "2) Initialize a forecast from an existing forecast, i.e. from WRF restart files - optionally with updates from data assimilation.\n", + "3) Run the forecast\n", "\n", "\n", - "### Restart a forecast\n", + "### 1) Initialize from sounding profiles\n", + "\n", + "tutorial missing\n", + "\n", + "### 2) Initialize a forecast from a previous forecast\n", "To run a forecast from initial conditions of a previous forecasts, we import these modules\n", "```python\n", "import datetime as dt\n", @@ -42,50 +47,47 @@ "w.prepare_WRFrundir(begin)\n", "\n", "w.prepare_IC_from_prior(prior_path_exp, prior_init_time, prior_valid_time)\n", - "\n", - "w.run_ENS(begin=begin, # start integration from here\n", - " end=end, # integrate until here\n", - " output_restart_interval=9999, # do not write WRF restart files\n", - " )\n", "```\n", - "Note that we use predefined workflow functions like `run_ENS`.\n", "\n", + "#### 2b) Optional: Update posterior with increments from assimilation\n", "\n", - "### Forecast run after Data Assimilation\n", - "In order to continue after assimilation you need the posterior = prior (1) + increments (2)\n", + "In order to continue a forecast after assimilation you need the posterior = prior (1) + increments (2)\n", "\n", - "1. Set posterior = prior\n", + "1. Prepare initial conditions from a prior forecast:\n", "```python\n", "id = w.prepare_IC_from_prior(prior_path_exp, prior_init_time, prior_valid_time, depends_on=id)\n", "```\n", "\n", - "2) Update posterior with increments from assimilation\n", - "After this, the wrfrst files are updated with assimilation increments from DART output and copied to the WRF's run directories so you can continue to run the forecast ensemble.\n", + "\n", + "2. Update the initial conditions from data assimilation:\n", "```python\n", "id = w.update_IC_from_DA(time, depends_on=id)\n", "```\n", "\n", - "3) Define how long you want to run the forecast and when you want WRF restart files. Since they take a lot of space, we want as few as possible.\n", + "After this, the wrfrst files are updated with assimilation increments from DART output and copied to the WRF's run directories so you can continue to run the forecast ensemble.\n", + "\n", + "### 3) Run the forecast\n", + "Define how long you want to run the forecast and when you want WRF-restart files. Since they take a lot of space, we want as few as possible.\n", "\n", "```python\n", "timedelta_integrate = dt.timedelta(hours=5)\n", - "output_restart_interval = 9999 # any value larger than the forecast duration\n", + "\n", + "w.run_ENS(begin=begin, # start integration from here\n", + " end=time + timedelta_integrate, # integrate until here\n", + " output_restart_interval=9999, # do not write WRF restart files\n", + " )\n", "```\n", "\n", - "If you run DA in cycles of 15 minutes, it will be\n", + "If you want to assimilate in 15 minutes again, use\n", "```python\n", "timedelta_integrate = dt.timedelta(hours=5)\n", "timedelta_btw_assim = dt.timedelta(minutes=15)\n", "output_restart_interval = timedelta_btw_assim.total_seconds()/60\n", - "```\n", - "\n", "\n", - "3) Run WRF ensemble\n", - "```python\n", "id = w.run_ENS(begin=time, # start integration from here\n", " end=time + timedelta_integrate, # integrate until here\n", - " output_restart_interval=output_restart_interval,\n", - " depends_on=id)\n", + " output_restart_interval=output_restart_interval\n", + " )\n", "```\n" ] }, diff --git a/docs/source/notebooks/tutorial3.ipynb b/docs/source/notebooks/tutorial3.ipynb index d3018ab95b378770ca3b05b1e2fc9adb547a188e..b86eb16dfeccbd0cee2ef32f2d46fd3c4a408f83 100644 --- a/docs/source/notebooks/tutorial3.ipynb +++ b/docs/source/notebooks/tutorial3.ipynb @@ -11,11 +11,23 @@ "\n", "\n", "**Goal**: To run a cycled data assimilation experiment.\n", - "[`cycled_exp.py`](https://github.com/lkugler/DART-WRF/blob/master/generate_free.py) contains an example which will be explained here:\n", "\n", - "For example, your experiment can look like this\n", + "\n", + "The script [cycled_exp.py](https://github.com/lkugler/DART-WRF/blob/master/cycled_exp.py) contains an example which will be explained here.\n", + "\n", + "\n", + "In this example, we assume that our computing jobs run on a cluster with a \"job scheduler\", so that the jobs don't run immediately, but when there are free resources. This means that we tell each job to wait on another job's completion by using the keyword `depends_on=id`.\n", + "\n", + "\n", "\n", "```python\n", + "\n", + "w = WorkFlows(exp_config='cfg.py', server_config='jet.py')\n", + "\n", + "timedelta_integrate = dt.timedelta(minutes=15)\n", + "timedelta_btw_assim = dt.timedelta(minutes=15)\n", + "\n", + " \n", "prior_path_exp = '/jetfs/home/lkugler/data/sim_archive/exp_v1.19_P2_noDA'\n", "\n", "init_time = dt.datetime(2008, 7, 30, 13)\n", @@ -24,7 +36,7 @@ "forecast_until = dt.datetime(2008, 7, 30, 14, 15)\n", "\n", "w.prepare_WRFrundir(init_time)\n", - "id = w.run_ideal(depends_on=id)\n", + "id = w.run_ideal()\n", "\n", "prior_init_time = init_time\n", "prior_valid_time = time\n",