diff --git a/LEM.png b/LEM.png
new file mode 100644
index 0000000000000000000000000000000000000000..b8dff74ced2a27125756ced01db30ddcddd30c14
Binary files /dev/null and b/LEM.png differ
diff --git a/README.md b/README.md
index d66898c2ab723279a9f5f32840c073938c2c3e3b..899dd650792d74332be2e8ce14291d9166f14767 100644
--- a/README.md
+++ b/README.md
@@ -1,17 +1,20 @@
-**Code repository for the publication "Uncertainties in cloud-radiative heating within an idealized extratropical cyclone".
+**Code repository for the publication "Uncertainties in cloud-radiative heating within an idealized extratropical cyclone**
 
-Author: Behrooz Keshtgar, IMK, Karlsruhe Institute of Technology, behrooz.keshtgar@kit.edu
+**Author:** Behrooz Keshtgar, IMK, Karlsruhe Institute of Technology, behrooz.keshtgar@kit.edu
+
+<a name="icon-lem"/>![ICON-LEM simulation](./LEM.png)
 
 The repository contains:
 
-* sims: ICON simulation runscripts
-  - Baroclinic life cycle simulation (ICON-NWP)
+* **sims:** ICON model setup
+  - Baroclinic Life Cycle Simulation (ICON-NWP)
   - Large Eddy Model Simulations (ICON-LEM)
 
-* offlineRT: scripts for running LibRadTran
+* **offlineRT:** Procedures for offline radiative transfer calculations using LibRadTran
   - Scripts for post-processing LEM simulation output for use in offline radiative transfer calculations
-  - Python scripts for post-processing LibRadTran results
+  - Python scripts for post-processing of LibRadTran results
+  - Bash scripts to run LibRadTran
 
-* plots4paper: jupyter notebooks for figures used in papers, also figure pdfs
+* **plots4paper:** jupyter notebooks for figures used in papers, also figure pdfs
 
 The post-processed data used in the analysis and a copy of the code repository are available at ...
diff --git a/offlineRT/README.md b/offlineRT/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..84c50773a0dde913dec30364d69be2d1ea7dd382
--- /dev/null
+++ b/offlineRT/README.md
@@ -0,0 +1,14 @@
+This directory contains the pre- and post-processing scripts for input and output data of LibradTran and bash scripts to run the offline radiative transfer calculations.
+
+* input_for_libradtran.ipynb** This Jupyter notebook generates the input files for LibRadTran from ICON-LEM output files.
+
+* Subdirectories **c_cluster_solar/thermal_...** are for different radiative transfer calculations.
+- List of radiative transfer calculations:
+ - c_cluster_solar/thermal_ipa3d: 1D radiative transfer calculations with Delta-Eddington two-stream solver and ice-optical parameterizations by Fu and Baum
+- c_cluster_solar/thermal_mystic: 3D and 1D radiative transfer calculations with the MYSTIC solver
+ - c_cluster_solar/thermal_ipa3d_cg/dl: 1D radiative transfer calculations with the Delta-Eddington two-stream solver for NWP homogeneous grid-box clouds and homogeneous clouds with cloud fraction at a resolution of 2.5 km
+ - solar/thermal_clear_sky: Clear-sky radiative transfer calculations with Delta-Eddington two-stream and MYSTIC solvers
+
+* To run the offline radiative transfer calculations, run the bash script 'step1_makeInpFiles.sh' in the desired radiative transfer_subdirectory/ccSolar/thermal. This will automatically create input files for all subdomains and time steps to be used by the 'uvspec' program of LibRadtran.Finally, run 'submit_runs.sh' to distribute the runs to different nodes. The outputs are radiative heating rates written as ASCII files in the representative subdirectory.
+
+* The python scripts **convert_libradtran_data_to_netcdf*.py** process the all-sky and clear-sky radiative heating rate outputs from each radiative transfer calculation and merge the outputs from all subdomains to get the heating rates over the entire LEM domain and save the result as a netcdf file. 
diff --git a/offlineRT/c_cluster_solar_ipa3d/add_to_submit_file.txt b/offlineRT/c_cluster_solar_ipa3d/add_to_submit_file.txt
new file mode 100644
index 0000000000000000000000000000000000000000..19439a7678e164092aacfea016d57e4018e80baf
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d/add_to_submit_file.txt
@@ -0,0 +1,12 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#---------------------------------
+
+
+wait
diff --git a/offlineRT/c_cluster_solar_ipa3d/ccSolar/VARIABLES b/offlineRT/c_cluster_solar_ipa3d/ccSolar/VARIABLES
new file mode 100644
index 0000000000000000000000000000000000000000..25735e7b4562e303bdec8fbb4ab9169a768f0d3e
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d/ccSolar/VARIABLES
@@ -0,0 +1,32 @@
+export solver=("mystic" "twostr")
+
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400")
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export dom1=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12")
+
+export dom2=("13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24")
+
+export dom3=("25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export sza_05T1000=()
+
+export sza_05T1030=()
+
+export sza_05T1100=()
+
+export sza_05T1130=()
+
+export sza_05T1200=()
+
+export sza_05T1230=()
+
+export sza_05T1300=()
+
+export sza_05T1330=()
+
+export sza_05T1400=()
+
+export isim_array=("01" "02" "03" "04")
+#export isim_array=("01" "02" "03" "04" "05" "06" "07" "08" "09" "10")
diff --git a/offlineRT/c_cluster_solar_ipa3d/ccSolar/s_runs1.sh b/offlineRT/c_cluster_solar_ipa3d/ccSolar/s_runs1.sh
new file mode 100755
index 0000000000000000000000000000000000000000..2a329fd1ae5822f0617062c7407b7708de198219
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d/ccSolar/s_runs1.sh
@@ -0,0 +1,26 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=512000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname='ipa3d'
+ssource='solar'
+timee=$1
+num=$2
+
+echo "Job started @ $(date)"
+for dm in "${dom1[@]}"
+do	
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname}/${ffname}_${timee}_${dm}_isim${num}/libsetup.inp &
+done
+wait
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_solar_ipa3d/ccSolar/s_runs2.sh b/offlineRT/c_cluster_solar_ipa3d/ccSolar/s_runs2.sh
new file mode 100755
index 0000000000000000000000000000000000000000..fb4af3c037a4686e1fa461f586a431290ff2c5ce
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d/ccSolar/s_runs2.sh
@@ -0,0 +1,26 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=512000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname='ipa3d'
+ssource='solar'
+timee=$1
+num=$2
+
+echo "Job started @ $(date)"
+for dm in "${dom2[@]}"
+do	
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname}/${ffname}_${timee}_${dm}_isim${num}/libsetup.inp &
+done
+wait
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_solar_ipa3d/ccSolar/s_runs3.sh b/offlineRT/c_cluster_solar_ipa3d/ccSolar/s_runs3.sh
new file mode 100755
index 0000000000000000000000000000000000000000..e5d53c3f611f803cf6b9b35cb75085964f2ede7d
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d/ccSolar/s_runs3.sh
@@ -0,0 +1,26 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=512000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname='ipa3d'
+ssource='solar'
+timee=$1
+num=$2
+
+echo "Job started @ $(date)"
+for dm in "${dom3[@]}"
+do	
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname}/${ffname}_${timee}_${dm}_isim${num}/libsetup.inp &
+done
+wait
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_solar_ipa3d/ccSolar/step1_makeInpFiles.sh b/offlineRT/c_cluster_solar_ipa3d/ccSolar/step1_makeInpFiles.sh
new file mode 100755
index 0000000000000000000000000000000000000000..1a8253c2fd57d4c83c8dd201d6244c51b610b145
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d/ccSolar/step1_makeInpFiles.sh
@@ -0,0 +1,196 @@
+#!/usr/bin/env bash
+
+. VARIABLES
+
+pathh=$(cd ../../../ && pwd)
+ffname='ipa3d'
+
+echo
+echo "Shell script sucessfully running!"
+echo
+
+echo
+echo "BUILD FOLDER NAMES AND INPUT FILES"
+echo
+
+#First go out of run folder:
+cd ..
+
+SCRIPTDIR=$(pwd)
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+i=0
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+if [ $it == 05T1000 ]
+then
+	echo 'sza = ' "${sza_05T1000[$i]}"
+	sza="${sza_05T1000[$i]}"
+elif [ $it == 05T1030 ]
+then
+	echo 'sza = ' "${sza_05T1030[$i]}"
+        sza="${sza_05T1030[$i]}"
+elif [ $it == 05T1100 ]
+then
+	echo 'sza = ' "${sza_05T1100[$i]}"
+        sza="${sza_05T1100[$i]}"
+elif [ $it == 05T1130 ]
+then
+        echo 'sza = ' "${sza_05T1130[$i]}"
+        sza="${sza_05T1130[$i]}"
+elif [ $it == 05T1200 ]
+then
+        echo 'sza = ' "${sza_05T1200[$i]}"
+        sza="${sza_05T1200[$i]}"
+elif [ $it == 05T1230 ]
+then
+        echo 'sza = ' "${sza_05T1230[$i]}"
+        sza="${sza_05T1230[$i]}"
+elif [ $it == 05T1300 ]
+then
+        echo 'sza = ' "${sza_05T1300[$i]}"
+        sza="${sza_05T1300[$i]}"
+elif [ $it == 05T1330 ]
+then
+        echo 'sza = ' "${sza_05T1330[$i]}"
+        sza="${sza_05T1330[$i]}"	
+else
+	echo 'sza = ' "${sza_05T1400[$i]}"
+        sza="${sza_05T1400[$i]}"	
+fi
+i=$i+1
+
+
+    CLOUD_WC_FILE_INP=''$pathh'/wc3d_'$dm'_202201'$it'33Z.dat'
+    CLOUD_IC_FILE_INP=''$pathh'/ic3d_'$dm'_202201'$it'33Z.dat'
+    ZVALUES="$(head -n2 ${CLOUD_WC_FILE_INP} | tail -n1 | cut -d " " -f 3-)"
+
+      for isim in "${isim_array[@]}"
+      do
+      echo
+      echo "isim = " $isim
+      if [ $isim == 01 ]
+      then
+         ice_p='fu'
+         ice_h='#ic_habit rough-aggregate'
+      elif [ $isim == 02 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit ghm'
+      elif [ $isim == 03 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit solid-column'
+      else
+         ice_p='baum_v36'
+         ice_h='ic_habit rough-aggregate'
+      fi 	 
+      # Create libradtran output file name:
+      OUT_FILE=$ffname'_'$it'_'$dm'_isim'$isim'.out'
+
+      echo 'OUT_FILE = ' $OUT_FILE
+
+      FOLDER_NAME=$ffname'_'$it'_'$dm'_isim'$isim
+
+      mkdir $FOLDER_NAME
+
+      cd $FOLDER_NAME
+
+      echo 'Entering folder name = ' $FOLDER_NAME
+      # Create output folder for results:
+      mkdir output
+
+INP_FILE_NAME='libsetup.inp'
+LIBRAD="/home/b/b381185/libRadtran"
+
+############################
+# CREATE INPUT FILE HERE !!!
+cat > $INP_FILE_NAME << EOF
+data_files_path $LIBRAD/data/
+atmosphere_file ${pathh}/atmosphere_mean_${dm}_202201${it}33Z.dat
+
+
+albedo 0.07
+source solar
+sza ${sza}
+#phi ${phi}
+mol_abs_param Fu
+wavelength_index 1  7
+output_process sum
+
+# gas profiles
+#mixing_ratio CO2 348
+#mixing_ratio O2 209460
+#mixing_ratio CH4 1.650 
+#mixing_ratio N2O 0.396
+
+mixing_ratio F11 0.0
+mixing_ratio F12 0.0
+mixing_ratio F22 0.0
+mixing_ratio NO2 0.0
+
+mol_modify O4 0.0 DU
+mol_modify BRO 0.0 DU
+mol_modify OCLO 0.0 DU
+mol_modify HCHO 0.0 DU
+mol_modify SO2 0.0 DU
+mol_modify CO 0.0 DU
+mol_modify N2 0.0 DU
+
+#surface
+albedo_library IGBP
+brdf_rpv_type 17
+
+rte_solver rodents
+ipa_3d
+
+heating_rate layer_fd
+mc_backward_output heat K_per_day
+
+mc_sample_grid 420 343 0.227 0.394
+
+mc_basename $SCRIPTDIR/$FOLDER_NAME/output/$OUT_FILE
+
+wc_properties hu interpolate
+ic_properties $ice_p interpolate
+$ice_h
+
+wc_file 3D $CLOUD_WC_FILE_INP
+ic_file 3D $CLOUD_IC_FILE_INP
+
+quiet
+EOF
+### END OF INPUT FILE ###########
+#################################
+cat >ther_HR.sh<<EOF
+#! /bin/bash
+
+#SBATCH --account=bb1135
+#SBATCH --job-name=mystic_dom01.run
+#SBATCH --partition=shared
+#SBATCH --nodes=1
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#SBATCH --exclusive
+
+cd $(pwd)
+echo "Job started @ \$(date)"
+$LIBRAD/bin/uvspec < $INP_FILE_NAME
+echo "Job finished @ \$(date)"
+EOF
+
+        #Leave mysti/mcipa folder:
+        cd ..
+
+        done #over isim
+
+done # over dom
+done # over itime
+
+echo
+echo 
+echo 'The END'
diff --git a/offlineRT/c_cluster_solar_ipa3d/ccSolar/submit_runs.sh b/offlineRT/c_cluster_solar_ipa3d/ccSolar/submit_runs.sh
new file mode 100755
index 0000000000000000000000000000000000000000..77a75ae5ddaa96c9139bf50ee6e6586c6dbc0a8a
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d/ccSolar/submit_runs.sh
@@ -0,0 +1,17 @@
+#!/usr/bin/env bash
+
+#-------------------------------------------
+. VARIABLES
+
+for tt in "${time_array[@]}"
+do
+for ss in "${isim_array[@]}"
+do
+   echo "submitting runs for = " $tt'_'$ss
+
+   sbatch s_runs1.sh $tt $ss
+   sbatch s_runs2.sh $tt $ss
+   sbatch s_runs3.sh $tt $ss
+
+done 
+done
diff --git a/offlineRT/c_cluster_solar_ipa3d/find_missing_simulations.sh b/offlineRT/c_cluster_solar_ipa3d/find_missing_simulations.sh
new file mode 100755
index 0000000000000000000000000000000000000000..e0ab46047a25e8b61045c504db6dfd7c74e94807
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d/find_missing_simulations.sh
@@ -0,0 +1,39 @@
+#!/usr/bin/env bash
+
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400")
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01" "02" "03" "04")
+
+pathh=$(pwd)
+#----------------------------------------------------------------------------------------
+ffname1='ipa3d'
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+for isim in "${isim_array[@]}"
+do
+
+if [ -n "$(ls -A $ffname1'_'$it'_'$dm'_isim'$isim/output 2>/dev/null)" ]
+then
+  echo "file exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+else
+  echo "file does not exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+  #count=(($i + 1))
+  # let's put these into a file for rerun
+  echo "/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/${ffname1}_${it}_${dm}_isim${isim}/libsetup.inp &" >> submit_ipa3d.sh
+
+fi
+
+done
+done
+done
+
+#-----------------------
diff --git a/offlineRT/c_cluster_solar_ipa3d_cg/add_to_submit_file.txt b/offlineRT/c_cluster_solar_ipa3d_cg/add_to_submit_file.txt
new file mode 100644
index 0000000000000000000000000000000000000000..19439a7678e164092aacfea016d57e4018e80baf
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d_cg/add_to_submit_file.txt
@@ -0,0 +1,12 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#---------------------------------
+
+
+wait
diff --git a/offlineRT/c_cluster_solar_ipa3d_cg/ccSolar/s_runs1.sh b/offlineRT/c_cluster_solar_ipa3d_cg/ccSolar/s_runs1.sh
new file mode 100755
index 0000000000000000000000000000000000000000..0ea4ef71e1133385902366944ae6f0c1548ae4bf
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d_cg/ccSolar/s_runs1.sh
@@ -0,0 +1,31 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+###SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname='ipa3d'
+ssource='solar'
+#timee="${time_array[0]}"
+#timee=$1
+num='01'
+#num=$2
+
+echo "Job started @ $(date)"
+for tt in "${time_array[@]}"
+do
+for dm in "${dom[@]}"
+do	
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname}_cg/${ffname}_${tt}_${dm}_isim${num}/libsetup.inp &
+done
+done
+wait
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_solar_ipa3d_cg/ccSolar/step1_makeInpFiles.sh b/offlineRT/c_cluster_solar_ipa3d_cg/ccSolar/step1_makeInpFiles.sh
new file mode 100755
index 0000000000000000000000000000000000000000..5f12c2edd0453782a633c26f13c237ae76f4e516
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d_cg/ccSolar/step1_makeInpFiles.sh
@@ -0,0 +1,194 @@
+#!/usr/bin/env bash
+
+. VARIABLES
+
+pathh=$(cd ../../../ && pwd)
+ffname='ipa3d'
+
+echo
+echo "Shell script sucessfully running!"
+echo
+
+echo
+echo "BUILD FOLDER NAMES AND INPUT FILES"
+echo
+
+#First go out of run folder:
+cd ..
+
+SCRIPTDIR=$(pwd)
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+i=0
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+if [ $it == 05T1000 ]
+then
+	echo 'sza = ' "${sza_05T1000[$i]}"
+	sza="${sza_05T1000[$i]}"
+elif [ $it == 05T1030 ]
+then
+	echo 'sza = ' "${sza_05T1030[$i]}"
+        sza="${sza_05T1030[$i]}"
+elif [ $it == 05T1100 ]
+then
+	echo 'sza = ' "${sza_05T1100[$i]}"
+        sza="${sza_05T1100[$i]}"
+elif [ $it == 05T1130 ]
+then
+        echo 'sza = ' "${sza_05T1130[$i]}"
+        sza="${sza_05T1130[$i]}"
+elif [ $it == 05T1200 ]
+then
+        echo 'sza = ' "${sza_05T1200[$i]}"
+        sza="${sza_05T1200[$i]}"
+elif [ $it == 05T1230 ]
+then
+        echo 'sza = ' "${sza_05T1230[$i]}"
+        sza="${sza_05T1230[$i]}"
+elif [ $it == 05T1300 ]
+then
+        echo 'sza = ' "${sza_05T1300[$i]}"
+        sza="${sza_05T1300[$i]}"
+elif [ $it == 05T1330 ]
+then
+        echo 'sza = ' "${sza_05T1330[$i]}"
+        sza="${sza_05T1330[$i]}"	
+else
+	echo 'sza = ' "${sza_05T1400[$i]}"
+        sza="${sza_05T1400[$i]}"	
+fi
+i=$i+1
+
+    CLOUD_WC_FILE_INP=''$pathh'/wc3d_cg_'$dm'_202201'$it'33Z.dat'
+    CLOUD_IC_FILE_INP=''$pathh'/ic3d_cg_'$dm'_202201'$it'33Z.dat'
+    ZVALUES="$(head -n2 ${CLOUD_WC_FILE_INP} | tail -n1 | cut -d " " -f 3-)"
+
+      for isim in "${isim_array[@]}"
+      do
+      echo
+      echo "isim = " $isim
+      if [ $isim == 01 ]
+      then
+         ice_p='fu'
+         ice_h='#ic_habit rough-aggregate'
+      elif [ $isim == 02 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit ghm'
+      elif [ $isim == 03 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit solid-column'
+      else
+         ice_p='baum_v36'
+         ice_h='ic_habit rough-aggregate'
+      fi 	 
+      # Create libradtran output file name:
+      OUT_FILE=$ffname'_'$it'_'$dm'_isim'$isim'.out'
+
+      echo 'OUT_FILE = ' $OUT_FILE
+
+      FOLDER_NAME=$ffname'_'$it'_'$dm'_isim'$isim
+
+      mkdir $FOLDER_NAME
+
+      cd $FOLDER_NAME
+
+      echo 'Entering folder name = ' $FOLDER_NAME
+      # Create output folder for results:
+      mkdir output
+
+INP_FILE_NAME='libsetup.inp'
+LIBRAD="/home/b/b381185/libRadtran"
+
+############################
+# CREATE INPUT FILE HERE !!!
+cat > $INP_FILE_NAME << EOF
+data_files_path $LIBRAD/data/
+atmosphere_file ${pathh}/atmosphere_mean_${dm}_202201${it}33Z.dat
+
+albedo 0.07
+source solar
+sza ${sza}
+#phi ${phi}
+mol_abs_param Fu
+wavelength_index 1  7
+output_process sum
+
+# gas profiles
+#mixing_ratio CO2 348
+#mixing_ratio O2 209460
+#mixing_ratio CH4 1.650 
+#mixing_ratio N2O 0.396
+
+mixing_ratio F11 0.0
+mixing_ratio F12 0.0
+mixing_ratio F22 0.0
+mixing_ratio NO2 0.0
+
+mol_modify O4 0.0 DU
+mol_modify BRO 0.0 DU
+mol_modify OCLO 0.0 DU
+mol_modify HCHO 0.0 DU
+mol_modify SO2 0.0 DU
+mol_modify CO 0.0 DU
+mol_modify N2 0.0 DU
+
+#surface
+albedo_library IGBP
+brdf_rpv_type 17
+
+rte_solver twomaxrnd
+ipa_3d
+
+heating_rate layer_fd
+mc_backward_output heat K_per_day
+
+mc_sample_grid 41 33 1.9 3.3
+
+mc_basename $SCRIPTDIR/$FOLDER_NAME/output/$OUT_FILE
+
+wc_properties hu interpolate
+ic_properties $ice_p interpolate
+$ice_h
+
+wc_file 3D $CLOUD_WC_FILE_INP
+ic_file 3D $CLOUD_IC_FILE_INP
+
+quiet
+EOF
+### END OF INPUT FILE ###########
+#################################
+cat >ther_HR.sh<<EOF
+#! /bin/bash
+
+#SBATCH --account=bb1135
+#SBATCH --job-name=mystic_dom01.run
+#SBATCH --partition=shared
+#SBATCH --nodes=1
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#SBATCH --exclusive
+
+cd $(pwd)
+echo "Job started @ \$(date)"
+$LIBRAD/bin/uvspec < $INP_FILE_NAME
+echo "Job finished @ \$(date)"
+EOF
+
+        #Leave mysti/mcipa folder:
+        cd ..
+
+        done #over isim
+
+done # over dom
+done # over itime
+
+echo
+echo 
+echo 'The END'
diff --git a/offlineRT/c_cluster_solar_ipa3d_cg/ccSolar/submit_runs.sh b/offlineRT/c_cluster_solar_ipa3d_cg/ccSolar/submit_runs.sh
new file mode 100755
index 0000000000000000000000000000000000000000..21637528ea5ef932d16c174eb55cf819186a3041
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d_cg/ccSolar/submit_runs.sh
@@ -0,0 +1,15 @@
+#!/usr/bin/env bash
+
+#-------------------------------------------
+. VARIABLES
+
+for tt in "${time_array[@]}"
+do
+for ss in "${isim_array[@]}"
+do
+   echo "submitting runs for = " $tt'_'$ss
+
+   sbatch s_runs1.sh $tt $ss
+
+done 
+done
diff --git a/offlineRT/c_cluster_solar_ipa3d_cg/find_missing_simulations.sh b/offlineRT/c_cluster_solar_ipa3d_cg/find_missing_simulations.sh
new file mode 100755
index 0000000000000000000000000000000000000000..086f35bc2fe65d973dc41811af4484c941f260d9
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d_cg/find_missing_simulations.sh
@@ -0,0 +1,39 @@
+#!/usr/bin/env bash
+
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400")
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01")
+
+pathh=$(pwd)
+#----------------------------------------------------------------------------------------
+ffname1='ipa3d'
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+for isim in "${isim_array[@]}"
+do
+
+if [ -n "$(ls -A $ffname1'_'$it'_'$dm'_isim'$isim/output 2>/dev/null)" ]
+then
+  echo "file exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+else
+  echo "file does not exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+  #count=(($i + 1))
+  # let's put these into a file for rerun
+  echo "/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/${ffname1}_${it}_${dm}_isim${isim}/libsetup.inp &" >> submit_ipa3d.sh
+
+fi
+
+done
+done
+done
+
+#-----------------------
diff --git a/offlineRT/c_cluster_solar_ipa3d_dl/add_to_submit_file.txt b/offlineRT/c_cluster_solar_ipa3d_dl/add_to_submit_file.txt
new file mode 100644
index 0000000000000000000000000000000000000000..19439a7678e164092aacfea016d57e4018e80baf
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d_dl/add_to_submit_file.txt
@@ -0,0 +1,12 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#---------------------------------
+
+
+wait
diff --git a/offlineRT/c_cluster_solar_ipa3d_dl/ccSolar/s_runs1.sh b/offlineRT/c_cluster_solar_ipa3d_dl/ccSolar/s_runs1.sh
new file mode 100755
index 0000000000000000000000000000000000000000..db6146871e5cb6b3d9632ed2ee04c499e5878dfe
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d_dl/ccSolar/s_runs1.sh
@@ -0,0 +1,31 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+###SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname='ipa3d'
+ssource='solar'
+#timee="${time_array[0]}"
+#timee=$1
+num='01'
+#num=$2
+
+echo "Job started @ $(date)"
+for tt in "${time_array[@]}"
+do
+for dm in "${dom[@]}"
+do	
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname}_dl/${ffname}_${tt}_${dm}_isim${num}/libsetup.inp &
+done
+done
+wait
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_solar_ipa3d_dl/ccSolar/step1_makeInpFiles.sh b/offlineRT/c_cluster_solar_ipa3d_dl/ccSolar/step1_makeInpFiles.sh
new file mode 100755
index 0000000000000000000000000000000000000000..2d32130b7bb99e2b9878714d0b612fa97204685e
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d_dl/ccSolar/step1_makeInpFiles.sh
@@ -0,0 +1,194 @@
+#!/usr/bin/env bash
+
+. VARIABLES
+
+pathh=$(cd ../../../ && pwd)
+ffname='ipa3d'
+
+echo
+echo "Shell script sucessfully running!"
+echo
+
+echo
+echo "BUILD FOLDER NAMES AND INPUT FILES"
+echo
+
+#First go out of run folder:
+cd ..
+
+SCRIPTDIR=$(pwd)
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+i=0
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+if [ $it == 05T1000 ]
+then
+	echo 'sza = ' "${sza_05T1000[$i]}"
+	sza="${sza_05T1000[$i]}"
+elif [ $it == 05T1030 ]
+then
+	echo 'sza = ' "${sza_05T1030[$i]}"
+        sza="${sza_05T1030[$i]}"
+elif [ $it == 05T1100 ]
+then
+	echo 'sza = ' "${sza_05T1100[$i]}"
+        sza="${sza_05T1100[$i]}"
+elif [ $it == 05T1130 ]
+then
+        echo 'sza = ' "${sza_05T1130[$i]}"
+        sza="${sza_05T1130[$i]}"
+elif [ $it == 05T1200 ]
+then
+        echo 'sza = ' "${sza_05T1200[$i]}"
+        sza="${sza_05T1200[$i]}"
+elif [ $it == 05T1230 ]
+then
+        echo 'sza = ' "${sza_05T1230[$i]}"
+        sza="${sza_05T1230[$i]}"
+elif [ $it == 05T1300 ]
+then
+        echo 'sza = ' "${sza_05T1300[$i]}"
+        sza="${sza_05T1300[$i]}"
+elif [ $it == 05T1330 ]
+then
+        echo 'sza = ' "${sza_05T1330[$i]}"
+        sza="${sza_05T1330[$i]}"	
+else
+	echo 'sza = ' "${sza_05T1400[$i]}"
+        sza="${sza_05T1400[$i]}"	
+fi
+i=$i+1
+
+    CLOUD_WC_FILE_INP=''$pathh'/wc3d_dl_'$dm'_202201'$it'33Z.dat'
+    CLOUD_IC_FILE_INP=''$pathh'/ic3d_dl_'$dm'_202201'$it'33Z.dat'
+    ZVALUES="$(head -n2 ${CLOUD_WC_FILE_INP} | tail -n1 | cut -d " " -f 3-)"
+
+      for isim in "${isim_array[@]}"
+      do
+      echo
+      echo "isim = " $isim
+      if [ $isim == 01 ]
+      then
+         ice_p='fu'
+         ice_h='#ic_habit rough-aggregate'
+      elif [ $isim == 02 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit ghm'
+      elif [ $isim == 03 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit solid-column'
+      else
+         ice_p='baum_v36'
+         ice_h='ic_habit rough-aggregate'
+      fi 	 
+      # Create libradtran output file name:
+      OUT_FILE=$ffname'_'$it'_'$dm'_isim'$isim'.out'
+
+      echo 'OUT_FILE = ' $OUT_FILE
+
+      FOLDER_NAME=$ffname'_'$it'_'$dm'_isim'$isim
+
+      mkdir $FOLDER_NAME
+
+      cd $FOLDER_NAME
+
+      echo 'Entering folder name = ' $FOLDER_NAME
+      # Create output folder for results:
+      mkdir output
+
+INP_FILE_NAME='libsetup.inp'
+LIBRAD="/home/b/b381185/libRadtran"
+
+############################
+# CREATE INPUT FILE HERE !!!
+cat > $INP_FILE_NAME << EOF
+data_files_path $LIBRAD/data/
+atmosphere_file ${pathh}/atmosphere_mean_${dm}_202201${it}33Z.dat
+
+albedo 0.07
+source solar
+sza ${sza}
+#phi ${phi}
+mol_abs_param Fu
+wavelength_index 1  7
+output_process sum
+
+# gas profiles
+#mixing_ratio CO2 348
+#mixing_ratio O2 209460
+#mixing_ratio CH4 1.650 
+#mixing_ratio N2O 0.396
+
+mixing_ratio F11 0.0
+mixing_ratio F12 0.0
+mixing_ratio F22 0.0
+mixing_ratio NO2 0.0
+
+mol_modify O4 0.0 DU
+mol_modify BRO 0.0 DU
+mol_modify OCLO 0.0 DU
+mol_modify HCHO 0.0 DU
+mol_modify SO2 0.0 DU
+mol_modify CO 0.0 DU
+mol_modify N2 0.0 DU
+
+#surface
+albedo_library IGBP
+brdf_rpv_type 17
+
+rte_solver rodents
+ipa_3d
+
+heating_rate layer_fd
+mc_backward_output heat K_per_day
+
+mc_sample_grid 41 33 1.9 3.3
+
+mc_basename $SCRIPTDIR/$FOLDER_NAME/output/$OUT_FILE
+
+wc_properties hu interpolate
+ic_properties $ice_p interpolate
+$ice_h
+
+wc_file 3D $CLOUD_WC_FILE_INP
+ic_file 3D $CLOUD_IC_FILE_INP
+
+quiet
+EOF
+### END OF INPUT FILE ###########
+#################################
+cat >ther_HR.sh<<EOF
+#! /bin/bash
+
+#SBATCH --account=bb1135
+#SBATCH --job-name=mystic_dom01.run
+#SBATCH --partition=shared
+#SBATCH --nodes=1
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#SBATCH --exclusive
+
+cd $(pwd)
+echo "Job started @ \$(date)"
+$LIBRAD/bin/uvspec < $INP_FILE_NAME
+echo "Job finished @ \$(date)"
+EOF
+
+        #Leave mysti/mcipa folder:
+        cd ..
+
+        done #over isim
+
+done # over dom
+done # over itime
+
+echo
+echo 
+echo 'The END'
diff --git a/offlineRT/c_cluster_solar_ipa3d_dl/ccSolar/submit_runs.sh b/offlineRT/c_cluster_solar_ipa3d_dl/ccSolar/submit_runs.sh
new file mode 100755
index 0000000000000000000000000000000000000000..21637528ea5ef932d16c174eb55cf819186a3041
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d_dl/ccSolar/submit_runs.sh
@@ -0,0 +1,15 @@
+#!/usr/bin/env bash
+
+#-------------------------------------------
+. VARIABLES
+
+for tt in "${time_array[@]}"
+do
+for ss in "${isim_array[@]}"
+do
+   echo "submitting runs for = " $tt'_'$ss
+
+   sbatch s_runs1.sh $tt $ss
+
+done 
+done
diff --git a/offlineRT/c_cluster_solar_ipa3d_dl/find_missing_simulations.sh b/offlineRT/c_cluster_solar_ipa3d_dl/find_missing_simulations.sh
new file mode 100755
index 0000000000000000000000000000000000000000..086f35bc2fe65d973dc41811af4484c941f260d9
--- /dev/null
+++ b/offlineRT/c_cluster_solar_ipa3d_dl/find_missing_simulations.sh
@@ -0,0 +1,39 @@
+#!/usr/bin/env bash
+
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400")
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01")
+
+pathh=$(pwd)
+#----------------------------------------------------------------------------------------
+ffname1='ipa3d'
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+for isim in "${isim_array[@]}"
+do
+
+if [ -n "$(ls -A $ffname1'_'$it'_'$dm'_isim'$isim/output 2>/dev/null)" ]
+then
+  echo "file exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+else
+  echo "file does not exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+  #count=(($i + 1))
+  # let's put these into a file for rerun
+  echo "/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/${ffname1}_${it}_${dm}_isim${isim}/libsetup.inp &" >> submit_ipa3d.sh
+
+fi
+
+done
+done
+done
+
+#-----------------------
diff --git a/offlineRT/c_cluster_solar_mystic/add_to_submit_file.txt b/offlineRT/c_cluster_solar_mystic/add_to_submit_file.txt
new file mode 100644
index 0000000000000000000000000000000000000000..6aa3cba2250c5ec60bfca8db6b889e3d1db067fb
--- /dev/null
+++ b/offlineRT/c_cluster_solar_mystic/add_to_submit_file.txt
@@ -0,0 +1,12 @@
+#!/bin/bash
+#SBATCH --partition=interactive
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=256000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#---------------------------------
+
+
+wait
diff --git a/offlineRT/c_cluster_solar_mystic/ccSolar/s_runs1.sh b/offlineRT/c_cluster_solar_mystic/ccSolar/s_runs1.sh
new file mode 100755
index 0000000000000000000000000000000000000000..3a23be714f31e03bf04d3a9aa7ae3451f32c73e4
--- /dev/null
+++ b/offlineRT/c_cluster_solar_mystic/ccSolar/s_runs1.sh
@@ -0,0 +1,31 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+###SBATCH --mem=256000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname1='mystic'
+ffname2='mysti'
+ssource='solar'
+timee=$1
+num=$2
+
+echo "Job started @ $(date)"
+
+for dm in "${dom[@]}"
+do
+
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname1}/${ffname2}_${timee}_${dm}_isim${num}/libsetup.inp &
+
+done
+wait
+
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_solar_mystic/ccSolar/s_runs2.sh b/offlineRT/c_cluster_solar_mystic/ccSolar/s_runs2.sh
new file mode 100755
index 0000000000000000000000000000000000000000..0811e955f1043f51013010cd4c44bb35adf4e994
--- /dev/null
+++ b/offlineRT/c_cluster_solar_mystic/ccSolar/s_runs2.sh
@@ -0,0 +1,31 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+###SBATCH --mem=256000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname1='mystic'
+ffname2='mcipa'
+ssource='solar'
+timee=$1
+num=$2
+
+echo "Job started @ $(date)"
+
+for dm in "${dom[@]}"
+do
+
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname1}/${ffname2}_${timee}_${dm}_isim${num}/libsetup.inp &
+
+done
+wait
+
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_solar_mystic/ccSolar/step1_makeInpFiles.sh b/offlineRT/c_cluster_solar_mystic/ccSolar/step1_makeInpFiles.sh
new file mode 100755
index 0000000000000000000000000000000000000000..e3c1ee8f576cd71b315bec777b9dbc57cf6f34c9
--- /dev/null
+++ b/offlineRT/c_cluster_solar_mystic/ccSolar/step1_makeInpFiles.sh
@@ -0,0 +1,200 @@
+#!/usr/bin/env bash
+
+. VARIABLES
+
+IDENT=$1
+
+pathh=$(cd ../../../ && pwd)
+ffname='mysti'
+
+if [ "$IDENT" == "mcipa" ]
+then
+    MCIPA="mc_ipa"
+    ffname='mcipa'
+fi
+
+echo
+echo "Shell script sucessfully running!"
+echo
+
+echo
+echo "BUILD FOLDER NAMES AND INPUT FILES"
+echo
+
+#First go out of run folder:
+cd ..
+
+SCRIPTDIR=$(pwd)
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+i=0
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+if [ $it == 05T1000 ]
+then
+	echo 'sza = ' "${sza_05T1000[$i]}"
+	sza="${sza_05T1000[$i]}"
+elif [ $it == 05T1030 ]
+then
+	echo 'sza = ' "${sza_05T1030[$i]}"
+        sza="${sza_05T1030[$i]}"
+elif [ $it == 05T1100 ]
+then
+	echo 'sza = ' "${sza_05T1100[$i]}"
+        sza="${sza_05T1100[$i]}"
+elif [ $it == 05T1130 ]
+then
+        echo 'sza = ' "${sza_05T1130[$i]}"
+        sza="${sza_05T1130[$i]}"
+elif [ $it == 05T1200 ]
+then
+        echo 'sza = ' "${sza_05T1200[$i]}"
+        sza="${sza_05T1200[$i]}"
+elif [ $it == 05T1230 ]
+then
+        echo 'sza = ' "${sza_05T1230[$i]}"
+        sza="${sza_05T1230[$i]}"
+elif [ $it == 05T1300 ]
+then
+        echo 'sza = ' "${sza_05T1300[$i]}"
+        sza="${sza_05T1300[$i]}"
+elif [ $it == 05T1330 ]
+then
+        echo 'sza = ' "${sza_05T1330[$i]}"
+        sza="${sza_05T1330[$i]}"	
+else
+	echo 'sza = ' "${sza_05T1400[$i]}"
+        sza="${sza_05T1400[$i]}"	
+fi
+i=$i+1
+
+    CLOUD_WC_FILE_INP=''$pathh'/wc3d_'$dm'_202201'$it'33Z.dat'
+    CLOUD_IC_FILE_INP=''$pathh'/ic3d_'$dm'_202201'$it'33Z.dat'
+    ZVALUES="$(head -n2 ${CLOUD_WC_FILE_INP} | tail -n1 | cut -d " " -f 3-)"
+
+      for isim in "${isim_array[@]}"
+      do
+      echo
+      echo "isim = " $isim
+      #if [ $isim == 01 ]
+      #then
+      #   ice_p='fu'
+      #   ice_h='#ic_habit rough-aggregate'
+      #elif [ $isim == 02 ]
+      #then
+      #   ice_p='baum_v36'
+      #   ice_h='ic_habit ghm'
+      #elif [ $isim == 03 ]
+      #then
+      #   ice_p='baum_v36'
+      #   ice_h='ic_habit solid-column'
+      #else
+      #   ice_p='baum_v36'
+      #   ice_h='ic_habit rough-aggregate'
+      #fi 	 
+      # Create libradtran output file name:
+      OUT_FILE=$ffname'_'$it'_'$dm'_isim'$isim'.out'
+
+      echo 'OUT_FILE = ' $OUT_FILE
+
+      FOLDER_NAME=$ffname'_'$it'_'$dm'_isim'$isim
+
+      mkdir $FOLDER_NAME
+
+      cd $FOLDER_NAME
+
+      echo 'Entering folder name = ' $FOLDER_NAME
+      # Create output folder for results:
+      mkdir output
+
+INP_FILE_NAME='libsetup.inp'
+LIBRAD="/home/b/b381185/libRadtran"
+
+############################
+# CREATE INPUT FILE HERE !!!
+cat > $INP_FILE_NAME << EOF
+data_files_path $LIBRAD/data/
+atmosphere_file ${pathh}/atmosphere_mean_${dm}_202201${it}33Z.dat
+
+albedo 0.07
+source solar
+sza ${sza}
+#phi ${phi}
+mol_abs_param Fu
+wavelength_index 1  7
+output_process sum
+
+# gas profiles
+#mixing_ratio CO2 348
+#mixing_ratio O2 209460
+#mixing_ratio CH4 1.650 
+#mixing_ratio N2O 0.396
+
+mixing_ratio F11 0.0
+mixing_ratio F12 0.0
+mixing_ratio F22 0.0
+mixing_ratio NO2 0.0
+
+mol_modify O4 0.0 DU
+mol_modify BRO 0.0 DU
+mol_modify OCLO 0.0 DU
+mol_modify HCHO 0.0 DU
+mol_modify SO2 0.0 DU
+mol_modify CO 0.0 DU
+mol_modify N2 0.0 DU
+
+#surface
+albedo_library IGBP
+brdf_rpv_type 17
+
+rte_solver mystic
+$MCIPA
+
+mc_forward_output heating K_per_day
+mc_photons 72500000
+
+mc_sample_grid 420 343 0.227 0.394
+
+mc_basename $SCRIPTDIR/$FOLDER_NAME/output/$OUT_FILE
+
+wc_properties hu interpolate
+ic_properties fu interpolate
+
+wc_file 3D $CLOUD_WC_FILE_INP
+ic_file 3D $CLOUD_IC_FILE_INP
+
+EOF
+### END OF INPUT FILE ###########
+#################################
+cat >ther_HR.sh<<EOF
+#! /bin/bash
+
+#SBATCH --account=bb1135
+#SBATCH --job-name=mystic_dom01.run
+#SBATCH --partition=shared
+#SBATCH --nodes=1
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#SBATCH --exclusive
+
+cd $(pwd)
+echo "Job started @ \$(date)"
+$LIBRAD/bin/uvspec < $INP_FILE_NAME
+echo "Job finished @ \$(date)"
+EOF
+
+        #Leave mysti/mcipa folder:
+        cd ..
+
+        done #over isim
+
+done # over dom
+done # over itime
+
+echo
+echo 
+echo 'The END'
diff --git a/offlineRT/c_cluster_solar_mystic/ccSolar/submit_runs.sh b/offlineRT/c_cluster_solar_mystic/ccSolar/submit_runs.sh
new file mode 100755
index 0000000000000000000000000000000000000000..6452e686c08d695e0bfeddcb9098a2ac42e840a3
--- /dev/null
+++ b/offlineRT/c_cluster_solar_mystic/ccSolar/submit_runs.sh
@@ -0,0 +1,16 @@
+#!/usr/bin/env bash
+
+#-------------------------------------------
+. VARIABLES
+
+for tt in "${time_array[@]}"
+do
+for ss in "${isim_array[@]}"
+do
+   echo "submitting runs for = " $tt'_'$ss
+
+   sbatch s_runs1.sh $tt $ss
+   sbatch s_runs2.sh $tt $ss
+
+done 
+done
diff --git a/offlineRT/c_cluster_solar_mystic/find_missing_simulations.sh b/offlineRT/c_cluster_solar_mystic/find_missing_simulations.sh
new file mode 100755
index 0000000000000000000000000000000000000000..2623b6927aa33b794041f597355896a6bd42db07
--- /dev/null
+++ b/offlineRT/c_cluster_solar_mystic/find_missing_simulations.sh
@@ -0,0 +1,68 @@
+#!/usr/bin/env bash
+
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400")
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01" "02" "03" "04" "05" "06" "07" "08" "09" "10")
+
+pathh=$(pwd)
+#----------------------------------------------------------------------------------------
+ffname1='mysti'
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+for isim in "${isim_array[@]}"
+do
+
+if [ -n "$(ls -A $ffname1'_'$it'_'$dm'_isim'$isim/output 2>/dev/null)" ]
+then
+  echo "file exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+else
+  echo "file does not exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+  #count=(($i + 1))
+  # let's put these into a file for rerun
+  echo "/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/${ffname1}_${it}_${dm}_isim${isim}/libsetup.inp &" >> submit_mysti.sh
+
+fi
+
+done
+done
+done
+
+#---------------------------- mcipa
+ffname2='mcipa'
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+for isim in "${isim_array[@]}"
+do
+
+if [ -n "$(ls -A $ffname2'_'$it'_'$dm'_isim'$isim/output 2>/dev/null)" ]
+then
+  echo "file exist (${ffname2}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+else
+  echo "file does not exist (${ffname2}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+  #count=(($i + 1))
+  # let's put these into a file for rerun
+  echo "/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/${ffname2}_${it}_${dm}_isim${isim}/libsetup.inp &" >> submit_mcipa.sh
+
+fi
+
+done
+done
+done
+
+#-----------------------
diff --git a/offlineRT/c_cluster_thermal_ipa3d/add_to_submit_file.txt b/offlineRT/c_cluster_thermal_ipa3d/add_to_submit_file.txt
new file mode 100644
index 0000000000000000000000000000000000000000..19439a7678e164092aacfea016d57e4018e80baf
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d/add_to_submit_file.txt
@@ -0,0 +1,12 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#---------------------------------
+
+
+wait
diff --git a/offlineRT/c_cluster_thermal_ipa3d/ccThermal/VARIABLES b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/VARIABLES
new file mode 100644
index 0000000000000000000000000000000000000000..1373301af56f6a455508ba8eb1ba849c90b31f87
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/VARIABLES
@@ -0,0 +1,11 @@
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400") 
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export dom1=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12")
+
+export dom2=("13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24")
+
+export dom3=("25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01" "02" "03" "04")
diff --git a/offlineRT/c_cluster_thermal_ipa3d/ccThermal/s_runs1.sh b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/s_runs1.sh
new file mode 100755
index 0000000000000000000000000000000000000000..09ff6d6846ada44dd8f3dc04dd98b80ca5ec9482
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/s_runs1.sh
@@ -0,0 +1,26 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname='ipa3d'
+ssource='thermal'
+timee=$1
+num=$2
+
+echo "Job started @ $(date)"
+for dm in "${dom1[@]}"
+do	
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname}/${ffname}_${timee}_${dm}_isim${num}/libsetup.inp &
+done
+wait
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_thermal_ipa3d/ccThermal/s_runs2.sh b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/s_runs2.sh
new file mode 100755
index 0000000000000000000000000000000000000000..03c7b0c058cb3d9d46cb53e427a97ce91a95a70d
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/s_runs2.sh
@@ -0,0 +1,26 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname='ipa3d'
+ssource='thermal'
+timee=$1
+num=$2
+
+echo "Job started @ $(date)"
+for dm in "${dom2[@]}"
+do	
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname}/${ffname}_${timee}_${dm}_isim${num}/libsetup.inp &
+done
+wait
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_thermal_ipa3d/ccThermal/s_runs3.sh b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/s_runs3.sh
new file mode 100755
index 0000000000000000000000000000000000000000..626a33f16354dd0ca2add666274f0fdeba508391
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/s_runs3.sh
@@ -0,0 +1,26 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname='ipa3d'
+ssource='thermal'
+timee=$1
+num=$2
+
+echo "Job started @ $(date)"
+for dm in "${dom3[@]}"
+do	
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname}/${ffname}_${timee}_${dm}_isim${num}/libsetup.inp &
+done
+wait
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_thermal_ipa3d/ccThermal/step1_makeInpFiles.sh b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/step1_makeInpFiles.sh
new file mode 100755
index 0000000000000000000000000000000000000000..c19cfeb2720d44da60d7579e2fb604a7f955d8c8
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/step1_makeInpFiles.sh
@@ -0,0 +1,154 @@
+#!/usr/bin/env bash
+
+. VARIABLES
+
+pathh=$(cd ../../../ && pwd)
+ffname='ipa3d'
+
+echo
+echo "Shell script sucessfully running!"
+echo
+
+echo
+echo "BUILD FOLDER NAMES AND INPUT FILES"
+echo
+
+#First go out of run folder:
+cd ..
+
+SCRIPTDIR=$(pwd)
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+    CLOUD_WC_FILE_INP=''$pathh'/wc3d_'$dm'_202201'$it'33Z.dat'
+    CLOUD_IC_FILE_INP=''$pathh'/ic3d_'$dm'_202201'$it'33Z.dat'
+    ZVALUES="$(head -n2 ${CLOUD_WC_FILE_INP} | tail -n1 | cut -d " " -f 3-)"
+
+      for isim in "${isim_array[@]}"
+      do
+      echo
+      echo "isim = " $isim
+      if [ $isim == 01 ]
+      then
+         ice_p='fu'
+         ice_h='#ic_habit rough-aggregate'
+      elif [ $isim == 02 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit ghm'
+      elif [ $isim == 03 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit solid-column'
+      else
+         ice_p='baum_v36'
+         ice_h='ic_habit rough-aggregate'
+      fi 	 
+      # Create libradtran output file name:
+      OUT_FILE=$ffname'_'$it'_'$dm'_isim'$isim'.out'
+
+      echo 'OUT_FILE = ' $OUT_FILE
+
+      FOLDER_NAME=$ffname'_'$it'_'$dm'_isim'$isim
+
+      mkdir $FOLDER_NAME
+
+      cd $FOLDER_NAME
+
+      echo 'Entering folder name = ' $FOLDER_NAME
+      # Create output folder for results:
+      mkdir output
+
+INP_FILE_NAME='libsetup.inp'
+LIBRAD="/home/b/b381185/libRadtran"
+
+############################
+# CREATE INPUT FILE HERE !!!
+cat > $INP_FILE_NAME << EOF
+data_files_path $LIBRAD/data/
+atmosphere_file ${pathh}/atmosphere_mean_${dm}_202201${it}33Z.dat
+
+albedo 0.009000000000000008
+source thermal  
+mol_abs_param Fu   
+wavelength_index 7  18   
+output_process sum   
+
+# gas profiles
+#mixing_ratio CO2 348
+#mixing_ratio O2 209460
+#mixing_ratio CH4 1.650 
+#mixing_ratio N2O 0.396
+
+mixing_ratio F11 0.0
+mixing_ratio F12 0.0
+mixing_ratio F22 0.0
+mixing_ratio NO2 0.0
+
+mol_modify O4 0.0 DU
+mol_modify BRO 0.0 DU
+mol_modify OCLO 0.0 DU
+mol_modify HCHO 0.0 DU
+mol_modify SO2 0.0 DU
+mol_modify CO 0.0 DU
+mol_modify N2 0.0 DU
+
+#surface
+albedo_library IGBP
+brdf_rpv_type 17
+
+rte_solver rodents
+ipa_3d
+
+heating_rate layer_fd
+mc_backward_output heat K_per_day
+
+mc_sample_grid 420 343 0.227 0.394
+
+mc_basename $SCRIPTDIR/$FOLDER_NAME/output/$OUT_FILE
+
+wc_properties hu interpolate
+ic_properties $ice_p interpolate
+$ice_h
+
+wc_file 3D $CLOUD_WC_FILE_INP
+ic_file 3D $CLOUD_IC_FILE_INP
+
+quiet
+EOF
+### END OF INPUT FILE ###########
+#################################
+cat >ther_HR.sh<<EOF
+#! /bin/bash
+
+#SBATCH --account=bb1135
+#SBATCH --job-name=mystic_dom01.run
+#SBATCH --partition=shared
+#SBATCH --nodes=1
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#SBATCH --exclusive
+
+cd $(pwd)
+echo "Job started @ \$(date)"
+$LIBRAD/bin/uvspec < $INP_FILE_NAME
+echo "Job finished @ \$(date)"
+EOF
+
+        #Leave mysti/mcipa folder:
+        cd ..
+
+        done #over isim
+
+done # over dom
+done # over itime
+
+echo
+echo 
+echo 'The END'
diff --git a/offlineRT/c_cluster_thermal_ipa3d/ccThermal/submit_runs.sh b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/submit_runs.sh
new file mode 100755
index 0000000000000000000000000000000000000000..77a75ae5ddaa96c9139bf50ee6e6586c6dbc0a8a
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d/ccThermal/submit_runs.sh
@@ -0,0 +1,17 @@
+#!/usr/bin/env bash
+
+#-------------------------------------------
+. VARIABLES
+
+for tt in "${time_array[@]}"
+do
+for ss in "${isim_array[@]}"
+do
+   echo "submitting runs for = " $tt'_'$ss
+
+   sbatch s_runs1.sh $tt $ss
+   sbatch s_runs2.sh $tt $ss
+   sbatch s_runs3.sh $tt $ss
+
+done 
+done
diff --git a/offlineRT/c_cluster_thermal_ipa3d/find_missing_simulations.sh b/offlineRT/c_cluster_thermal_ipa3d/find_missing_simulations.sh
new file mode 100755
index 0000000000000000000000000000000000000000..e0ab46047a25e8b61045c504db6dfd7c74e94807
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d/find_missing_simulations.sh
@@ -0,0 +1,39 @@
+#!/usr/bin/env bash
+
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400")
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01" "02" "03" "04")
+
+pathh=$(pwd)
+#----------------------------------------------------------------------------------------
+ffname1='ipa3d'
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+for isim in "${isim_array[@]}"
+do
+
+if [ -n "$(ls -A $ffname1'_'$it'_'$dm'_isim'$isim/output 2>/dev/null)" ]
+then
+  echo "file exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+else
+  echo "file does not exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+  #count=(($i + 1))
+  # let's put these into a file for rerun
+  echo "/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/${ffname1}_${it}_${dm}_isim${isim}/libsetup.inp &" >> submit_ipa3d.sh
+
+fi
+
+done
+done
+done
+
+#-----------------------
diff --git a/offlineRT/c_cluster_thermal_ipa3d_cg/add_to_submit_file.txt b/offlineRT/c_cluster_thermal_ipa3d_cg/add_to_submit_file.txt
new file mode 100644
index 0000000000000000000000000000000000000000..19439a7678e164092aacfea016d57e4018e80baf
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_cg/add_to_submit_file.txt
@@ -0,0 +1,12 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#---------------------------------
+
+
+wait
diff --git a/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/VARIABLES b/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/VARIABLES
new file mode 100644
index 0000000000000000000000000000000000000000..fe5ae23a33f7e08e5bb3b28b4d89f5a0b2840476
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/VARIABLES
@@ -0,0 +1,9 @@
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400") 
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export dom1=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17")
+
+export dom2=("18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01") ##("02" "03" "04")
diff --git a/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/s_runs1.sh b/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/s_runs1.sh
new file mode 100755
index 0000000000000000000000000000000000000000..760f1d53357df75668e30123f499577720ae0c67
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/s_runs1.sh
@@ -0,0 +1,31 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+###SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname='ipa3d'
+ssource='thermal'
+#timee="${time_array[0]}"
+#timee=$1
+num='01'
+#num=$2
+
+echo "Job started @ $(date)"
+for tt in "${time_array[@]}"
+do
+for dm in "${dom[@]}"
+do	
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname}_cg/${ffname}_${tt}_${dm}_isim${num}/libsetup.inp &
+done
+done
+wait
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/step1_makeInpFiles.sh b/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/step1_makeInpFiles.sh
new file mode 100755
index 0000000000000000000000000000000000000000..fcfb8001568a4ec0508ed3474ab0dd6cf12e0d67
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/step1_makeInpFiles.sh
@@ -0,0 +1,154 @@
+#!/usr/bin/env bash
+
+. VARIABLES
+
+pathh=$(cd ../../../ && pwd)
+ffname='ipa3d'
+
+echo
+echo "Shell script sucessfully running!"
+echo
+
+echo
+echo "BUILD FOLDER NAMES AND INPUT FILES"
+echo
+
+#First go out of run folder:
+cd ..
+
+SCRIPTDIR=$(pwd)
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+    CLOUD_WC_FILE_INP=''$pathh'/wc3d_cg_'$dm'_202201'$it'33Z.dat'
+    CLOUD_IC_FILE_INP=''$pathh'/ic3d_cg_'$dm'_202201'$it'33Z.dat'
+    ZVALUES="$(head -n2 ${CLOUD_WC_FILE_INP} | tail -n1 | cut -d " " -f 3-)"
+
+      for isim in "${isim_array[@]}"
+      do
+      echo
+      echo "isim = " $isim
+      if [ $isim == 01 ]
+      then
+         ice_p='fu'
+         ice_h='#ic_habit rough-aggregate'
+      elif [ $isim == 02 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit ghm'
+      elif [ $isim == 03 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit solid-column'
+      else
+         ice_p='baum_v36'
+         ice_h='ic_habit rough-aggregate'
+      fi 	 
+      # Create libradtran output file name:
+      OUT_FILE=$ffname'_'$it'_'$dm'_isim'$isim'.out'
+
+      echo 'OUT_FILE = ' $OUT_FILE
+
+      FOLDER_NAME=$ffname'_'$it'_'$dm'_isim'$isim
+
+      mkdir $FOLDER_NAME
+
+      cd $FOLDER_NAME
+
+      echo 'Entering folder name = ' $FOLDER_NAME
+      # Create output folder for results:
+      mkdir output
+
+INP_FILE_NAME='libsetup.inp'
+LIBRAD="/home/b/b381185/libRadtran"
+
+############################
+# CREATE INPUT FILE HERE !!!
+cat > $INP_FILE_NAME << EOF
+data_files_path $LIBRAD/data/
+atmosphere_file ${pathh}/atmosphere_mean_${dm}_202201${it}33Z.dat
+
+albedo 0.009000000000000008
+source thermal  
+mol_abs_param Fu   
+wavelength_index 7  18   
+output_process sum   
+
+# gas profiles
+#mixing_ratio CO2 348
+#mixing_ratio O2 209460
+#mixing_ratio CH4 1.650 
+#mixing_ratio N2O 0.396
+
+mixing_ratio F11 0.0
+mixing_ratio F12 0.0
+mixing_ratio F22 0.0
+mixing_ratio NO2 0.0
+
+mol_modify O4 0.0 DU
+mol_modify BRO 0.0 DU
+mol_modify OCLO 0.0 DU
+mol_modify HCHO 0.0 DU
+mol_modify SO2 0.0 DU
+mol_modify CO 0.0 DU
+mol_modify N2 0.0 DU
+
+#surface
+albedo_library IGBP
+brdf_rpv_type 17
+
+rte_solver twomaxrnd
+ipa_3d
+
+heating_rate layer_fd
+mc_backward_output heat K_per_day
+
+mc_sample_grid 41 33 1.9 3.3
+
+mc_basename $SCRIPTDIR/$FOLDER_NAME/output/$OUT_FILE
+
+wc_properties hu interpolate
+ic_properties $ice_p interpolate
+$ice_h
+
+wc_file 3D $CLOUD_WC_FILE_INP
+ic_file 3D $CLOUD_IC_FILE_INP
+
+quiet
+EOF
+### END OF INPUT FILE ###########
+#################################
+cat >ther_HR.sh<<EOF
+#! /bin/bash
+
+#SBATCH --account=bb1135
+#SBATCH --job-name=mystic_dom01.run
+#SBATCH --partition=shared
+#SBATCH --nodes=1
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#SBATCH --exclusive
+
+cd $(pwd)
+echo "Job started @ \$(date)"
+$LIBRAD/bin/uvspec < $INP_FILE_NAME
+echo "Job finished @ \$(date)"
+EOF
+
+        #Leave mysti/mcipa folder:
+        cd ..
+
+        done #over isim
+
+done # over dom
+done # over itime
+
+echo
+echo 
+echo 'The END'
diff --git a/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/submit_runs.sh b/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/submit_runs.sh
new file mode 100755
index 0000000000000000000000000000000000000000..21637528ea5ef932d16c174eb55cf819186a3041
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_cg/ccThermal/submit_runs.sh
@@ -0,0 +1,15 @@
+#!/usr/bin/env bash
+
+#-------------------------------------------
+. VARIABLES
+
+for tt in "${time_array[@]}"
+do
+for ss in "${isim_array[@]}"
+do
+   echo "submitting runs for = " $tt'_'$ss
+
+   sbatch s_runs1.sh $tt $ss
+
+done 
+done
diff --git a/offlineRT/c_cluster_thermal_ipa3d_cg/find_missing_simulations.sh b/offlineRT/c_cluster_thermal_ipa3d_cg/find_missing_simulations.sh
new file mode 100755
index 0000000000000000000000000000000000000000..086f35bc2fe65d973dc41811af4484c941f260d9
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_cg/find_missing_simulations.sh
@@ -0,0 +1,39 @@
+#!/usr/bin/env bash
+
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400")
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01")
+
+pathh=$(pwd)
+#----------------------------------------------------------------------------------------
+ffname1='ipa3d'
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+for isim in "${isim_array[@]}"
+do
+
+if [ -n "$(ls -A $ffname1'_'$it'_'$dm'_isim'$isim/output 2>/dev/null)" ]
+then
+  echo "file exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+else
+  echo "file does not exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+  #count=(($i + 1))
+  # let's put these into a file for rerun
+  echo "/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/${ffname1}_${it}_${dm}_isim${isim}/libsetup.inp &" >> submit_ipa3d.sh
+
+fi
+
+done
+done
+done
+
+#-----------------------
diff --git a/offlineRT/c_cluster_thermal_ipa3d_dl/add_to_submit_file.txt b/offlineRT/c_cluster_thermal_ipa3d_dl/add_to_submit_file.txt
new file mode 100644
index 0000000000000000000000000000000000000000..19439a7678e164092aacfea016d57e4018e80baf
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_dl/add_to_submit_file.txt
@@ -0,0 +1,12 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#---------------------------------
+
+
+wait
diff --git a/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/VARIABLES b/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/VARIABLES
new file mode 100644
index 0000000000000000000000000000000000000000..fe5ae23a33f7e08e5bb3b28b4d89f5a0b2840476
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/VARIABLES
@@ -0,0 +1,9 @@
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400") 
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export dom1=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17")
+
+export dom2=("18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01") ##("02" "03" "04")
diff --git a/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/s_runs1.sh b/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/s_runs1.sh
new file mode 100755
index 0000000000000000000000000000000000000000..dc2938352c573b17322773840d3c9f6e5a663339
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/s_runs1.sh
@@ -0,0 +1,31 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+###SBATCH --mem=800000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname='ipa3d'
+ssource='thermal'
+#timee="${time_array[0]}"
+#timee=$1
+num='01'
+#num=$2
+
+echo "Job started @ $(date)"
+for tt in "${time_array[@]}"
+do
+for dm in "${dom[@]}"
+do	
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname}_dl/${ffname}_${tt}_${dm}_isim${num}/libsetup.inp &
+done
+done
+wait
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/step1_makeInpFiles.sh b/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/step1_makeInpFiles.sh
new file mode 100755
index 0000000000000000000000000000000000000000..5f8aa9483488d1fa882a8e1078fb6348ba14100e
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/step1_makeInpFiles.sh
@@ -0,0 +1,154 @@
+#!/usr/bin/env bash
+
+. VARIABLES
+
+pathh=$(cd ../../../ && pwd)
+ffname='ipa3d'
+
+echo
+echo "Shell script sucessfully running!"
+echo
+
+echo
+echo "BUILD FOLDER NAMES AND INPUT FILES"
+echo
+
+#First go out of run folder:
+cd ..
+
+SCRIPTDIR=$(pwd)
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+    CLOUD_WC_FILE_INP=''$pathh'/wc3d_dl_'$dm'_202201'$it'33Z.dat'
+    CLOUD_IC_FILE_INP=''$pathh'/ic3d_dl_'$dm'_202201'$it'33Z.dat'
+    ZVALUES="$(head -n2 ${CLOUD_WC_FILE_INP} | tail -n1 | cut -d " " -f 3-)"
+
+      for isim in "${isim_array[@]}"
+      do
+      echo
+      echo "isim = " $isim
+      if [ $isim == 01 ]
+      then
+         ice_p='fu'
+         ice_h='#ic_habit rough-aggregate'
+      elif [ $isim == 02 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit ghm'
+      elif [ $isim == 03 ]
+      then
+         ice_p='baum_v36'
+         ice_h='ic_habit solid-column'
+      else
+         ice_p='baum_v36'
+         ice_h='ic_habit rough-aggregate'
+      fi 	 
+      # Create libradtran output file name:
+      OUT_FILE=$ffname'_'$it'_'$dm'_isim'$isim'.out'
+
+      echo 'OUT_FILE = ' $OUT_FILE
+
+      FOLDER_NAME=$ffname'_'$it'_'$dm'_isim'$isim
+
+      mkdir $FOLDER_NAME
+
+      cd $FOLDER_NAME
+
+      echo 'Entering folder name = ' $FOLDER_NAME
+      # Create output folder for results:
+      mkdir output
+
+INP_FILE_NAME='libsetup.inp'
+LIBRAD="/home/b/b381185/libRadtran"
+
+############################
+# CREATE INPUT FILE HERE !!!
+cat > $INP_FILE_NAME << EOF
+data_files_path $LIBRAD/data/
+atmosphere_file ${pathh}/atmosphere_mean_${dm}_202201${it}33Z.dat
+
+albedo 0.009000000000000008
+source thermal  
+mol_abs_param Fu   
+wavelength_index 7  18   
+output_process sum   
+
+# gas profiles
+#mixing_ratio CO2 348
+#mixing_ratio O2 209460
+#mixing_ratio CH4 1.650 
+#mixing_ratio N2O 0.396
+
+mixing_ratio F11 0.0
+mixing_ratio F12 0.0
+mixing_ratio F22 0.0
+mixing_ratio NO2 0.0
+
+mol_modify O4 0.0 DU
+mol_modify BRO 0.0 DU
+mol_modify OCLO 0.0 DU
+mol_modify HCHO 0.0 DU
+mol_modify SO2 0.0 DU
+mol_modify CO 0.0 DU
+mol_modify N2 0.0 DU
+
+#surface
+albedo_library IGBP
+brdf_rpv_type 17
+
+rte_solver rodents
+ipa_3d
+
+heating_rate layer_fd
+mc_backward_output heat K_per_day
+
+mc_sample_grid 41 33 1.9 3.3
+
+mc_basename $SCRIPTDIR/$FOLDER_NAME/output/$OUT_FILE
+
+wc_properties hu interpolate
+ic_properties $ice_p interpolate
+$ice_h
+
+wc_file 3D $CLOUD_WC_FILE_INP
+ic_file 3D $CLOUD_IC_FILE_INP
+
+quiet
+EOF
+### END OF INPUT FILE ###########
+#################################
+cat >ther_HR.sh<<EOF
+#! /bin/bash
+
+#SBATCH --account=bb1135
+#SBATCH --job-name=mystic_dom01.run
+#SBATCH --partition=shared
+#SBATCH --nodes=1
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#SBATCH --exclusive
+
+cd $(pwd)
+echo "Job started @ \$(date)"
+$LIBRAD/bin/uvspec < $INP_FILE_NAME
+echo "Job finished @ \$(date)"
+EOF
+
+        #Leave mysti/mcipa folder:
+        cd ..
+
+        done #over isim
+
+done # over dom
+done # over itime
+
+echo
+echo 
+echo 'The END'
diff --git a/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/submit_runs.sh b/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/submit_runs.sh
new file mode 100755
index 0000000000000000000000000000000000000000..21637528ea5ef932d16c174eb55cf819186a3041
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_dl/ccThermal/submit_runs.sh
@@ -0,0 +1,15 @@
+#!/usr/bin/env bash
+
+#-------------------------------------------
+. VARIABLES
+
+for tt in "${time_array[@]}"
+do
+for ss in "${isim_array[@]}"
+do
+   echo "submitting runs for = " $tt'_'$ss
+
+   sbatch s_runs1.sh $tt $ss
+
+done 
+done
diff --git a/offlineRT/c_cluster_thermal_ipa3d_dl/find_missing_simulations.sh b/offlineRT/c_cluster_thermal_ipa3d_dl/find_missing_simulations.sh
new file mode 100755
index 0000000000000000000000000000000000000000..086f35bc2fe65d973dc41811af4484c941f260d9
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_ipa3d_dl/find_missing_simulations.sh
@@ -0,0 +1,39 @@
+#!/usr/bin/env bash
+
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400")
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01")
+
+pathh=$(pwd)
+#----------------------------------------------------------------------------------------
+ffname1='ipa3d'
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+for isim in "${isim_array[@]}"
+do
+
+if [ -n "$(ls -A $ffname1'_'$it'_'$dm'_isim'$isim/output 2>/dev/null)" ]
+then
+  echo "file exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+else
+  echo "file does not exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+  #count=(($i + 1))
+  # let's put these into a file for rerun
+  echo "/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/${ffname1}_${it}_${dm}_isim${isim}/libsetup.inp &" >> submit_ipa3d.sh
+
+fi
+
+done
+done
+done
+
+#-----------------------
diff --git a/offlineRT/c_cluster_thermal_mystic/add_to_submit_file.txt b/offlineRT/c_cluster_thermal_mystic/add_to_submit_file.txt
new file mode 100644
index 0000000000000000000000000000000000000000..6aa3cba2250c5ec60bfca8db6b889e3d1db067fb
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_mystic/add_to_submit_file.txt
@@ -0,0 +1,12 @@
+#!/bin/bash
+#SBATCH --partition=interactive
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+#SBATCH --mem=256000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#---------------------------------
+
+
+wait
diff --git a/offlineRT/c_cluster_thermal_mystic/ccThermal/VARIABLES b/offlineRT/c_cluster_thermal_mystic/ccThermal/VARIABLES
new file mode 100644
index 0000000000000000000000000000000000000000..24817247af7017b8984cd21bb0624442b439cd45
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_mystic/ccThermal/VARIABLES
@@ -0,0 +1,7 @@
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400") 
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01" "02" "03" "04" "05" "06" "07" "08" "09" "10")
+
+
diff --git a/offlineRT/c_cluster_thermal_mystic/ccThermal/s_runs1.sh b/offlineRT/c_cluster_thermal_mystic/ccThermal/s_runs1.sh
new file mode 100755
index 0000000000000000000000000000000000000000..cfeb6010fe3a07f5c17e19495ce26a9e41ba0139
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_mystic/ccThermal/s_runs1.sh
@@ -0,0 +1,31 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+###SBATCH --mem=256000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname1='mystic'
+ffname2='mysti'
+ssource='thermal'
+timee=$1
+num=$2
+
+echo "Job started @ $(date)"
+
+for dm in "${dom[@]}"
+do
+
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname1}/${ffname2}_${timee}_${dm}_isim${num}/libsetup.inp &
+
+done
+wait
+
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_thermal_mystic/ccThermal/s_runs2.sh b/offlineRT/c_cluster_thermal_mystic/ccThermal/s_runs2.sh
new file mode 100755
index 0000000000000000000000000000000000000000..9463699a30ae5624368beb9f0b428dc7545d3fc8
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_mystic/ccThermal/s_runs2.sh
@@ -0,0 +1,31 @@
+#!/bin/bash
+#SBATCH --partition=compute
+#SBATCH --account=bb1135
+#SBATCH --nodes=1
+###SBATCH --mem=256000
+#SBATCH --exclusive
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+
+
+#---------------------------------
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+ffname1='mystic'
+ffname2='mcipa'
+ssource='thermal'
+timee=$1
+num=$2
+
+echo "Job started @ $(date)"
+
+for dm in "${dom[@]}"
+do
+
+/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/c_cluster_${ssource}_${ffname1}/${ffname2}_${timee}_${dm}_isim${num}/libsetup.inp &
+
+done
+wait
+
+echo "Job finished @ $(date)"
diff --git a/offlineRT/c_cluster_thermal_mystic/ccThermal/step1_makeInpFiles.sh b/offlineRT/c_cluster_thermal_mystic/ccThermal/step1_makeInpFiles.sh
new file mode 100755
index 0000000000000000000000000000000000000000..54e8d304bbf53385876fc1633df14469afe640a3
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_mystic/ccThermal/step1_makeInpFiles.sh
@@ -0,0 +1,161 @@
+#!/usr/bin/env bash
+
+. VARIABLES
+
+IDENT=$1
+
+pathh=$(cd ../../../ && pwd)
+ffname='mysti'
+
+if [ "$IDENT" == "mcipa" ]
+then
+    MCIPA="mc_ipa"
+    ffname='mcipa'
+fi
+
+echo
+echo "Shell script sucessfully running!"
+echo
+
+echo
+echo "BUILD FOLDER NAMES AND INPUT FILES"
+echo
+
+#First go out of run folder:
+cd ..
+
+SCRIPTDIR=$(pwd)
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+    CLOUD_WC_FILE_INP=''$pathh'/wc3d_'$dm'_202201'$it'33Z.dat'
+    CLOUD_IC_FILE_INP=''$pathh'/ic3d_'$dm'_202201'$it'33Z.dat'
+    ZVALUES="$(head -n2 ${CLOUD_WC_FILE_INP} | tail -n1 | cut -d " " -f 3-)"
+
+      for isim in "${isim_array[@]}"
+      do
+      echo
+      echo "isim = " $isim
+      #if [ $isim == 01 ]
+      #then
+      #   ice_p='fu'
+      #   ice_h='#ic_habit rough-aggregate'
+      #elif [ $isim == 02 ]
+      #then
+      #   ice_p='baum_v36'
+      #   ice_h='ic_habit ghm'
+      #elif [ $isim == 03 ]
+      #then
+      #   ice_p='baum_v36'
+      #   ice_h='ic_habit solid-column'
+      #else
+      #   ice_p='baum_v36'
+      #   ice_h='ic_habit rough-aggregate'
+      #fi 	 
+      # Create libradtran output file name:
+      OUT_FILE=$ffname'_'$it'_'$dm'_isim'$isim'.out'
+
+      echo 'OUT_FILE = ' $OUT_FILE
+
+      FOLDER_NAME=$ffname'_'$it'_'$dm'_isim'$isim
+
+      mkdir $FOLDER_NAME
+
+      cd $FOLDER_NAME
+
+      echo 'Entering folder name = ' $FOLDER_NAME
+      # Create output folder for results:
+      mkdir output
+
+INP_FILE_NAME='libsetup.inp'
+LIBRAD="/home/b/b381185/libRadtran"
+
+############################
+# CREATE INPUT FILE HERE !!!
+cat > $INP_FILE_NAME << EOF
+data_files_path $LIBRAD/data/
+atmosphere_file ${pathh}/atmosphere_mean_${dm}_202201${it}33Z.dat
+
+albedo 0.009000000000000008
+source thermal  
+mol_abs_param Fu   
+wavelength_index 7  18   
+output_process sum   
+
+# gas profiles
+#mixing_ratio CO2 348
+#mixing_ratio O2 209460
+#mixing_ratio CH4 1.650 
+#mixing_ratio N2O 0.396
+
+mixing_ratio F11 0.0
+mixing_ratio F12 0.0
+mixing_ratio F22 0.0
+mixing_ratio NO2 0.0
+
+mol_modify O4 0.0 DU
+mol_modify BRO 0.0 DU
+mol_modify OCLO 0.0 DU
+mol_modify HCHO 0.0 DU
+mol_modify SO2 0.0 DU
+mol_modify CO 0.0 DU
+mol_modify N2 0.0 DU
+
+#surface
+albedo_library IGBP
+brdf_rpv_type 17
+
+rte_solver mystic
+$MCIPA
+
+mc_forward_output heating K_per_day
+mc_photons 72500000
+
+mc_sample_grid 420 343 0.227 0.394
+
+mc_basename $SCRIPTDIR/$FOLDER_NAME/output/$OUT_FILE
+
+wc_properties hu interpolate
+ic_properties fu interpolate
+
+wc_file 3D $CLOUD_WC_FILE_INP
+ic_file 3D $CLOUD_IC_FILE_INP
+
+quiet
+EOF
+### END OF INPUT FILE ###########
+#################################
+cat >ther_HR.sh<<EOF
+#! /bin/bash
+
+#SBATCH --account=bb1135
+#SBATCH --job-name=mystic_dom01.run
+#SBATCH --partition=shared
+#SBATCH --nodes=1
+#SBATCH --output=libther.sh.%j.out
+#SBATCH --error=libther.sh.%j.err
+#SBATCH --exclusive
+
+cd $(pwd)
+echo "Job started @ \$(date)"
+$LIBRAD/bin/uvspec < $INP_FILE_NAME
+echo "Job finished @ \$(date)"
+EOF
+
+        #Leave mysti/mcipa folder:
+        cd ..
+
+        done #over isim
+
+done # over dom
+done # over itime
+
+echo
+echo 
+echo 'The END'
diff --git a/offlineRT/c_cluster_thermal_mystic/ccThermal/submit_runs.sh b/offlineRT/c_cluster_thermal_mystic/ccThermal/submit_runs.sh
new file mode 100755
index 0000000000000000000000000000000000000000..6452e686c08d695e0bfeddcb9098a2ac42e840a3
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_mystic/ccThermal/submit_runs.sh
@@ -0,0 +1,16 @@
+#!/usr/bin/env bash
+
+#-------------------------------------------
+. VARIABLES
+
+for tt in "${time_array[@]}"
+do
+for ss in "${isim_array[@]}"
+do
+   echo "submitting runs for = " $tt'_'$ss
+
+   sbatch s_runs1.sh $tt $ss
+   sbatch s_runs2.sh $tt $ss
+
+done 
+done
diff --git a/offlineRT/c_cluster_thermal_mystic/find_missing_simulations.sh b/offlineRT/c_cluster_thermal_mystic/find_missing_simulations.sh
new file mode 100755
index 0000000000000000000000000000000000000000..2623b6927aa33b794041f597355896a6bd42db07
--- /dev/null
+++ b/offlineRT/c_cluster_thermal_mystic/find_missing_simulations.sh
@@ -0,0 +1,68 @@
+#!/usr/bin/env bash
+
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400")
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01" "02" "03" "04" "05" "06" "07" "08" "09" "10")
+
+pathh=$(pwd)
+#----------------------------------------------------------------------------------------
+ffname1='mysti'
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+for isim in "${isim_array[@]}"
+do
+
+if [ -n "$(ls -A $ffname1'_'$it'_'$dm'_isim'$isim/output 2>/dev/null)" ]
+then
+  echo "file exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+else
+  echo "file does not exist (${ffname1}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+  #count=(($i + 1))
+  # let's put these into a file for rerun
+  echo "/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/${ffname1}_${it}_${dm}_isim${isim}/libsetup.inp &" >> submit_mysti.sh
+
+fi
+
+done
+done
+done
+
+#---------------------------- mcipa
+ffname2='mcipa'
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+
+for isim in "${isim_array[@]}"
+do
+
+if [ -n "$(ls -A $ffname2'_'$it'_'$dm'_isim'$isim/output 2>/dev/null)" ]
+then
+  echo "file exist (${ffname2}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+else
+  echo "file does not exist (${ffname2}_'${it}'_'${dm}'_isim'${isim}'.out.abs.spc')"
+  #count=(($i + 1))
+  # let's put these into a file for rerun
+  echo "/home/b/b381185/libRadtran/bin/uvspec < ${pathh}/${ffname2}_${it}_${dm}_isim${isim}/libsetup.inp &" >> submit_mcipa.sh
+
+fi
+
+done
+done
+done
+
+#-----------------------
diff --git a/offlineRT/convert_libradtran_data_to_netcdf_ipa3d.py b/offlineRT/convert_libradtran_data_to_netcdf_ipa3d.py
new file mode 100644
index 0000000000000000000000000000000000000000..81bd13a3878f635eb6f672efb08a3607b7accf65
--- /dev/null
+++ b/offlineRT/convert_libradtran_data_to_netcdf_ipa3d.py
@@ -0,0 +1,194 @@
+# Behrooz Keshtgar, KIT
+
+# loading modules
+
+import xarray as xr
+import numpy as np
+import pandas as pd
+import warnings
+warnings.filterwarnings("ignore")
+
+
+print('start converting ipa3d outputs to netcdf files')
+# Dictionary for loading simulations
+simdict = {
+         'LC1-LES-471x667km-lon25-lat40-300m-0006' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+         'LC1-LES-471x667km-lon30-lat53-300m-0005' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+         'LC1-LES-471x667km-lon40-lat44-300m-0003' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+         'LC1-LES-471x667km-lon50-lat48-300m-0004' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+          }
+
+# here change the simulation path
+simm = list(simdict.keys())[0]
+path = '/work/bb1135/icon_output/'+simm+'/input4libradtran/libradtran/'
+#---------------------------------------
+# number of model layers
+height = np.arange(0,150,1)
+nz = 150
+nx = 420
+ny = 343
+# cutting boundaries
+nx2 = 344
+ny2 = 281
+
+# solver names for clear-sky hr
+solvers_name=['rodents'] 
+k = 0
+#---------------------------------------
+# loop over time steps
+for time in ['05T1000','05T1030','05T1100','05T1130','05T1200','05T1230','05T1300','05T1330','05T1400']:   
+    # loop over solvers
+    for solver in ['ipa3d']:
+        # loop over sources (thermal/solar)
+        for source in ['thermal','solar']:
+            # loop over sim types
+            for nsim in ['01','02','03','04']: # 01: ICE FU, 02: ICE Baum-ghm, 03: ICE Baum-sc, 04: ICE Baum-rg
+                print('working on datasets for:', time+'/'+solver+'/'+source+'/'+nsim)
+                # list conatining paths to the results
+                simpath = []
+                # list holding names of the files
+                sims=[]
+                # list holding names of the files for clear-sky results
+                sims2=[]
+                # loop over subdomains
+                for dom in range(1,37):
+                    siim = solvers_name[k]+'_'+time+'_'+str(dom)+'_isim'+nsim
+                    sims2.append(siim)
+                    sim = solver+'_'+time+'_'+str(dom)+'_isim'+nsim
+                    simpath.append(path+'c_cluster_'+source+'_'+solver+'/'+sim+'/')
+                    sims.append(sim)
+
+                # empty list to store heating rates
+                thr_3d = []
+                for i in range (len(simpath)):
+                    print('****loading ascii file:',sims[i])
+                    hr = (pd.read_table(simpath[i]+'output/'+sims[i]+'.out.abs.spc',header=None,sep='\s+',usecols=[4]).values)[:,0]
+                    #---------------------------------------
+                    temp = np.zeros((nx, ny, nz))
+                    for i in range(nz):
+                        temp[:,:,i] = np.reshape(hr[nx*ny*i:nx*ny*i+nx*ny],(nx,ny))
+                    # cutting boundaries
+                    temp2 = temp[38:420-38,31:343-31,:]
+                    thr_3d.append(temp2*1.4) #  *cp/cv factor in ICON
+
+                # loading clear sky heating rates
+                cshr = []
+                print('****loading clear-sky hr')
+                for i in range (len(sims2)):
+                    cshr_t = (pd.read_table(path+source+'_clear_sky/output/'+sims2[i].replace('isim'+nsim,'cs_hr')+'.out',header=None,sep='\s+',usecols=[1]).values)[:,0]
+                    tmp = cshr_t*1.4 # x cp/cv factor in ICON
+                    # broadcasting to 3D structure
+                    tmp2 = np.broadcast_to(tmp,(nx2,ny2,nz)).copy()
+                    cshr.append(tmp2)
+
+                print('****create dataset')
+                # creating datasets
+
+                ds = xr.Dataset({
+                #'half_levels': xr.DataArray(z1d, dims=('height')),
+                'ddt_dom1': xr.DataArray(thr_3d[0], dims=('lon','lat','height')),
+                'ddt_dom2': xr.DataArray(thr_3d[1], dims=('lon','lat','height')),
+                'ddt_dom3': xr.DataArray(thr_3d[2], dims=('lon','lat','height')),
+                'ddt_dom4': xr.DataArray(thr_3d[3], dims=('lon','lat','height')),
+                'ddt_dom5': xr.DataArray(thr_3d[4], dims=('lon','lat','height')),
+                'ddt_dom6': xr.DataArray(thr_3d[5], dims=('lon','lat','height')),
+                'ddt_dom7': xr.DataArray(thr_3d[6], dims=('lon','lat','height')),
+                'ddt_dom8': xr.DataArray(thr_3d[7], dims=('lon','lat','height')),
+                'ddt_dom9': xr.DataArray(thr_3d[8], dims=('lon','lat','height')),
+                'ddt_dom10': xr.DataArray(thr_3d[9], dims=('lon','lat','height')),
+                'ddt_dom11': xr.DataArray(thr_3d[10], dims=('lon','lat','height')),
+                'ddt_dom12': xr.DataArray(thr_3d[11], dims=('lon','lat','height')),
+                'ddt_dom13': xr.DataArray(thr_3d[12], dims=('lon','lat','height')),
+                'ddt_dom14': xr.DataArray(thr_3d[13], dims=('lon','lat','height')),
+                'ddt_dom15': xr.DataArray(thr_3d[14], dims=('lon','lat','height')),
+                'ddt_dom16': xr.DataArray(thr_3d[15], dims=('lon','lat','height')),
+                'ddt_dom17': xr.DataArray(thr_3d[16], dims=('lon','lat','height')),
+                'ddt_dom18': xr.DataArray(thr_3d[17], dims=('lon','lat','height')),
+                'ddt_dom19': xr.DataArray(thr_3d[18], dims=('lon','lat','height')),
+                'ddt_dom20': xr.DataArray(thr_3d[19], dims=('lon','lat','height')),
+                'ddt_dom21': xr.DataArray(thr_3d[20], dims=('lon','lat','height')),
+                'ddt_dom22': xr.DataArray(thr_3d[21], dims=('lon','lat','height')),
+                'ddt_dom23': xr.DataArray(thr_3d[22], dims=('lon','lat','height')),
+                'ddt_dom24': xr.DataArray(thr_3d[23], dims=('lon','lat','height')),
+                'ddt_dom25': xr.DataArray(thr_3d[24], dims=('lon','lat','height')),
+                'ddt_dom26': xr.DataArray(thr_3d[25], dims=('lon','lat','height')),
+                'ddt_dom27': xr.DataArray(thr_3d[26], dims=('lon','lat','height')),
+                'ddt_dom28': xr.DataArray(thr_3d[27], dims=('lon','lat','height')),
+                'ddt_dom29': xr.DataArray(thr_3d[28], dims=('lon','lat','height')),
+                'ddt_dom30': xr.DataArray(thr_3d[29], dims=('lon','lat','height')),
+                'ddt_dom31': xr.DataArray(thr_3d[30], dims=('lon','lat','height')),
+                'ddt_dom32': xr.DataArray(thr_3d[31], dims=('lon','lat','height')),
+                'ddt_dom33': xr.DataArray(thr_3d[32], dims=('lon','lat','height')),
+                'ddt_dom34': xr.DataArray(thr_3d[33], dims=('lon','lat','height')),
+                'ddt_dom35': xr.DataArray(thr_3d[34], dims=('lon','lat','height')),
+                'ddt_dom36': xr.DataArray(thr_3d[35], dims=('lon','lat','height')),
+
+                'ddt_cs_dom1': xr.DataArray(cshr[0], dims=('lon','lat','height')),
+                'ddt_cs_dom2': xr.DataArray(cshr[1], dims=('lon','lat','height')),
+                'ddt_cs_dom3': xr.DataArray(cshr[2], dims=('lon','lat','height')),
+                'ddt_cs_dom4': xr.DataArray(cshr[3], dims=('lon','lat','height')),
+                'ddt_cs_dom5': xr.DataArray(cshr[4], dims=('lon','lat','height')),
+                'ddt_cs_dom6': xr.DataArray(cshr[5], dims=('lon','lat','height')),
+                'ddt_cs_dom7': xr.DataArray(cshr[6], dims=('lon','lat','height')),
+                'ddt_cs_dom8': xr.DataArray(cshr[7], dims=('lon','lat','height')),
+                'ddt_cs_dom9': xr.DataArray(cshr[8], dims=('lon','lat','height')),
+                'ddt_cs_dom10': xr.DataArray(cshr[9], dims=('lon','lat','height')),
+                'ddt_cs_dom11': xr.DataArray(cshr[10], dims=('lon','lat','height')),
+                'ddt_cs_dom12': xr.DataArray(cshr[11], dims=('lon','lat','height')),
+                'ddt_cs_dom13': xr.DataArray(cshr[12], dims=('lon','lat','height')),
+                'ddt_cs_dom14': xr.DataArray(cshr[13], dims=('lon','lat','height')),
+                'ddt_cs_dom15': xr.DataArray(cshr[14], dims=('lon','lat','height')),
+                'ddt_cs_dom16': xr.DataArray(cshr[15], dims=('lon','lat','height')),
+                'ddt_cs_dom17': xr.DataArray(cshr[16], dims=('lon','lat','height')),
+                'ddt_cs_dom18': xr.DataArray(cshr[17], dims=('lon','lat','height')),
+                'ddt_cs_dom19': xr.DataArray(cshr[18], dims=('lon','lat','height')),
+                'ddt_cs_dom20': xr.DataArray(cshr[19], dims=('lon','lat','height')),
+                'ddt_cs_dom21': xr.DataArray(cshr[20], dims=('lon','lat','height')),
+                'ddt_cs_dom22': xr.DataArray(cshr[21], dims=('lon','lat','height')),
+                'ddt_cs_dom23': xr.DataArray(cshr[22], dims=('lon','lat','height')),
+                'ddt_cs_dom24': xr.DataArray(cshr[23], dims=('lon','lat','height')),
+                'ddt_cs_dom25': xr.DataArray(cshr[24], dims=('lon','lat','height')),
+                'ddt_cs_dom26': xr.DataArray(cshr[25], dims=('lon','lat','height')),
+                'ddt_cs_dom27': xr.DataArray(cshr[26], dims=('lon','lat','height')),
+                'ddt_cs_dom28': xr.DataArray(cshr[27], dims=('lon','lat','height')),
+                'ddt_cs_dom29': xr.DataArray(cshr[28], dims=('lon','lat','height')),
+                'ddt_cs_dom30': xr.DataArray(cshr[29], dims=('lon','lat','height')),
+                'ddt_cs_dom31': xr.DataArray(cshr[30], dims=('lon','lat','height')),
+                'ddt_cs_dom32': xr.DataArray(cshr[31], dims=('lon','lat','height')),
+                'ddt_cs_dom33': xr.DataArray(cshr[32], dims=('lon','lat','height')),
+                'ddt_cs_dom34': xr.DataArray(cshr[33], dims=('lon','lat','height')),
+                'ddt_cs_dom35': xr.DataArray(cshr[34], dims=('lon','lat','height')),
+                'ddt_cs_dom36': xr.DataArray(cshr[35], dims=('lon','lat','height')),
+
+                },
+                coords={"lat": (["lat"], np.arange(0,ny2)),
+                                "lon": (["lon"], np.arange(0,nx2)),
+                                'height':(["height"],np.arange(0,nz)),})
+
+                # let's merge them together here
+                print('****merging subdomains into the big domain')
+                ds_j = []
+                ds_j2 = []
+                for j in range(1,37,6):
+                    ds_i = xr.concat([ds['ddt_dom'+str(j)+''],ds['ddt_dom'+str(j+1)+''],
+                                      ds['ddt_dom'+str(j+2)+''],ds['ddt_dom'+str(j+3)+''],
+                                      ds['ddt_dom'+str(j+4)+''],ds['ddt_dom'+str(j+5)+'']],dim='lon')
+                    ds_j.append(ds_i)
+
+                    ds_i2 = xr.concat([ds['ddt_cs_dom'+str(j)+''],ds['ddt_cs_dom'+str(j+1)+''],
+                                      ds['ddt_cs_dom'+str(j+2)+''],ds['ddt_cs_dom'+str(j+3)+''],
+                                      ds['ddt_cs_dom'+str(j+4)+''],ds['ddt_cs_dom'+str(j+5)+'']],dim='lon')
+                    ds_j2.append(ds_i2)
+                # concat along y dim
+                ds_f1 = xr.concat([ds_j[0],ds_j[1],ds_j[2],ds_j[3],ds_j[4],ds_j[5]],dim='lat')
+                ds_f2 = xr.concat([ds_j2[0],ds_j2[1],ds_j2[2],ds_j2[3],ds_j2[4],ds_j2[5]],dim='lat')
+                ds_1 = xr.merge([ds_f1,ds_f2])
+                ds_1['ddt_radlw'] = ds_1['ddt_dom1'] - ds_1['ddt_cs_dom1']
+                ds_1.coords['lon'] = np.arange(0,344*6)
+                ds_1.coords['lat'] = np.arange(0,281*6)
+                ds_1 = ds_1.expand_dims('time')
+                print('****save')
+                ds_1.to_netcdf('/work/bb1135/icon_output/'+simm+'/input4libradtran/libradtran/output_netcdf/ds_librad2_'+time+'_'+source+'_'+solver+'_'+nsim+'.nc')
+                print('----------------------------------------------------------------------------')
+
+print('finished')
diff --git a/offlineRT/convert_libradtran_data_to_netcdf_ipa3d_nwp.py b/offlineRT/convert_libradtran_data_to_netcdf_ipa3d_nwp.py
new file mode 100644
index 0000000000000000000000000000000000000000..48a4efd20dcf6f89a0834b5fb37a226504b05a7c
--- /dev/null
+++ b/offlineRT/convert_libradtran_data_to_netcdf_ipa3d_nwp.py
@@ -0,0 +1,196 @@
+# Behrooz Keshtgar, KIT
+
+# loading modules
+
+import xarray as xr
+import numpy as np
+import pandas as pd
+import warnings
+warnings.filterwarnings("ignore")
+
+
+print('start converting ipa3d outputs to netcdf files')
+# path to the libradtran results
+
+# Dictionary for loading simulations
+simdict = {
+         'LC1-LES-471x667km-lon25-lat40-300m-0006' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+         'LC1-LES-471x667km-lon30-lat53-300m-0005' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+         'LC1-LES-471x667km-lon40-lat44-300m-0003' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+         'LC1-LES-471x667km-lon50-lat48-300m-0004' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+          }
+
+# here change the simulation path
+simm = list(simdict.keys())[0]
+path = '/work/bb1135/icon_output/'+simm+'/input4libradtran/libradtran/'
+#---------------------------------------
+# number of model layers
+height = np.arange(0,150,1)
+nz = 150
+nx = 41
+ny = 33
+
+# solver names for clear-sky hr
+solvers_name=['rodents']
+k = 0
+
+# NWP clouds with or without fractions, change between suffix (cg/dl)
+cldt = 'cg'
+#---------------------------------------
+# loop over time steps
+for time in ['05T1000','05T1030','05T1100','05T1130','05T1200','05T1230','05T1300','05T1330','05T1400']:
+    # loop over solvers
+    for solver in ['ipa3d']:
+        # loop over sources (thermal/solar)
+        for source in ['thermal','solar']:
+            # loop over sim types
+            for nsim in ['01']: # 01: ICE FU, 02: ICE Baum-ghm, 03: ICE Baum-sc, 04: ICE Baum-rg
+                print('working on datasets for:', time+'/'+solver+'/'+source+'/'+nsim)
+                # list conatining paths to the results
+                simpath = []
+                # list holding names of the files
+                sims=[]
+                # list holding names of the files for clear-sky results
+                sims2=[]
+                # loop over subdomains
+                for dom in range(1,37):
+                    siim = solvers_name[k]+'_'+time+'_'+str(dom)+'_isim'+nsim
+                    sims2.append(siim)
+                    sim = solver+'_'+time+'_'+str(dom)+'_isim'+nsim
+                    simpath.append(path+'c_cluster_'+source+'_'+solver+'_'+cldt+'/'+sim+'/')
+                    sims.append(sim)
+
+                # empty list to store heating rates
+                thr_3d = []
+                for i in range (len(simpath)):
+                    print('****loading ascii file:',sims[i])
+                    #---------------------------------------
+                    hr = (pd.read_table(simpath[i]+'output/'+sims[i]+'.out.abs.spc',header=None,sep='\s+',usecols=[4]).values)[:,0]
+                    #print('****creating 3d structure for file:',sims[i])
+                    temp = np.zeros((nx, ny, nz))
+                    for i in range(nz):
+                        temp[:,:,i] = np.reshape(hr[nx*ny*i:nx*ny*i+nx*ny],(nx,ny))
+                    thr_3d.append(temp*1.4) #  *cp/cv factor in ICON
+
+                # loading clear sky heating rates
+                cshr = []
+                print('****loading clear-sky hr')
+                for i in range (len(sims2)):
+                    cshr_t = (pd.read_table(path+source+'_clear_sky/output/'+sims2[i].replace('isim'+nsim,'cs_hr')+'.out',header=None,sep='\s+',usecols=[1]).values)[:,0]
+                    tmp = cshr_t*1.4 # x cp/cv factor in ICON
+                    # broadcasting to 3D structure
+                    tmp2 = np.broadcast_to(tmp,(nx,ny,nz)).copy()
+                    cshr.append(tmp2)
+
+                print('****create dataset')
+                # creating datasets
+                z1d    = (pd.read_table(path+source+'_clear_sky/output/'+sims2[i].replace('isim'+nsim,'cs_hr')+'.out',header=None,sep='\s+',usecols=[0]).values)[:,0]
+
+                ds = xr.Dataset({
+                'half_levels': xr.DataArray(z1d, dims=('height')),
+                'ddt_dom1': xr.DataArray(thr_3d[0], dims=('lon','lat','height')),
+                'ddt_dom2': xr.DataArray(thr_3d[1], dims=('lon','lat','height')),
+                'ddt_dom3': xr.DataArray(thr_3d[2], dims=('lon','lat','height')),
+                'ddt_dom4': xr.DataArray(thr_3d[3], dims=('lon','lat','height')),
+                'ddt_dom5': xr.DataArray(thr_3d[4], dims=('lon','lat','height')),
+                'ddt_dom6': xr.DataArray(thr_3d[5], dims=('lon','lat','height')),
+                'ddt_dom7': xr.DataArray(thr_3d[6], dims=('lon','lat','height')),
+                'ddt_dom8': xr.DataArray(thr_3d[7], dims=('lon','lat','height')),
+                'ddt_dom9': xr.DataArray(thr_3d[8], dims=('lon','lat','height')),
+                'ddt_dom10': xr.DataArray(thr_3d[9], dims=('lon','lat','height')),
+                'ddt_dom11': xr.DataArray(thr_3d[10], dims=('lon','lat','height')),
+                'ddt_dom12': xr.DataArray(thr_3d[11], dims=('lon','lat','height')),
+                'ddt_dom13': xr.DataArray(thr_3d[12], dims=('lon','lat','height')),
+                'ddt_dom14': xr.DataArray(thr_3d[13], dims=('lon','lat','height')),
+                'ddt_dom15': xr.DataArray(thr_3d[14], dims=('lon','lat','height')),
+                'ddt_dom16': xr.DataArray(thr_3d[15], dims=('lon','lat','height')),
+                'ddt_dom17': xr.DataArray(thr_3d[16], dims=('lon','lat','height')),
+                'ddt_dom18': xr.DataArray(thr_3d[17], dims=('lon','lat','height')),
+                'ddt_dom19': xr.DataArray(thr_3d[18], dims=('lon','lat','height')),
+                'ddt_dom20': xr.DataArray(thr_3d[19], dims=('lon','lat','height')),
+                'ddt_dom21': xr.DataArray(thr_3d[20], dims=('lon','lat','height')),
+                'ddt_dom22': xr.DataArray(thr_3d[21], dims=('lon','lat','height')),
+                'ddt_dom23': xr.DataArray(thr_3d[22], dims=('lon','lat','height')),
+                'ddt_dom24': xr.DataArray(thr_3d[23], dims=('lon','lat','height')),
+                'ddt_dom25': xr.DataArray(thr_3d[24], dims=('lon','lat','height')),
+                'ddt_dom26': xr.DataArray(thr_3d[25], dims=('lon','lat','height')),
+                'ddt_dom27': xr.DataArray(thr_3d[26], dims=('lon','lat','height')),
+                'ddt_dom28': xr.DataArray(thr_3d[27], dims=('lon','lat','height')),
+                'ddt_dom29': xr.DataArray(thr_3d[28], dims=('lon','lat','height')),
+                'ddt_dom30': xr.DataArray(thr_3d[29], dims=('lon','lat','height')),
+                'ddt_dom31': xr.DataArray(thr_3d[30], dims=('lon','lat','height')),
+                'ddt_dom32': xr.DataArray(thr_3d[31], dims=('lon','lat','height')),
+                'ddt_dom33': xr.DataArray(thr_3d[32], dims=('lon','lat','height')),
+                'ddt_dom34': xr.DataArray(thr_3d[33], dims=('lon','lat','height')),
+                'ddt_dom35': xr.DataArray(thr_3d[34], dims=('lon','lat','height')),
+                'ddt_dom36': xr.DataArray(thr_3d[35], dims=('lon','lat','height')),
+
+                'ddt_cs_dom1': xr.DataArray(cshr[0], dims=('lon','lat','height')),
+                'ddt_cs_dom2': xr.DataArray(cshr[1], dims=('lon','lat','height')),
+                'ddt_cs_dom3': xr.DataArray(cshr[2], dims=('lon','lat','height')),
+                'ddt_cs_dom4': xr.DataArray(cshr[3], dims=('lon','lat','height')),
+                'ddt_cs_dom5': xr.DataArray(cshr[4], dims=('lon','lat','height')),
+                'ddt_cs_dom6': xr.DataArray(cshr[5], dims=('lon','lat','height')),
+                'ddt_cs_dom7': xr.DataArray(cshr[6], dims=('lon','lat','height')),
+                'ddt_cs_dom8': xr.DataArray(cshr[7], dims=('lon','lat','height')),
+                'ddt_cs_dom9': xr.DataArray(cshr[8], dims=('lon','lat','height')),
+                'ddt_cs_dom10': xr.DataArray(cshr[9], dims=('lon','lat','height')),
+                'ddt_cs_dom11': xr.DataArray(cshr[10], dims=('lon','lat','height')),
+                'ddt_cs_dom12': xr.DataArray(cshr[11], dims=('lon','lat','height')),
+                'ddt_cs_dom13': xr.DataArray(cshr[12], dims=('lon','lat','height')),
+                'ddt_cs_dom14': xr.DataArray(cshr[13], dims=('lon','lat','height')),
+                'ddt_cs_dom15': xr.DataArray(cshr[14], dims=('lon','lat','height')),
+                'ddt_cs_dom16': xr.DataArray(cshr[15], dims=('lon','lat','height')),
+                'ddt_cs_dom17': xr.DataArray(cshr[16], dims=('lon','lat','height')),
+                'ddt_cs_dom18': xr.DataArray(cshr[17], dims=('lon','lat','height')),
+                'ddt_cs_dom19': xr.DataArray(cshr[18], dims=('lon','lat','height')),
+                'ddt_cs_dom20': xr.DataArray(cshr[19], dims=('lon','lat','height')),
+                'ddt_cs_dom21': xr.DataArray(cshr[20], dims=('lon','lat','height')),
+                'ddt_cs_dom22': xr.DataArray(cshr[21], dims=('lon','lat','height')),
+                'ddt_cs_dom23': xr.DataArray(cshr[22], dims=('lon','lat','height')),
+                'ddt_cs_dom24': xr.DataArray(cshr[23], dims=('lon','lat','height')),
+                'ddt_cs_dom25': xr.DataArray(cshr[24], dims=('lon','lat','height')),
+                'ddt_cs_dom26': xr.DataArray(cshr[25], dims=('lon','lat','height')),
+                'ddt_cs_dom27': xr.DataArray(cshr[26], dims=('lon','lat','height')),
+                'ddt_cs_dom28': xr.DataArray(cshr[27], dims=('lon','lat','height')),
+                'ddt_cs_dom29': xr.DataArray(cshr[28], dims=('lon','lat','height')),
+                'ddt_cs_dom30': xr.DataArray(cshr[29], dims=('lon','lat','height')),
+                'ddt_cs_dom31': xr.DataArray(cshr[30], dims=('lon','lat','height')),
+                'ddt_cs_dom32': xr.DataArray(cshr[31], dims=('lon','lat','height')),
+                'ddt_cs_dom33': xr.DataArray(cshr[32], dims=('lon','lat','height')),
+                'ddt_cs_dom34': xr.DataArray(cshr[33], dims=('lon','lat','height')),
+                'ddt_cs_dom35': xr.DataArray(cshr[34], dims=('lon','lat','height')),
+                'ddt_cs_dom36': xr.DataArray(cshr[35], dims=('lon','lat','height')),
+
+                },
+                coords={"lat": (["lat"], np.arange(0,ny)),
+                                "lon": (["lon"], np.arange(0,nx)),
+                                'height':(["height"],np.arange(0,nz)),})
+
+                # let's merge them together here
+                print('****merging subdomains into the big domain')
+                ds_j = []
+                ds_j2 = []
+                for j in range(1,37,6):
+                    ds_i = xr.concat([ds['ddt_dom'+str(j)+''],ds['ddt_dom'+str(j+1)+''],
+                                      ds['ddt_dom'+str(j+2)+''],ds['ddt_dom'+str(j+3)+''],
+                                      ds['ddt_dom'+str(j+4)+''],ds['ddt_dom'+str(j+5)+'']],dim='lon')
+                    ds_j.append(ds_i)
+
+                    ds_i2 = xr.concat([ds['ddt_cs_dom'+str(j)+''],ds['ddt_cs_dom'+str(j+1)+''],
+                                      ds['ddt_cs_dom'+str(j+2)+''],ds['ddt_cs_dom'+str(j+3)+''],
+                                      ds['ddt_cs_dom'+str(j+4)+''],ds['ddt_cs_dom'+str(j+5)+'']],dim='lon')
+                    ds_j2.append(ds_i2)
+                # concat along y dim
+                ds_f1 = xr.concat([ds_j[0],ds_j[1],ds_j[2],ds_j[3],ds_j[4],ds_j[5]],dim='lat')
+                ds_f2 = xr.concat([ds_j2[0],ds_j2[1],ds_j2[2],ds_j2[3],ds_j2[4],ds_j2[5]],dim='lat')
+                ds_1 = xr.merge([ds_f1,ds_f2])
+                ds_1['ddt_radlw'] = ds_1['ddt_dom1'] - ds_1['ddt_cs_dom1']
+                ds_1.coords['lon'] = np.arange(0,41*6)
+                ds_1.coords['lat'] = np.arange(0,33*6)
+                ds_1 = ds_1.expand_dims('time')
+                print('****save')
+                ds_1.to_netcdf('/work/bb1135/icon_output/'+simm+'/input4libradtran/libradtran/output_netcdf/ds_librad_'+cldt+'_'+time+'_'+source+'_'+solver+'_'+nsim+'.nc')
+                print('----------------------------------------------------------------------------')
+
+print('finished')
diff --git a/offlineRT/convert_libradtran_data_to_netcdf_mystic.py b/offlineRT/convert_libradtran_data_to_netcdf_mystic.py
new file mode 100644
index 0000000000000000000000000000000000000000..2fb892b147ff4d00e26f90f8406e8afc0f242404
--- /dev/null
+++ b/offlineRT/convert_libradtran_data_to_netcdf_mystic.py
@@ -0,0 +1,204 @@
+# Behrooz Keshtgar, KIT
+
+# loading modules
+
+import xarray as xr
+import numpy as np
+import pandas as pd
+import warnings
+warnings.filterwarnings("ignore")
+
+
+print('start converting mystic/mcipa outputs to netcdf files')
+# Dictionary for loading simulations
+simdict = {
+         'LC1-LES-471x667km-lon25-lat40-300m-0006' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+         'LC1-LES-471x667km-lon30-lat53-300m-0005' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+         'LC1-LES-471x667km-lon40-lat44-300m-0003' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+         'LC1-LES-471x667km-lon50-lat48-300m-0004' : {'res':'300m', 'radiation':4, 'mphy':4}, # Only cloud radiation
+          }
+
+# here change the simulation path
+simm = list(simdict.keys())[0]
+path = '/work/bb1135/icon_output/'+simm+'/input4libradtran/libradtran/'
+#---------------------------------------
+# number of model layers
+height = np.arange(0,150,1)
+nz = 150
+nx = 420 
+ny = 343 
+# cutting boundaries
+nx2 = 344
+ny2 = 281
+
+# solver name for clear-sky hr
+solvers_name=['mystic']
+k = 0
+#---------------------------------------
+# loop over time steps
+for time in ['05T1000','05T1030','05T1100','05T1130','05T1200','05T1230','05T1300','05T1330','05T1400']:
+    # loop over solvers
+    for solver in ['mysti','mcipa']:
+        # loop over sources (thermal/solar)
+        for source in ['thermal','solar']:
+            print('working on datasets for:', time+'/'+solver+'/'+source)
+            # loop over subdomains
+            thr1_3d=[]
+            # list holding names of the files for clear-sky results
+            sims2=[]
+            for dom in range(1,37): 
+                # list conatining paths to the results
+                simpath = []
+                # list holding names of the files
+                sims=[]
+                
+                #----------
+                siim = solvers_name[k]+'_'+time+'_'+str(dom)+'_isim01'
+                sims2.append(siim)
+                #----------
+                # loop over number of simulations
+                for nsim in ['01','02','03','04','05','06','07','08','09','10']:
+                    sim = solver+'_'+time+'_'+str(dom)+'_isim'+nsim
+                    simpath.append(path+'c_cluster_'+source+'_mystic/'+sim+'/')
+                    sims.append(sim)
+          
+                # empty list to store heating rates
+                thr_3d = []
+                for i in range (len(simpath)):
+                    print('****loading ascii file:',sims[i])
+                    hr = (pd.read_table(simpath[i]+'output/'+sims[i]+'.out.abs.spc',header=None,sep='\s+',usecols=[4]).values)[:,0]
+                # change to numpy array    
+                tmpp = np.array(thr_3d)
+                # take a mean over 10 simulations
+                temp = (tmpp[0]+tmpp[1]+tmpp[2]+tmpp[3]+tmpp[4]+tmpp[5]+tmpp[6]+tmpp[7]+tmpp[8]+tmpp[9])/10
+                #print('****creating 3d structure')
+                temp1 = np.zeros((nx, ny, nz))
+                for i in range(nz):
+                    temp1[:,:,i] = np.reshape(temp[nx*ny*i:nx*ny*i+nx*ny],(nx,ny))
+                # cutting boundaries
+                temp2 = temp1[38:420-38,31:343-31,:]    
+                thr1_3d.append(temp2*1.4) #  *cp/cv factor in ICON
+                del tmpp
+
+            # loading clear sky heating rates
+            cshr = []
+            print('****loading clear-sky hr')
+            for i in range (len(sims2)):
+                cshr_t = (pd.read_table(path+source+'_clear_sky/output/'+sims2[i].replace('isim'+nsim,'cs_hr')+'.out',header=None,sep='\s+',usecols=[1]).values)[:,0]
+                tmp = cshr_t*1.4 # x cp/cv factor in ICON 
+                # broadcasting to 3D structure
+                tmp2 = np.broadcast_to(tmp,(nx2,ny2,nz)).copy()
+                cshr.append(tmp2)
+                
+            print('****create dataset')
+            # creating datasets
+
+            ds = xr.Dataset({
+            #'half_levels': xr.DataArray(z1d, dims=('height')),
+            'ddt_dom1': xr.DataArray(thr1_3d[0], dims=('lon','lat','height')),
+            'ddt_dom2': xr.DataArray(thr1_3d[1], dims=('lon','lat','height')),
+            'ddt_dom3': xr.DataArray(thr1_3d[2], dims=('lon','lat','height')),
+            'ddt_dom4': xr.DataArray(thr1_3d[3], dims=('lon','lat','height')),
+            'ddt_dom5': xr.DataArray(thr1_3d[4], dims=('lon','lat','height')),
+            'ddt_dom6': xr.DataArray(thr1_3d[5], dims=('lon','lat','height')),
+            'ddt_dom7': xr.DataArray(thr1_3d[6], dims=('lon','lat','height')),
+            'ddt_dom8': xr.DataArray(thr1_3d[7], dims=('lon','lat','height')),
+            'ddt_dom9': xr.DataArray(thr1_3d[8], dims=('lon','lat','height')),
+            'ddt_dom10': xr.DataArray(thr1_3d[9], dims=('lon','lat','height')),
+            'ddt_dom11': xr.DataArray(thr1_3d[10], dims=('lon','lat','height')),
+            'ddt_dom12': xr.DataArray(thr1_3d[11], dims=('lon','lat','height')),
+            'ddt_dom13': xr.DataArray(thr1_3d[12], dims=('lon','lat','height')),
+            'ddt_dom14': xr.DataArray(thr1_3d[13], dims=('lon','lat','height')),
+            'ddt_dom15': xr.DataArray(thr1_3d[14], dims=('lon','lat','height')),
+            'ddt_dom16': xr.DataArray(thr1_3d[15], dims=('lon','lat','height')),
+            'ddt_dom17': xr.DataArray(thr1_3d[16], dims=('lon','lat','height')),
+            'ddt_dom18': xr.DataArray(thr1_3d[17], dims=('lon','lat','height')),
+            'ddt_dom19': xr.DataArray(thr1_3d[18], dims=('lon','lat','height')),
+            'ddt_dom20': xr.DataArray(thr1_3d[19], dims=('lon','lat','height')),
+            'ddt_dom21': xr.DataArray(thr1_3d[20], dims=('lon','lat','height')),
+            'ddt_dom22': xr.DataArray(thr1_3d[21], dims=('lon','lat','height')),
+            'ddt_dom23': xr.DataArray(thr1_3d[22], dims=('lon','lat','height')),
+            'ddt_dom24': xr.DataArray(thr1_3d[23], dims=('lon','lat','height')),
+            'ddt_dom25': xr.DataArray(thr1_3d[24], dims=('lon','lat','height')),
+            'ddt_dom26': xr.DataArray(thr1_3d[25], dims=('lon','lat','height')),
+            'ddt_dom27': xr.DataArray(thr1_3d[26], dims=('lon','lat','height')),
+            'ddt_dom28': xr.DataArray(thr1_3d[27], dims=('lon','lat','height')),
+            'ddt_dom29': xr.DataArray(thr1_3d[28], dims=('lon','lat','height')),
+            'ddt_dom30': xr.DataArray(thr1_3d[29], dims=('lon','lat','height')),
+            'ddt_dom31': xr.DataArray(thr1_3d[30], dims=('lon','lat','height')),
+            'ddt_dom32': xr.DataArray(thr1_3d[31], dims=('lon','lat','height')),
+            'ddt_dom33': xr.DataArray(thr1_3d[32], dims=('lon','lat','height')),
+            'ddt_dom34': xr.DataArray(thr1_3d[33], dims=('lon','lat','height')),
+            'ddt_dom35': xr.DataArray(thr1_3d[34], dims=('lon','lat','height')),
+            'ddt_dom36': xr.DataArray(thr1_3d[35], dims=('lon','lat','height')),
+
+            'ddt_cs_dom1': xr.DataArray(cshr[0], dims=('lon','lat','height')),
+            'ddt_cs_dom2': xr.DataArray(cshr[1], dims=('lon','lat','height')),
+            'ddt_cs_dom3': xr.DataArray(cshr[2], dims=('lon','lat','height')),
+            'ddt_cs_dom4': xr.DataArray(cshr[3], dims=('lon','lat','height')),
+            'ddt_cs_dom5': xr.DataArray(cshr[4], dims=('lon','lat','height')),
+            'ddt_cs_dom6': xr.DataArray(cshr[5], dims=('lon','lat','height')),
+            'ddt_cs_dom7': xr.DataArray(cshr[6], dims=('lon','lat','height')),
+            'ddt_cs_dom8': xr.DataArray(cshr[7], dims=('lon','lat','height')),
+            'ddt_cs_dom9': xr.DataArray(cshr[8], dims=('lon','lat','height')),
+            'ddt_cs_dom10': xr.DataArray(cshr[9], dims=('lon','lat','height')),
+            'ddt_cs_dom11': xr.DataArray(cshr[10], dims=('lon','lat','height')),
+            'ddt_cs_dom12': xr.DataArray(cshr[11], dims=('lon','lat','height')),
+            'ddt_cs_dom13': xr.DataArray(cshr[12], dims=('lon','lat','height')),
+            'ddt_cs_dom14': xr.DataArray(cshr[13], dims=('lon','lat','height')),
+            'ddt_cs_dom15': xr.DataArray(cshr[14], dims=('lon','lat','height')),
+            'ddt_cs_dom16': xr.DataArray(cshr[15], dims=('lon','lat','height')),
+            'ddt_cs_dom17': xr.DataArray(cshr[16], dims=('lon','lat','height')),
+            'ddt_cs_dom18': xr.DataArray(cshr[17], dims=('lon','lat','height')),
+            'ddt_cs_dom19': xr.DataArray(cshr[18], dims=('lon','lat','height')),
+            'ddt_cs_dom20': xr.DataArray(cshr[19], dims=('lon','lat','height')),
+            'ddt_cs_dom21': xr.DataArray(cshr[20], dims=('lon','lat','height')),
+            'ddt_cs_dom22': xr.DataArray(cshr[21], dims=('lon','lat','height')),
+            'ddt_cs_dom23': xr.DataArray(cshr[22], dims=('lon','lat','height')),
+            'ddt_cs_dom24': xr.DataArray(cshr[23], dims=('lon','lat','height')),
+            'ddt_cs_dom25': xr.DataArray(cshr[24], dims=('lon','lat','height')),
+            'ddt_cs_dom26': xr.DataArray(cshr[25], dims=('lon','lat','height')),
+            'ddt_cs_dom27': xr.DataArray(cshr[26], dims=('lon','lat','height')),
+            'ddt_cs_dom28': xr.DataArray(cshr[27], dims=('lon','lat','height')),
+            'ddt_cs_dom29': xr.DataArray(cshr[28], dims=('lon','lat','height')),
+            'ddt_cs_dom30': xr.DataArray(cshr[29], dims=('lon','lat','height')),
+            'ddt_cs_dom31': xr.DataArray(cshr[30], dims=('lon','lat','height')),
+            'ddt_cs_dom32': xr.DataArray(cshr[31], dims=('lon','lat','height')),
+            'ddt_cs_dom33': xr.DataArray(cshr[32], dims=('lon','lat','height')),
+            'ddt_cs_dom34': xr.DataArray(cshr[33], dims=('lon','lat','height')),
+            'ddt_cs_dom35': xr.DataArray(cshr[34], dims=('lon','lat','height')),
+            'ddt_cs_dom36': xr.DataArray(cshr[35], dims=('lon','lat','height')),
+
+            },
+            coords={"lat": (["lat"], np.arange(0,ny2)), 
+                            "lon": (["lon"], np.arange(0,nx2)),
+                            'height':(["height"],np.arange(0,nz)),})
+
+            # let's merge them together here
+            print('****merging subdomains into the big domain')
+            ds_j = []
+            ds_j2 = []
+            # concat along x dim
+            for j in range(1,37,6):
+                ds_i = xr.concat([ds['ddt_dom'+str(j)+''],ds['ddt_dom'+str(j+1)+''],
+                                  ds['ddt_dom'+str(j+2)+''],ds['ddt_dom'+str(j+3)+''],
+                                  ds['ddt_dom'+str(j+4)+''],ds['ddt_dom'+str(j+5)+'']],dim='lon')
+                ds_j.append(ds_i)
+
+                ds_i2 = xr.concat([ds['ddt_cs_dom'+str(j)+''],ds['ddt_cs_dom'+str(j+1)+''],
+                                  ds['ddt_cs_dom'+str(j+2)+''],ds['ddt_cs_dom'+str(j+3)+''],
+                                  ds['ddt_cs_dom'+str(j+4)+''],ds['ddt_cs_dom'+str(j+5)+'']],dim='lon')
+                ds_j2.append(ds_i2)
+            # concat along y dim    
+            ds_f1 = xr.concat([ds_j[0],ds_j[1],ds_j[2],ds_j[3],ds_j[4],ds_j[5]],dim='lat')
+            ds_f2 = xr.concat([ds_j2[0],ds_j2[1],ds_j2[2],ds_j2[3],ds_j2[4],ds_j2[5]],dim='lat')
+            ds_1 = xr.merge([ds_f1,ds_f2])
+            ds_1['ddt_radlw'] = ds_1['ddt_dom1'] - ds_1['ddt_cs_dom1']
+            ds_1.coords['lon'] = np.arange(0,344*6)
+            ds_1.coords['lat'] = np.arange(0,281*6)
+            ds_1 = ds_1.expand_dims('time')
+            print('****save')
+            ds_1.to_netcdf('/work/bb1135/icon_output/'+simm+'/input4libradtran/libradtran/output_netcdf/ds_librad_'+time+'_'+source+'_'+solver+'_01.nc')
+            print('----------------------------------------------------------------------------')
+
+print('finished')            
diff --git a/offlineRT/input_for_libradtran.ipynb b/offlineRT/input_for_libradtran.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..93de53b01e9b96f339482f4f7deffb6b3b250cbe
--- /dev/null
+++ b/offlineRT/input_for_libradtran.ipynb
@@ -0,0 +1,1114 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "-*- coding: iso-8859-15 -*-\n",
+    "\n",
+    "   ## I C O N 2 M Y S T I C\n",
+    "   \n",
+    "   ## Generating input files for LibRadTran from ICON-LEM output files\n",
+    "   \n",
+    "   Original code by Leonard Scheck (LMU), 2021.1\n",
+    "   \n",
+    "   Modified by Behrooz Keshtgar (KIT), 2023:\n",
+    "   \n",
+    "   + Convert the standalone Python script into an interactive Jupyter notebook and include only the necessary functions.\n",
+    "   + Generate atmospheric background profiles from ICON-LEM output files\n",
+    "   + Generate NWP homogeneous grid-box clouds and homogeneous clouds with cloud fraction from LEM clouds at the specified resolution\n",
+    "   + Derive atmospheric gas concentration profiles following the routines implemented in the ICON model\n",
+    "   + Remap ICON output from triangular grid to regular lat/lon grid and save them as Netcdf files\n",
+    "\n",
+    "----------------------------------------------------------   \n",
+    "Required inputs to run this Jupyter notebook are \n",
+    "\n",
+    "+ ICON grid file\n",
+    "+ ICON output files including cloud water and ice mass content and their effective radii (tot_qc_dia,tot_qi_dia,reff_qc,reff_qi)\n",
+    "+ ICON output files including atmospheric variables: temperature, pressure, density, ozone density, specific humidity, height and pressure at half levels (temp,pres,rho,O3,qv,z_ifc,pres_ifc)\n",
+    "+ Other variables from ICON simulation including solar zenith angle (sza) "
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "## Libraries"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "import os, sys, time, pickle, netCDF4\n",
+    "from time import perf_counter\n",
+    "from matplotlib import pyplot as plt\n",
+    "import numpy as np\n",
+    "from numba import jit, float64, float32, int32\n",
+    "import xarray as xr\n",
+    "import os\n",
+    "from scipy import interpolate\n",
+    "import pandas as pd\n",
+    "import matplotlib as mpl\n",
+    "import math"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "## Constants"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# these values are extracted from ICON, gas concentrations values are those used in the baroclinic life cycle simulation\n",
+    "vpp_ch4=np.array([1.25e-01,  683.0, -1.43])\n",
+    "vpp_n2o=np.array([1.20e-02, 1395.0, -1.43])\n",
+    "\n",
+    "vmr_ch4 = 1650.0e-09\n",
+    "vmr_n2o = 396.0e-09\n",
+    "\n",
+    "vmr_o2  = 0.20946  # Volume Mixing Ratio (mol mol–1), mol/mol *1e6 -> ppm\n",
+    "vmr_co2 = 348.0e-6 # 0.000348 mol/mol : (348 ppmv)\n",
+    "\n",
+    "# we need to change their unit to cm**-3\n",
+    "# I used this converter https://www.lenntech.com/calculators/ppm/converter-parts-per-million.htm to convert them\n",
+    "o2  = 2.95*1e-1  # kg/m3\n",
+    "co2 = 647*1e-6   # kg/m3\n",
+    "\n",
+    "R = 287.04 # J⋅kg−1⋅K−1 # dry gas heat constant from ICON\n",
+    "k = 1.38064852*1e-23 #m2 kg s-2 K-1 #stephan-Boltzman constant\n",
+    "\n",
+    "''' # From ICON-ecRad\n",
+    "! Molar masses (g mol-1) of dry air and the various gases above\n",
+    "  real(jprb), parameter :: IAirMolarMass = 28.970\n",
+    "  real(jprb), parameter, dimension(0:NMaxGases) :: IGasMolarMass = (/ &\n",
+    "       & 0.0_jprb,        & ! Gas not present\n",
+    "       & 18.0152833_jprb, & ! H2O\n",
+    "       & 44.011_jprb,     & ! CO2\n",
+    "       & 47.9982_jprb,    & ! O3\n",
+    "       & 44.013_jprb,     & ! N2O\n",
+    "       & 28.0101_jprb,    & ! CO\n",
+    "       & 16.043_jprb,     & ! CH4\n",
+    "       & 31.9988_jprb,    & ! O2\n",
+    "       & 137.3686_jprb,   & ! CFC11\n",
+    "       & 120.914_jprb,    & ! CFC12\n",
+    "       & 86.469_jprb,     & ! HCFC22\n",
+    "       & 153.823_jprb,    & ! CCl4\n",
+    "       & 46.0055_jprb /)    ! NO2\n",
+    "'''       \n",
+    "avo   = 6.02214179e23 # !> [1/mo]    Avogadro constant\n",
+    "\n",
+    "m_d   = 4.810580854822417e-26  # molecular mass of dry air kg\n",
+    "m_o3  = 7.970287262200115e-26  # molecular mass of ozone kg\n",
+    "m_h2o = 2.991508172585458e-26  # molecular mass of h2o kg\n",
+    "m_co2 = 7.308197238577473e-26  # molecular mass of co2 kg\n",
+    "m_o2  = 5.3135241466744e-26    # molecular mass of o2 kg\n",
+    "\n",
+    "# density of water and ice from ICON\n",
+    "rhoh2o =  1000  # kg/m³\n",
+    "rhoice =  916.7 # kg/m³"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "## Functions"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# ICON routine for interpolating a variable from full level to half level\n",
+    "def interpolate_hl(var1,var2,var3):\n",
+    "    var_hl = var3*0.0\n",
+    "    for jk in range(1,150):\n",
+    "        var_hl[jk,:] = (var1[jk-1,:] * var2[jk-1,:]  * ( var2[jk,:] - var3[jk,:] ) + \n",
+    "                   var1[jk,:] * var2[jk,:] * ( var3[jk,:] - var2[jk-1,:])) / ( var3[jk,:] * (var2[jk,:] - var2[jk-1,:] ) )\n",
+    "        \n",
+    "    var_hl[150,:] = var1[149,:] + (var3[150,:] - var2[149,:]) * (var1[149-1,:] - var1[149,:])/(var2[149-1,:] - var2[149,:])\n",
+    "    \n",
+    "    var_hl[0,:] = var1[0,:] + ( var3[0,:] - var2[0,:] ) * (var1[0,:] - var_hl[1,:]) / (var2[0,:] - var3[1,:] )\n",
+    "        \n",
+    "    return var_hl\n",
+    "\n",
+    "# ICON routine for deriving gas profile according to a tanh profile\n",
+    "def gas_profile(vmr_gas,pres,xp):\n",
+    "    gas_pro = pres*0.0\n",
+    "    zx_m = (vmr_gas+xp[0]*vmr_gas)*0.5\n",
+    "    zx_d = (vmr_gas-xp[0]*vmr_gas)*0.5\n",
+    "    \n",
+    "    gas_pro = (1.-(zx_d/zx_m)*np.tanh(np.log(pres/xp[1]) /xp[2])) * zx_m\n",
+    "    \n",
+    "    return gas_pro\n",
+    "#-----------------------------------------------------------------------------------------------------------\n",
+    "# function for writing ascii files without cloud fraction\n",
+    "def write_mc_cloud_file( beta, reff, dx, dy, zcoord, fname='wc3d.dat', flag=2 ) :\n",
+    "    \"\"\"Write 3D extinction coefficient + effective radius file in libradtran mc_albedo_ascii file format\"\"\"\n",
+    "\n",
+    "    print('write_mc_cloud_file', fname, beta.max(), reff.max(), beta.shape, dx, dy, zcoord[1]-zcoord[0])\n",
+    "\n",
+    "    # flag = 2 --> beta [1/km],          reff [micron] are supplied for each cell\n",
+    "    # flag = 3 --> water content [g/m3], reff [micron] are supplied for each cell\n",
+    "    \n",
+    "    a = np.swapaxes(beta, 0, 1)\n",
+    "    b = np.swapaxes(reff, 0, 1)    \n",
+    "    nx, ny, nz = a.shape #lon,lat,lev\n",
+    "    with open(fname,'w') as f:\n",
+    "        f.write(\"%d %d %d %d\\n\" % (nx,ny,nz,flag))\n",
+    "        f.write(\"%f %f \" % (dx/1e3,dy/1e3))\n",
+    "        f.write( ' '.join([ '%f'%z for z in zcoord/1e3 ]) + '\\n' )\n",
+    "        \n",
+    "        ## For radiative transfer with MYSTIC, it would be okay to skip the points in the domain where there is no cloud.\n",
+    "        \n",
+    "        #f.writelines( [ (\"%d %d %d %f %f\\n\" % (i+1,j+1,k+1,a[i,j,k],b[i,j,k]) if beta[j,i,k] > 1e-9 else '') \\\n",
+    "        #                for i in range(nx) for j in range(ny) for k in range(nz) ] )\n",
+    "        \n",
+    "        #f.writelines( [ (\"%d %d %d %e %d\\n\" % (1,1,k+1,1e-20,20)) \\\n",
+    "        #                for k in range(nz) ] )\n",
+    "        \n",
+    "        ## Keeping everything  \n",
+    "        f.writelines( [ (\"%d %d %d %f %f\\n\" % (i+1,j+1,k+1,a[i,j,k],b[i,j,k])) \\\n",
+    "                        for i in range(nx) for j in range(ny) for k in range(nz) ] )\n",
+    "\n",
+    "# function for writing ascii files with cloud fraction (Behrooz Keshtgar, KIT)       \n",
+    "def write_mc_cloud_file_frac(beta, reff, frac, dx, dy, zcoord, fname='wc3d.dat', flag=2 ) : \n",
+    "    \"\"\"Write 3D extinction coefficient + effective radius file in libradtran mc_albedo_ascii file format\"\"\"\n",
+    "\n",
+    "    print('write_mc_cloud_file_frac')\n",
+    "\n",
+    "    a = np.swapaxes(beta, 0, 1)\n",
+    "    b = np.swapaxes(reff, 0, 1)\n",
+    "    c = np.swapaxes(frac, 0, 1)\n",
+    "    nx, ny, nz = a.shape #lon,lat,lev\n",
+    "    with open(fname,'w') as f:\n",
+    "        f.write(\"%d %d %d %d\\n\" % (nx,ny,nz,flag))\n",
+    "        f.write(\"%f %f \" % (dx/1e3,dy/1e3))\n",
+    "        f.write( ' '.join([ '%f'%z for z in zcoord/1e3 ]) + '\\n' )\n",
+    "        \n",
+    "        f.writelines( [ (\"%d %d %d %f %f %f\\n\" % (i+1,j+1,k+1,a[i,j,k],b[i,j,k],c[i,j,k])) \\\n",
+    "                        for i in range(nx) for j in range(ny) for k in range(nz) ] )\n",
+    "        \n",
+    "def get_subdomain( lat_min, lat_max, lon_min, lon_max, gridfile, datafile1, datafile2, datafile3, datafile4) :\n",
+    "    \"\"\"\n",
+    "    Return model variables and grid for the cells with cells centers in the specified lat-lon rectangle\n",
+    "    :param lat_min:  minimum latitude  [rad]\n",
+    "    :param lat_max:  maximum latitude  [rad]\n",
+    "    :param lon_min:  minimum longitude [rad]\n",
+    "    :param lon_max:  maximum longitude [rad]\n",
+    "    :param args:     argparse object\n",
+    "    :param verbose:  be more verbose\n",
+    "    :return:         grid dictionary, model variables dictionary\n",
+    "    \"\"\"\n",
+    "\n",
+    "    # determine subdomain grid .........................................................................................\n",
+    "\n",
+    "    print('    [icon_subdomain.get_subdomain_data] opening horizontal grid file %s...')\n",
+    "    #grid_full = netCDF4.Dataset( args.gridfile,'r')\n",
+    "    grid_full = gridfile\n",
+    "    #if args.verbose :\n",
+    "    print('    full model grid : %f < lon <%f, %f < lat < %f' % (\n",
+    "    np.array(grid_full.variables['vlon']).min()*180/np.pi, np.array(grid_full.variables['vlon']).max()*180/np.pi,\n",
+    "    np.array(grid_full.variables['vlat']).min()*180/np.pi, np.array(grid_full.variables['vlat']).max()*180/np.pi ))\n",
+    "\n",
+    "    print('*** constructing subdomain grid'); starttime = perf_counter()\n",
+    "\n",
+    "    nvertices_full = len(grid_full.dimensions['vertex'])\n",
+    "    ncells_full    = len(grid_full.dimensions['cell'])\n",
+    "\n",
+    "    cell_indices             = np.zeros( ncells_full,    dtype=np.int32 ) - 1\n",
+    "    vertex_indices           = np.zeros( nvertices_full, dtype=np.int32 ) - 1\n",
+    "    translate_cell_indices   = np.zeros( ncells_full,    dtype=np.int32 ) - 1\n",
+    "    translate_vertex_indices = np.zeros( nvertices_full, dtype=np.int32 ) - 1\n",
+    "\n",
+    "    print('    determining subdomain indices for %f <= lat < %f, %f <= lon < %f' % (lat_min*180/np.pi, lat_max*180/np.pi, lon_min*180/np.pi, lon_max*180/np.pi))\n",
+    "\n",
+    "    clon                = np.array(grid_full.variables['clon'])\n",
+    "    clat                = np.array(grid_full.variables['clat'])\n",
+    "    vlon                = np.array(grid_full.variables['vlon'])\n",
+    "    vlat                = np.array(grid_full.variables['vlat'])\n",
+    "    cell_area           = np.array(grid_full.variables['cell_area'])\n",
+    "    vertex_of_cell      = np.array(grid_full.variables['vertex_of_cell'])      - 1\n",
+    "    neighbor_cell_index = np.array(grid_full.variables['neighbor_cell_index']) - 1\n",
+    "    cells_of_vertex      = np.array(grid_full.variables['cells_of_vertex'])      - 1\n",
+    "    ncells, nvertices = subdomain_indices( clon, clat, vlon, vlat, vertex_of_cell,\n",
+    "                                           lat_min, lat_max, lon_min, lon_max,\n",
+    "                                           cell_indices, vertex_indices, translate_cell_indices, translate_vertex_indices )\n",
+    "    print('    subdomain contains %d of %d cells and %d of %d vertices' % ( ncells, ncells_full, nvertices, nvertices_full ))\n",
+    "    print('    sqrt(area) of first cell : {:.0f}m'.format(np.sqrt(cell_area[0])))\n",
+    "\n",
+    "    print('    converting horizontal grid...')\n",
+    "    grid = dict()\n",
+    "    grid['ncells']    = ncells\n",
+    "    grid['nvertices'] = nvertices\n",
+    "    grid['clat'] = clat[cell_indices[:ncells]]\n",
+    "    grid['clon'] = clon[cell_indices[:ncells]]\n",
+    "    grid['cell_area'] = cell_area[cell_indices[:ncells]]\n",
+    "    grid['vlat'] = vlat[vertex_indices[:nvertices]]\n",
+    "    grid['vlon'] = vlon[vertex_indices[:nvertices]]\n",
+    "    grid['neighbor_cell_index'] = translate_cell_indices[ neighbor_cell_index[:,cell_indices[:ncells]] ]\n",
+    "    grid['vertex_of_cell']      = translate_vertex_indices[ vertex_of_cell[:,cell_indices[:ncells]] ]\n",
+    "    grid['cells_of_vertex']      = translate_cell_indices[ cells_of_vertex[:,vertex_indices[:nvertices]] ]\n",
+    "    grid['full_grid_cell_indices'] = cell_indices[:ncells]\n",
+    "    print('*** subdomain grid construction took %f seconds' % (perf_counter() - starttime))\n",
+    "     \n",
+    "    ########################################################################################\n",
+    "    #modelstate1 = datafile1\n",
+    "    #modelstate2 = datafile2\n",
+    "    #print('    ...which is a NetCDF file...')\n",
+    "    #modelvars_available = list(modelstate1.variables.keys())\n",
+    "    #modelvars_available = modelvars_available+list(modelstate2.variables.keys()) #behrooz\n",
+    "    #nz_full = len(modelstate1.dimensions['height'])\n",
+    "    #ftype='netcdf'\n",
+    "    \n",
+    "    #########################################################################################\n",
+    "    num_file = [datafile1,datafile2,datafile3,datafile4]\n",
+    "    print('    opening model state files')\n",
+    "    if len(num_file) > 1 : # several files separated by ':' have been specified\n",
+    "        print('    ...which are actually several files...')\n",
+    "        #modelvarfiles = args.modelvarsfile.split(':')\n",
+    "        modelstates = []\n",
+    "        modelstates_variables = []\n",
+    "        modelvars_available = []\n",
+    "        for i in range(len(num_file)) :\n",
+    "            print('    --- opening file ', i, ' = ')\n",
+    "            modelstates.append(num_file[i])#(netCDF4.Dataset( mvf, 'r'))\n",
+    "            modelstates_variables.append(list(modelstates[-1].variables.keys()))\n",
+    "            print('                 which contains', modelstates_variables[-1])\n",
+    "            modelvars_available += modelstates_variables[-1]\n",
+    "\n",
+    "            if 'height' in modelstates[-1].dimensions :\n",
+    "                print('                 and has a dimension height of size', end=' ')\n",
+    "                nz_full = len(modelstates[-1].dimensions['height'])\n",
+    "                print(nz_full)\n",
+    "        modelstate = modelstates[0]\n",
+    "    #else :\n",
+    "    #    modelstate = netCDF4.Dataset( args.modelvarsfile, 'r')\n",
+    "    #    print('    ...which is a NetCDF file...')\n",
+    "    #    modelvars_available = list(modelstate.variables.keys())\n",
+    "    #    nz_full = len(modelstate.dimensions['height'])\n",
+    "    ftype='netcdf'\n",
+    "    \n",
+    "    ########\n",
+    "    print('    available variables : ', modelvars_available)\n",
+    "    print('    number of layers    : ', nz_full)\n",
+    "    #modelstates = None\n",
+    "    if modelstates is None : # only one model state file was specified\n",
+    "        modelstates = [ modelstate ]\n",
+    "        modelstates_variables = [ modelvars_available ]\n",
+    "\n",
+    "    # list available output times ......................................................................................\n",
+    "\n",
+    "    if ftype == 'netcdf' :\n",
+    "        outputtimes = modelstate.variables['time'][:]\n",
+    "        print('    output times available :')\n",
+    "        for ti, ot in enumerate(outputtimes) :\n",
+    "            ot_date = int(ot)\n",
+    "            ot_time = ot-ot_date\n",
+    "            ot_hour = int(ot_time*24.0)\n",
+    "            ot_min  = int((ot_time - ot_hour/24.0)*60.0)\n",
+    "            print('      (', ti, ') --- ',  ot_date, ot_hour, ot_min)\n",
+    "        print('    selected time index : ', 0)\n",
+    "    else :\n",
+    "        if 0 > 0 :\n",
+    "            raise ValueError( 'time index > 0 probably not yet supported for grib files...' )\n",
+    "\n",
+    "\n",
+    "    # read subdomain model variables ...................................................................................\n",
+    "    if grid['ncells'] <= 0 :\n",
+    "\n",
+    "        print('    subdomain does not contain any model grid cells... ', end=' ')\n",
+    "        modelvars = {}\n",
+    "\n",
+    "    else :\n",
+    "\n",
+    "        print('    extracting model data... ', end=' ')\n",
+    "\n",
+    "        # dimension check ..............................................................................................\n",
+    "\n",
+    "        if ftype == 'netcdf' :\n",
+    "            ncells_full_model    = len(modelstate.dimensions['ncells'])\n",
+    "            if ncells_full_model != ncells_full :\n",
+    "                print('ERROR: ncells mismatch between grid and model data', ncells_full_model, ncells_full)\n",
+    "        else :\n",
+    "            # we cannot check this easily for grib files -> assume everything is ok...\n",
+    "            ncells_full_model = ncells_full\n",
+    "\n",
+    "        # determine vertical part to be read ...........................................................................\n",
+    "\n",
+    "        nz = nz_full - 0\n",
+    "        #kl=0\n",
+    "        #kh=nz\n",
+    "        print('    using %d of %d layers... ' % ( nz, nz_full ))   \n",
+    "        \n",
+    "        # read variables ...............................................................................................\n",
+    "\n",
+    "        read_in_chunks = True # is faster...\n",
+    "\n",
+    "        modelvars = dict()\n",
+    "        #for v in ['z_ifc','tot_qc_dia','tot_qi_dia','rho','reff_qc_ecrad','reff_qi_ecrad','temp','pres','o3','tot_qv_dia',\n",
+    "        #          'ddt_temp_radlwnw','ddt_temp_radlwcs','ddt_temp_radswnw','ddt_temp_radswcs','pres_ifc','clc','cosz_bz','tsfctrad'] :\n",
+    "        for v in ['z_ifc','tot_qc_dia','tot_qi_dia','rho','reff_qc_ecrad','reff_qi_ecrad'] :    \n",
+    "            if '.' in v :\n",
+    "                vfile, vname = v.split('.')\n",
+    "                print('        - reading [{}->{}]'.format(vfile,vname), end=' ')\n",
+    "            else :\n",
+    "                vname = v\n",
+    "                vfile = v\n",
+    "                print('        - reading [{}]'.format(vname), end=' ')\n",
+    "\n",
+    "\n",
+    "            kl = 0\n",
+    "            if vname != 'z_ifc' :\n",
+    "                kh = kl + nz\n",
+    "            else :\n",
+    "                kh = kl + nz + 1\n",
+    "\n",
+    "            for istate in range(len(modelstates)) :\n",
+    "                if vfile in modelstates_variables[istate] or ftype == 'grib':\n",
+    "                    if len(modelstates) > 1 :\n",
+    "                        print('<file {}>'.format(istate), end=' ')\n",
+    "                    if ftype == 'netcdf' :\n",
+    "                        if read_in_chunks :\n",
+    "                            modelvars[vname] = read_part_of_variable( modelstates[istate], vfile, cell_indices[:ncells], time_index=0 )[:,...]\n",
+    "                        else :\n",
+    "                            modelvars[vname] = modelstates[istate].variables[vfile][0,:,cell_indices[:ncells]][:,...]\n",
+    "                    else :\n",
+    "                        if modelstates[istate][vfile].ndim > 1 :\n",
+    "                            modelvars[vname] = np.transpose( modelstates[istate][vfile][cell_indices[:ncells],:][...,:] )\n",
+    "                        else :\n",
+    "                            modelvars[vname] = np.transpose( modelstates[istate][vfile][cell_indices[:ncells]] )\n",
+    "\n",
+    "            if vname == 'CLC' and modelvars[vname].max() > 1:\n",
+    "                modelvars[vname] /= 100.0\n",
+    "\n",
+    "            print(' with shape ', modelvars[vname].shape)\n",
+    "\n",
+    "            \n",
+    "    print('    get_subdomain: done. ')    \n",
+    "\n",
+    "    return grid, modelvars      \n",
+    "    \n",
+    "def read_part_of_variable( modelvars_full, vname, ci, check=False, time_index=None, verbose=False ) :\n",
+    "\n",
+    "    if len(modelvars_full.variables[vname].shape) == 3 :\n",
+    "        has_timedim = True\n",
+    "    else :\n",
+    "        has_timedim = False\n",
+    "\n",
+    "    nlevels = modelvars_full.variables[vname].shape[-2]\n",
+    "    ncells = ci.size\n",
+    "    ncells_full = modelvars_full.variables[vname].shape[-1]\n",
+    "    modelvar_part = np.zeros( (nlevels,ncells) ) # omit leading time dimension    \n",
+    "\n",
+    "    starttime = perf_counter()\n",
+    "\n",
+    "    nchunks = 50\n",
+    "    ncells_chunk = ncells_full // nchunks\n",
+    "    for ic in range(nchunks) :\n",
+    "        index_min = ic*ncells_chunk\n",
+    "        index_max = (ic+1)*ncells_chunk\n",
+    "        if ic == nchunks-1 : index_max = ncells_full\n",
+    "\n",
+    "        idcs = np.where( (ci >= index_min) & (ci < index_max) )\n",
+    "        nrelevant = len(idcs[0])\n",
+    "        if verbose :\n",
+    "            print('   --- chunk %d [ %d <= index < %d ] : %d relevant' % (ic, index_min, index_max, nrelevant))\n",
+    "        if nrelevant > 0 :\n",
+    "            if has_timedim :\n",
+    "                chunk = np.array(modelvars_full.variables[vname][time_index,:,index_min:index_max])\n",
+    "            else :\n",
+    "                chunk = np.array(modelvars_full.variables[vname][:,index_min:index_max])\n",
+    "            chunk_idcs = np.array(ci[idcs],dtype=int)-index_min\n",
+    "            modelvar_part[:,np.asarray(idcs,dtype=int)[0,:]] = chunk[:,chunk_idcs]\n",
+    "\n",
+    "    print('   --- reading and distributing chunks took %f seconds...' % (perf_counter() - starttime))\n",
+    "\n",
+    "    if check :\n",
+    "        print('ok, checking...')\n",
+    "        if has_timedim :\n",
+    "            fullvar = np.array(modelvars_full.variables[vname][time_index,:,:])\n",
+    "        else :\n",
+    "            fullvar = np.array(modelvars_full.variables[vname][:,:])\n",
+    "        redvar = fullvar[:,ci]\n",
+    "        print('DEVIATION ', (modelvar_part-redvar).min(), (modelvar_part-redvar).max())\n",
+    "        fullvar = ''\n",
+    "        redvar = ''\n",
+    "\n",
+    "    return modelvar_part    \n",
+    "\n",
+    "@jit(nopython=True,nogil=True)\n",
+    "def subdomain_indices( clon, clat, vlon, vlat, vertex_of_cell, lat_min, lat_max, lon_min, lon_max,\n",
+    "                       cell_indices, vertex_indices, translate_cell_indices, translate_vertex_indices ) :\n",
+    "    \"\"\"Save the indices of the cells whose center lies within the given region\n",
+    "       and the indices of vertices forming these cells in cell_indices and vertex_indices.\n",
+    "       Returns number of cells and number of vertices.\"\"\"\n",
+    "\n",
+    "    icell = 0\n",
+    "    for i in range(clon.size) :\n",
+    "        if (clon[i] >= lon_min) and (clon[i] < lon_max) and (clat[i] >= lat_min) and (clat[i] < lat_max) :\n",
+    "          cell_indices[icell] = i\n",
+    "          translate_cell_indices[i] = icell\n",
+    "          icell += 1\n",
+    "          for ii in range(3) :\n",
+    "              if vertex_of_cell[ii,i] > -1 :\n",
+    "                  vertex_indices[ vertex_of_cell[ii,i] ] = 1 # mark as required\n",
+    "\n",
+    "    ivertex = 0\n",
+    "    for i in range(vlon.size) :\n",
+    "        if vertex_indices[i] > 0 :\n",
+    "            vertex_indices[ivertex] = i\n",
+    "            translate_vertex_indices[i] = ivertex\n",
+    "            ivertex += 1\n",
+    "\n",
+    "    return icell, ivertex\n",
+    "\n",
+    "def generate_latlon_grid( lat_min, lon_min, lat_max, lon_max, nlat, nlon, dim=1, first_dim='lon' ) :\n",
+    "    \"\"\"Generate regular latitude-longitude grid with the specified limits and resolution\"\"\"\n",
+    "\n",
+    "    r =    { 'lat_min':lat_min, 'lon_min':lon_min, 'lat_max':lat_max, 'lon_max':lon_max, 'nlat':nlat, 'nlon':nlon,\n",
+    "             'dlat':(lat_max-lat_min)/nlat, 'dlon':(lon_max-lon_min)/nlon,\n",
+    "             'lat':lat_min + (lat_max-lat_min)*np.arange(nlat+1)/float(nlat),\n",
+    "             'lon':lon_min + (lon_max-lon_min)*np.arange(nlon+1)/float(nlon) }\n",
+    "\n",
+    "    if dim == 2 : # create also coordinate 2d-fields\n",
+    "        if first_dim == 'lon' :\n",
+    "            lon2d, lat2d = np.meshgrid( r['lon'], r['lat'], sparse=False, indexing='ij' )\n",
+    "            # lon changes with first index, lat with second\n",
+    "            #print 'TESTLATLON lon2d 1 ', lon2d[0,0], lon2d[1,0], lon2d[0,1]\n",
+    "            #print 'TESTLATLON lat2d 1 ', lat2d[0,0], lat2d[1,0], lat2d[0,1]\n",
+    "        elif first_dim == 'lat' :\n",
+    "            lat2d, lon2d = np.meshgrid( r['lat'], r['lon'], sparse=False, indexing='ij' )\n",
+    "            # lat changes with first index, lon with second\n",
+    "            #print 'TESTLATLON lon2d 2 ', lon2d[0,0], lon2d[1,0], lon2d[0,1]\n",
+    "            #print 'TESTLATLON lat2d 2 ', lat2d[0,0], lat2d[1,0], lat2d[0,1]\n",
+    "        else :\n",
+    "            raise ValueError('generate_latlon_grid: I do not understand first_dim='+first_dim)\n",
+    "\n",
+    "        r.update({ 'lon2d':lon2d, 'lat2d':lat2d })\n",
+    "\n",
+    "    return r\n",
+    "\n",
+    "def tri2latlon( ll_grid, tri_grid, tri_var, method='fine', silent=False, nsearch=3, oversample=1 ) :\n",
+    "    \"\"\"\n",
+    "    Map variable defined on triangular grid onto regular lat-lon grid.\n",
+    "    The lat_min, lon_min values ll_grid correspond to the cell centers.\n",
+    "    \"\"\"\n",
+    "\n",
+    "    nlat, nlon = ll_grid['nlat'], ll_grid['nlon']\n",
+    "    latlon_var = np.zeros((nlat, nlon),dtype=np.float64)\n",
+    "        \n",
+    "    if oversample == 1 :\n",
+    "        latlon_hits = np.zeros((nlat, nlon),dtype=np.int32)\n",
+    "        misses = tri2latlon_fine( latlon_var, latlon_hits,\n",
+    "                                ll_grid['lat_min'], ll_grid['lon_min'], ll_grid['dlat'], ll_grid['dlon'],\n",
+    "                                tri_var.astype(np.float64), tri_grid['clat'].astype(np.float64), tri_grid['clon'].astype(np.float64),\n",
+    "                                tri_grid['vlat'].astype(np.float64), tri_grid['vlon'].astype(np.float64), tri_grid['vertex_of_cell'].astype(np.int32), nsearch )\n",
+    "        if not silent : print('misses in tri2latlon_fine (triangles -> quads) : ', misses)\n",
+    "    \n",
+    "    else :\n",
+    "        # map triangle data onto finer (factor oversample) grid, then coarsen to target resolution\n",
+    "        # (-> latmin and lonmin must be adjusted)\n",
+    "        latlon_hits =  np.zeros((nlat*oversample, nlon*oversample),dtype=np.int32)\n",
+    "        latlon_var_ref = np.zeros((nlat*oversample, nlon*oversample),dtype=np.float64)\n",
+    "\n",
+    "        misses = tri2latlon_fine( latlon_var_ref, latlon_hits,\n",
+    "                                ll_grid['lat_min'] - ll_grid['dlat']/2 + ll_grid['dlat']/(2*oversample),\n",
+    "                                ll_grid['lon_min'] - ll_grid['dlon']/2 + ll_grid['dlon']/(2*oversample),\n",
+    "                                ll_grid['dlat'] / oversample,\n",
+    "                                ll_grid['dlon'] / oversample,\n",
+    "                                tri_var.astype(np.float64), tri_grid['clat'].astype(np.float64), tri_grid['clon'].astype(np.float64),\n",
+    "                                tri_grid['vlat'].astype(np.float64), tri_grid['vlon'].astype(np.float64), tri_grid['vertex_of_cell'].astype(np.int32), nsearch )\n",
+    "        if not silent : print('misses in tri2latlon_fine (triangles -> quads) : ', misses)\n",
+    "\n",
+    "        # average over blocks of size oversample * oversample\n",
+    "        for i in range(oversample) :\n",
+    "            for j in range(oversample) :\n",
+    "                latlon_var += latlon_var_ref[i::oversample,j::oversample]\n",
+    "        latlon_var /= oversample**2\n",
+    "        latlon_var2 = coarsen_regular_2d_grid( latlon_var_ref, oversample )\n",
+    "        d = np.abs(latlon_var-latlon_var2)\n",
+    "        print('Coarsened results: ', d.max(), d.mean(), d.mean()/latlon_var.mean() )\n",
+    "\n",
+    "    return latlon_var\n",
+    "\n",
+    "@jit(nopython=True, nogil=True)\n",
+    "def det2d( ux, uy, vx, vy ) :\n",
+    "    return ux*vy - uy*vx\n",
+    "\n",
+    "@jit(nopython=True, nogil=True)\n",
+    "def point_in_triangle( lat, lon, vlat, vlon, include_edges=False ) :\n",
+    "\n",
+    "    # see http://mathworld.wolfram.com/TriangleInterior.html\n",
+    "    d12 = det2d( vlon[1]-vlon[0], vlat[1]-vlat[0], vlon[2]-vlon[0], vlat[2]-vlat[0] )\n",
+    "    if d12 != 0 :\n",
+    "        a   =   ( det2d( lon, lat, vlon[2]-vlon[0], vlat[2]-vlat[0] ) \\\n",
+    "                - det2d( vlon[0], vlat[0], vlon[2]-vlon[0], vlat[2]-vlat[0] ) ) / d12\n",
+    "        b   = - ( det2d( lon, lat, vlon[1]-vlon[0], vlat[1]-vlat[0] ) \\\n",
+    "                - det2d( vlon[0], vlat[0], vlon[1]-vlon[0], vlat[1]-vlat[0] ) ) / d12\n",
+    "    else :\n",
+    "        a = 0\n",
+    "        b = 0\n",
+    "\n",
+    "    inside = False\n",
+    "    if include_edges :\n",
+    "        if (a>=0) and (b>=0) and (a+b<=1) :\n",
+    "            inside = True\n",
+    "    else :\n",
+    "        if (a>0) and (b>0) and (a+b<1) :\n",
+    "            inside = True\n",
+    "    return inside\n",
+    "\n",
+    "@jit('int32(         float64[:,:], int32[:,:],  float64, float64, float64, float64, float64[:], float64[:], float64[:], float64[:], float64[:], int32[:,:],     int32   )', nopython=True, nogil=True)\n",
+    "def tri2latlon_fine( latlon_var,   latlon_hits, lat_min, lon_min, dlat,    dlon,    tri_var,    clat,       clon,       vlat,       vlon,       vertex_of_cell, nsearch ) :\n",
+    "    \"\"\"Assume lat-lon grid is finer than unstructured grid so that each triangle contains at least one lat-lon quad\"\"\"\n",
+    "\n",
+    "    nlat, nlon = latlon_var.shape\n",
+    "    trilat = np.zeros(3)\n",
+    "    trilon = np.zeros(3)\n",
+    "\n",
+    "    # determine dimensions of first triangle\n",
+    "    cidx=0\n",
+    "    trilat[0] = vlat[vertex_of_cell[0,cidx]]\n",
+    "    trilat[1] = vlat[vertex_of_cell[1,cidx]]\n",
+    "    trilat[2] = vlat[vertex_of_cell[2,cidx]]\n",
+    "    trilon[0] = vlon[vertex_of_cell[0,cidx]]\n",
+    "    trilon[1] = vlon[vertex_of_cell[1,cidx]]\n",
+    "    trilon[2] = vlon[vertex_of_cell[2,cidx]]\n",
+    "    dlat_tri = trilat.max() - trilat.min()\n",
+    "    dlon_tri = trilon.max() - trilon.min()\n",
+    "\n",
+    "    # search area\n",
+    "    nsearch = np.int(np.maximum( dlat_tri/dlat, dlon_tri/dlon ))+1\n",
+    "\n",
+    "    for cidx in range(clon.size) :\n",
+    "\n",
+    "        # find cell containing triangle center\n",
+    "        ilat = np.int( np.round( (clat[cidx]-lat_min)/dlat ) )\n",
+    "        ilon = np.int( np.round( (clon[cidx]-lon_min)/dlon ) )\n",
+    "\n",
+    "        #if ilat < 0 or ilat >= nlat or ilon < 0 or ilon >= nlon :\n",
+    "        if ilat < -nsearch or ilat >= nlat+nsearch or ilon < -nsearch or ilon >= nlon+nsearch :\n",
+    "            continue\n",
+    "\n",
+    "        # get vertices of triangle\n",
+    "        trilat[0] = vlat[vertex_of_cell[0,cidx]]\n",
+    "        trilat[1] = vlat[vertex_of_cell[1,cidx]]\n",
+    "        trilat[2] = vlat[vertex_of_cell[2,cidx]]\n",
+    "        trilon[0] = vlon[vertex_of_cell[0,cidx]]\n",
+    "        trilon[1] = vlon[vertex_of_cell[1,cidx]]\n",
+    "        trilon[2] = vlon[vertex_of_cell[2,cidx]]\n",
+    "\n",
+    "        # irhotify neighbor cells with centers in the same triangle\n",
+    "        for i in range( np.maximum(ilat-nsearch,0), np.minimum(ilat+1+nsearch,nlat) ) :\n",
+    "            for j in range( np.maximum(ilon-nsearch,0), np.minimum(ilon+1+nsearch,nlon) ) :\n",
+    "                if point_in_triangle( i*dlat+lat_min, j*dlon+lon_min, trilat, trilon ) :\n",
+    "                    latlon_hits[ i, j ] += 1\n",
+    "                    latlon_var[  i, j ] += tri_var[cidx]\n",
+    "\n",
+    "    misses = 0\n",
+    "    for ilat in range(nlat) :\n",
+    "        for ilon in range(nlon) :\n",
+    "            if latlon_hits[ ilat, ilon ] == 0 :  # this should not be necessary, but it is...\n",
+    "                if (ilat > 0) and (ilat < nlat-1) and (ilon > 0) and (ilon < nlon-1) :\n",
+    "                    s = latlon_hits[ ilat-1:ilat+2, ilon-1:ilon+2 ].sum()\n",
+    "                    if s > 0 :\n",
+    "                        latlon_var[  ilat, ilon ] = (  latlon_hits[ ilat-1:ilat+2, ilon-1:ilon+2 ] \\\n",
+    "                                                     * latlon_var[ ilat-1:ilat+2, ilon-1:ilon+2 ]).sum() / s\n",
+    "                        misses += 1\n",
+    "\n",
+    "            elif latlon_hits[ ilat, ilon ] > 1 : # this also should not be necessary, but it is...\n",
+    "                latlon_var[  ilat, ilon ] = latlon_hits[ ilat, ilon ] / latlon_hits[ ilat, ilon ]\n",
+    "                latlon_hits[ ilat, ilon ] = 1\n",
+    "\n",
+    "    return misses\n",
+    "\n",
+    "@jit(nopython=True,nogil=True)\n",
+    "def coarsen_regular_2d_grid( arr2d, cfac ) :\n",
+    "    \"\"\"Coarsen 2d array by factor cfac, i.e. average over blocks of size cfac * cfac\"\"\"\n",
+    "    nx, ny = arr2d.shape\n",
+    "    nxc, nyc = nx // cfac, ny // cfac\n",
+    "    arr2dc = np.zeros((nxc,nyc),dtype=arr2d.dtype)\n",
+    "    for ic in range(nxc) :\n",
+    "        for jc in range(nyc) :\n",
+    "            for i in range(cfac) :\n",
+    "                for j in range(cfac) :\n",
+    "                    arr2dc[ic,jc] += arr2d[ic*cfac+i,jc*cfac+j]\n",
+    "            arr2dc[ic,jc] /= cfac**2\n",
+    "    return arr2dc\n",
+    "\n",
+    "##################################################################\n",
+    "#                        MAIN FUNCTION\n",
+    "###################################################################\n",
+    "\n",
+    "def generate_mystic_files(lat_cam,lon_cam,dx,dy,dx1,dy1,lat_marg,lon_marg,dlat,dlon,dlat_px,dlon_px,grid_o,data_1,data_2,data_3,data_4,z_max,t_h,dom,nx,ny):\n",
+    "    \n",
+    "    lat_min, lat_max, lon_min, lon_max = lat_cam - dlat/2, lat_cam + dlat/2, lon_cam - dlon/2, lon_cam + dlon/2\n",
+    "    print('    extracting model state in {}<lat<{}, {}<lon<{}'.format( lat_min, lat_max, lon_min, lon_max ))\n",
+    "\n",
+    "     \n",
+    "    grid, modelvars = get_subdomain( (lat_min - lat_marg) * np.pi/180, (lat_max + lat_marg) * np.pi/180,\n",
+    "                                     (lon_min - lon_marg) * np.pi/180, (lon_max + lon_marg) * np.pi/180,grid_o,data_1,data_2,data_3,data_4)\n",
+    "    \n",
+    "    # find camera column\n",
+    "    idx_cam = np.argmin( (grid['clat']*180.0/np.pi - lat_cam)**2 + (grid['clon']*180.0/np.pi - lon_cam)**2 )\n",
+    "    print('    camera column: {} (clat={}, clon={})'.format( idx_cam, grid['clat'][idx_cam]*180/np.pi, grid['clon'][idx_cam]*180/np.pi ) )\n",
+    "    \n",
+    "    # finding the solar zenith angle at the center of the subdomain\n",
+    "    csza=modelvars['cosz_bz'][0,idx_cam]\n",
+    "    sza = math.degrees(math.acos(csza))\n",
+    "    \n",
+    "    # okay for clouds (heating_rate layer_fd) we should write from index 0:149 -> 0:29. km\n",
+    "    z_hl = modelvars['z_ifc'][:,idx_cam]\n",
+    "    z_icon = z_hl[1:151]\n",
+    "    # full levels\n",
+    "    z_fl = 0.5*( z_hl[1:] + z_hl[:-1] )\n",
+    "    dz = np.abs( modelvars['z_ifc'][1:,:] - modelvars['z_ifc'][:-1,:] )\n",
+    "    #nz = zlayers.size\n",
+    "    nz = z_hl.size -1\n",
+    "    \n",
+    "    print('deriving CH4 and N2O profiles')\n",
+    "    ch4 = gas_profile(vmr_ch4,modelvars['pres_ifc'][:,:],vpp_ch4)\n",
+    "    n2o = gas_profile(vmr_n2o,modelvars['pres_ifc'][:,:],vpp_n2o)\n",
+    "    \n",
+    "    # domain mean values\n",
+    "    ch4_dm = ch4.mean((1))\n",
+    "    n2o_dm = n2o.mean((1))\n",
+    "   \n",
+    "    print ('[icon2mystic] Writing Ch4 and N2O')\n",
+    "    # One should put these profiles in the LibRadTran/data subdirectory to be used in the analysis \n",
+    "    with open('/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/afglus_ch4_vmr_'+dom+'_'+t_h+'.dat','w') as f:\n",
+    "        f.writelines( [ (\"%e %e \\n\" % (z_hl[i]/1e3,ch4_dm[i])) \\\n",
+    "                        for i in range(z_hl.size)] )\n",
+    "\n",
+    "    with open('/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/afglus_n2o_vmr_'+dom+'_'+t_h+'.dat','w') as f:\n",
+    "        f.writelines( [ (\"%e %e \\n\" % (z_hl[i]/1e3,n2o_dm[i])) \\\n",
+    "                        for i in range(z_hl.size)] ) \n",
+    "    \n",
+    "    print('Deriving temp qv, o3, density at half levels')\n",
+    "    temp_hl = interpolate_hl(modelvars['temp'],modelvars['pres'],modelvars['pres_ifc'])\n",
+    "    qv_hl   = interpolate_hl(modelvars['tot_qv_dia'],modelvars['pres'],modelvars['pres_ifc']) \n",
+    "    o3_hl   = interpolate_hl(modelvars['o3'],modelvars['pres'],modelvars['pres_ifc'])\n",
+    "    rho_hl  = ((modelvars['pres_ifc'])/(R*temp_hl))\n",
+    "    pres_hl = modelvars['pres_ifc']\n",
+    "    \n",
+    "    ## getting correct units for Ozone and water vapor mass concentrations\n",
+    "    print('min/mean/max of density: ', modelvars['rho'].min(), modelvars['rho'].mean(), modelvars['rho'].max() )\n",
+    "    o3_hl_n = o3_hl* rho_hl # kg/m3\n",
+    "    qv_hl_n = qv_hl* rho_hl # kg/m3\n",
+    "    \n",
+    "    # adjusting units of water and ice mass concentrations\n",
+    "    print('*** computing LWC, IWC, reff_qc, reff_qi...'); starttime = time.perf_counter()\n",
+    "    IWC = np.maximum( modelvars['tot_qi_dia'], 0 ) * modelvars['rho'] # kg/m3\n",
+    "    LWC = np.maximum( modelvars['tot_qc_dia'], 0 ) * modelvars['rho'] # kg/m3\n",
+    "    \n",
+    "    #print('*** computing ice and water optical depth ...')\n",
+    "    tau_lwc = 3*LWC*dz/(2*rhoh2o*modelvars['reff_qc_ecrad'])\n",
+    "    tau_iwc = 3*IWC*dz/(2*rhoice*modelvars['reff_qi_ecrad'])\n",
+    "    \n",
+    "    # adjust the LWC and IWC according to limits used in the LibRadTran\n",
+    "    LWC_n = tau_lwc/((3*dz)/(2*rhoh2o*np.clip(modelvars['reff_qc_ecrad'], 5.0*1e-6, 25.0*1e-6)))\n",
+    "    IWC_n = tau_iwc/((3*dz)/(2*rhoice*np.clip(modelvars['reff_qi_ecrad'], 20.0*1e-6, 60.0*1e-6)))\n",
+    "\n",
+    "    print('[icon2mystic] Mapping model state to rectangular grid...')\n",
+    "    print('    horizontal resolution: {}km --> dlat_px={}deg, dlon_px={}deg'.format( dx, dlat_px, dlon_px ))\n",
+    "\n",
+    "    # Expand subdomains to overlap with neighboring subdomains \n",
+    "    lat_max1 = lat_max + lat_marg\n",
+    "    lat_min1 = lat_min - lat_marg\n",
+    "    lon_max1 = lon_max + lon_marg\n",
+    "    lon_min1 = lon_min - lon_marg\n",
+    "    \n",
+    "    # define lat-lon grid\n",
+    "    nlat = int( (lat_max-lat_min) / dlat_px )\n",
+    "    nlon = int( (lon_max-lon_min) / dlon_px )\n",
+    "    \n",
+    "    nlat1 = int( (lat_max1-lat_min1) / dlat_px )\n",
+    "    nlon1 = int( (lon_max1-lon_min1) / dlon_px )\n",
+    "    print('    lat-lon grid pixel size : nlon={} x nlat={}'.format(nlon,nlat) )\n",
+    "    print('    lat-lon grid extende pixel size : nlon={} x nlat={}'.format(nlon1,nlat1) )\n",
+    "\n",
+    "    print('*** calling generate_lat_lon_grid'); starttime = time.perf_counter()\n",
+    "    # without expansion (atmospheric background profiles)\n",
+    "    latlon_grid  = generate_latlon_grid( lat_min * np.pi/180, lon_min * np.pi/180, lat_max * np.pi/180, lon_max * np.pi/180, nlat, nlon, dim=2 )\n",
+    "    # with expansion (LibRadTran 3D cloud files)\n",
+    "    latlon_grid1  = generate_latlon_grid( lat_min1 * np.pi/180, lon_min1 * np.pi/180, lat_max1 * np.pi/180, lon_max1 * np.pi/180, nlat1, nlon1, dim=2 )\n",
+    "    print('*** generate_lat_lon_grid took %f seconds' % (time.perf_counter() - starttime))\n",
+    "    print( latlon_grid.keys() )\n",
+    "\n",
+    "    # horizontal grid overview plot\n",
+    "    fig, ax = plt.subplots(figsize=(8,8))\n",
+    "    ax.scatter( grid_o['vlon'][:]*180.0/np.pi, grid_o['vlat'][:]*180.0/np.pi, c='g', alpha=0.3, label='original_triangle grid vertices' )\n",
+    "    ax.scatter( grid['vlon']*180.0/np.pi, grid['vlat']*180.0/np.pi, c='b', alpha=0.3, label='triangle grid vertices' )\n",
+    "    ax.scatter( latlon_grid1['lon2d']*180.0/np.pi, latlon_grid1['lat2d']*180.0/np.pi, marker='.', c='r', s=1, alpha=0.3, label='cartesian grid cell centers' )\n",
+    "    ax.scatter( lon_cam, lat_cam, marker='X', c='#666666', s=100, label='camera position' )\n",
+    "    ax.scatter( grid['clon'][idx_cam]*180/np.pi, grid['clat'][idx_cam]*180/np.pi, marker='X', c='k', s=100, label='center of camera column' )\n",
+    "    ax.plot( (lon_min1, lon_max1, lon_max1, lon_min1, lon_min1), (lat_min1, lat_min1, lat_max1, lat_max1, lat_min1), 'k', label='cartesian grid boundary' )\n",
+    "    ax.legend(title='horizontal grid', loc='upper left', bbox_to_anchor=(1., 0., 0.3, 1.0), frameon=False)\n",
+    "    \n",
+    "    print('[icon2mystic] Mapping coordinates to cartesian grid...')\n",
+    "    cartvar = {}\n",
+    "    for v in ['clon','clat'] :\n",
+    "        cartvar[v] = tri2latlon( latlon_grid, grid, grid[v], method='fine', silent=True )\n",
+    "\n",
+    "    print('shape of the remapped data',cartvar['clat'].shape)\n",
+    "    \n",
+    "    print('[icon2mystic] Mapping effective radii to cartesian grid...')\n",
+    "    for v in ['reff_qc_ecrad','reff_qi_ecrad'] :\n",
+    "        cartvar[v] = np.zeros((nlat1,nlon1,nz))\n",
+    "        print('*** calling tri2latlon'); starttime = time.perf_counter()\n",
+    "        for k in range(nz) :        \n",
+    "            cartvar[v][:,:,k] = tri2latlon( latlon_grid1, grid, modelvars[v][k,:], method='fine', silent=True)\n",
+    "        print('*** tri2latlon took %f seconds' % (time.perf_counter() - starttime))\n",
+    "        \n",
+    "    print('[icon2mystic] Mapping LWC and IWC to cartesian grid...')\n",
+    "    modelvars['LWC'] = np.nan_to_num(LWC_n,nan=0.0) # just to make sure we do not have nan values\n",
+    "    modelvars['IWC'] = np.nan_to_num(IWC_n,nan=0.0)\n",
+    "    # remap\n",
+    "    for v in ['LWC','IWC'] :\n",
+    "        cartvar[v] = np.zeros((nlat1,nlon1,nz))\n",
+    "        print('*** calling tri2latlon'); starttime = time.perf_counter()\n",
+    "        for k in range(nz) :        \n",
+    "            cartvar[v][:,:,k] = tri2latlon( latlon_grid1, grid, modelvars[v][k,:], method='fine', silent=True )\n",
+    "        print('*** tri2latlon took %f seconds' % (time.perf_counter() - starttime))\n",
+    "        \n",
+    "    print('[icon2mystic] remap ICON radiative temperature tendencies ...')\n",
+    "    for v in ['ddt_temp_radlwnw','ddt_temp_radlwcs','ddt_temp_radswnw','ddt_temp_radswcs','clc'] :\n",
+    "        cartvar[v] = np.zeros((nlat,nlon,nz))\n",
+    "        print('*** calling tri2latlon'); starttime = time.perf_counter()\n",
+    "        for k in range(nz) :\n",
+    "            cartvar[v][:,:,k] = tri2latlon( latlon_grid, grid, modelvars[v][k,:], method='fine', silent=True )\n",
+    "        print('*** tri2latlon took %f seconds' % (time.perf_counter() - starttime))\n",
+    "    \n",
+    "    # deriving cloud radiative heating\n",
+    "    cartvar['ddt_radlw'] = cartvar['ddt_temp_radlwnw'] - cartvar['ddt_temp_radlwcs']\n",
+    "    cartvar['ddt_radsw'] = cartvar['ddt_temp_radswnw'] - cartvar['ddt_temp_radswcs']\n",
+    "    \n",
+    "    print('creating a dataset and save the data')\n",
+    "    ds = xr.Dataset(data_vars={\"qc\":([\"lat\",\"lon\",'height'],cartvar['LWC'][:,:,::-1]), \n",
+    "                               \"qi\":([\"lat\",\"lon\",'height'],cartvar['IWC'][:,:,::-1]),\n",
+    "                               \"clc\":([\"lat\",\"lon\",'height'],cartvar['clc'][:,:,::-1]),\n",
+    "                               \"ddt_radlwnw\":([\"lat\",\"lon\",'height'],cartvar['ddt_temp_radlwnw'][:,:,::-1]),\n",
+    "                               \"ddt_radlwcs\":([\"lat\",\"lon\",'height'],cartvar['ddt_temp_radlwcs'][:,:,::-1]),\n",
+    "                               \"ddt_radswnw\":([\"lat\",\"lon\",'height'],cartvar['ddt_temp_radswnw'][:,:,::-1]),\n",
+    "                               \"ddt_radswcs\":([\"lat\",\"lon\",'height'],cartvar['ddt_temp_radswcs'][:,:,::-1]),\n",
+    "                               \"ddt_radlw\":([\"lat\",\"lon\",'height'],cartvar['ddt_radlw'][:,:,::-1]),\n",
+    "                               \"ddt_radsw\":([\"lat\",\"lon\",'height'],cartvar['ddt_radsw'][:,:,::-1]),\n",
+    "                               \"rqi\":([\"lat\",\"lon\",'height'],cartvar['reff_qi_ecrad'][:,:,::-1]),\n",
+    "                               \"rqc\":([\"lat\",\"lon\",'height'],cartvar['reff_qc_ecrad'][:,:,::-1]),\n",
+    "                               \"z_fl\":(['height'],z_fl[::-1]),\n",
+    "                           \n",
+    "                           },\n",
+    "                coords={\"lat\": ([\"lat\"], np.arange(0,nlat)), \n",
+    "                        \"lon\": ([\"lon\"], np.arange(0,nlon)),\n",
+    "                        'height':([\"height\"],np.arange(0,nz))})\n",
+    "    \n",
+    "    # save as netcdf file\n",
+    "    ds.to_netcdf('/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/icon_'+dom+'_'+t_h+'.nc')\n",
+    "    ########################################################################\n",
+    "    # preparing atmospheric background profiles\n",
+    "    print('[icon2mystic] remap atmospheric components to cartesian grid...')\n",
+    "    var_name = ['o3_hl','pres_hl','temp_hl','rho_hl','qv_hl']\n",
+    "    nm = 0\n",
+    "    for v in [o3_hl_n,pres_hl,temp_hl,rho_hl,qv_hl_n] :\n",
+    "        cartvar[var_name[nm]] = np.zeros((nlat,nlon,nz+1))\n",
+    "        print('*** calling tri2latlon'); starttime = time.perf_counter()\n",
+    "        for k in range(nz+1) :        \n",
+    "            cartvar[var_name[nm]][:,:,k] = tri2latlon( latlon_grid, grid, v[k,:], method='fine', silent=True )\n",
+    "        print('*** tri2latlon took %f seconds' % (time.perf_counter() - starttime))\n",
+    "        nm = nm + 1 \n",
+    "    \n",
+    "    # taking domain mean\n",
+    "    for n in ['pres_dm','temp_dm','rho_dm','o3_dm','qv_dm']:\n",
+    "        cartvar[n.replace(\"_dm\", \"_hl\")][cartvar[n.replace(\"_dm\", \"_hl\")] == 0] = np.nan \n",
+    "        cartvar[n] = np.nanmean(cartvar[n.replace(\"_dm\", \"_hl\")],axis=(0,1))\n",
+    "    \n",
+    "    print ('[icon2mystic] Writing background_atmospheric_profile')\n",
+    "    with open('/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/atmosphere_mean_'+dom+'_'+t_h+'.dat','w') as f:\n",
+    "        # here I will add another level to avoid the crash in radiative transfer with NCA solver\n",
+    "        f.writelines( [ (\"%f %e %e %e %e %e %e %e \\n\" % ((z_hl[0]/1e3)+0.1,cartvar['pres_dm'][0]/1e2,\n",
+    "                                                  cartvar['temp_dm'][0],cartvar['rho_dm'][0]/m_d*1e-6,\n",
+    "                                                  cartvar['o3_dm'][0]/m_o3*1e-6,\n",
+    "                                                  cartvar['o3_dm'][0]*0.0 + (o2/m_o2)*1e-6,\n",
+    "                                                  cartvar['qv_dm'][0]/m_h2o*1e-6,\n",
+    "                                                  cartvar['o3_dm'][0]*0.0 + (co2/m_co2)*1e-6))] )\n",
+    "        \n",
+    "        f.writelines( [ (\"%f %e %e %e %e %e %e %e \\n\" % (z_hl[i]/1e3,cartvar['pres_dm'][i]/1e2,\n",
+    "                                                  cartvar['temp_dm'][i],cartvar['rho_dm'][i]/m_d*1e-6,\n",
+    "                                                  cartvar['o3_dm'][i]/m_o3*1e-6,\n",
+    "                                                  cartvar['o3_dm'][i]*0.0 + (o2/m_o2)*1e-6,\n",
+    "                                                  cartvar['qv_dm'][i]/m_h2o*1e-6,\n",
+    "                                                  cartvar['o3_dm'][i]*0.0 + (co2/m_co2)*1e-6)) \\\n",
+    "                        for i in range(nz+1)] )    \n",
+    "    ########################################################################    \n",
+    "        \n",
+    "    ## Creating NWP homogeneous grid-box clouds and homogeneous clouds with cloud fraction from LEM clouds at the resolution of 2.5 km \n",
+    "    print('  finding total cloudy pixels from both ice and water clouds')\n",
+    "    cartvar['ic_wc_tot'] = cartvar['LWC'] + cartvar['IWC']\n",
+    "    # empty array to store the coarse-grained LWC+IWC and cloud fraction\n",
+    "    for n in ['LWC_cg','IWC_cg','reff_qc_cg','reff_qi_cg','cf_cg','LWC_dl','IWC_dl','reff_qc_dl','reff_qi_dl']:\n",
+    "        cartvar[n] = np.zeros((nx,ny,150))  \n",
+    "    # let's chunk them\n",
+    "    tc1 = np.swapaxes(cartvar['LWC'], 0, 1)\n",
+    "    tc2 = np.swapaxes(cartvar['IWC'], 0, 1)\n",
+    "    tc3 = np.swapaxes(cartvar['reff_qc_ecrad'], 0, 1)\n",
+    "    tc4 = np.swapaxes(cartvar['reff_qi_ecrad'], 0, 1)\n",
+    "    tc5 = np.swapaxes(cartvar['ic_wc_tot'], 0, 1)\n",
+    "\n",
+    "    tm1 = np.array_split(tc1[:,:,:],nx,axis=0) #2.5 km grid spacing\n",
+    "    tm2 = np.array_split(tc2[:,:,:],nx,axis=0)\n",
+    "    tm3 = np.array_split(tc3[:,:,:],nx,axis=0)\n",
+    "    tm4 = np.array_split(tc4[:,:,:],nx,axis=0)\n",
+    "    tm5 = np.array_split(tc5[:,:,:],nx,axis=0)\n",
+    "    \n",
+    "    # empty array \n",
+    "    lc1 = np.zeros((150))\n",
+    "    lc2 = np.zeros((150))\n",
+    "    lc3 = np.zeros((150))\n",
+    "    lc4 = np.zeros((150))\n",
+    "    lc5 = np.zeros((150))\n",
+    "    for i in range(len(tm1)):\n",
+    "        tmm_1 = np.array_split(tm1[i],ny,axis=1)\n",
+    "        tmm_2 = np.array_split(tm2[i],ny,axis=1)\n",
+    "        tmm_3 = np.array_split(tm3[i],ny,axis=1)\n",
+    "        tmm_4 = np.array_split(tm4[i],ny,axis=1)\n",
+    "        tmm_5 = np.array_split(tm5[i],ny,axis=1)\n",
+    "        for j in range(len(tmm_1)):  \n",
+    "            for k in range(150):\n",
+    "                tmmm_1 = tmm_1[j][:,:,k]\n",
+    "                tmmm_2 = tmm_2[j][:,:,k]\n",
+    "                tmmm_3 = tmm_3[j][:,:,k]\n",
+    "                tmmm_4 = tmm_4[j][:,:,k]\n",
+    "                tmmm_5 = tmm_5[j][:,:,k]\n",
+    "\n",
+    "                nln, nlt = tmmm_1.shape\n",
+    "                # cloud water\n",
+    "                cloudy1 = tmmm_1[np.nonzero(tmmm_1)]\n",
+    "                if cloudy1.size > 0 :\n",
+    "                    lc1[k] = cloudy1.mean()\n",
+    "                    cartvar['LWC_cg'][i,j,k] = lc1[k]\n",
+    "                # cloud ice\n",
+    "                cloudy2 = tmmm_2[np.nonzero(tmmm_2)]\n",
+    "                if cloudy2.size > 0 :\n",
+    "                    lc2[k] = cloudy2.mean()\n",
+    "                    cartvar['IWC_cg'][i,j,k] = lc2[k]\n",
+    "                # cloud water reff\n",
+    "                cloudy3 = tmmm_3[np.nonzero(tmmm_3)]\n",
+    "                if cloudy3.size > 0 :\n",
+    "                    lc3[k] = cloudy3.mean()\n",
+    "                    cartvar['reff_qc_cg'][i,j,k] = lc3[k]\n",
+    "                # cloud ice reff\n",
+    "                cloudy4 = tmmm_4[np.nonzero(tmmm_4)]\n",
+    "                if cloudy4.size > 0 :\n",
+    "                    lc4[k] = cloudy4.mean()\n",
+    "                    cartvar['reff_qi_cg'][i,j,k] = lc4[k]\n",
+    "                # cloud fraction\n",
+    "                cloudy5 = tmmm_5[np.nonzero(tmmm_5)]\n",
+    "                if cloudy5.size > 0 :\n",
+    "                    lc5[k] = (cloudy5.size/nln/nlt)\n",
+    "                    cartvar['cf_cg'][i,j,k] = lc5[k]\n",
+    "                # NWP homogeneous grid-box clouds\n",
+    "                cartvar['LWC_dl'][i,j,k] = tmmm_1.mean()\n",
+    "                cartvar['IWC_dl'][i,j,k] = tmmm_2.mean()\n",
+    "                cartvar['reff_qc_dl'][i,j,k] = tmmm_3.mean()\n",
+    "                cartvar['reff_qi_dl'][i,j,k] = tmmm_4.mean()\n",
+    "\n",
+    "    # just to make sure we do not have nan values             \n",
+    "    cartvar['LWC_cg'] = np.nan_to_num(cartvar['LWC_cg'],nan=0.0)\n",
+    "    cartvar['IWC_cg'] = np.nan_to_num(cartvar['IWC_cg'],nan=0.0)\n",
+    "    cartvar['reff_qc_cg'] = np.nan_to_num(cartvar['reff_qc_cg'],nan=0.0)\n",
+    "    cartvar['reff_qi_cg'] = np.nan_to_num(cartvar['reff_qi_cg'],nan=0.0)\n",
+    "    cartvar['LWC_dl'] = np.nan_to_num(cartvar['LWC_dl'],nan=0.0)\n",
+    "    cartvar['IWC_dl'] = np.nan_to_num(cartvar['IWC_dl'],nan=0.0)\n",
+    "    cartvar['reff_qc_dl'] = np.nan_to_num(cartvar['reff_qc_dl'],nan=0.0)\n",
+    "    cartvar['reff_qi_dl'] = np.nan_to_num(cartvar['reff_qi_dl'],nan=0.0)\n",
+    "    cartvar['cf_cg'] = np.nan_to_num(cartvar['cf_cg'],nan=0.0)\n",
+    "    \n",
+    "    # Save (#ice defaut code: 20-60 wc=5-25)\n",
+    "    print('[icon2mystic] Writing MYSTIC files...') \n",
+    "    # NWP homogeneous clouds with cloud fraction\n",
+    "    write_mc_cloud_file_frac(np.swapaxes(cartvar['LWC_cg'][:,:,::-1]*1000,0,1), np.swapaxes(np.clip( cartvar['reff_qc_cg'][:,:,::-1]*1e6,  5.0, 25.0 ),0,1), np.swapaxes(cartvar['cf_cg'][:,:,::-1],0,1), dx1*1e3, dy1*1e3, z_hl[::-1], fname='/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/wc3d_cg_'+dom+'_'+t_h+'.dat', flag=4 )\n",
+    "    write_mc_cloud_file_frac(np.swapaxes(cartvar['IWC_cg'][:,:,::-1]*1000,0,1), np.swapaxes(np.clip( cartvar['reff_qi_cg'][:,:,::-1]*1e6, 20.0, 60.0 ),0,1), np.swapaxes(cartvar['cf_cg'][:,:,::-1],0,1), dx1*1e3, dy1*1e3, z_hl[::-1], fname='/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/ic3d_cg_'+dom+'_'+t_h+'.dat', flag=4 )\n",
+    "    # NWP homogeneous grid-box clouds\n",
+    "    write_mc_cloud_file(np.swapaxes(cartvar['LWC_dl'][:,:,::-1]*1000,0,1), np.swapaxes(np.clip( cartvar['reff_qc_dl'][:,:,::-1]*1e6,  5.0, 25.0 ),0,1), dx1*1e3, dy1*1e3, z_hl[::-1], fname='/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/wc3d_dl_'+dom+'_'+t_h+'.dat', flag=3 )\n",
+    "    write_mc_cloud_file(np.swapaxes(cartvar['IWC_dl'][:,:,::-1]*1000,0,1), np.swapaxes(np.clip( cartvar['reff_qi_dl'][:,:,::-1]*1e6, 20.0, 60.0 ),0,1), dx1*1e3, dy1*1e3, z_hl[::-1], fname='/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/ic3d_dl_'+dom+'_'+t_h+'.dat', flag=3 )\n",
+    "    # LEM clouds\n",
+    "    write_mc_cloud_file( cartvar['LWC'][:,:,::-1]*1000, np.clip( cartvar['reff_qc_ecrad'][:,:,::-1]*1e6,  5.0, 25.0 ), dx*1e3, dy*1e3, z_hl[::-1], fname='/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/wc3d_'+dom+'_'+t_h+'.dat', flag=3 )\n",
+    "    write_mc_cloud_file( cartvar['IWC'][:,:,::-1]*1000, np.clip( cartvar['reff_qi_ecrad'][:,:,::-1]*1e6, 20.0, 60.0 ), dx*1e3, dy*1e3, z_hl[::-1], fname='/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/ic3d_'+dom+'_'+t_h+'.dat', flag=3 )\n",
+    "    \n",
+    "    return sza,cartvar,z_hl"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "## Define inputs and call 'generate_mystic_files' function"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# Information about defining the grid parameters for remapping and coarse-graining\n",
+    "# Here I follow the routines outlined in the MPI grid generator.\n",
+    "\n",
+    "# ICON-LEM triangular planar grid \n",
+    "\n",
+    "resolution = 300\n",
+    "x_length = 471000\n",
+    "y_length = 667000\n",
+    "\n",
+    "# from planar-grid routine \n",
+    "\n",
+    "edge_length = 1.5196713713 * resolution\n",
+    "x_no_of_columns = int(round(x_length / edge_length))\n",
+    "edge_length = x_length / x_no_of_columns\n",
+    "y_height = edge_length * np.sin(np.deg2rad(60)) # triangle height\n",
+    "y_no_of_rows = int(round(y_length / y_height))\n",
+    "\n",
+    "# lon and lat step\n",
+    "x_lon_range = 6\n",
+    "y_lat_range = 6\n",
+    "x_lon_step  = x_lon_range / (x_no_of_columns*2) \n",
+    "y_lat_step  = y_lat_range / y_no_of_rows\n",
+    "\n",
+    "print('X_increment in degrees:', x_lon_step)\n",
+    "print('Y_increment in degrees:', y_lat_step)\n",
+    "\n",
+    "print('Longitude extension of the LEM domain:', x_lon_step * 2*x_no_of_columns)\n",
+    "print('Latitude extension of the LEM domain:', y_lat_step * y_no_of_rows)\n",
+    "print('Number of total grid cells:', 2 * x_no_of_columns * y_no_of_rows) # should be 3489474\n",
+    "\n",
+    "print('Number of points in the x_direction of the Cartesian grid:',2 * x_no_of_columns)\n",
+    "print('Number of points in the y_direction of the Cartesian grid:',y_no_of_rows)\n",
+    "\n",
+    "print('X_increment in meter',x_length/(6/x_lon_step))\n",
+    "print('y_increment in meter:',y_length/(6/y_lat_step))\n",
+    "\n",
+    "print('Number of NWP boxes fit into 1°x1° resolution of subdomains in the x direction:',1/x_lon_step)\n",
+    "print('Number of NWP boxes fit into 1°x1° resolution of subdomains in the y direction:',1/y_lat_step)\n",
+    "\n",
+    "'''\n",
+    "a = ((471/6)/344)*(344/41)\n",
+    "b = ((667/6)/281)*(281/33)\n",
+    "print(a)\n",
+    "print(b)\n",
+    "print('resolution of derived rectangles at 300 m ',np.sqrt(0.22819767441860464*0.3956109134045077))\n",
+    "print('resolution of derived rectangles at 2500 m ',np.sqrt(1.9146341463414633*3.368686868686869))\n",
+    "'''"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# ICON grid\n",
+    "grid_o = netCDF4.Dataset('/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/grid_DOM01.nc','r')\n",
+    "\n",
+    "# Margin of lat/lon for extending the subdomains for overlapping \n",
+    "lat_marg = 0.11\n",
+    "lon_marg = 0.11\n",
+    "\n",
+    "# Extention of lat/lon for creating subdomains\n",
+    "dlat = 1 \n",
+    "dlon = 1\n",
+    "\n",
+    "# X and Y increments in km for remapping LEM clouds\n",
+    "dx = 0.227 \n",
+    "dy = 0.394 \n",
+    "\n",
+    "# X and Y increments in km for NWP clouds\n",
+    "dx1 = 1.9 \n",
+    "dy1 = 3.3 \n",
+    "\n",
+    "# number of NWP boxes in 1°x1° subdomains \n",
+    "nx = 41\n",
+    "ny = 33\n",
+    "\n",
+    "# lat/lon increments in degrees\n",
+    "dlat_px = 0.003552397868561279 \n",
+    "dlon_px = 0.002904162633107454 \n",
+    "z_max = 13 # not used\n",
+    "\n",
+    "#############################################\n",
+    "# List of ICON-LEM simulations\n",
+    "\n",
+    "#'LC1-LES-471x667km-lon25-lat40-300m-0006' : {'res':'300', 'radiation':4, 'mphy':0.8},  dom01: shallow cumulus clouds\n",
+    "#'LC1-LES-471x667km-lon40-lat44-300m-0003' : {'res':'300', 'radiation':4, 'mphy':0.8},  dom02: WCB ascent\n",
+    "#'LC1-LES-471x667km-lon30-lat53-300m-0005' : {'res':'300', 'radiation':4, 'mphy':0.8},  dom03: WCB cyclonic outflow\n",
+    "#'LC1-LES-471x667km-lon50-lat48-300m-0004' : {'res':'300', 'radiation':4, 'mphy':0.8},  dom03: WCB anticyclonic outflow"
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "metadata": {},
+   "outputs": [],
+   "source": [
+    "# looping through time and calling 'generate_mystic_files' function\n",
+    "\n",
+    "# Repeat this for other ICON-LEM simulations by adjusting the path to the simulations output and lat/lon center of the first subdomain\n",
+    "\n",
+    "for t_h in ['20220105T100033Z','20220105T103033Z','20220105T110033Z','20220105T113033Z','20220105T120033Z','20220105T123033Z','20220105T130033Z','20220105T133033Z','20220105T140033Z']:\n",
+    "    \n",
+    "    data_1 = netCDF4.Dataset('/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/icon-cld3d_ML_'+t_h+'.nc','r')\n",
+    "    data_2 = netCDF4.Dataset('/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/icon-atm3d_ML_'+t_h+'.nc','r')\n",
+    "    data_3 = netCDF4.Dataset('/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/icon-ddt_temp_ML_'+t_h+'.nc','r')\n",
+    "    data_4 = netCDF4.Dataset('/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/icon-radbz_ML_'+t_h+'.nc','r')\n",
+    "    \n",
+    "    num = 1\n",
+    "    solar_za = []\n",
+    "    for j in range(6): \n",
+    "        lat_cam = 37.5 + j # Need to be adjusted for other LEM domains\n",
+    "        for i in range(6):\n",
+    "            lon_cam = 22.5 + i # Need to be adjusted for other LEM domains\n",
+    "            print('center:',lon_cam,lat_cam)\n",
+    "            #calling the final function\n",
+    "            ss = generate_mystic_files(lat_cam,lon_cam,dx,dy,dx1,dy1,lat_marg,lon_marg,dlat,dlon,dlat_px,dlon_px,grid_o,data_1,data_2,data_3,data_4,z_max,t_h,str(num),nx,ny)\n",
+    "            solar_za.append(ss)\n",
+    "            num = num + 1\n",
+    "    # Write down solar zenith angles        \n",
+    "    with open('/work/bb1135/icon_output/LC1-LES-471x667km-lon25-lat40-300m-0006/input4libradtran/sza_'+t_h+'.dat', 'w') as f:\n",
+    "        for item in solar_za:\n",
+    "            f.write('\"%s\" ' % item)        "
+   ]
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "pycrh",
+   "language": "python",
+   "name": "pycrh"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 3
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython3",
+   "version": "3.8.5"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
diff --git a/offlineRT/solar_clear_sky/input_run.sh b/offlineRT/solar_clear_sky/input_run.sh
new file mode 100755
index 0000000000000000000000000000000000000000..77c2f68906a9d81649c0f96702792e3d13cd798e
--- /dev/null
+++ b/offlineRT/solar_clear_sky/input_run.sh
@@ -0,0 +1,121 @@
+#!/usr/bin/env bash
+
+. VARIABLES
+
+
+pathh=$(cd ../../ && pwd)
+INP_FILE_NAME='libsetup.inp'
+LIBRAD="/home/b/b381185/libRadtran"
+
+mkdir output
+cd output
+
+for sol in "${solver[@]}"
+do
+    echo 'solver = ' $sol
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+i=0
+for dm in "${dom[@]}"
+do
+echo 'dom = ' $dm
+if [ $it == 05T1000 ]
+then
+        echo 'sza = ' "${sza_05T1000[$i]}"
+        sza="${sza_05T1000[$i]}"
+elif [ $it == 05T1030 ]
+then
+        echo 'sza = ' "${sza_05T1030[$i]}"
+        sza="${sza_05T1030[$i]}"
+elif [ $it == 05T1100 ]
+then
+        echo 'sza = ' "${sza_05T1100[$i]}"
+        sza="${sza_05T1100[$i]}"
+elif [ $it == 05T1130 ]
+then
+        echo 'sza = ' "${sza_05T1130[$i]}"
+        sza="${sza_05T1130[$i]}"
+elif [ $it == 05T1200 ]
+then
+        echo 'sza = ' "${sza_05T1200[$i]}"
+        sza="${sza_05T1200[$i]}"
+elif [ $it == 05T1230 ]
+then
+        echo 'sza = ' "${sza_05T1230[$i]}"
+        sza="${sza_05T1230[$i]}"
+elif [ $it == 05T1300 ]
+then
+        echo 'sza = ' "${sza_05T1300[$i]}"
+        sza="${sza_05T1300[$i]}"
+elif [ $it == 05T1330 ]
+then
+        echo 'sza = ' "${sza_05T1330[$i]}"
+        sza="${sza_05T1330[$i]}"
+else
+        echo 'sza = ' "${sza_05T1400[$i]}"
+        sza="${sza_05T1400[$i]}"
+fi
+i=$i+1
+
+OUT_FILE=$sol'_'$it'_'$dm'_cs_hr.out'
+
+############################
+# CREATE INPUT FILE HERE !!!
+cat > $INP_FILE_NAME << EOF
+data_files_path $LIBRAD/data/
+atmosphere_file ${pathh}/atmosphere_mean_${dm}_202201${it}33Z.dat
+#source solar /work/bb1135/b381185/tools/libRadtran/data/data/solar_flux/kato
+
+albedo 0.07
+source solar
+
+sza ${sza}
+#phi ${phi}
+  
+mol_abs_param Fu   
+wavelength_index 1  7   
+output_process sum   
+
+# gas profiles
+#mixing_ratio CO2 348
+#mixing_ratio O2 209460
+#mixing_ratio CH4 1.650 
+#mixing_ratio N2O 0.396
+
+mixing_ratio F11 0.0
+mixing_ratio F12 0.0
+mixing_ratio F22 0.0
+mixing_ratio NO2 0.0
+
+mol_modify O4 0.0 DU
+mol_modify BRO 0.0 DU
+mol_modify OCLO 0.0 DU
+mol_modify HCHO 0.0 DU
+mol_modify SO2 0.0 DU
+mol_modify CO 0.0 DU
+mol_modify N2 0.0 DU
+
+#surface
+albedo_library IGBP
+brdf_rpv_type 17
+
+rte_solver $sol
+
+heating_rate layer_fd 
+
+zout 0.000000 0.020000 0.051914 0.090769 0.135002 0.183759 0.236489 0.292801 0.352401 0.415061 0.480594 0.548850 0.619700 0.693036 0.768763 0.846801 0.927078 1.009531 1.094104 1.180746 1.269413 1.360064 1.452663 1.547177 1.643575 1.741831 1.841918 1.943816 2.047502 2.152959 2.260168 2.369114 2.479784 2.592163 2.706240 2.822004 2.939446 3.058556 3.179328 3.301754 3.425828 3.551544 3.678898 3.807886 3.938505 4.070751 4.204624 4.340121 4.477242 4.615985 4.756352 4.898344 5.041960 5.187202 5.334074 5.482577 5.632714 5.784490 5.937907 6.092970 6.249683 6.408053 6.568084 6.729782 6.893155 7.058208 7.224949 7.393386 7.563527 7.735380 7.908954 8.084258 8.261303 8.440098 8.620654 8.802982 8.987094 9.173002 9.360718 9.550255 9.741626 9.934847 10.129930 10.326890 10.525744 10.726509 10.929198 11.133832 11.340426 11.549001 11.759574 11.972166 12.186796 12.403487 12.622261 12.843139 13.066146 13.291306 13.518643 13.748186 13.979961 14.213994 14.450318 14.688961 14.929955 15.173333 15.419130 15.667381 15.918122 16.171394 16.427235 16.685688 16.946798 17.210609 17.477171 17.746536 18.018753 18.293882 18.571980 18.853111 19.137339 19.424730 19.715363 20.009314 20.306667 20.607512 20.911943 21.220062 21.531981 21.847816 22.167702 22.491774 22.820189 23.153118 23.490747 23.833288 24.180979 24.534081 24.892902 25.257799 25.629181 26.007547 26.393496 26.787775 27.191345 27.605484 28.031990 28.473600 28.934996 29.426264
+
+quiet
+
+EOF
+### END OF INPUT FILE ###########
+
+#run libradtran
+$LIBRAD/bin/uvspec <$INP_FILE_NAME> $OUT_FILE
+
+done
+done
+done
diff --git a/offlineRT/thermal_clear_sky/VARIABLES b/offlineRT/thermal_clear_sky/VARIABLES
new file mode 100644
index 0000000000000000000000000000000000000000..69d8009a75e82705fc2afd011be80f62873ce382
--- /dev/null
+++ b/offlineRT/thermal_clear_sky/VARIABLES
@@ -0,0 +1,11 @@
+export solver=("mystic" "twostr")
+
+export time_array=("05T1000" "05T1030" "05T1100" "05T1130" "05T1200" "05T1230" "05T1300" "05T1330" "05T1400")
+
+export dom=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17" "18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export dom1=("1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "12" "13" "14" "15" "16" "17")
+
+export dom2=("18" "19" "20" "21" "22" "23" "24" "25" "26" "27" "28" "29" "30" "31" "32" "33" "34" "35" "36")
+
+export isim_array=("01" "04")
diff --git a/offlineRT/thermal_clear_sky/input_run.sh b/offlineRT/thermal_clear_sky/input_run.sh
new file mode 100755
index 0000000000000000000000000000000000000000..49f07ad6278124d158b1c3492211736c3d4210ae
--- /dev/null
+++ b/offlineRT/thermal_clear_sky/input_run.sh
@@ -0,0 +1,78 @@
+#!/usr/bin/env bash
+
+. VARIABLES
+
+pathh=$(cd ../../ && pwd)
+INP_FILE_NAME='libsetup.inp'
+LIBRAD="/home/b/b381185/libRadtran"
+
+mkdir output
+cd output
+
+for sol in "${solver[@]}"
+do
+    echo 'solver = ' $sol
+
+for it in "${time_array[@]}"
+do
+    echo 'time = ' $it
+
+for dm in "${dom[@]}"
+do
+    echo 'dom = ' $dm
+
+OUT_FILE=$sol'_'$it'_'$dm'_cs_hr.out'
+
+
+############################
+# CREATE INPUT FILE HERE !!!
+cat > $INP_FILE_NAME << EOF
+data_files_path $LIBRAD/data/
+atmosphere_file ${pathh}/atmosphere_mean_${dm}_202201${it}33Z.dat
+
+albedo 0.009000000000000008
+source thermal  
+mol_abs_param Fu   
+wavelength_index 7  18   
+output_process sum   
+
+# gas profiles
+#mixing_ratio CO2 348
+#mixing_ratio O2 209460
+#mixing_ratio CH4 1.650 
+#mixing_ratio N2O 0.396
+
+mixing_ratio F11 0.0
+mixing_ratio F12 0.0
+mixing_ratio F22 0.0
+mixing_ratio NO2 0.0
+
+mol_modify O4 0.0 DU
+mol_modify BRO 0.0 DU
+mol_modify OCLO 0.0 DU
+mol_modify HCHO 0.0 DU
+mol_modify SO2 0.0 DU
+mol_modify CO 0.0 DU
+mol_modify N2 0.0 DU
+
+#surface
+albedo_library IGBP
+brdf_rpv_type 17
+
+rte_solver $sol
+
+heating_rate layer_fd 
+
+zout 0.000000 0.020000 0.051914 0.090769 0.135002 0.183759 0.236489 0.292801 0.352401 0.415061 0.480594 0.548850 0.619700 0.693036 0.768763 0.846801 0.927078 1.009531 1.094104 1.180746 1.269413 1.360064 1.452663 1.547177 1.643575 1.741831 1.841918 1.943816 2.047502 2.152959 2.260168 2.369114 2.479784 2.592163 2.706240 2.822004 2.939446 3.058556 3.179328 3.301754 3.425828 3.551544 3.678898 3.807886 3.938505 4.070751 4.204624 4.340121 4.477242 4.615985 4.756352 4.898344 5.041960 5.187202 5.334074 5.482577 5.632714 5.784490 5.937907 6.092970 6.249683 6.408053 6.568084 6.729782 6.893155 7.058208 7.224949 7.393386 7.563527 7.735380 7.908954 8.084258 8.261303 8.440098 8.620654 8.802982 8.987094 9.173002 9.360718 9.550255 9.741626 9.934847 10.129930 10.326890 10.525744 10.726509 10.929198 11.133832 11.340426 11.549001 11.759574 11.972166 12.186796 12.403487 12.622261 12.843139 13.066146 13.291306 13.518643 13.748186 13.979961 14.213994 14.450318 14.688961 14.929955 15.173333 15.419130 15.667381 15.918122 16.171394 16.427235 16.685688 16.946798 17.210609 17.477171 17.746536 18.018753 18.293882 18.571980 18.853111 19.137339 19.424730 19.715363 20.009314 20.306667 20.607512 20.911943 21.220062 21.531981 21.847816 22.167702 22.491774 22.820189 23.153118 23.490747 23.833288 24.180979 24.534081 24.892902 25.257799 25.629181 26.007547 26.393496 26.787775 27.191345 27.605484 28.031990 28.473600 28.934996 29.426264
+
+quiet
+#verbose
+EOF
+### END OF INPUT FILE ###########
+
+#run libradtran
+$LIBRAD/bin/uvspec <$INP_FILE_NAME> $OUT_FILE
+
+done
+done
+done
diff --git a/sims/README.md b/sims/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..b56f9e2d7faf64bddac361f39bfb4e0ffadd25bf
--- /dev/null
+++ b/sims/README.md
@@ -0,0 +1,11 @@
+This directory contains the scripts for setting up the ICON model for both the baroclinic life cycle simulation with ICON-NWP and the large eddy model simulations with ICON-LEM.
+
+* The **planar_grid** subdirectory contains the script to generate the planar grid for large eddy model simulations using the MPI grid generator.
+
+* The **preprocessing** subdirectory contains the scripts to prepare the initial and lateral boundary conditions from the ICON-NWP simulation for ICON-LEM simulations with DWD_ICON_tools and CDO.
+
+* The **runscript** subdirectory contains the ICON runscript for the baroclinic life cycle and large eddy model simulations.
+- The baroclinic life cycle simulation follows the same model setup as in Keshtgar et al., 2023 (https://wcd.copernicus.org/articles/4/115/2023/).
+- BLC simulation : LC1-channel-4000x9000km-2km-0002
+ - The model setup procedure is described here: https://gitlab.phaidra.org/climate/keshtgar-etal-crh-cyclone-wcd2022/-/tree/main/blc_initial_conditions
+
diff --git a/sims/planar_grid/grid.create_plane.run b/sims/planar_grid/grid.create_plane.run
index 45862612c2e1b35bf0f88ccae87cda105e9d1957..d9f737903feb273731cfb402160a6716c5df0f1d 100755
--- a/sims/planar_grid/grid.create_plane.run
+++ b/sims/planar_grid/grid.create_plane.run
@@ -92,44 +92,29 @@ WORKDIR=${basedir}/grids
 #-----------------------------------------------------------------------------
 set -x
 #-----------------------------------------------------------------------------
-# Defining grid properties
-
-#lon_center=40.0
-#lat_center=44.0
-#lon_range=10
-#lat_range=12
-
-# ideally length and width can be derived according to:
-#((x_length_km=lon_range*102))
-#((y_length_km=lat_range*96))
-
-# using latitude/longitude distance calculator (https://www.nhc.noaa.gov/gccalc.shtml)
-#x_length_km=785
-#y_length_km=1333
-
-#resolution=300
-#create_raggedOrthogonal_ofTriangles $x_length_km $y_length_km $resolution
-#create_limited_area_boundaries "raggedOrthogonal_${x_length_km}x${y_length_km}_${resolution}.nc" "raggedOrthogonal_${x_length_km}x${y_length_km}_${resolution}_with_boundary.nc"
-
-#----------------------------------------------------------------------------
-# Defining grid properties
+# Defining grid properties for each LEM domains
+# choose the lat/lon center of desired domain
+# dom01: shallow cumulus clouds   lon_center=25.0, lat_center=40.0
+# dom02: WCB ascent region        lon_center=40.0, lat_center=44.0
+# dom03: WCB cyclonic outflow     lon_center=30.0, lat_center=53.0
+# dom04: WCB anticyclonic otflow  lon_center=50.0, lat_center=48.0
 
 lon_center=40.0
 lat_center=44.0
 lon_range=6
 lat_range=6
 
-# ideally length and width can be derived according to:
-#((x_length_km=lon_range*102))
-#((y_length_km=lat_range*96))
-
 # using latitude/longitude distance calculator (https://www.nhc.noaa.gov/gccalc.shtml)
+
 x_length_km=471
 y_length_km=667
 
 resolution=300
+
 create_raggedOrthogonal_ofTriangles $x_length_km $y_length_km $resolution
-create_limited_area_boundaries "raggedOrthogonal_${x_length_km}x${y_length_km}_${resolution}.nc" "raggedOrthogonal_${x_length_km}x${y_length_km}_${resolution}_with_boundary.nc"
+
+# defining lateral boundary cells
+create_limited_area_boundaries "raggedOrthogonal_${x_length_km}x${y_length_km}_${resolution}.nc" "raggedOrthogonal_${x_length_km}x${y_length_km}_${resolution}_with_boundary_dom01.nc"
 
 exit
 
diff --git a/sims/preprocessing_scripts/1.extpar_nest_plane.sh b/sims/preprocessing_scripts/1.extpar_nest_plane.sh
new file mode 100755
index 0000000000000000000000000000000000000000..63fc45b3c232a43ece4d150ade4bec266bfc4bb7
--- /dev/null
+++ b/sims/preprocessing_scripts/1.extpar_nest_plane.sh
@@ -0,0 +1,166 @@
+#! /bin/bash
+set -x
+ulimit -s unlimited
+#=============================================================================
+# OpenMP environment variables
+# ----------------------------
+export OMP_NUM_THREADS=4
+export ICON_THREADS=$OMP_NUM_THREADS
+export OMP_SCHEDULE="guided"
+export OMP_DYNAMIC="false"
+export OMP_STACKSIZE=500M
+#
+# MPI variables
+# -------------
+no_of_nodes=${SLURM_JOB_NUM_NODES:=1}
+mpi_procs_pernode=$((128 / OMP_NUM_THREADS))
+((mpi_total_procs=no_of_nodes * mpi_procs_pernode))
+
+ulimit -s $((4 * 1024 * 1024))
+ulimit -c 0
+
+
+# Loading modules 
+
+module purge
+module load cdo/2.0.5-gcc-11.2.0
+module load python3/2022.01-gcc-11.2.0
+module load nco/5.0.6-gcc-11.2.0
+
+module list
+
+#--------------------------------------------------------------------------------------
+# ICONTOOLS directory
+
+ICONTOOLS_DIR=/home/b/b381185/dwd_icon_tools/icontools
+
+BINARY_ICONSUB=iconsub
+BINARY_REMAP=iconremap
+BINARY_GRIDGEN=icongridgen
+
+#--------------------------------------------------------------------------------------
+# here choose the grid for different LEM domains
+# dom01: shallow cumulus clouds   lon_center=25.0, lat_center=40.0
+# dom02: WCB ascent region        lon_center=40.0, lat_center=44.0
+# dom03: WCB cyclonic outflow     lon_center=30.0, lat_center=53.0
+# dom04: WCB anticyclonic otflow  lon_center=50.0, lat_center=48.0
+
+
+mkdir plane_nest_300m_r6x6_2mom_25_40_0001
+cd plane_nest_300m_r6x6_2mom_25_40_0001
+
+# import the grid from grid_generator
+gridfile=raggedOrthogonal_471x667_300_with_boundary_dom01.nc
+cp /work/bb1135/from_Mistral/bb1135/b381185/tools/GridGenerator_master/grids/$gridfile ./
+#---------------------------------------------------------------------------------------
+# 1- Remmaping the external prameters file onto the nest grid
+# Aquaplanet extpar file
+
+for field in SOILTYP FR_LAND ICE PLCOV_MX LAI_MX RSMIN URBAN FOR_D FOR_E EMIS_RAD ROOTDP Z0 NDVI_MAX topography_c SSO_STDH SSO_THETA SSO_GAMMA SSO_SIGMA T_CL FR_LAKE DEPTH_LK topography_v LU_CLASS_FRACTION NDVI NDVI_MRAT AER_BC AER_DUST AER_ORG AER_SO4 AER_SS ALB ALNID ALUVD lon lat clon clat clon_vertices clat_vertices ; do
+
+cat >> NAMELIST_ICONREMAP_FIELDS << EOF_2A
+!
+&input_field_nml
+ inputname      = "${field}"
+ outputname     = "${field}"
+ intp_method    = 3
+/
+EOF_2A
+
+done
+
+cat NAMELIST_ICONREMAP_FIELDS
+
+cat > NAMELIST_ICONREMAP << EOF_2C
+&remap_nml
+ in_grid_filename  = '../inputs/icon_grid_0010_R02B04_G.nc'
+ in_filename       = '../inputs/icon_extpar_0010_R02B04_G_aquaplanet.nc'
+ in_type           = 2
+ out_grid_filename = '$gridfile'
+ out_filename      = 'extpar_remapped.nc'
+ out_type          = 2
+ out_filetype      = 4
+ l_have3dbuffer    = .false.
+ ncstorage_file    = "ncstorage.tmp"
+/
+EOF_2C
+
+
+${ICONTOOLS_DIR}/${BINARY_REMAP} \
+            --remap_nml NAMELIST_ICONREMAP                                  \
+            --input_field_nml NAMELIST_ICONREMAP_FIELDS 2>&1
+
+#-----------------------------------------------------------------------------
+# clean-up
+
+rm -f ncstorage.tmp*
+rm -f nml.log  NAMELIST_SUB NAMELIST_ICONREMAP NAMELIST_ICONREMAP_FIELDS
+
+#-----------------------------------------------------------------------------
+
+
+# 4- Remmaping the Ozone file onto the channel grid
+
+# APE O3 file for irad_o3 = 4
+
+for field in O3 ; do
+
+cat >> NAMELIST_ICONREMAP_FIELDS << EOF_2A
+!
+&input_field_nml
+ inputname      = "${field}"
+ outputname     = "${field}"
+ intp_method    = 3
+/
+EOF_2A
+
+done
+
+cat > NAMELIST_ICONREMAP << EOF_2C
+&remap_nml
+ in_grid_filename  = ''
+ in_filename       = '../inputs/ape_o3_R2B04_1Pa_cell.t63grid.nc'
+ in_type           = 1
+ out_grid_filename = '$gridfile'
+ out_filename      = 'ape_O3_remapped.nc'
+ out_type          = 2
+ out_filetype      = 4
+ l_have3dbuffer    = .false.
+ ncstorage_file    = "ncstorage.tmp"
+/
+EOF_2C
+
+${ICONTOOLS_DIR}/${BINARY_REMAP} \
+            --remap_nml NAMELIST_ICONREMAP                                  \
+            --input_field_nml NAMELIST_ICONREMAP_FIELDS 2>&1
+
+#-----------------------------------------------------------------------------
+# clean-up
+
+rm -f ncstorage.tmp*
+rm -f nml.log  NAMELIST_SUB NAMELIST_ICONREMAP NAMELIST_ICONREMAP_FIELDS
+
+#-----------------------------------------------------------------------------
+# Correction of remmaped extpar file 
+
+ncatted -a rawdata,global,c,c,"GLOBCOVER2009, FAO DSMW, GLOBE, Lake Database" extpar_remapped.nc
+
+# Run the python script for correction of the type and month dimension
+
+python ../inputs/extpar_helper.py
+
+# modification of grid global attributes
+python ../inputs/grid_change.py $gridfile
+#----------------------------------------------------------------------------
+
+# Correction of remmaped ozone file
+
+ncrename -d plev,level ape_O3_remapped.nc
+ncrename -v plev,level ape_O3_remapped.nc
+
+# clean 
+rm extpar_remapped.nc 
+
+
+
+
diff --git a/sims/preprocessing_scripts/2.iconnwp_4iconnest_init_lc1.sh b/sims/preprocessing_scripts/2.iconnwp_4iconnest_init_lc1.sh
new file mode 100755
index 0000000000000000000000000000000000000000..3cca85d73fc538a6fc8519deeec6103174d8be89
--- /dev/null
+++ b/sims/preprocessing_scripts/2.iconnwp_4iconnest_init_lc1.sh
@@ -0,0 +1,162 @@
+#! /bin/bash
+set -x
+ulimit -s unlimited
+#=============================================================================
+# OpenMP environment variables
+# ----------------------------
+export OMP_NUM_THREADS=4
+export ICON_THREADS=$OMP_NUM_THREADS
+export OMP_SCHEDULE="guided"
+export OMP_DYNAMIC="false"
+export OMP_STACKSIZE=500M
+#
+# MPI variables
+# -------------
+no_of_nodes=${SLURM_JOB_NUM_NODES:=1}
+mpi_procs_pernode=$((128 / OMP_NUM_THREADS))
+((mpi_total_procs=no_of_nodes * mpi_procs_pernode))
+
+ulimit -s $((4 * 1024 * 1024))
+ulimit -c 0
+
+
+# Loading modules
+
+module purge
+module load cdo/2.0.5-gcc-11.2.0
+module load python3/2022.01-gcc-11.2.0
+module load nco/5.0.6-gcc-11.2.0
+
+module list
+
+#--------------------------------------------------------------------------------------
+# ICONTOOLS directory
+
+ICONTOOLS_DIR=/home/b/b381185/dwd_icon_tools/icontools
+
+BINARY_ICONSUB=iconsub
+BINARY_REMAP=iconremap
+BINARY_GRIDGEN=icongridgen
+
+#=============================================================================
+# here choose the grid for different LEM domains
+# dom01: shallow cumulus clouds   lon_center=25.0, lat_center=40.0
+# dom02: WCB ascent region        lon_center=40.0, lat_center=44.0
+# dom03: WCB cyclonic outflow     lon_center=30.0, lat_center=53.0
+# dom04: WCB anticyclonic otflow  lon_center=50.0, lat_center=48.0
+#=============================================================================
+
+EXPNAME=plane_nest_300m_r6x6_2mom_25_40_0001
+
+gridfile=raggedOrthogonal_471x667_300_with_boundary_dom01.nc
+
+INDATDIR=/work/bb1135/icon_output/LC1-channel-4000x9000km-2km-0002  # the folder for the icon-nwp input
+OUTDATDIR=/work/bb1135/LES_Simulations/initial_conditions/${EXPNAME}    # the folder for the remapped output
+
+in_grid_File=/work/bb1135/icon_output/LC1-channel-4000x9000km-2km-0002/grid_DOM01.nc
+out_grid_File=/work/bb1135/LES_Simulations/initial_conditions/${EXPNAME}/$gridfile
+
+in_data_File=${INDATDIR}/icon-fg_ML_202201
+out_data_File=${OUTDATDIR}/icon-fg_ML_nest
+
+### interpolation method
+# 2 -> interpolation : conservative (gives problems in the outer most cells in HDCP2-DE)
+# 3 -> interpolation : rbf: scalar  (RBF method, needs correct values for rbf_scale_scalar)
+intp_method=3
+
+# the directory for the experiment will be created, if not already there
+if [ ${OUTDATDIR}.notset = .notset ]; then
+    echo "OUTDATDIR not set"
+    exit 1
+fi
+
+if [ ! -d $OUTDATDIR ]; then
+    mkdir -p $OUTDATDIR
+fi
+#
+cd $OUTDATDIR
+#
+
+for day in 05 ; do
+
+for hour in 06 ; do 
+
+rm -f indata.nc indata-vn.nc
+cp ${in_data_File}${day}T${hour}0030Z.nc indata.nc
+# add grid to in_data_File, needed for vn interpolation
+# ATTENTION: make sure to use cdo/1.7.0-magicsxx-gcc48, other cdo version can lead
+# to a crash of the vn remapping with the dwdicontools
+
+cdo -P 32 setgrid,${in_grid_File} -selname,vn ${in_data_File}${day}T${hour}0030Z.nc indata-vn.nc
+
+# create ICON master namelist: obtained from Matthias Brueck
+# ------------------------
+cat > ${OUTDATDIR}/tmp.nml << REMAP_NML_EOF
+! REMAPPING NAMELIST FILE
+!
+&remap_nml
+ in_grid_filename   = '${in_grid_File}'
+ in_filename        = 'indata.nc'
+ in_type            = 2
+ out_grid_filename  = '${out_grid_File}'
+ out_filename       = 'outdata.nc'
+ out_type           = 2
+ out_filetype       = 5
+ !s_maxsize         = 1000000
+ lsynthetic_grid    = .FALSE.
+/
+REMAP_NML_EOF
+
+for field in  w rho theta_v qv qc qi qr qs tke u v pres_sfc temp pres z_ifc t_2m td_2m u_10m v_10m fr_land gz0 t_g t_ice h_ice alb_si qv_s fr_seaice t_sk t_seasfc w_i t_so w_so w_so_ice t_snow w_snow rho_snow h_snow freshsnow snowfrac_lc rho_snow_mult t_snow_mult wliq_snow wtot_snow dzh_snow ; do 
+
+cat >> ${OUTDATDIR}/tmp.nml << REMAP_NML_EOF 
+! 
+&input_field_nml  
+ inputname      = "${field}"  
+ outputname     = "${field}"  
+ intp_method    = ${intp_method} 
+/ 
+REMAP_NML_EOF
+
+done
+
+cat > ${OUTDATDIR}/tmp-vn.nml << REMAP_NML_EOF
+! REMAPPING NAMELIST FILE
+!
+&remap_nml
+ in_grid_filename   = '${in_grid_File}'
+ in_filename        = 'indata-vn.nc'
+ in_type            = 2
+ out_grid_filename  = '${out_grid_File}'
+ out_filename       = 'outdata-vn.nc'
+ out_type           = 2
+ out_filetype       = 5
+ !s_maxsize         = 1000000
+ lsynthetic_grid    = .FALSE.
+/
+! DEFINITION FOR INPUT DATA FIELD
+!
+&input_field_nml
+ inputname      = "vn"
+ outputname     = "vn"
+!! intp_method    = ${intp_method}
+/
+REMAP_NML_EOF
+
+# Hint: option -vvvvvvvvvvvv activates a lot of diagnostic output
+#${START} ${ICONTOOLS_DIR}/${BINARY_REMAP} --remap_nml ${OUTDATDIR}/tmp.nml    2>&1
+#${START} ${ICONTOOLS_DIR}/${BINARY_REMAP} --remap_nml ${OUTDATDIR}/tmp-vn.nml 2>&1
+
+${ICONTOOLS_DIR}/${BINARY_REMAP} --remap_nml ${OUTDATDIR}/tmp.nml    2>&1
+${ICONTOOLS_DIR}/${BINARY_REMAP} --remap_nml ${OUTDATDIR}/tmp-vn.nml 2>&1
+
+# merge interpolated output
+cdo -O merge outdata.nc outdata-vn.nc ${out_data_File}_202201${day}T${hour}0030Z.nc
+
+# clean up
+rm -f indata.nc indata-vn.nc outdata.nc outdata-vn.nc
+
+done # end of loop over times (hours)
+
+done
+##cd ${RUNSCRIPTDIR}
diff --git a/sims/preprocessing_scripts/3.iconnwp_4iconnest_latbc_lc1.sh b/sims/preprocessing_scripts/3.iconnwp_4iconnest_latbc_lc1.sh
new file mode 100755
index 0000000000000000000000000000000000000000..f66b7e6654e0ab1798c99410b5e160dd2620cc8d
--- /dev/null
+++ b/sims/preprocessing_scripts/3.iconnwp_4iconnest_latbc_lc1.sh
@@ -0,0 +1,139 @@
+#! /bin/bash
+set -x
+ulimit -s unlimited
+#=============================================================================
+# OpenMP environment variables
+# ----------------------------
+export OMP_NUM_THREADS=4
+export ICON_THREADS=$OMP_NUM_THREADS
+export OMP_SCHEDULE="guided"
+export OMP_DYNAMIC="false"
+export OMP_STACKSIZE=500M
+#
+# MPI variables
+# -------------
+no_of_nodes=${SLURM_JOB_NUM_NODES:=1}
+mpi_procs_pernode=$((128 / OMP_NUM_THREADS))
+((mpi_total_procs=no_of_nodes * mpi_procs_pernode))
+
+ulimit -s $((4 * 1024 * 1024))
+ulimit -c 0
+
+
+# Loading modules
+
+module purge
+module load cdo/2.0.5-gcc-11.2.0
+module load python3/2022.01-gcc-11.2.0
+module load nco/5.0.6-gcc-11.2.0
+
+module list
+
+#--------------------------------------------------------------------------------------
+# ICONTOOLS directory
+
+ICONTOOLS_DIR=/home/b/b381185/dwd_icon_tools/icontools
+
+BINARY_ICONSUB=iconsub
+BINARY_REMAP=iconremap
+BINARY_GRIDGEN=icongridgen
+
+#=============================================================================
+# here choose the grid for different LEM domains
+# dom01: shallow cumulus clouds   lon_center=25.0, lat_center=40.0
+# dom02: WCB ascent region        lon_center=40.0, lat_center=44.0
+# dom03: WCB cyclonic outflow     lon_center=30.0, lat_center=53.0
+# dom04: WCB anticyclonic otflow  lon_center=50.0, lat_center=48.0
+#=============================================================================
+
+EXPNAME=plane_nest_300m_r6x6_2mom_25_40_0001
+
+gridfile=raggedOrthogonal_471x667_300_with_boundary_dom01.nc
+
+INDATDIR=/work/bb1135/icon_output/LC1-channel-4000x9000km-2km-0002  # the folder for the icon-nwp input
+OUTDATDIR=/work/bb1135/LES_Simulations/initial_conditions/${EXPNAME}    # the folder for the remapped output
+
+in_grid_File=/work/bb1135/icon_output/LC1-channel-4000x9000km-2km-0002/grid_DOM01.nc
+out_grid_File=/work/bb1135/LES_Simulations/initial_conditions/${EXPNAME}/$gridfile
+
+in_data_File=${INDATDIR}/icon-fg_ML_202201
+out_data_File=${OUTDATDIR}/icon-fg_ML_nest
+
+
+### interpolation method
+# 2 -> interpolation : conservative (gives problems in the outer most cells in HDCP2-DE)
+# 3 -> interpolation : rbf: scalar  (RBF method, needs correct values for rbf_scale_scalar)
+intp_method=3
+
+# the directory for the experiment will be created, if not already there
+if [ ${OUTDATDIR}.notset = .notset ]; then
+    echo "OUTDATDIR not set"
+    exit 1
+fi
+
+if [ ! -d $OUTDATDIR ]; then
+    mkdir -p $OUTDATDIR
+fi
+#
+cd $OUTDATDIR
+#
+
+year=2022
+month=01
+
+for day in 05 ; do
+
+for hour in 06 07 08 09 10 11 12 13 14 15; do
+
+#rm -f indata.nc indata-vn.nc
+#cp ${in_data_File}${day}T${hour}0030Z.nc indata.nc
+
+# create ICON master namelist: obtained from Matthias Brueck
+# ------------------------
+cat > ${OUTDATDIR}/tmp.nml << REMAP_NML_EOF
+! REMAPPING NAMELIST FILE
+!
+&remap_nml
+ in_grid_filename   = '${in_grid_File}'
+ in_filename        = '${in_data_File}${day}T${hour}0030Z.nc'
+ in_type            = 2
+ out_grid_filename  = '${out_grid_File}'
+ out_filename       = 'outdata.nc'
+ out_type           = 2
+ out_filetype       = 5
+ !s_maxsize         = 1000000
+ lsynthetic_grid    = .FALSE.
+/
+REMAP_NML_EOF
+
+for field in z_ifc U V W THETA_V RHO QV QC QI QR QS ; do 
+
+cat >> ${OUTDATDIR}/tmp.nml << REMAP_NML_EOF 
+! 
+&input_field_nml  
+ inputname      = "${field}"  
+ outputname     = "${field}"  
+ intp_method    = ${intp_method} 
+/ 
+REMAP_NML_EOF
+
+done
+
+#${START} ${ICONTOOLS_DIR}/${BINARY_REMAP} --remap_nml ${OUTDATDIR}/tmp.nml    2>&1
+${ICONTOOLS_DIR}/${BINARY_REMAP} --remap_nml ${OUTDATDIR}/tmp.nml    2>&1
+
+#${START}${ICONTOOLS_DIR}/${BINARY_REMAP} --remap_nml ${OUTDATDIR}/tmp-vn.nml 2>&1
+
+
+# merge interpolated output
+#cdo -O merge outdata.nc outdata-vn.nc ${out_data_File}_${year}-${month}-${day}T${hour}.nc
+
+mv outdata.nc ${out_data_File}_${year}-${month}-${day}T${hour}.nc
+
+# clean up
+#rm -f indata.nc 
+
+done # end of loop over times (hours)
+
+done
+##cd ${RUNSCRIPTDIR}
diff --git a/sims/preprocessing_scripts/inputs/ape_o3_R2B04_1Pa_cell.t63grid.nc b/sims/preprocessing_scripts/inputs/ape_o3_R2B04_1Pa_cell.t63grid.nc
new file mode 100644
index 0000000000000000000000000000000000000000..3ab36a5f7056f1ffeb8807fede792752df55106a
Binary files /dev/null and b/sims/preprocessing_scripts/inputs/ape_o3_R2B04_1Pa_cell.t63grid.nc differ
diff --git a/sims/preprocessing_scripts/inputs/extpar_helper.py b/sims/preprocessing_scripts/inputs/extpar_helper.py
new file mode 100644
index 0000000000000000000000000000000000000000..d7d12df0b5cf5cf674950b4508fd2d395a8c966b
--- /dev/null
+++ b/sims/preprocessing_scripts/inputs/extpar_helper.py
@@ -0,0 +1,27 @@
+import xarray as xr
+import numpy as np
+
+ds = xr.open_dataset('extpar_remapped.nc')
+
+time_array = np.array([1.111011e+07, 1.111021e+07, 1.111031e+07, 1.111041e+07, 1.111051e+07,1.111061e+07, 1.111071e+07, 1.111081e+07,1.111091e+07, 1.111101e+07,1.111111e+07, 1.111121e+07])
+
+newds_list = []
+for ind in range(0,12):
+    temp = ds.copy(deep=True)
+    temp['time']= temp['time']*0 + time_array[ind]
+    newds_list.append(temp)
+
+temp = ds.copy(deep=True)
+
+newds = xr.merge(newds_list)
+
+newds['SOILTYP'] = newds['SOILTYP'].astype('int32', copy=True)
+
+newds.attrs = ds.attrs
+
+for var in ds.data_vars.keys():
+    newds[var].attrs = ds[var].attrs
+
+newds['T_CL'][:] = 285
+
+newds.to_netcdf('extpar_remapped_12_months.nc')
diff --git a/sims/preprocessing_scripts/inputs/grid_change.py b/sims/preprocessing_scripts/inputs/grid_change.py
new file mode 100644
index 0000000000000000000000000000000000000000..63336a16ee94cd77a81f0ae39f0f01424d787e33
--- /dev/null
+++ b/sims/preprocessing_scripts/inputs/grid_change.py
@@ -0,0 +1,9 @@
+import xarray as xr
+import sys
+
+
+grid = xr.open_dataset(sys.argv[1])
+grid.attrs['grid_geometry'] = 3.
+grid.attrs['domain_length'] = grid.attrs['domain_length']*3
+
+grid.to_netcdf('./new_'+str(sys.argv[1]))
diff --git a/sims/preprocessing_scripts/inputs/icon_extpar_0010_R02B04_G_aquaplanet.nc b/sims/preprocessing_scripts/inputs/icon_extpar_0010_R02B04_G_aquaplanet.nc
new file mode 100644
index 0000000000000000000000000000000000000000..c6774c9dc41d738ead93452afdf209fe691b42de
Binary files /dev/null and b/sims/preprocessing_scripts/inputs/icon_extpar_0010_R02B04_G_aquaplanet.nc differ
diff --git a/sims/preprocessing_scripts/inputs/icon_grid_0010_R02B04_G.nc b/sims/preprocessing_scripts/inputs/icon_grid_0010_R02B04_G.nc
new file mode 100644
index 0000000000000000000000000000000000000000..7325b1d31f823c96675f965d0e4681b2c27d85b7
Binary files /dev/null and b/sims/preprocessing_scripts/inputs/icon_grid_0010_R02B04_G.nc differ
diff --git a/sims/preprocessing_scripts/map_file.latbc b/sims/preprocessing_scripts/map_file.latbc
new file mode 100644
index 0000000000000000000000000000000000000000..e8e34cd89f3f66d61760f92cbe10e665a4be2425
--- /dev/null
+++ b/sims/preprocessing_scripts/map_file.latbc
@@ -0,0 +1,21 @@
+# Dictionary for mapping between internal names and GRIB2 shortNames
+# needed by GRIB2 read procedures.
+#
+# internal name     GRIB2 shortName
+u                   u
+v                   v
+w                   w
+temp                temp
+pres                pres
+qv                  qv
+qc                  qc
+qi                  qi
+qr                  qr
+qs                  qs
+pres_sfc            LNPS
+#z_ifc               HHL
+vn                  VN
+GEOSP               GEOSP
+GEOP_ML             GEOP_ML
+theta_v             theta_v
+rho                 rho
diff --git a/sims/runscripts/README b/sims/runscripts/README
deleted file mode 100644
index 37c3ad1b70c71d0c8527b49bdc0fef3c9d3ca97f..0000000000000000000000000000000000000000
--- a/sims/runscripts/README
+++ /dev/null
@@ -1,9 +0,0 @@
-simulations
-
-LC1-channel-4000x9000km-2km-0002        : cloud radiaton with two-moment microphysics, ecrad
-
-LC1-LES-471x667km-lon40-lat44-300m-0003 : cloud radiaton with two-moment microphysics, ecrad LATBC: NWP_0002
-LC1-LES-471x667km-lon50-lat48-300m-0004 : cloud radiaton with two-moment microphysics, ecrad LATBC: NWP_0002
-LC1-LES-471x667km-lon30-lat53-300m-0005 : cloud radiaton with two-moment microphysics, ecrad LATBC: NWP_0002
-LC1-LES-471x667km-lon25-lat40-300m-0006 : cloud radiaton with two-moment microphysics, ecrad LATBC: NWP_0002
-
diff --git a/sims/runscripts/README.md b/sims/runscripts/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..28fbf3289894f190b85484dfed21386f020e4684
--- /dev/null
+++ b/sims/runscripts/README.md
@@ -0,0 +1,13 @@
+** ICON simulations**
+
+*Baroclinic life cycle simulation with ICON-NWP**
+
+LC1-channel-4000x9000km-2km-0002        : Only with cloud radiaton, two-moment microphysics, and ecrad
+
+*Large eddy model simulations with ICON-LEM*
+
+LC1-LES-471x667km-lon40-lat44-300m-0003 : Only with cloud radiaton, two-moment microphysics, ecrad, LATBC: NWP_0002
+LC1-LES-471x667km-lon50-lat48-300m-0004 : Only with cloud radiaton, two-moment microphysics, ecrad, LATBC: NWP_0002
+LC1-LES-471x667km-lon30-lat53-300m-0005 : Only with cloud radiaton, two-moment microphysics, ecrad, LATBC: NWP_0002
+LC1-LES-471x667km-lon25-lat40-300m-0006 : Only with cloud radiaton, two-moment microphysics, ecrad, LATBC: NWP_0002
+