Skip to content
Snippets Groups Projects
Commit 6d9d62e3 authored by Michael Blaschek's avatar Michael Blaschek :bicyclist:
Browse files

Update README.md, Jet-Cluster.md, SSH-VPN-VNC/README.md,...

Update README.md, Jet-Cluster.md, SSH-VPN-VNC/README.md, Python/Your-First-Notebook-onJet_v2.ipynb, SRVX8.md, SRVX2.md, SRVX1.md files
parent eba76118
No related branches found
No related tags found
No related merge requests found
...@@ -5,48 +5,14 @@ ...@@ -5,48 +5,14 @@
# Getting Started # Getting Started
Welcome to the HPC @IMG @UNIVIE and please follow these steps to become a productive member of our department and make good use of the computer resources. Efficiency is keen. Welcome to the HPC @IMG @UNIVIE and please follow these steps to become a productive member of our department and make good use of the computer resources.
**Efficiency is keen.**
1. Connect to Jet 1. Connect to Jet
2. Load environment (libraries, compilers, interpreter, tools) 2. Load environment (libraries, compilers, interpreter, tools)
3. Checkout Code, Program, Compile, Test 3. Checkout Code, Program, Compile, Test
4. Submit to compute Nodes 4. Submit to compute Nodes
## SSH [How to SSH / VNC / VPN](SSH-VPN-VNC/README.md)
- [x] Terminal, Putty, ...
- [x] Network access @UNIVIE
Use a terminal or [Putty](https://www.putty.org/) to connect with SSH to the server
```bash
ssh [user]@jet01.img.univie.ac.at
```
Please replace `[user]` with the username given by the [sysadmin](mailto:michael.blaschek@univie.ac.at) and supply your password when asked for. This should give you access to the login node. If you are outside of the university network, then please goto [VPN](#VPN) below.
## VPN
- [x] `u:account`
The jet cluster is only accessible from within the virtual network of the university of Vienna. Therefore access from outside has to be granted via the [VPN-Service](https://vpn.univie.ac.at). Go there and login with your `u:account` and download the *Big-IP Edge* client for you system.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/01_download_neu.png)
Links:
* [ZID-VPN](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/)
* Linux (Ubuntu, Generic), Windows, Mac: [VPN user guides](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/anleitungen/)
* Arch based AUR package [AUR f5fpc](https://aur.archlinux.org/packages/f5fpc/)
Follow the install instructions for Windows, Mac and Linux and make sure the software works.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/08_verbinden.png)
On Windows and MAc you get a nice gui that requires you to fill in the VPN server: `vpn.univie.ac.at` and username and password from the `u:account`. On Linux execute the following:
```
f5fpc -s -t vpn.univie.ac.at -u [user]
```
The status can be checked with `f5fpc --info`.
## VNC
The login nodes (`jet01` and `jet02`) allow to run a `VNC`-Server
:construction:
## Jupyterhub ## Jupyterhub
<img src="https://jupyter.org/assets/hublogo.svg" width="300px"> <img src="https://jupyter.org/assets/hublogo.svg" width="300px">
...@@ -62,8 +28,9 @@ Login with your jet-Credentials, choose a job and the jupyterlab will be launche ...@@ -62,8 +28,9 @@ Login with your jet-Credentials, choose a job and the jupyterlab will be launche
<img src="Documentation/jet-job2.png" width="500px"> <img src="Documentation/jet-job2.png" width="500px">
:construction: There are several kernels available and some help can be found:
- [Python/](Python/)
- [Tutorial on Jet](Python/Your-First-Notebook-onJet_v2.ipynb)
## User Quotas and Restrictions ## User Quotas and Restrictions
...@@ -93,11 +60,8 @@ Node Setup ...@@ -93,11 +60,8 @@ Node Setup
| Name | Value | | Name | Value |
| --- | --- | | --- | --- |
| Product | ThinkSystem SR650 | | Product | ThinkSystem SR650 |
| Distro | RedHatEnterprise 8.2 Ootpa |
| Kernel | 4.18.0-147.el8.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz | | Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
| Cores | 2 CPU, 20 physical cores per CPU, total 80 logical CPU units | | Cores | 2 CPU, 20 physical cores per CPU, total 80 logical CPU units |
| BaseBoard | Lenovo -\[7X06CTO1WW\]- Intel Corporation C624 Series Chipset |
| CPU Time | 350 kh | | CPU Time | 350 kh |
| Memory | 755 GB Total | | Memory | 755 GB Total |
| Memory/Core | 18.9 GB | | Memory/Core | 18.9 GB |
...@@ -109,8 +73,8 @@ Global file system (GPFS) is present on all nodes with about 1 PB (~1000 TB) of ...@@ -109,8 +73,8 @@ Global file system (GPFS) is present on all nodes with about 1 PB (~1000 TB) of
The typcial installation of a intel-cluster has the INTEL Compiler suite (`intel-parallel-studio`) and the open source GNU Compilers installed. Based on these two different compilers (`intel`, `gnu`), there are usually two version of each scientific software. The typcial installation of a intel-cluster has the INTEL Compiler suite (`intel-parallel-studio`) and the open source GNU Compilers installed. Based on these two different compilers (`intel`, `gnu`), there are usually two version of each scientific software.
Major Libraries: Major Libraries:
- OpenMPI - OpenMPI (3.1.6, 4.0.5)
- HDF5 - HDF5
- NetCDF (C, Fortran) - NetCDF (C, Fortran)
- ECCODES from [ECMWF](https://confluence.ecmwf.int/display/ECC) - ECCODES from [ECMWF](https://confluence.ecmwf.int/display/ECC)
- Math libraries e.g. intel-mkl, lapack,scalapack - Math libraries e.g. intel-mkl, lapack,scalapack
...@@ -122,36 +86,47 @@ These software libraries are usually handled by environment modules. ...@@ -122,36 +86,47 @@ These software libraries are usually handled by environment modules.
<img src="http://modules.sourceforge.net/modules_red.svg" width="300px"> <img src="http://modules.sourceforge.net/modules_red.svg" width="300px">
## Currently installed modules ## Currently installed modules
```bash ```bash
>>> module av $ module av
---------------------------------- /jetfs/spack/share/spack/modules/linux-rhel8-skylake_avx512 ----------------------------------- ------------------------------------ /jetfs/spack/share/spack/modules/linux-rhel8-haswell -------------------------------------
anaconda2/2019.10-gcc-8.3.1-5pou6ji ncview/2.1.8-gcc-8.3.1-s2owtzw intel-parallel-studio/composer.2017.7-intel-17.0.7-disfj2g
anaconda3/2019.10-gcc-8.3.1-tmy5mgp netcdf-c/4.7.4-gcc-8.3.1-fh4nn6k
anaconda3/2020.07-gcc-8.3.1-weugqkf netcdf-c/4.7.4-intel-20.0.2-337uqtc --------------------------------- /jetfs/spack/share/spack/modules/linux-rhel8-skylake_avx512 ---------------------------------
cdo/1.9.8-gcc-8.3.1-ipgvzeh netcdf-fortran/4.5.3-gcc-8.3.1-kfd2vkj anaconda2/2019.10-gcc-8.3.1-5pou6ji nco/4.9.3-intel-20.0.2-dhlqiyo
eccodes/2.18.0-gcc-8.3.1-s7clum3 netcdf-fortran/4.5.3-intel-20.0.2-irdm5gq anaconda3/2019.10-gcc-8.3.1-tmy5mgp ncview/2.1.8-gcc-8.3.1-s2owtzw
eccodes/2.18.0-intel-20.0.2-6tadpgr netlib-lapack/3.8.0-gcc-8.3.1-ue37lic anaconda3/2020.07-gcc-8.3.1-weugqkf ncview/2.1.8-intel-20.0.2-3taqdda
enstools/2020.11.dev-gcc-8.3.1-fes7kgo netlib-scalapack/2.1.0-gcc-8.3.1-pbkjymd anaconda3/2020.11-gcc-8.3.1-gramgir netcdf-c/4.7.4-gcc-8.3.1-fh4nn6k
gcc/8.3.1-gcc-8.3.1-pp3wjou openblas/0.3.10-gcc-8.3.1-ncess5c cdo/1.9.8-gcc-8.3.1-ipgvzeh netcdf-c/4.7.4-gcc-8.3.1-MPI3.1.6-y5gknpt
geos/3.8.1-gcc-8.3.1-o76leir openmpi/3.1.6-gcc-8.3.1-rk5av53 eccodes/2.18.0-gcc-8.3.1-s7clum3 netcdf-c/4.7.4-intel-20.0.2-337uqtc
hdf5/1.12.0-gcc-8.3.1-awl4atl openmpi/3.1.6-intel-20.0.2-ubasrpk eccodes/2.18.0-intel-20.0.2-6tadpgr netcdf-fortran/4.5.3-gcc-8.3.1-kfd2vkj
hdf5/1.12.0-intel-20.0.2-ezeotzr openmpi/4.0.5-gcc-8.3.1-773ztsv enstools/2020.11.dev-gcc-8.3.1-fes7kgo netcdf-fortran/4.5.3-gcc-8.3.1-MPI3.1.6-rjcxqb6
intel-mkl/2020.3.279-gcc-8.3.1-5xeezjw openmpi/4.0.5-intel-20.0.2-4wfaaz4 esmf/7.1.0r-gcc-8.3.1-4fijz4q netcdf-fortran/4.5.3-intel-20.0.2-irdm5gq
intel-mkl/2020.3.279-intel-20.0.2-m7bxged parallel-netcdf/1.12.1-gcc-8.3.1-gng2jcu gcc/8.3.1-gcc-8.3.1-pp3wjou netlib-lapack/3.8.0-gcc-8.3.1-ue37lic
intel-parallel-studio/composer.2020.2-intel-20.0.2-zuot22y parallel-netcdf/1.12.1-gcc-8.3.1-xxrhtxn geos/3.8.1-gcc-8.3.1-o76leir netlib-scalapack/2.1.0-gcc-8.3.1-pbkjymd
julia/1.5.2-gcc-8.3.1-3iwgkf7 parallel-netcdf/1.12.1-intel-20.0.2-sgz3yqs hdf5/1.10.7-gcc-8.3.1-MPI3.1.6-vm3avor openblas/0.3.10-gcc-8.3.1-ncess5c
libemos/4.5.9-gcc-8.3.1-h3lqu2n proj/7.1.0-gcc-8.3.1-xcjaco5 hdf5/1.12.0-gcc-8.3.1-awl4atl openmpi/3.1.6-gcc-8.3.1-rk5av53
miniconda2/4.7.12.1-gcc-8.3.1-zduqggv zlib/1.2.11-gcc-8.3.1-bbbpnzp hdf5/1.12.0-intel-20.0.2-ezeotzr openmpi/3.1.6-intel-20.0.2-ubasrpk
miniconda3/4.8.2-gcc-8.3.1-3m7b6t2 zlib/1.2.11-intel-20.0.2-3h374ov intel-mkl/2020.3.279-gcc-8.3.1-5xeezjw openmpi/4.0.5-gcc-8.3.1-773ztsv
nco/4.9.3-gcc-8.3.1-jtokrle intel-mkl/2020.3.279-intel-20.0.2-m7bxged openmpi/4.0.5-intel-20.0.2-4wfaaz4
intel-oneapi-compilers/2021.2.0-oneapi-2021.2.0-6kdzddx openmpi/4.0.5-oneapi-2021.2.0-hrfsxrd
intel-oneapi-mpi/2021.2.0-oneapi-2021.2.0-haqpxfl parallel-netcdf/1.12.1-gcc-8.3.1-MPI3.1.6-gng2jcu
intel-parallel-studio/composer.2020.2-intel-20.0.2-zuot22y parallel-netcdf/1.12.1-gcc-8.3.1-xxrhtxn
julia/1.5.2-gcc-8.3.1-3iwgkf7 parallel-netcdf/1.12.1-intel-20.0.2-sgz3yqs
libemos/4.5.9-gcc-8.3.1-h3lqu2n perl/5.32.0-intel-20.0.2-2d23x7l
miniconda2/4.7.12.1-gcc-8.3.1-zduqggv proj/7.1.0-gcc-8.3.1-xcjaco5
miniconda3/4.8.2-gcc-8.3.1-3m7b6t2 zlib/1.2.11-gcc-8.3.1-bbbpnzp
ncl/6.6.2-gcc-8.3.1-MPI3.1.6-3dxuv5f zlib/1.2.11-intel-20.0.2-3h374ov
nco/4.9.3-gcc-8.3.1-g7o6lao
``` ```
Using [environment modules](https://modules.readthedocs.io/en/latest/) it is possible to have different software libraries (versions, compilers) side-by-side and ready to be loaded. Be aware that some libraries are dependent on others. It is recommended to load the highest rank library first to check what dependencies are loaded as well. e.g.: Using [environment modules](https://modules.readthedocs.io/en/latest/) it is possible to have different software libraries (versions, compilers) side-by-side and ready to be loaded. Be aware that some libraries are dependent on others. It is recommended to load the highest rank library first to check what dependencies are loaded as well. e.g.:
``` ```
>>> module load eccodes/2.18.0-intel-20.0.2-6tadpgr $ module load eccodes/2.18.0-intel-20.0.2-6tadpgr
``` ```
loads the `ECCODES` library and all dependencies. e.g. intel compilers, as indicated by the naming. loads the `ECCODES` library and all dependencies. e.g. intel compilers, as indicated by the naming.
``` ```
>>> module list $ module list
Currently Loaded Modulefiles: Currently Loaded Modulefiles:
1) zlib/1.2.11-intel-20.0.2-3h374ov 3) hdf5/1.12.0-intel-20.0.2-ezeotzr 5) netcdf-c/4.7.4-intel-20.0.2-337uqtc 1) zlib/1.2.11-intel-20.0.2-3h374ov 3) hdf5/1.12.0-intel-20.0.2-ezeotzr 5) netcdf-c/4.7.4-intel-20.0.2-337uqtc
2) openmpi/4.0.5-intel-20.0.2-4wfaaz4 4) parallel-netcdf/1.12.1-intel-20.0.2-sgz3yqs 6) eccodes/2.18.0-intel-20.0.2-6tadpgr 2) openmpi/4.0.5-intel-20.0.2-4wfaaz4 4) parallel-netcdf/1.12.1-intel-20.0.2-sgz3yqs 6) eccodes/2.18.0-intel-20.0.2-6tadpgr
......
%% Cell type:markdown id: tags:
# Welcome to Jet
<img src='https://homepage.univie.ac.at/michael.blaschek/openfiles/rocket.png' width='100px'><img src='https://homepage.univie.ac.at/michael.blaschek/openfiles/uni.jpg' width='400px'>
You have done it! Your Notebook Server is running on the Jet Cluster.
Welcome, please skip this, if your know where you are, as this is meant as an introduction. However, not complete as it is.
If you need more infos about Notebooks have a look at [my Homepage](https://homepage.univie.ac.at/michael.blaschek/#python) and there are numerous Tutorials on the Internet for any advice, but feel free to ask me and give me feedback.
Help can be found here:
* Goto Help -> JupyterLab Reference
* Goto Help -> Jupyter Reference
* [Tour](https://jupyterlab.readthedocs.io/en/stable/user/interface.html)
In the following I will show you some useful things.
First: This Notebook is in your Home directory, but also on `/jetfs/scratch` for any reason.
%% Cell type:markdown id: tags:
## Check your current Host
%% Cell type:code id: tags:
``` python
!hostnamectl
```
%% Output
Static hostname: localhost.localdomain
Transient hostname: jet03.jet.local
Icon name: computer-server
Chassis: server
Operating System: ]8;;https://www.centos.org/CentOS Linux 8 (Core)]8;;
CPE OS Name: cpe:/o:centos:centos:8
Kernel: Linux 4.18.0-147.el8.x86_64
Architecture: x86-64
%% Cell type:markdown id: tags:
## Who is running this process
%% Cell type:code id: tags:
``` python
!whoami
```
%% Output
mblaschek
%% Cell type:code id: tags:
``` python
# Where are you
!pwd
```
%% Output
/jetfs/home/mblaschek/Documents
%% Cell type:code id: tags:
``` python
# Show my Current Jobs in the Queue
!squeue -u $(id -u) -l
```
%% Output
Fri Oct 16 23:10:58 2020
JOBID PARTITION NAME USER STATE TIME TIME_LIMI NODES NODELIST(REASON)
235 compute jupyters mblasche RUNNING 6:29:07 8:00:00 1 jet03
%% Cell type:markdown id: tags:
There you can see your `TIME_LIMIT` of 8 hours again and how much time has passed on your current jobs
%% Cell type:code id: tags:
``` python
# Import sys and check the executable (so what python you are using)
import sys
print(sys.executable)
```
%% Output
/jetfs/spack/opt/spack/linux-rhel8-skylake_avx512/gcc-8.3.1/miniconda3-4.8.2-3m7b6t2kgedyr3jnd2nasmgiq7wm27iv/bin/python
%% Cell type:markdown id: tags:
**Notice the funny path?**
Yes this is SPACK your nice scientific package manager.
Allows to build custom libraries and applications tailored to our needs!
Please not that software currently installed in /opt on JET01 is not available here.
Software under /opt is deprecated and will be removed after a full transition
to the current libraries
More information on [SPACK](https://spack.io/)
## Check your environmental PATH Variable, could be useful to know
%% Cell type:code id: tags:
``` python
%env PATH
```
%% Output
'/jetfs/spack/opt/spack/linux-rhel8-skylake_avx512/gcc-8.3.1/miniconda3-4.8.2-3m7b6t2kgedyr3jnd2nasmgiq7wm27iv/bin:/bin:/jetfs/spack/bin:/jetfs/userservices:/opt/jupyterhub/bin:/opt/slurm/bin:/sbin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/share/Modules/bin::/jetfs/home/mblaschek/bin:/jetfs/home/mblaschek/.local/bin'
%% Cell type:markdown id: tags:
## Check what modules are currently loaded in this Notebook session
%% Cell type:code id: tags:
``` python
!module list --no-pager
```
%% Output
Currently Loaded Modulefiles:
1) miniconda3/4.8.2-gcc-8.3.1-3m7b6t2
%% Cell type:markdown id: tags:
Put this file `load_modules_into_jupyter.conf` into your Home directory and your Notebook environment will have these modules loaded.
It is not possible to load these modules via magic (python magic ;).
**It needs the empty line at the end...**
%% Cell type:code id: tags:
``` python
%%writefile ~/load_modules_into_jupyter.conf
eccodes/2.18.0-gcc-8.3.1-s7clum3
```
%% Output
Writing /jetfs/home/mblaschek/load_modules_into_jupyter.conf
%% Cell type:code id: tags:
``` python
!cat ~/load_modules_into_jupyter.conf
```
%% Output
eccodes/2.18.0-gcc-8.3.1-s7clum3
%% Cell type:markdown id: tags:
**Restarting the kernel will show now loaded modules**
%% Cell type:code id: tags:
``` python
# after restarting
!module list --no-pager
```
%% Output
Currently Loaded Modulefiles:
1) miniconda3/4.8.2-gcc-8.3.1-3m7b6t2
2) openmpi/4.0.5-gcc-8.3.1-773ztsv
3) hdf5/1.12.0-gcc-8.3.1-awl4atl
4) parallel-netcdf/1.12.1-gcc-8.3.1-xxrhtxn
5) netcdf-c/4.7.4-gcc-8.3.1-fh4nn6k
6) eccodes/2.18.0-gcc-8.3.1-s7clum3
%% Cell type:markdown id: tags:
## Currently available Modules
%% Cell type:code id: tags:
``` python
# --no-pager is needed only inside notebook
!module avail --no-pager
```
%% Output
--------- /jetfs/spack/share/spack/modules/linux-rhel8-skylake_avx512 ----------
anaconda2/2019.10-gcc-8.3.1-5pou6ji
anaconda3/2019.10-gcc-8.3.1-tmy5mgp
anaconda3/2020.07-gcc-8.3.1-weugqkf
cdo/1.9.8-gcc-8.3.1-ipgvzeh
eccodes/2.18.0-gcc-8.3.1-s7clum3
hdf5/1.12.0-gcc-8.3.1-awl4atl
intel-mkl/2020.3.279-gcc-8.3.1-5xeezjw
miniconda2/4.7.12.1-gcc-8.3.1-zduqggv
miniconda3/4.8.2-gcc-8.3.1-3m7b6t2
netcdf-c/4.7.4-gcc-8.3.1-fh4nn6k
netcdf-fortran/4.5.3-gcc-8.3.1-kfd2vkj
openmpi/3.1.6-gcc-8.3.1-rk5av53
openmpi/4.0.5-gcc-8.3.1-773ztsv
parallel-netcdf/1.12.1-gcc-8.3.1-gng2jcu
parallel-netcdf/1.12.1-gcc-8.3.1-xxrhtxn
--------- /jetfs/spack/share/spack/modules/linux-rhel8-skylake_avx512 ----------
anaconda2/2019.10-gcc-8.3.1-5pou6ji
anaconda3/2019.10-gcc-8.3.1-tmy5mgp
anaconda3/2020.07-gcc-8.3.1-weugqkf
cdo/1.9.8-gcc-8.3.1-ipgvzeh
eccodes/2.18.0-gcc-8.3.1-s7clum3
hdf5/1.12.0-gcc-8.3.1-awl4atl
intel-mkl/2020.3.279-gcc-8.3.1-5xeezjw
miniconda2/4.7.12.1-gcc-8.3.1-zduqggv
miniconda3/4.8.2-gcc-8.3.1-3m7b6t2
netcdf-c/4.7.4-gcc-8.3.1-fh4nn6k
netcdf-fortran/4.5.3-gcc-8.3.1-kfd2vkj
openmpi/3.1.6-gcc-8.3.1-rk5av53
openmpi/4.0.5-gcc-8.3.1-773ztsv
parallel-netcdf/1.12.1-gcc-8.3.1-gng2jcu
parallel-netcdf/1.12.1-gcc-8.3.1-xxrhtxn
%% Cell type:markdown id: tags:
# Software on JET
Currently:
- [x] Anaconda (2,3)
- [x] Miniconda (2,3)
- [x] OPENMPI (4.0.5, 3.6.1) optimized for JET
- [x] HDF5 (1.12) optimized with OPENMPI
- [x] NetCDF (Parallel (NC3), NC4->HDF5)
- [x] Eccodes (2.18)
- [x] CDO
- [ ] Intel compiled Libraries (License)
- [ ] Emoslib
- [ ] Ecaccess
- [ ] RTTOV
%% Cell type:markdown id: tags:
## Currently there are a few choices:
1. Python (2, 3)
* Anaconda
* Miniconda
2. Julia (not yet)
3. R (not yet)
%% Cell type:markdown id: tags:
## Install Python packages
%% Cell type:code id: tags:
``` python
# Please use the --user flag to install the packages into .local/lib/python...
%pip install --user numpy scipy
```
%% Output
Requirement already satisfied: numpy in /jetfs/home/mblaschek/.local/lib/python3.8/site-packages (1.19.2)
Requirement already satisfied: scipy in /jetfs/home/mblaschek/.local/lib/python3.8/site-packages (1.5.2)
Note: you may need to restart the kernel to use updated packages.
%% Cell type:code id: tags:
``` python
import os
import sys
# Need to update search Path / only necessary here for the first time
sys.path.append(os.path.expandvars("$HOME/.local/lib/python3.8/site-packages"))
```
%% Cell type:code id: tags:
``` python
import numpy as np
import scipy as sp
# WORKS! ;)
```
%% Cell type:markdown id: tags:
## Using conda
%% Cell type:code id: tags:
``` python
!conda info
```
%% Output
active environment : None
user config file : /jetfs/home/mblaschek/.condarc
populated config files : /jetfs/home/mblaschek/.condarc
conda version : 4.8.5
conda-build version : not installed
python version : 3.8.1.final.0
virtual packages : __glibc=2.28
base environment : /jetfs/spack/opt/spack/linux-rhel8-skylake_avx512/gcc-8.3.1/miniconda3-4.8.2-3m7b6t2kgedyr3jnd2nasmgiq7wm27iv (read only)
channel URLs : https://repo.anaconda.com/pkgs/main/linux-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/linux-64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /jetfs/spack/opt/spack/linux-rhel8-skylake_avx512/gcc-8.3.1/miniconda3-4.8.2-3m7b6t2kgedyr3jnd2nasmgiq7wm27iv/pkgs
/jetfs/home/mblaschek/.conda/pkgs
envs directories : /jetfs/home/mblaschek/.conda/envs
/jetfs/spack/opt/spack/linux-rhel8-skylake_avx512/gcc-8.3.1/miniconda3-4.8.2-3m7b6t2kgedyr3jnd2nasmgiq7wm27iv/envs
platform : linux-64
user-agent : conda/4.8.5 requests/2.22.0 CPython/3.8.1 Linux/4.18.0-147.el8.x86_64 centos/8.2.2004 glibc/2.28
UID:GID : 54212:100
netrc file : None
offline mode : False
%% Cell type:code id: tags:
``` python
# Create a environment with conda
!conda create -y -n myenv python=3.7 numpy ipykernel
```
%% Output
Collecting package metadata (current_repodata.json): done
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: done
## Package Plan ##
environment location: /jetfs/home/mblaschek/.conda/envs/myenv
added / updated specs:
- ipykernel
- numpy
- python=3.7
The following NEW packages will be INSTALLED:
_libgcc_mutex pkgs/main/linux-64::_libgcc_mutex-0.1-main
backcall pkgs/main/noarch::backcall-0.2.0-py_0
blas pkgs/main/linux-64::blas-1.0-mkl
ca-certificates pkgs/main/linux-64::ca-certificates-2020.10.14-0
certifi pkgs/main/linux-64::certifi-2020.6.20-py37_0
decorator pkgs/main/noarch::decorator-4.4.2-py_0
intel-openmp pkgs/main/linux-64::intel-openmp-2020.2-254
ipykernel pkgs/main/linux-64::ipykernel-5.3.4-py37h5ca1d4c_0
ipython pkgs/main/linux-64::ipython-7.18.1-py37h5ca1d4c_0
ipython_genutils pkgs/main/linux-64::ipython_genutils-0.2.0-py37_0
jedi pkgs/main/linux-64::jedi-0.17.2-py37_0
jupyter_client pkgs/main/noarch::jupyter_client-6.1.7-py_0
jupyter_core pkgs/main/linux-64::jupyter_core-4.6.3-py37_0
ld_impl_linux-64 pkgs/main/linux-64::ld_impl_linux-64-2.33.1-h53a641e_7
libedit pkgs/main/linux-64::libedit-3.1.20191231-h14c3975_1
libffi pkgs/main/linux-64::libffi-3.3-he6710b0_2
libgcc-ng pkgs/main/linux-64::libgcc-ng-9.1.0-hdf63c60_0
libsodium pkgs/main/linux-64::libsodium-1.0.18-h7b6447c_0
libstdcxx-ng pkgs/main/linux-64::libstdcxx-ng-9.1.0-hdf63c60_0
mkl pkgs/main/linux-64::mkl-2020.2-256
mkl-service pkgs/main/linux-64::mkl-service-2.3.0-py37he904b0f_0
mkl_fft pkgs/main/linux-64::mkl_fft-1.2.0-py37h23d657b_0
mkl_random pkgs/main/linux-64::mkl_random-1.1.1-py37h0573a6f_0
ncurses pkgs/main/linux-64::ncurses-6.2-he6710b0_1
numpy pkgs/main/linux-64::numpy-1.19.1-py37hbc911f0_0
numpy-base pkgs/main/linux-64::numpy-base-1.19.1-py37hfa32c7d_0
openssl pkgs/main/linux-64::openssl-1.1.1h-h7b6447c_0
parso pkgs/main/noarch::parso-0.7.0-py_0
pexpect pkgs/main/linux-64::pexpect-4.8.0-py37_1
pickleshare pkgs/main/linux-64::pickleshare-0.7.5-py37_1001
pip pkgs/main/linux-64::pip-20.2.3-py37_0
prompt-toolkit pkgs/main/noarch::prompt-toolkit-3.0.8-py_0
ptyprocess pkgs/main/linux-64::ptyprocess-0.6.0-py37_0
pygments pkgs/main/noarch::pygments-2.7.1-py_0
python pkgs/main/linux-64::python-3.7.9-h7579374_0
python-dateutil pkgs/main/noarch::python-dateutil-2.8.1-py_0
pyzmq pkgs/main/linux-64::pyzmq-19.0.2-py37he6710b0_1
readline pkgs/main/linux-64::readline-8.0-h7b6447c_0
setuptools pkgs/main/linux-64::setuptools-50.3.0-py37hb0f4dca_1
six pkgs/main/noarch::six-1.15.0-py_0
sqlite pkgs/main/linux-64::sqlite-3.33.0-h62c20be_0
tk pkgs/main/linux-64::tk-8.6.10-hbc83047_0
tornado pkgs/main/linux-64::tornado-6.0.4-py37h7b6447c_1
traitlets pkgs/main/noarch::traitlets-5.0.5-py_0
wcwidth pkgs/main/noarch::wcwidth-0.2.5-py_0
wheel pkgs/main/noarch::wheel-0.35.1-py_0
xz pkgs/main/linux-64::xz-5.2.5-h7b6447c_0
zeromq pkgs/main/linux-64::zeromq-4.3.3-he6710b0_3
zlib pkgs/main/linux-64::zlib-1.2.11-h7b6447c_3
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
#
# To activate this environment, use
#
# $ conda activate myenv
#
# To deactivate an active environment, use
#
# $ conda deactivate
%% Cell type:code id: tags:
``` python
!~/.conda/envs/myenv/bin/python -m ipykernel install --user --name MYENV --display-name "MYCONDA"
```
%% Output
Installed kernelspec MYENV in /jetfs/home/mblaschek/.local/share/jupyter/kernels/myenv
%% Cell type:code id: tags:
``` python
!jupyter kernelspec list
```
%% Output
Available kernels:
myenv /jetfs/home/mblaschek/.local/share/jupyter/kernels/myenv
python3 /jetfs/spack/opt/spack/linux-rhel8-skylake_avx512/gcc-8.3.1/miniconda3-4.8.2-3m7b6t2kgedyr3jnd2nasmgiq7wm27iv/share/jupyter/kernels/python3
%% Cell type:markdown id: tags:
## Custom Environment Python Environment
%% Cell type:code id: tags:
``` python
# Setup a virtual Environment for your Python in the directoy: myfancyenv
!python -m venv myfancyenv
```
%% Cell type:code id: tags:
``` python
# Install some package into that environment
# important is ipykernel to use it within jupyterlab/notebook
!./myfancyenv/bin/pip -q install numpy scipy pandas ipykernel
```
%% Output
WARNING: You are using pip version 19.2.3, however version 20.2.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
%% Cell type:code id: tags:
``` python
# Allow this environment to be available on the LAUNCHERS (+) sign on the left
!./myfancyenv/bin/python -m ipykernel install --user --name fancy --display-name "My Python"
```
%% Output
Installed kernelspec fancy in /jetfs/home/mblaschek/.local/share/jupyter/kernels/fancy
%% Cell type:markdown id: tags:
Wait a bit and the new python environment should be available on your launchers.
%% Cell type:code id: tags:
``` python
!jupyter kernelspec list
```
%% Output
Available kernels:
fancy /jetfs/home/mblaschek/.local/share/jupyter/kernels/fancy
myenv /jetfs/home/mblaschek/.local/share/jupyter/kernels/myenv
python3 /jetfs/spack/opt/spack/linux-rhel8-skylake_avx512/gcc-8.3.1/miniconda3-4.8.2-3m7b6t2kgedyr3jnd2nasmgiq7wm27iv/share/jupyter/kernels/python3
%% Cell type:markdown id: tags:
There you go. Your personal Environment.
# If you need help. Feel free to ask me. Enjoy!
...@@ -25,50 +25,6 @@ Missing solutions are very welcome. ...@@ -25,50 +25,6 @@ Missing solutions are very welcome.
[Go to Description](Jet-Cluster.md) [Go to Description](Jet-Cluster.md)
### Architecture
| Name | Value |
| --- | --- |
| Product | ThinkSystem SR650 |
| Distro | RedHatEnterprise 8.2 Ootpa |
| Kernel | 4.18.0-147.el8.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
| Cores | 2 CPU, 20 physical cores per CPU, total 80 logical CPU units |
| BaseBoard | Lenovo -\[7X06CTO1WW\]- Intel Corporation C624 Series Chipset |
| CPU Time | 350 kh |
| Memory | 755 GB Total |
| Memory/Core | 18.9 GB |
| Network | 40 Gbit/s (Infiniband) |
### Scientific Software
- GNU compilers, Intel compilers, OpenMPI, HDF5, NetCDF, Eccodes, ...
- Anaconda Python, Julia
```bash
>>> module av
---------------------------------- /jetfs/spack/share/spack/modules/linux-rhel8-skylake_avx512 -----------------------------------
anaconda2/2019.10-gcc-8.3.1-5pou6ji ncview/2.1.8-gcc-8.3.1-s2owtzw
anaconda3/2019.10-gcc-8.3.1-tmy5mgp netcdf-c/4.7.4-gcc-8.3.1-fh4nn6k
anaconda3/2020.07-gcc-8.3.1-weugqkf netcdf-c/4.7.4-intel-20.0.2-337uqtc
cdo/1.9.8-gcc-8.3.1-ipgvzeh netcdf-fortran/4.5.3-gcc-8.3.1-kfd2vkj
eccodes/2.18.0-gcc-8.3.1-s7clum3 netcdf-fortran/4.5.3-intel-20.0.2-irdm5gq
eccodes/2.18.0-intel-20.0.2-6tadpgr netlib-lapack/3.8.0-gcc-8.3.1-ue37lic
enstools/2020.11.dev-gcc-8.3.1-fes7kgo netlib-scalapack/2.1.0-gcc-8.3.1-pbkjymd
gcc/8.3.1-gcc-8.3.1-pp3wjou openblas/0.3.10-gcc-8.3.1-ncess5c
geos/3.8.1-gcc-8.3.1-o76leir openmpi/3.1.6-gcc-8.3.1-rk5av53
hdf5/1.12.0-gcc-8.3.1-awl4atl openmpi/3.1.6-intel-20.0.2-ubasrpk
hdf5/1.12.0-intel-20.0.2-ezeotzr openmpi/4.0.5-gcc-8.3.1-773ztsv
intel-mkl/2020.3.279-gcc-8.3.1-5xeezjw openmpi/4.0.5-intel-20.0.2-4wfaaz4
intel-mkl/2020.3.279-intel-20.0.2-m7bxged parallel-netcdf/1.12.1-gcc-8.3.1-gng2jcu
intel-parallel-studio/composer.2020.2-intel-20.0.2-zuot22y parallel-netcdf/1.12.1-gcc-8.3.1-xxrhtxn
julia/1.5.2-gcc-8.3.1-3iwgkf7 parallel-netcdf/1.12.1-intel-20.0.2-sgz3yqs
libemos/4.5.9-gcc-8.3.1-h3lqu2n proj/7.1.0-gcc-8.3.1-xcjaco5
miniconda2/4.7.12.1-gcc-8.3.1-zduqggv zlib/1.2.11-gcc-8.3.1-bbbpnzp
miniconda3/4.8.2-gcc-8.3.1-3m7b6t2 zlib/1.2.11-intel-20.0.2-3h374ov
nco/4.9.3-gcc-8.3.1-jtokrle
```
### JupyterHub ### JupyterHub
[JET01-Jupyterhub](https://jet01.img.univie.ac.at) [JET01-Jupyterhub](https://jet01.img.univie.ac.at)
...@@ -77,33 +33,6 @@ Teaching and webaccess server ...@@ -77,33 +33,6 @@ Teaching and webaccess server
[Go to Description](SRVX1.md) [Go to Description](SRVX1.md)
### Architecture
| Name | Value |
| --- | --- |
| Product | PowerEdge R720xd |
| Distro | CentOS 6.10 Final |
| Kernel | 2.6.32-754.29.2.el6.centos.plus.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) CPU E5-2690 0 @ 2.90GHz |
| Cores | 2 CPU, 8 physical cores per CPU, total 32 logical CPU units |
| BaseBoard | Dell Inc. 0C4Y3R Intel Corporation C600/X79 series chipset |
| CPU time | 140 kh |
| Memory | 190 GB Total |
| Memory/Core | 11.9 GB |
| Network | 10 Gbit/s |
### Scientific Software
- GNU compilers, OpenMPI, HDF5, NetCDF, Eccodes, ...
- Anaconda Python
```bash
>>> module av
------------- /home/opt/spack/share/spack/modules/linux-centos6-sandybridge -------------
anaconda3/2020.07-gcc-5.3.0 hdf5/1.10.7-gcc-5.3.0 openmpi/3.1.6-gcc-5.3.0
eccodes/2.18.0-gcc-5.3.0 miniconda3/4.8.2-gcc-5.3.0 proj/7.1.0-gcc-5.3.0
gcc/5.3.0-gcc-5.3.0 netcdf-c/4.7.4-gcc-5.3.0 zlib/1.2.11-gcc-5.3.0
git/2.29.0-gcc-5.3.0 netcdf-fortran/4.5.3-gcc-5.3.0
```
### JupyterHub ### JupyterHub
Direct Access: [Goto SRVX1-Jupyterhub](https://srvx1.img.univie.ac.at) Direct Access: [Goto SRVX1-Jupyterhub](https://srvx1.img.univie.ac.at)
...@@ -114,64 +43,11 @@ Computing Node @UZA2 ...@@ -114,64 +43,11 @@ Computing Node @UZA2
[Go to Description](SRVX2.md) [Go to Description](SRVX2.md)
### Architecture
| Name | Value |
| --- | --- |
| Product | PowerEdge R940 |
| Distro | CentOS 8.2.2004 Core |
| Kernel | 4.18.0-147.el8.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
| Cores | 4 CPU, 20 physical cores per CPU, total 160 logical CPU units |
| BaseBoard | Dell Inc. 0D41HC Intel Corporation C621 Series Chipset |
| CPU time | 700 kh |
| Memory | 376 GB Total |
| Memory/Core | 9.4 GB |
### Software
- GNU compilers, Intel compilers, OpenMPI, HDF5, NetCDF, Eccodes, ...
- Anaconda Python
```bash
>>> module av
----------- /home/spack-root/share/spack/modules/linux-centos8-skylake_avx512 -----------
anaconda3/2020.07-gcc-8.3.1 miniconda3/4.8.2-gcc-8.3.1
eccodes/2.18.0-intel-20.0.2 netcdf-c/4.7.4-gcc-8.3.1
eccodes/2.19.1-gcc-8.3.1 netcdf-c/4.7.4-intel-20.0.2
eccodes/2.19.1-intel-20.0.2 netcdf-fortran/4.5.3-gcc-8.3.1
hdf5/1.10.7-gcc-8.3.1 netcdf-fortran/4.5.3-intel-20.0.2
hdf5/1.10.7-intel-20.0.2 openmpi/3.1.6-gcc-8.3.1
intel-parallel-studio/composer.2020.2-intel-20.0.2 openmpi/3.1.6-intel-20.0.2
```
## Information on SRVX8 - Research ## Information on SRVX8 - Research
Researchers @UZA2 Researchers @UZA2
[Go to Description](SRVX8.md) [Go to Description](SRVX8.md)
### Architecture
| Name | Value |
| --- | --- |
| Product | PowerEdge R730xd |
| Distro | CentOS 6.10 Final |
| Kernel | 2.6.32-754.29.2.el6.centos.plus.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) CPU E5-2697 v3 @ 2.60GHz |
| Cores | 2 CPU, 14 physical cores per CPU, total 56 logical CPU units |
| BaseBoard | Dell Inc. 0599V5 Intel Corporation C610/X99 series chipset |
| CPU time | 245 kh |
| Memory | 504 GB Total |
| Memory/Core | 18 Gb |
| Network | 10 Gbit/s |
### Software
- GNU compilers, OpenMPI, HDF5, NetCDF, Eccodes, ...
- Anaconda Python
```bash
----------------- /opt/spack/share/spack/modules/linux-centos6-haswell/ -----------------
anaconda3/2020.07-gcc-5.3.0 netcdf-c/4.7.4-gcc-5.3.0
eccodes/2.18.0-gcc-5.3.0 netcdf-fortran/4.5.3-gcc-5.3.0
gcc/4.4.7-gcc-5.3.0 openmpi/3.1.6-gcc-5.3.0
gcc/5.3.0-gcc-5.3.0 proj/7.1.0-gcc-5.3.0
hdf5/1.10.7-gcc-5.3.0 python/3.8.5-gcc-5.3.0
miniconda3/4.8.2-gcc-5.3.0 zlib/1.2.11-gcc-5.3.0
```
## Information on VSC - Research ## Information on VSC - Research
Researchers can use the private nodes at VSC-4. Researchers can use the private nodes at VSC-4.
...@@ -181,29 +57,5 @@ Note: ...@@ -181,29 +57,5 @@ Note:
> https://vsc.ac.at/training > https://vsc.ac.at/training
[Go to Description](VSC.md) [Go to Description](VSC.md)
### Architecture
| Name | Value |
| --- | --- |
| Product | ? |
| Distro | CentOS 7.6 |
| Kernel | 3.10.0-957.el7.x86_64 |
| Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
| Cores | 2 CPU, 20 physical cores per CPU, total 80 logical CPU units |
| BaseBoard | ? |
| CPU time | 2102 kh |
| Memory | 378 GB Total |
| Memory/Core | 13.5 Gb |
| Network | 10 Gbit/s |
### Software
- GNU compilers, OpenMPI, HDF5, NetCDF, Eccodes (only GNU)
- Anaconda Python
- Matlab
...@@ -5,37 +5,7 @@ ...@@ -5,37 +5,7 @@
## Getting Started ## Getting Started
### SSH [How to SSH / VNC / VPN](SSH-VPN-VNC/README.md)
- [x] Terminal, Putty, ...
- [x] Network access @UNIVIE
Use a terminal or [Putty](https://www.putty.org/) to connect with SSH to the server
```bash
ssh [user]@srvx1.img.univie.ac.at
```
Please replace `[user]` with the username given by the [sysadmin](mailto:michael.blaschek@univie.ac.at) and supply your password when asked for. This should give you access to the login node. If you are outside of the university network, then please goto [VPN](#VPN) below.
### VPN
- [x] `u:account`
The jet cluster is only accessible from within the virtual network of the university of Vienna. Therefore access from outside has to be granted via the [VPN-Service](https://vpn.univie.ac.at). Go there and login with your `u:account` and download the *Big-IP Edge* client for you system.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/01_download_neu.png)
Links:
* [ZID-VPN](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/)
* Linux (Ubuntu, Generic), Windows, Mac: [VPN user guides](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/anleitungen/)
* Arch based AUR package [AUR f5fpc](https://aur.archlinux.org/packages/f5fpc/)
Follow the install instructions for Windows, Mac and Linux and make sure the software works.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/08_verbinden.png)
On Windows and MAc you get a nice gui that requires you to fill in the VPN server: `vpn.univie.ac.at` and username and password from the `u:account`. On Linux execute the following:
```
f5fpc -s -t vpn.univie.ac.at -u [user]
```
The status can be checked with `f5fpc --info`.
### Jupyterhub ### Jupyterhub
<img src="https://jupyter.org/assets/hublogo.svg" width="300px"> <img src="https://jupyter.org/assets/hublogo.svg" width="300px">
...@@ -52,17 +22,12 @@ Download/Use any of these required Authenticator Apps: ...@@ -52,17 +22,12 @@ Download/Use any of these required Authenticator Apps:
After registering the teacher/admin has to grant you access and you can login. After registering the teacher/admin has to grant you access and you can login.
:construction:
## System Information ## System Information
| Name | Value | | Name | Value |
| --- | --- | | --- | --- |
| Product | PowerEdge R720xd | | Product | PowerEdge R720xd |
| Distro | CentOS 6.10 Final |
| Kernel | 2.6.32-754.29.2.el6.centos.plus.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) CPU E5-2690 0 @ 2.90GHz | | Processor | Intel(R) Xeon(R) CPU E5-2690 0 @ 2.90GHz |
| Cores | 2 CPU, 8 physical cores per CPU, total 32 logical CPU units | | Cores | 2 CPU, 8 physical cores per CPU, total 32 logical CPU units |
| BaseBoard | Dell Inc. 0C4Y3R Intel Corporation C600/X79 series chipset |
| CPU time | 140 kh | | CPU time | 140 kh |
| Memory | 190 GB Total | | Memory | 190 GB Total |
| Memory/Core | 11.9 GB | | Memory/Core | 11.9 GB |
...@@ -70,6 +35,45 @@ After registering the teacher/admin has to grant you access and you can login. ...@@ -70,6 +35,45 @@ After registering the teacher/admin has to grant you access and you can login.
## Software ## Software
The typcial installation of a intel-cluster has the INTEL Compiler suite (`intel-parallel-studio`) and the open source GNU Compilers installed. Based on these two different compilers (`intel`, `gnu`), there are usually two version of each scientific software.
Major Libraries:
- OpenMPI (3.1.6, 4.0.5)
- HDF5
- NetCDF (C, Fortran)
- ECCODES from [ECMWF](https://confluence.ecmwf.int/display/ECC)
- Math libraries e.g. intel-mkl, lapack,scalapack
- Interpreters: Python, Julia
- Tools: cdo, ncl, nco, ncview
These software libraries are usually handled by environment modules.
<img src="http://modules.sourceforge.net/modules_red.svg" width="300px">
## Currently installed modules
```
$ module av
-------------------------------- /home/opt/spack/share/spack/modules/linux-centos6-sandybridge --------------------------------
anaconda3/2020.07-gcc-5.3.0 gcc/5.3.0-gcc-5.3.0 netcdf-c/4.7.4-gcc-5.3.0 zlib/1.2.11-gcc-5.3.0
cdo/1.9.9-gcc-5.3.0 git/2.29.0-gcc-5.3.0 netcdf-fortran/4.5.3-gcc-5.3.0
eccodes/2.18.0-gcc-5.3.0 hdf5/1.10.7-gcc-5.3.0 openmpi/3.1.6-gcc-5.3.0
enstools/2020.11.dev-gcc-5.3.0 miniconda3/4.8.2-gcc-5.3.0 proj/7.1.0-gcc-5.3.0
```
Using [environment modules](https://modules.readthedocs.io/en/latest/) it is possible to have different software libraries (versions, compilers) side-by-side and ready to be loaded. Be aware that some libraries are dependent on others. It is recommended to load the highest rank library first to check what dependencies are loaded as well. e.g.:
```
$ module load eccodes/2.18.0-gcc-5.3.0
Loading eccodes/2.18.0-gcc-5.3.0
Loading requirement: zlib/1.2.11-gcc-5.3.0 openmpi/3.1.6-gcc-5.3.0 hdf5/1.10.7-gcc-5.3.0 netcdf-c/4.7.4-gcc-5.3.0
```
loads the `ECCODES` library and all dependencies. e.g. intel or gnu compilers, as indicated by the naming.
```
$ module list
Currently Loaded Modulefiles:
1) zlib/1.2.11-gcc-5.3.0 3) hdf5/1.10.7-gcc-5.3.0 5) eccodes/2.18.0-gcc-5.3.0
2) openmpi/3.1.6-gcc-5.3.0 4) netcdf-c/4.7.4-gcc-5.3.0
```
`module list` shows the currently loaded modules and reports that 5 libraries need to be loaded as dependencies for `ECCODES`. Thus, it is not necessary to load the other libraries manually as they are dependencies of `ECCODES`.
## Best Practice ## Best Practice
## Networking ## Networking
# S R V X 8 # S R V X 2
[[_TOC_]] [[_TOC_]]
## Getting Started ## Getting Started
### SSH [How to SSH / VNC / VPN](SSH-VPN-VNC/README.md)
### VPN
### Jupyterhub
## System Information ## System Information
| Name | Value | | Name | Value |
| --- | --- | | --- | --- |
| Product | PowerEdge R940 | | Product | PowerEdge R940 |
| Distro | CentOS 8.2.2004 Core |
| Kernel | 4.18.0-147.el8.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz | | Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
| Cores | 4 CPU, 20 physical cores per CPU, total 160 logical CPU units | | Cores | 4 CPU, 20 physical cores per CPU, total 160 logical CPU units |
| BaseBoard | Dell Inc. 0D41HC Intel Corporation C621 Series Chipset |
| CPU time | 700 kh | | CPU time | 700 kh |
| Memory | 376 GB Total | | Memory | 376 GB Total |
| Memory/Core | 9.4 GB | | Memory/Core | 9.4 GB |
## Software ## Software
```
$ module av
------------------------------ /home/spack-root/share/spack/modules/linux-centos8-skylake_avx512 ------------------------------
anaconda3/2020.07-gcc-8.3.1 intel-parallel-studio/composer.2020.2-intel-20.0.2 netcdf-fortran/4.5.3-gcc-8.3.1
eccodes/2.18.0-intel-20.0.2 libemos/4.5.9-gcc-8.3.1 netcdf-fortran/4.5.3-intel-20.0.2
eccodes/2.19.1-gcc-8.3.1 miniconda3/4.8.2-gcc-8.3.1 netlib-lapack/3.8.0-gcc-8.3.1
eccodes/2.19.1-intel-20.0.2 miniconda3/4.9.2-gcc-8.3.1 netlib-scalapack/2.1.0-gcc-8.3.1
eccodes/2.21.0-intel-20.0.2 ncl/6.6.2-gcc-8.3.1 openblas/0.3.12-gcc-8.3.1
hdf5/1.10.7-gcc-8.3.1 netcdf-c/4.7.4-gcc-8.3.1 openmpi/3.1.6-gcc-8.3.1
hdf5/1.10.7-intel-20.0.2 netcdf-c/4.7.4-intel-20.0.2 openmpi/3.1.6-intel-20.0.2
```
## Best Practice ## Best Practice
## Networking ## Networking
...@@ -5,22 +5,15 @@ ...@@ -5,22 +5,15 @@
## Getting Started ## Getting Started
### SSH [How to SSH / VNC / VPN](SSH-VPN-VNC/README.md)
### VPN
### Jupyterhub
## System Information ## System Information
| Name | Value | | Name | Value |
| --- | --- | | --- | --- |
| Product | PowerEdge R730xd | | Product | PowerEdge R730xd |
| Distro | CentOS 6.10 Final |
| Kernel | 2.6.32-754.29.2.el6.centos.plus.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) CPU E5-2697 v3 @ 2.60GHz | | Processor | Intel(R) Xeon(R) CPU E5-2697 v3 @ 2.60GHz |
| Cores | 2 CPU, 14 physical cores per CPU, total 56 logical CPU units | | Cores | 2 CPU, 14 physical cores per CPU, total 56 logical CPU units |
| BaseBoard | Dell Inc. 0599V5 Intel Corporation C610/X99 series chipset |
| CPU time | 245 kh | | CPU time | 245 kh |
| Memory | 504 GB Total | | Memory | 504 GB Total |
| Memory/Core | 18 Gb | | Memory/Core | 18 Gb |
...@@ -28,6 +21,9 @@ ...@@ -28,6 +21,9 @@
## Software ## Software
Software is installed in numerous places. This is a legency system with no software controller.
A reinstall is planed for summer 2021.
## Best Practice ## Best Practice
## Networking ## Networking
...@@ -46,6 +46,8 @@ Host login ...@@ -46,6 +46,8 @@ Host login
``` ```
and replacing `[USERNAME]` and `[U:Account USERNAME]` with your usernames. Using such a file allows to connect like this `ssh srvx2` using the correct server adress and specified username. Copy this file as well on `login.univie.ac.at` and you can use commands like this: `ssh -t login ssh jet` to connect directly to `jet` via the `login` gateway. and replacing `[USERNAME]` and `[U:Account USERNAME]` with your usernames. Using such a file allows to connect like this `ssh srvx2` using the correct server adress and specified username. Copy this file as well on `login.univie.ac.at` and you can use commands like this: `ssh -t login ssh jet` to connect directly to `jet` via the `login` gateway.
If you want to use ssh-keys you can also use different keys in `.ssh/config` per server with `IdentityFile ~/.ssh/id_rsa_for_server`.
**From eduroam**: You should be able to log in as above. **From eduroam**: You should be able to log in as above.
...@@ -166,6 +168,28 @@ Adding resolutions according to your display's resolution have a look here: [add ...@@ -166,6 +168,28 @@ Adding resolutions according to your display's resolution have a look here: [add
Note: `$DISPLAY` is an environment variable that is usually set to your VNC server port. Note: `$DISPLAY` is an environment variable that is usually set to your VNC server port.
# VPN
Requirements:
- `u:account`
Some servers are only accessible from within the virtual network of the university of Vienna. Therefore access from outside has to be granted via the [VPN-Service](https://vpn.univie.ac.at). Go there and login with your `u:account` and download the *Big-IP Edge* client for you system.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/01_download_neu.png)
Links:
* [ZID-VPN](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/)
* Linux (Ubuntu, Generic), Windows, Mac: [VPN user guides](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/anleitungen/)
* Arch based AUR package [AUR f5fpc](https://aur.archlinux.org/packages/f5fpc/)
Follow the install instructions for Windows, Mac and Linux and make sure the software works.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/08_verbinden.png)
On Windows and Mac you get a nice gui that requires you to fill in the VPN server: `vpn.univie.ac.at` and username and password from the `u:account`. On Linux execute the following:
```
f5fpc -s -t vpn.univie.ac.at -u [user]
```
The status can be checked with `f5fpc --info`.
## Screen ## Screen
[Screen](https://wiki.ubuntuusers.de/Screen/) is terminal session manager, that allows to start processes and reconnect to these processes after disconnection. [Screen](https://wiki.ubuntuusers.de/Screen/) is terminal session manager, that allows to start processes and reconnect to these processes after disconnection.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment