Skip to content
Snippets Groups Projects
Commit 6d9d62e3 authored by Michael Blaschek's avatar Michael Blaschek :bicyclist:
Browse files

Update README.md, Jet-Cluster.md, SSH-VPN-VNC/README.md,...

Update README.md, Jet-Cluster.md, SSH-VPN-VNC/README.md, Python/Your-First-Notebook-onJet_v2.ipynb, SRVX8.md, SRVX2.md, SRVX1.md files
parent eba76118
Branches
Tags
No related merge requests found
......@@ -5,48 +5,14 @@
# Getting Started
Welcome to the HPC @IMG @UNIVIE and please follow these steps to become a productive member of our department and make good use of the computer resources. Efficiency is keen.
Welcome to the HPC @IMG @UNIVIE and please follow these steps to become a productive member of our department and make good use of the computer resources.
**Efficiency is keen.**
1. Connect to Jet
2. Load environment (libraries, compilers, interpreter, tools)
3. Checkout Code, Program, Compile, Test
4. Submit to compute Nodes
## SSH
- [x] Terminal, Putty, ...
- [x] Network access @UNIVIE
Use a terminal or [Putty](https://www.putty.org/) to connect with SSH to the server
```bash
ssh [user]@jet01.img.univie.ac.at
```
Please replace `[user]` with the username given by the [sysadmin](mailto:michael.blaschek@univie.ac.at) and supply your password when asked for. This should give you access to the login node. If you are outside of the university network, then please goto [VPN](#VPN) below.
## VPN
- [x] `u:account`
The jet cluster is only accessible from within the virtual network of the university of Vienna. Therefore access from outside has to be granted via the [VPN-Service](https://vpn.univie.ac.at). Go there and login with your `u:account` and download the *Big-IP Edge* client for you system.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/01_download_neu.png)
Links:
* [ZID-VPN](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/)
* Linux (Ubuntu, Generic), Windows, Mac: [VPN user guides](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/anleitungen/)
* Arch based AUR package [AUR f5fpc](https://aur.archlinux.org/packages/f5fpc/)
Follow the install instructions for Windows, Mac and Linux and make sure the software works.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/08_verbinden.png)
On Windows and MAc you get a nice gui that requires you to fill in the VPN server: `vpn.univie.ac.at` and username and password from the `u:account`. On Linux execute the following:
```
f5fpc -s -t vpn.univie.ac.at -u [user]
```
The status can be checked with `f5fpc --info`.
## VNC
The login nodes (`jet01` and `jet02`) allow to run a `VNC`-Server
:construction:
[How to SSH / VNC / VPN](SSH-VPN-VNC/README.md)
## Jupyterhub
<img src="https://jupyter.org/assets/hublogo.svg" width="300px">
......@@ -62,8 +28,9 @@ Login with your jet-Credentials, choose a job and the jupyterlab will be launche
<img src="Documentation/jet-job2.png" width="500px">
:construction:
There are several kernels available and some help can be found:
- [Python/](Python/)
- [Tutorial on Jet](Python/Your-First-Notebook-onJet_v2.ipynb)
## User Quotas and Restrictions
......@@ -93,11 +60,8 @@ Node Setup
| Name | Value |
| --- | --- |
| Product | ThinkSystem SR650 |
| Distro | RedHatEnterprise 8.2 Ootpa |
| Kernel | 4.18.0-147.el8.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
| Cores | 2 CPU, 20 physical cores per CPU, total 80 logical CPU units |
| BaseBoard | Lenovo -\[7X06CTO1WW\]- Intel Corporation C624 Series Chipset |
| CPU Time | 350 kh |
| Memory | 755 GB Total |
| Memory/Core | 18.9 GB |
......@@ -109,8 +73,8 @@ Global file system (GPFS) is present on all nodes with about 1 PB (~1000 TB) of
The typcial installation of a intel-cluster has the INTEL Compiler suite (`intel-parallel-studio`) and the open source GNU Compilers installed. Based on these two different compilers (`intel`, `gnu`), there are usually two version of each scientific software.
Major Libraries:
- OpenMPI
- HDF5
- OpenMPI (3.1.6, 4.0.5)
- HDF5
- NetCDF (C, Fortran)
- ECCODES from [ECMWF](https://confluence.ecmwf.int/display/ECC)
- Math libraries e.g. intel-mkl, lapack,scalapack
......@@ -122,36 +86,47 @@ These software libraries are usually handled by environment modules.
<img src="http://modules.sourceforge.net/modules_red.svg" width="300px">
## Currently installed modules
```bash
>>> module av
---------------------------------- /jetfs/spack/share/spack/modules/linux-rhel8-skylake_avx512 -----------------------------------
anaconda2/2019.10-gcc-8.3.1-5pou6ji ncview/2.1.8-gcc-8.3.1-s2owtzw
anaconda3/2019.10-gcc-8.3.1-tmy5mgp netcdf-c/4.7.4-gcc-8.3.1-fh4nn6k
anaconda3/2020.07-gcc-8.3.1-weugqkf netcdf-c/4.7.4-intel-20.0.2-337uqtc
cdo/1.9.8-gcc-8.3.1-ipgvzeh netcdf-fortran/4.5.3-gcc-8.3.1-kfd2vkj
eccodes/2.18.0-gcc-8.3.1-s7clum3 netcdf-fortran/4.5.3-intel-20.0.2-irdm5gq
eccodes/2.18.0-intel-20.0.2-6tadpgr netlib-lapack/3.8.0-gcc-8.3.1-ue37lic
enstools/2020.11.dev-gcc-8.3.1-fes7kgo netlib-scalapack/2.1.0-gcc-8.3.1-pbkjymd
gcc/8.3.1-gcc-8.3.1-pp3wjou openblas/0.3.10-gcc-8.3.1-ncess5c
geos/3.8.1-gcc-8.3.1-o76leir openmpi/3.1.6-gcc-8.3.1-rk5av53
hdf5/1.12.0-gcc-8.3.1-awl4atl openmpi/3.1.6-intel-20.0.2-ubasrpk
hdf5/1.12.0-intel-20.0.2-ezeotzr openmpi/4.0.5-gcc-8.3.1-773ztsv
intel-mkl/2020.3.279-gcc-8.3.1-5xeezjw openmpi/4.0.5-intel-20.0.2-4wfaaz4
intel-mkl/2020.3.279-intel-20.0.2-m7bxged parallel-netcdf/1.12.1-gcc-8.3.1-gng2jcu
intel-parallel-studio/composer.2020.2-intel-20.0.2-zuot22y parallel-netcdf/1.12.1-gcc-8.3.1-xxrhtxn
julia/1.5.2-gcc-8.3.1-3iwgkf7 parallel-netcdf/1.12.1-intel-20.0.2-sgz3yqs
libemos/4.5.9-gcc-8.3.1-h3lqu2n proj/7.1.0-gcc-8.3.1-xcjaco5
miniconda2/4.7.12.1-gcc-8.3.1-zduqggv zlib/1.2.11-gcc-8.3.1-bbbpnzp
miniconda3/4.8.2-gcc-8.3.1-3m7b6t2 zlib/1.2.11-intel-20.0.2-3h374ov
nco/4.9.3-gcc-8.3.1-jtokrle
$ module av
------------------------------------ /jetfs/spack/share/spack/modules/linux-rhel8-haswell -------------------------------------
intel-parallel-studio/composer.2017.7-intel-17.0.7-disfj2g
--------------------------------- /jetfs/spack/share/spack/modules/linux-rhel8-skylake_avx512 ---------------------------------
anaconda2/2019.10-gcc-8.3.1-5pou6ji nco/4.9.3-intel-20.0.2-dhlqiyo
anaconda3/2019.10-gcc-8.3.1-tmy5mgp ncview/2.1.8-gcc-8.3.1-s2owtzw
anaconda3/2020.07-gcc-8.3.1-weugqkf ncview/2.1.8-intel-20.0.2-3taqdda
anaconda3/2020.11-gcc-8.3.1-gramgir netcdf-c/4.7.4-gcc-8.3.1-fh4nn6k
cdo/1.9.8-gcc-8.3.1-ipgvzeh netcdf-c/4.7.4-gcc-8.3.1-MPI3.1.6-y5gknpt
eccodes/2.18.0-gcc-8.3.1-s7clum3 netcdf-c/4.7.4-intel-20.0.2-337uqtc
eccodes/2.18.0-intel-20.0.2-6tadpgr netcdf-fortran/4.5.3-gcc-8.3.1-kfd2vkj
enstools/2020.11.dev-gcc-8.3.1-fes7kgo netcdf-fortran/4.5.3-gcc-8.3.1-MPI3.1.6-rjcxqb6
esmf/7.1.0r-gcc-8.3.1-4fijz4q netcdf-fortran/4.5.3-intel-20.0.2-irdm5gq
gcc/8.3.1-gcc-8.3.1-pp3wjou netlib-lapack/3.8.0-gcc-8.3.1-ue37lic
geos/3.8.1-gcc-8.3.1-o76leir netlib-scalapack/2.1.0-gcc-8.3.1-pbkjymd
hdf5/1.10.7-gcc-8.3.1-MPI3.1.6-vm3avor openblas/0.3.10-gcc-8.3.1-ncess5c
hdf5/1.12.0-gcc-8.3.1-awl4atl openmpi/3.1.6-gcc-8.3.1-rk5av53
hdf5/1.12.0-intel-20.0.2-ezeotzr openmpi/3.1.6-intel-20.0.2-ubasrpk
intel-mkl/2020.3.279-gcc-8.3.1-5xeezjw openmpi/4.0.5-gcc-8.3.1-773ztsv
intel-mkl/2020.3.279-intel-20.0.2-m7bxged openmpi/4.0.5-intel-20.0.2-4wfaaz4
intel-oneapi-compilers/2021.2.0-oneapi-2021.2.0-6kdzddx openmpi/4.0.5-oneapi-2021.2.0-hrfsxrd
intel-oneapi-mpi/2021.2.0-oneapi-2021.2.0-haqpxfl parallel-netcdf/1.12.1-gcc-8.3.1-MPI3.1.6-gng2jcu
intel-parallel-studio/composer.2020.2-intel-20.0.2-zuot22y parallel-netcdf/1.12.1-gcc-8.3.1-xxrhtxn
julia/1.5.2-gcc-8.3.1-3iwgkf7 parallel-netcdf/1.12.1-intel-20.0.2-sgz3yqs
libemos/4.5.9-gcc-8.3.1-h3lqu2n perl/5.32.0-intel-20.0.2-2d23x7l
miniconda2/4.7.12.1-gcc-8.3.1-zduqggv proj/7.1.0-gcc-8.3.1-xcjaco5
miniconda3/4.8.2-gcc-8.3.1-3m7b6t2 zlib/1.2.11-gcc-8.3.1-bbbpnzp
ncl/6.6.2-gcc-8.3.1-MPI3.1.6-3dxuv5f zlib/1.2.11-intel-20.0.2-3h374ov
nco/4.9.3-gcc-8.3.1-g7o6lao
```
Using [environment modules](https://modules.readthedocs.io/en/latest/) it is possible to have different software libraries (versions, compilers) side-by-side and ready to be loaded. Be aware that some libraries are dependent on others. It is recommended to load the highest rank library first to check what dependencies are loaded as well. e.g.:
```
>>> module load eccodes/2.18.0-intel-20.0.2-6tadpgr
$ module load eccodes/2.18.0-intel-20.0.2-6tadpgr
```
loads the `ECCODES` library and all dependencies. e.g. intel compilers, as indicated by the naming.
```
>>> module list
$ module list
Currently Loaded Modulefiles:
1) zlib/1.2.11-intel-20.0.2-3h374ov 3) hdf5/1.12.0-intel-20.0.2-ezeotzr 5) netcdf-c/4.7.4-intel-20.0.2-337uqtc
2) openmpi/4.0.5-intel-20.0.2-4wfaaz4 4) parallel-netcdf/1.12.1-intel-20.0.2-sgz3yqs 6) eccodes/2.18.0-intel-20.0.2-6tadpgr
......
This diff is collapsed.
......@@ -25,50 +25,6 @@ Missing solutions are very welcome.
[Go to Description](Jet-Cluster.md)
### Architecture
| Name | Value |
| --- | --- |
| Product | ThinkSystem SR650 |
| Distro | RedHatEnterprise 8.2 Ootpa |
| Kernel | 4.18.0-147.el8.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
| Cores | 2 CPU, 20 physical cores per CPU, total 80 logical CPU units |
| BaseBoard | Lenovo -\[7X06CTO1WW\]- Intel Corporation C624 Series Chipset |
| CPU Time | 350 kh |
| Memory | 755 GB Total |
| Memory/Core | 18.9 GB |
| Network | 40 Gbit/s (Infiniband) |
### Scientific Software
- GNU compilers, Intel compilers, OpenMPI, HDF5, NetCDF, Eccodes, ...
- Anaconda Python, Julia
```bash
>>> module av
---------------------------------- /jetfs/spack/share/spack/modules/linux-rhel8-skylake_avx512 -----------------------------------
anaconda2/2019.10-gcc-8.3.1-5pou6ji ncview/2.1.8-gcc-8.3.1-s2owtzw
anaconda3/2019.10-gcc-8.3.1-tmy5mgp netcdf-c/4.7.4-gcc-8.3.1-fh4nn6k
anaconda3/2020.07-gcc-8.3.1-weugqkf netcdf-c/4.7.4-intel-20.0.2-337uqtc
cdo/1.9.8-gcc-8.3.1-ipgvzeh netcdf-fortran/4.5.3-gcc-8.3.1-kfd2vkj
eccodes/2.18.0-gcc-8.3.1-s7clum3 netcdf-fortran/4.5.3-intel-20.0.2-irdm5gq
eccodes/2.18.0-intel-20.0.2-6tadpgr netlib-lapack/3.8.0-gcc-8.3.1-ue37lic
enstools/2020.11.dev-gcc-8.3.1-fes7kgo netlib-scalapack/2.1.0-gcc-8.3.1-pbkjymd
gcc/8.3.1-gcc-8.3.1-pp3wjou openblas/0.3.10-gcc-8.3.1-ncess5c
geos/3.8.1-gcc-8.3.1-o76leir openmpi/3.1.6-gcc-8.3.1-rk5av53
hdf5/1.12.0-gcc-8.3.1-awl4atl openmpi/3.1.6-intel-20.0.2-ubasrpk
hdf5/1.12.0-intel-20.0.2-ezeotzr openmpi/4.0.5-gcc-8.3.1-773ztsv
intel-mkl/2020.3.279-gcc-8.3.1-5xeezjw openmpi/4.0.5-intel-20.0.2-4wfaaz4
intel-mkl/2020.3.279-intel-20.0.2-m7bxged parallel-netcdf/1.12.1-gcc-8.3.1-gng2jcu
intel-parallel-studio/composer.2020.2-intel-20.0.2-zuot22y parallel-netcdf/1.12.1-gcc-8.3.1-xxrhtxn
julia/1.5.2-gcc-8.3.1-3iwgkf7 parallel-netcdf/1.12.1-intel-20.0.2-sgz3yqs
libemos/4.5.9-gcc-8.3.1-h3lqu2n proj/7.1.0-gcc-8.3.1-xcjaco5
miniconda2/4.7.12.1-gcc-8.3.1-zduqggv zlib/1.2.11-gcc-8.3.1-bbbpnzp
miniconda3/4.8.2-gcc-8.3.1-3m7b6t2 zlib/1.2.11-intel-20.0.2-3h374ov
nco/4.9.3-gcc-8.3.1-jtokrle
```
### JupyterHub
[JET01-Jupyterhub](https://jet01.img.univie.ac.at)
......@@ -77,33 +33,6 @@ Teaching and webaccess server
[Go to Description](SRVX1.md)
### Architecture
| Name | Value |
| --- | --- |
| Product | PowerEdge R720xd |
| Distro | CentOS 6.10 Final |
| Kernel | 2.6.32-754.29.2.el6.centos.plus.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) CPU E5-2690 0 @ 2.90GHz |
| Cores | 2 CPU, 8 physical cores per CPU, total 32 logical CPU units |
| BaseBoard | Dell Inc. 0C4Y3R Intel Corporation C600/X79 series chipset |
| CPU time | 140 kh |
| Memory | 190 GB Total |
| Memory/Core | 11.9 GB |
| Network | 10 Gbit/s |
### Scientific Software
- GNU compilers, OpenMPI, HDF5, NetCDF, Eccodes, ...
- Anaconda Python
```bash
>>> module av
------------- /home/opt/spack/share/spack/modules/linux-centos6-sandybridge -------------
anaconda3/2020.07-gcc-5.3.0 hdf5/1.10.7-gcc-5.3.0 openmpi/3.1.6-gcc-5.3.0
eccodes/2.18.0-gcc-5.3.0 miniconda3/4.8.2-gcc-5.3.0 proj/7.1.0-gcc-5.3.0
gcc/5.3.0-gcc-5.3.0 netcdf-c/4.7.4-gcc-5.3.0 zlib/1.2.11-gcc-5.3.0
git/2.29.0-gcc-5.3.0 netcdf-fortran/4.5.3-gcc-5.3.0
```
### JupyterHub
Direct Access: [Goto SRVX1-Jupyterhub](https://srvx1.img.univie.ac.at)
......@@ -114,64 +43,11 @@ Computing Node @UZA2
[Go to Description](SRVX2.md)
### Architecture
| Name | Value |
| --- | --- |
| Product | PowerEdge R940 |
| Distro | CentOS 8.2.2004 Core |
| Kernel | 4.18.0-147.el8.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
| Cores | 4 CPU, 20 physical cores per CPU, total 160 logical CPU units |
| BaseBoard | Dell Inc. 0D41HC Intel Corporation C621 Series Chipset |
| CPU time | 700 kh |
| Memory | 376 GB Total |
| Memory/Core | 9.4 GB |
### Software
- GNU compilers, Intel compilers, OpenMPI, HDF5, NetCDF, Eccodes, ...
- Anaconda Python
```bash
>>> module av
----------- /home/spack-root/share/spack/modules/linux-centos8-skylake_avx512 -----------
anaconda3/2020.07-gcc-8.3.1 miniconda3/4.8.2-gcc-8.3.1
eccodes/2.18.0-intel-20.0.2 netcdf-c/4.7.4-gcc-8.3.1
eccodes/2.19.1-gcc-8.3.1 netcdf-c/4.7.4-intel-20.0.2
eccodes/2.19.1-intel-20.0.2 netcdf-fortran/4.5.3-gcc-8.3.1
hdf5/1.10.7-gcc-8.3.1 netcdf-fortran/4.5.3-intel-20.0.2
hdf5/1.10.7-intel-20.0.2 openmpi/3.1.6-gcc-8.3.1
intel-parallel-studio/composer.2020.2-intel-20.0.2 openmpi/3.1.6-intel-20.0.2
```
## Information on SRVX8 - Research
Researchers @UZA2
[Go to Description](SRVX8.md)
### Architecture
| Name | Value |
| --- | --- |
| Product | PowerEdge R730xd |
| Distro | CentOS 6.10 Final |
| Kernel | 2.6.32-754.29.2.el6.centos.plus.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) CPU E5-2697 v3 @ 2.60GHz |
| Cores | 2 CPU, 14 physical cores per CPU, total 56 logical CPU units |
| BaseBoard | Dell Inc. 0599V5 Intel Corporation C610/X99 series chipset |
| CPU time | 245 kh |
| Memory | 504 GB Total |
| Memory/Core | 18 Gb |
| Network | 10 Gbit/s |
### Software
- GNU compilers, OpenMPI, HDF5, NetCDF, Eccodes, ...
- Anaconda Python
```bash
----------------- /opt/spack/share/spack/modules/linux-centos6-haswell/ -----------------
anaconda3/2020.07-gcc-5.3.0 netcdf-c/4.7.4-gcc-5.3.0
eccodes/2.18.0-gcc-5.3.0 netcdf-fortran/4.5.3-gcc-5.3.0
gcc/4.4.7-gcc-5.3.0 openmpi/3.1.6-gcc-5.3.0
gcc/5.3.0-gcc-5.3.0 proj/7.1.0-gcc-5.3.0
hdf5/1.10.7-gcc-5.3.0 python/3.8.5-gcc-5.3.0
miniconda3/4.8.2-gcc-5.3.0 zlib/1.2.11-gcc-5.3.0
```
## Information on VSC - Research
Researchers can use the private nodes at VSC-4.
......@@ -181,29 +57,5 @@ Note:
> https://vsc.ac.at/training
[Go to Description](VSC.md)
### Architecture
| Name | Value |
| --- | --- |
| Product | ? |
| Distro | CentOS 7.6 |
| Kernel | 3.10.0-957.el7.x86_64 |
| Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
| Cores | 2 CPU, 20 physical cores per CPU, total 80 logical CPU units |
| BaseBoard | ? |
| CPU time | 2102 kh |
| Memory | 378 GB Total |
| Memory/Core | 13.5 Gb |
| Network | 10 Gbit/s |
### Software
- GNU compilers, OpenMPI, HDF5, NetCDF, Eccodes (only GNU)
- Anaconda Python
- Matlab
......@@ -5,37 +5,7 @@
## Getting Started
### SSH
- [x] Terminal, Putty, ...
- [x] Network access @UNIVIE
Use a terminal or [Putty](https://www.putty.org/) to connect with SSH to the server
```bash
ssh [user]@srvx1.img.univie.ac.at
```
Please replace `[user]` with the username given by the [sysadmin](mailto:michael.blaschek@univie.ac.at) and supply your password when asked for. This should give you access to the login node. If you are outside of the university network, then please goto [VPN](#VPN) below.
### VPN
- [x] `u:account`
The jet cluster is only accessible from within the virtual network of the university of Vienna. Therefore access from outside has to be granted via the [VPN-Service](https://vpn.univie.ac.at). Go there and login with your `u:account` and download the *Big-IP Edge* client for you system.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/01_download_neu.png)
Links:
* [ZID-VPN](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/)
* Linux (Ubuntu, Generic), Windows, Mac: [VPN user guides](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/anleitungen/)
* Arch based AUR package [AUR f5fpc](https://aur.archlinux.org/packages/f5fpc/)
Follow the install instructions for Windows, Mac and Linux and make sure the software works.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/08_verbinden.png)
On Windows and MAc you get a nice gui that requires you to fill in the VPN server: `vpn.univie.ac.at` and username and password from the `u:account`. On Linux execute the following:
```
f5fpc -s -t vpn.univie.ac.at -u [user]
```
The status can be checked with `f5fpc --info`.
[How to SSH / VNC / VPN](SSH-VPN-VNC/README.md)
### Jupyterhub
<img src="https://jupyter.org/assets/hublogo.svg" width="300px">
......@@ -52,17 +22,12 @@ Download/Use any of these required Authenticator Apps:
After registering the teacher/admin has to grant you access and you can login.
:construction:
## System Information
| Name | Value |
| --- | --- |
| Product | PowerEdge R720xd |
| Distro | CentOS 6.10 Final |
| Kernel | 2.6.32-754.29.2.el6.centos.plus.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) CPU E5-2690 0 @ 2.90GHz |
| Cores | 2 CPU, 8 physical cores per CPU, total 32 logical CPU units |
| BaseBoard | Dell Inc. 0C4Y3R Intel Corporation C600/X79 series chipset |
| CPU time | 140 kh |
| Memory | 190 GB Total |
| Memory/Core | 11.9 GB |
......@@ -70,6 +35,45 @@ After registering the teacher/admin has to grant you access and you can login.
## Software
The typcial installation of a intel-cluster has the INTEL Compiler suite (`intel-parallel-studio`) and the open source GNU Compilers installed. Based on these two different compilers (`intel`, `gnu`), there are usually two version of each scientific software.
Major Libraries:
- OpenMPI (3.1.6, 4.0.5)
- HDF5
- NetCDF (C, Fortran)
- ECCODES from [ECMWF](https://confluence.ecmwf.int/display/ECC)
- Math libraries e.g. intel-mkl, lapack,scalapack
- Interpreters: Python, Julia
- Tools: cdo, ncl, nco, ncview
These software libraries are usually handled by environment modules.
<img src="http://modules.sourceforge.net/modules_red.svg" width="300px">
## Currently installed modules
```
$ module av
-------------------------------- /home/opt/spack/share/spack/modules/linux-centos6-sandybridge --------------------------------
anaconda3/2020.07-gcc-5.3.0 gcc/5.3.0-gcc-5.3.0 netcdf-c/4.7.4-gcc-5.3.0 zlib/1.2.11-gcc-5.3.0
cdo/1.9.9-gcc-5.3.0 git/2.29.0-gcc-5.3.0 netcdf-fortran/4.5.3-gcc-5.3.0
eccodes/2.18.0-gcc-5.3.0 hdf5/1.10.7-gcc-5.3.0 openmpi/3.1.6-gcc-5.3.0
enstools/2020.11.dev-gcc-5.3.0 miniconda3/4.8.2-gcc-5.3.0 proj/7.1.0-gcc-5.3.0
```
Using [environment modules](https://modules.readthedocs.io/en/latest/) it is possible to have different software libraries (versions, compilers) side-by-side and ready to be loaded. Be aware that some libraries are dependent on others. It is recommended to load the highest rank library first to check what dependencies are loaded as well. e.g.:
```
$ module load eccodes/2.18.0-gcc-5.3.0
Loading eccodes/2.18.0-gcc-5.3.0
Loading requirement: zlib/1.2.11-gcc-5.3.0 openmpi/3.1.6-gcc-5.3.0 hdf5/1.10.7-gcc-5.3.0 netcdf-c/4.7.4-gcc-5.3.0
```
loads the `ECCODES` library and all dependencies. e.g. intel or gnu compilers, as indicated by the naming.
```
$ module list
Currently Loaded Modulefiles:
1) zlib/1.2.11-gcc-5.3.0 3) hdf5/1.10.7-gcc-5.3.0 5) eccodes/2.18.0-gcc-5.3.0
2) openmpi/3.1.6-gcc-5.3.0 4) netcdf-c/4.7.4-gcc-5.3.0
```
`module list` shows the currently loaded modules and reports that 5 libraries need to be loaded as dependencies for `ECCODES`. Thus, it is not necessary to load the other libraries manually as they are dependencies of `ECCODES`.
## Best Practice
## Networking
# S R V X 8
# S R V X 2
[[_TOC_]]
## Getting Started
### SSH
### VPN
### Jupyterhub
[How to SSH / VNC / VPN](SSH-VPN-VNC/README.md)
## System Information
| Name | Value |
| --- | --- |
| Product | PowerEdge R940 |
| Distro | CentOS 8.2.2004 Core |
| Kernel | 4.18.0-147.el8.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
| Cores | 4 CPU, 20 physical cores per CPU, total 160 logical CPU units |
| BaseBoard | Dell Inc. 0D41HC Intel Corporation C621 Series Chipset |
| CPU time | 700 kh |
| Memory | 376 GB Total |
| Memory/Core | 9.4 GB |
## Software
```
$ module av
------------------------------ /home/spack-root/share/spack/modules/linux-centos8-skylake_avx512 ------------------------------
anaconda3/2020.07-gcc-8.3.1 intel-parallel-studio/composer.2020.2-intel-20.0.2 netcdf-fortran/4.5.3-gcc-8.3.1
eccodes/2.18.0-intel-20.0.2 libemos/4.5.9-gcc-8.3.1 netcdf-fortran/4.5.3-intel-20.0.2
eccodes/2.19.1-gcc-8.3.1 miniconda3/4.8.2-gcc-8.3.1 netlib-lapack/3.8.0-gcc-8.3.1
eccodes/2.19.1-intel-20.0.2 miniconda3/4.9.2-gcc-8.3.1 netlib-scalapack/2.1.0-gcc-8.3.1
eccodes/2.21.0-intel-20.0.2 ncl/6.6.2-gcc-8.3.1 openblas/0.3.12-gcc-8.3.1
hdf5/1.10.7-gcc-8.3.1 netcdf-c/4.7.4-gcc-8.3.1 openmpi/3.1.6-gcc-8.3.1
hdf5/1.10.7-intel-20.0.2 netcdf-c/4.7.4-intel-20.0.2 openmpi/3.1.6-intel-20.0.2
```
## Best Practice
## Networking
......@@ -5,22 +5,15 @@
## Getting Started
### SSH
### VPN
### Jupyterhub
[How to SSH / VNC / VPN](SSH-VPN-VNC/README.md)
## System Information
| Name | Value |
| --- | --- |
| Product | PowerEdge R730xd |
| Distro | CentOS 6.10 Final |
| Kernel | 2.6.32-754.29.2.el6.centos.plus.x86_64 GNU/Linux |
| Processor | Intel(R) Xeon(R) CPU E5-2697 v3 @ 2.60GHz |
| Cores | 2 CPU, 14 physical cores per CPU, total 56 logical CPU units |
| BaseBoard | Dell Inc. 0599V5 Intel Corporation C610/X99 series chipset |
| CPU time | 245 kh |
| Memory | 504 GB Total |
| Memory/Core | 18 Gb |
......@@ -28,6 +21,9 @@
## Software
Software is installed in numerous places. This is a legency system with no software controller.
A reinstall is planed for summer 2021.
## Best Practice
## Networking
......@@ -46,6 +46,8 @@ Host login
```
and replacing `[USERNAME]` and `[U:Account USERNAME]` with your usernames. Using such a file allows to connect like this `ssh srvx2` using the correct server adress and specified username. Copy this file as well on `login.univie.ac.at` and you can use commands like this: `ssh -t login ssh jet` to connect directly to `jet` via the `login` gateway.
If you want to use ssh-keys you can also use different keys in `.ssh/config` per server with `IdentityFile ~/.ssh/id_rsa_for_server`.
**From eduroam**: You should be able to log in as above.
......@@ -166,6 +168,28 @@ Adding resolutions according to your display's resolution have a look here: [add
Note: `$DISPLAY` is an environment variable that is usually set to your VNC server port.
# VPN
Requirements:
- `u:account`
Some servers are only accessible from within the virtual network of the university of Vienna. Therefore access from outside has to be granted via the [VPN-Service](https://vpn.univie.ac.at). Go there and login with your `u:account` and download the *Big-IP Edge* client for you system.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/01_download_neu.png)
Links:
* [ZID-VPN](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/)
* Linux (Ubuntu, Generic), Windows, Mac: [VPN user guides](https://vpn.univie.ac.at/f5-w-68747470733a2f2f7a69642e756e697669652e61632e6174$$/vpn/anleitungen/)
* Arch based AUR package [AUR f5fpc](https://aur.archlinux.org/packages/f5fpc/)
Follow the install instructions for Windows, Mac and Linux and make sure the software works.
![](https://zid.univie.ac.at/fileadmin/user_upload/d_zid/zid-open/daten/datennetz/vpn/Windows/08_verbinden.png)
On Windows and Mac you get a nice gui that requires you to fill in the VPN server: `vpn.univie.ac.at` and username and password from the `u:account`. On Linux execute the following:
```
f5fpc -s -t vpn.univie.ac.at -u [user]
```
The status can be checked with `f5fpc --info`.
## Screen
[Screen](https://wiki.ubuntuusers.de/Screen/) is terminal session manager, that allows to start processes and reconnect to these processes after disconnection.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment