-
Michael Blaschek authoredMichael Blaschek authored
S R V X 1
Teaching Hub and Access Point from the web
Getting Started
Steps:
- Request access / will be done for you by the lecturer
- As Staff, access using SSH - How to SSH / VNC / VPN
- As Student, access using Teaching Hub - How to connect using the TeachingHub
- Access Services https://srvx1.img.univie.ac.at
System Information
Name | Value |
---|---|
Product | PowerEdge R940 |
Processor | Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
Cores | 4 CPU, 20 physical cores per CPU, total 160 logical CPU units |
CPU time | 700 kh |
Memory | 754 GB Total |
Memory/Core | 9.4 GB |
Greeter
----------------------------------------------
131.130.157.11 _ . , . .
* / \_ * / \_ _ * * /\'_
/ \ / \, (( . _/ /
. /\/\ /\/ :' __ \_ ` _^/ ^/
/ \/ \ _/ \-'\ * /.' ^_ \
/\ .- `. \/ \ /==~=-=~=-=-;. _/ \ -
/ `-.__ ^ / .-'.--\ =-=~_=-=~=^/ _ `--./
/SRVX1 `. / / `.~-^=-=~=^=.-' '
----------------------------------------------
Services
The SRVX1 is the central access point to IMG services: goto: srvx.img.univie.ac.at
Currently running:
- TeachingHub (Jupyterhub)
- Webdata - File hosting
- YoPass - Password Sharing
- iLibrarian - Paper Management (development)
- Filetransfer.sh - File sharing service (development)
Jupyterhub
SRVX1 serves a teaching jupyterhub with a jupyterlab. It allows easy access for students and teachers. Access: https://srvx1.img.univie.ac.at/hub
Signup is only granted by teachers and requires a srvx1 user account. A new password is needed and a TOTP (time base one-time password) will be created.
Download/Use any of these required Authenticator Apps:
Please note that having a synchronized time on the TOTP device is required. Otherwise the time-based passwords will be out of sync and the authentication fails. After registering the teacher/admin has to grant you access and you can login.
Software
The typcial installation of a intel-server has the INTEL Compiler suite (intel-parallel-studio
, intel-oneapi
) and the open source GNU Compilers installed. Based on these two different compilers (intel
, gnu
), there are usually two version of each scientific software.
Major Libraries:
- OpenMPI (3.1.6, 4.0.5)
- HDF5
- NetCDF (C, Fortran)
- ECCODES from ECMWF
- Math libraries e.g. intel-mkl, lapack,scalapack
- Interpreters: Python, Julia
- Tools: cdo, ncl, nco, ncview
These software libraries are usually handled by environment modules.
Currently installed modules
Please note that new versions might already be installed.
$ module av
--------------- /home/swd/spack/share/spack/modules/linux-rhel8-skylake_avx512 ----------------
anaconda3/2020.11-gcc-8.5.0 matlab/R2020b-gcc-8.5.0 proj/8.1.0-gcc-8.5.0
anaconda3/2021.05-gcc-8.5.0 miniconda2/4.7.12.1-gcc-8.5.0 python/3.8.12-gcc-8.5.0
autoconf/2.69-oneapi-2021.2.0 miniconda3/4.10.3-gcc-8.5.0
autoconf/2.71-oneapi-2021.2.0 nco/4.9.3-intel-20.0.4
cdo/1.9.10-gcc-8.5.0 nco/5.0.1-gcc-8.5.0
eccodes/2.19.1-gcc-8.5.0 ncview/2.1.8-gcc-8.5.0
eccodes/2.19.1-gcc-8.5.0-MPI3.1.6 netcdf-c/4.6.3-gcc-8.5.0-MPI3.1.6
eccodes/2.19.1-intel-20.0.4 netcdf-c/4.6.3-intel-20.0.4-MPI3.1.6
eccodes/2.19.1-intel-20.0.4-MPI3.1.6 netcdf-c/4.7.4-gcc-8.5.0
eccodes/2.21.0-gcc-8.5.0 netcdf-c/4.7.4-intel-20.0.4
eccodes/2.21.0-gcc-8.5.0-MPI3.1.6 netcdf-fortran/4.5.2-gcc-8.5.0-MPI3.1.6
eccodes/2.21.0-intel-20.0.4 netcdf-fortran/4.5.2-intel-20.0.4-MPI3.1.6
gcc/8.5.0-gcc-8.5rhel8 netcdf-fortran/4.5.3-gcc-8.5.0
geos/3.8.1-gcc-8.5.0 netlib-lapack/3.9.1-gcc-8.5.0
hdf5/1.10.7-gcc-8.5.0 netlib-lapack/3.9.1-intel-20.0.4
hdf5/1.10.7-gcc-8.5.0-MPI3.1.6 netlib-lapack/3.9.1-oneapi-2021.2.0
hdf5/1.10.7-intel-20.0.4-MPI3.1.6 netlib-scalapack/2.1.0-gcc-8.5.0
hdf5/1.12.0-intel-20.0.4 netlib-scalapack/2.1.0-gcc-8.5.0-MPI3.1.6
hdf5/1.12.0-oneapi-2021.2.0 openblas/0.3.18-gcc-8.5.0
intel-mkl/2020.4.304-intel-20.0.4 openmpi/3.1.6-gcc-8.5.0
intel-oneapi-compilers/2021.2.0-oneapi-2021.2.0 openmpi/3.1.6-intel-20.0.4
intel-oneapi-dal/2021.2.0-oneapi-2021.2.0 openmpi/4.0.5-gcc-8.5.0
intel-oneapi-mkl/2021.2.0-oneapi-2021.2.0 openmpi/4.0.5-intel-20.0.4
intel-oneapi-mpi/2021.2.0-oneapi-2021.2.0 parallel-netcdf/1.12.2-gcc-8.5.0
intel-parallel-studio/composer.2020.4-intel-20.0.4 parallel-netcdf/1.12.2-gcc-8.5.0-MPI3.1.6
libemos/4.5.9-gcc-8.5.0-MPI3.1.6 perl/5.32.0-intel-20.0.4
------------------------------------------------------ /home/swd/modules ------------------------------------------------------
anaconda3/leo-current-gcc-8.3.1 idl/8.2-sp1 micromamba/0.15.2 pypy/7.3.5 xconv/1.94
ecaccess/4.0.2 intelpython/2021.4.0.3353 ncl/6.6.2 shpc/0.0.33
on how to use environment modules go to Using Environment Modules
User services
There is a script collection that is accessible via the userservices
command. e.g. running
$ userservices
Usage: userservices [service] [Options]
Available Services:
------------------------------------------------------------------------
archive --- Submit files/folders to ZID Archive
fetch-sysinfo --- Display system information
filesender --- Transfer files to ACONET filesender (requires account)
fix-permissions --- fix file/directory permissions
home-dir-check --- Check home directory/configuration
modules --- Pretty print environment modules
transfersh --- Transfer files/directories (IMGW subnet)
weather --- Retrieve weather information
yopass --- Send messages/small files to YoPass (encrypted)
------------------------------------------------------------------------
These scripts are intended to help with certain known problems.
Report problems to: michael.blaschek@univie.ac.at
These are scripts in a common directory. Feel free to copy or edit as you like. Note that some services like filesender
require an ACONET account (accessible via your u:account).
Container Hub
Currently there is the possibility to run singularity containers on all our Servers. This is really similar to docker, but much more secure for multi-user servers. Almost every docker container can be converted into a singularity container. Some of the build recipes use docker.
There are a number of prepared containers but more can be added. If you have a wish or an existing container useful for others please share.
containers:
- root: /home/swd/containers
- available:
- RTTOV:
- RTTOV: 12.3
- compiler: gcc:7.3.0 (anaconda)
- path: /home/swd/containers/rttov-jupyter/jup3rttov.sif
- os: centos:6.10
- python: 3.7.4
- singularity: 3.5.2
- packages:
- anaconda3
- jupyter jupyterlab numpy matplotlib pandas xarray bottleneck dask numba scipy netcdf4 cartopy h5netcdf nc-time-axis cfgrib eccodes nodejs
- apps:
- atlas
- lab
- notebook
- rtcoef
- rthelp
- rttest
- description: Use for running RTTOV simulations with Python.
- LIBEMOS:
- LIBEMOS: 4.5.1
- compiler: gcc:7.5.0
- path: /home/swd/containers/libemos-dev/libemos-dev.sif
- os: ubuntu:18.04
- singularity: 3.8.6
- packages:
- openmpi:2.1.1-8
- fftw3:3.3.7-1
- eccodes:2.6.0
- libemos
- description: Use for building flexextract with working libemos dependencies.