diff --git a/.docs/index.md b/.docs/index.md
index 79c657b4d9cfe98fd1804379486c7b1ebf8822bb..e5aa318c57d4774ff8ae3f4a47a5c90644220295 100644
--- a/.docs/index.md
+++ b/.docs/index.md
@@ -23,12 +23,6 @@ We present a database repository system that allows researchers to ingest data i
 through common interfaces, provides efficient access to arbitrary subsets of data even when the underlying data store is
 evolving, allows reproducing of query results and supports findable-, accessible-, interoperable- and reusable data.
 
-## Community
-
-These institutions use DBRepo and integrated it into their repository infrastructure
-
-<img src="images/logos.png" width="100%" alt="Logos of TU Darmstadt, TU Wien, University Malaya" />
-
 ## More Information
 
 - Demonstration instance [https://dbrepo1.ec.tuwien.ac.at](https://dbrepo1.ec.tuwien.ac.at)
diff --git a/.docs/system-services-analyse.md b/.docs/system-services-analyse.md
index add62db0e279ae2e2deb8e3cd1f127f963129f58..5a9e08be36e64d95da663ebcac1a63d5e87e2a99 100644
--- a/.docs/system-services-analyse.md
+++ b/.docs/system-services-analyse.md
@@ -38,32 +38,7 @@ the [Storage Service](../system-services-storage), analysis for data types and p
 
 ### Examples
 
-Given a [CSV-file](https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-datasets/-/raw/master/gps.csv) 
-containing GPS-data `gps.csv` already uploaded in the `dbrepo-upload` bucket of the Storage Service with key `gps.csv`:
-
-```shell
-curl -X POST \
-  -d '{"filename":"gps.csv","separator":","}'
-  http://<hostname>:5000/api/analyse/determinedt
-```
-
-This results in the response:
-
-```json
-{
-    "columns": {
-        "ID": "bigint",
-        "KEY": "varchar",
-        "OBJECTID": "bigint",
-        "LBEZEICHNUNG": "varchar",
-        "LTYP": "bigint",
-        "LTYPTXT": "varchar",
-        "LAT": "decimal",
-        "LNG": "decimal"
-    },
-    "separator": ","
-}
-```
+See the [usage page](../usage-analyse).
 
 ## Limitations
 
diff --git a/.docs/system-services-storage.md b/.docs/system-services-storage.md
index e7767cbc4e09583147acb3becc6de931c1710c7e..f6b9ae28ee48fe5fa8dc319228ffe09ee4e3f030 100644
--- a/.docs/system-services-storage.md
+++ b/.docs/system-services-storage.md
@@ -33,42 +33,7 @@ The default configuration creates two buckets `dbrepo-upload`, `dbrepo-download`
 
 ### Examples
 
-Upload a CSV-file into the `dbrepo-upload` bucket with the AWS CLI:
-
-```console
-$ aws --endpoint-url http://<hostname>:9000 \
-    s3 \
-    cp /path/to/file.csv \
-    s3://dbrepo-upload/
-upload: /path/to/file.csv to s3://dbrepo-upload/file.csv
-```
-
-You can list the buckets:
-
-```console
-$ aws --endpoint-url http://<hostname>:9000 \
-    s3 \
-    ls
-2023-12-03 16:23:15 dbrepo-download
-2023-12-03 16:28:05 dbrepo-upload
-```
-
-And list the files in the bucket `dbrepo-upload` with:
-
-```console
-$ aws --endpoint-url http://<hostname>:9000 \
-    s3 \
-    ls \
-    dbrepo-upload
-2023-12-03 16:28:05     535219 file.csv
-```
-
-Alternatively, you can use the middleware of the [User Interface](../system-other-ui/) to upload files.
-
-Alternatively, you can use a S3-compatible client:
-
-* [boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) (generic Python implementation of S3)
-* AWS SDK (tailored towards Amazon S3)
+See the [usage page](../usage-storage).
 
 ## Limitations
 
diff --git a/.docs/system-services-upload.md b/.docs/system-services-upload.md
index 287d3a371780762f8213aec189a05d301e7c6788..656b82b8692a80aaacac6f8a1283492c9ea2ee61 100644
--- a/.docs/system-services-upload.md
+++ b/.docs/system-services-upload.md
@@ -20,16 +20,7 @@ We use the [TUS](https://tus.io/) open protocol for resumable file uploads which
 
 ### Examples
 
-Upload a CSV-file into the `dbrepo-upload` bucket with the console 
-via `http://<hostname>/admin/storage/browser/dbrepo-upload`.
-
-
-
-We recommend using a TUS-compatible client:
-
-* [tus-java-client](https://github.com/tus/tus-java-client) (Java)
-* [tus-js-client](https://github.com/tus/tus-js-client) (JavaScript/Node.js)
-* [tusd](https://github.com/tus/tusd) (Go)
+See the [usage page](../usage-upload).
 
 ## Limitations
 
diff --git a/.docs/usage-analyse.md b/.docs/usage-analyse.md
new file mode 100644
index 0000000000000000000000000000000000000000..92e4b5c15417357e21847c7d5ae61e6ac70fe781
--- /dev/null
+++ b/.docs/usage-analyse.md
@@ -0,0 +1,32 @@
+---
+author: Martin Weise
+---
+
+# Analyse Service
+
+Given a [CSV-file](https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-datasets/-/raw/master/gps.csv)
+containing GPS-data `gps.csv` already uploaded in the `dbrepo-upload` bucket of the Storage Service with key `gps.csv`:
+
+```shell
+curl -X POST \
+  -d '{"filename":"gps.csv","separator":","}'
+  http://<hostname>:5000/api/analyse/determinedt
+```
+
+This results in the response:
+
+```json
+{
+    "columns": {
+        "ID": "bigint",
+        "KEY": "varchar",
+        "OBJECTID": "bigint",
+        "LBEZEICHNUNG": "varchar",
+        "LTYP": "bigint",
+        "LTYPTXT": "varchar",
+        "LAT": "decimal",
+        "LNG": "decimal"
+    },
+    "separator": ","
+}
+```
\ No newline at end of file
diff --git a/.docs/usage-storage.md b/.docs/usage-storage.md
new file mode 100644
index 0000000000000000000000000000000000000000..214698710e8086ae1dd40ada21561cd98b1127ff
--- /dev/null
+++ b/.docs/usage-storage.md
@@ -0,0 +1,42 @@
+---
+author: Martin Weise
+---
+
+# Storage Service
+
+Upload a CSV-file into the `dbrepo-upload` bucket with the AWS CLI:
+
+```console
+$ aws --endpoint-url http://<hostname>:9000 \
+    s3 \
+    cp /path/to/file.csv \
+    s3://dbrepo-upload/
+upload: /path/to/file.csv to s3://dbrepo-upload/file.csv
+```
+
+You can list the buckets:
+
+```console
+$ aws --endpoint-url http://<hostname>:9000 \
+    s3 \
+    ls
+2023-12-03 16:23:15 dbrepo-download
+2023-12-03 16:28:05 dbrepo-upload
+```
+
+And list the files in the bucket `dbrepo-upload` with:
+
+```console
+$ aws --endpoint-url http://<hostname>:9000 \
+    s3 \
+    ls \
+    dbrepo-upload
+2023-12-03 16:28:05     535219 file.csv
+```
+
+Alternatively, you can use the middleware of the [User Interface](../system-other-ui/) to upload files.
+
+Alternatively, you can use a S3-compatible client:
+
+* [boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) (generic Python implementation of S3)
+* AWS SDK (tailored towards Amazon S3)
diff --git a/.docs/usage-upload.md b/.docs/usage-upload.md
index d01050cb916abe7ae6484360ffc69be9aee6b700..e48bc364c1c41fc1141afd0cc2dd4ff09696c4cd 100644
--- a/.docs/usage-upload.md
+++ b/.docs/usage-upload.md
@@ -4,7 +4,17 @@ author: Martin Weise
 
 # Upload Service
 
-Uploads a file `file.csv` in 200 byte chunks.
+Upload a CSV-file into the `dbrepo-upload` bucket with the console
+via `http://<hostname>/admin/storage/browser/dbrepo-upload`.
+
+We recommend using a TUS-compatible client:
+
+* [tus-py-client](https://github.com/tus/tus-py-client) (Python)
+* [tus-java-client](https://github.com/tus/tus-java-client) (Java)
+* [tus-js-client](https://github.com/tus/tus-js-client) (JavaScript/Node.js)
+* [tusd](https://github.com/tus/tusd) (Go)
+
+You can also upload a file `file.csv` in 200 byte chunks with Python:
 
 === "Python"
 
diff --git a/mkdocs.yml b/mkdocs.yml
index 2e236a5dc435c8b8ff2455b1f32cbcb5a1831473..b4dfba732c38d719639ace21b964457fde77f84e 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -31,10 +31,13 @@ nav:
       - Search Database Dashboard: system-other-search-dashboard.md
   - Usage:
     - Overview: usage-overview.md
-    - Authentication Service: usage-auth.md
-    - Broker Service: usage-broker.md
-    - Identifier Service: usage-identifier.md
-    - Upload Service: usage-upload.md
+    - Services:
+      - Analyse Service: usage-analyse.md
+      - Authentication Service: usage-auth.md
+      - Broker Service: usage-broker.md
+      - Identifier Service: usage-identifier.md
+      - Upload Service: usage-upload.md
+      - Storage Service: usage-storage.md
   - publications.md
   - contact.md
 extra_css:
diff --git a/.docs/requirements.txt b/requirements.txt
similarity index 100%
rename from .docs/requirements.txt
rename to requirements.txt