Skip to content
Snippets Groups Projects
Verified Commit 818758d5 authored by Martin Weise's avatar Martin Weise
Browse files

Updated docs and install script checks

parent c8363492
No related branches found
No related tags found
3 merge requests!237Resolve "Proxy settings",!232Draft: Master,!231CI: Remove build for log-service
...@@ -20,8 +20,8 @@ For this small, local, test deployment any modern hardware would suffice, we rec ...@@ -20,8 +20,8 @@ For this small, local, test deployment any modern hardware would suffice, we rec
the following settings. the following settings.
- min. 8 vCPU cores - min. 8 vCPU cores
- min. 16GB RAM memory - min. 8GB free RAM memory
- min. 200GB SSD storage - min. 200GB free SSD storage
- min. 100Mbit/s connection - min. 100Mbit/s connection
*Optional*: public IP-address if you want to secure the deployment with a (free) TLS-certificate from Let's Encrypt. *Optional*: public IP-address if you want to secure the deployment with a (free) TLS-certificate from Let's Encrypt.
...@@ -134,6 +134,22 @@ In case the deployment is unsuccessful, we have explanations on their origin and ...@@ -134,6 +134,22 @@ In case the deployment is unsuccessful, we have explanations on their origin and
`sudo netstat -tulpn` (sudo is necessary for the process id) and then stop the service or application `sudo netstat -tulpn` (sudo is necessary for the process id) and then stop the service or application
gracefully or force a stop via `kill -15 PID` (not recommended). gracefully or force a stop via `kill -15 PID` (not recommended).
**IllegalArgumentException values less than -1 bytes are not supported**
: *Origin*: Your deployment machine (e.g. laptop, virtual machine) appears to not have enough RAM assigned.
: *Solution*: Assign more RAM to the deployment machine (e.g. add vRAM to the virtual machine).
## Next Steps
You should now be able to view the front end at [http://localhost](http://localhost).
Please be warned that the default configuration is not intended for public deployments. It is only intended to have a
running system within minutes to play around within the system and explore features. It is strongly advised to change
the default `.env` environment variables.
Next, create a [user account](../usage-overview/#create-user-account) and
then [create a database](../usage-overview/#create-database) to [import a dataset](../usage-overview/#import-dataset).
## Security ## Security
!!! warning "Known security issues with the default configuration" !!! warning "Known security issues with the default configuration"
......
.docs/images/screenshots/import-database-dump-step-1.png

103 KiB

.docs/images/screenshots/import-database-dump-step-2.png

133 KiB

.docs/images/screenshots/import-database-dump-step-3.png

106 KiB

...@@ -11,7 +11,7 @@ We give usage examples of the most important use-cases we identified. ...@@ -11,7 +11,7 @@ We give usage examples of the most important use-cases we identified.
| [Create User Account](#create-user-account) | :white_check_mark: | :white_check_mark: | :x: | :white_check_mark: | | [Create User Account](#create-user-account) | :white_check_mark: | :white_check_mark: | :x: | :white_check_mark: |
| [Create Database](#create-database) | :white_check_mark: | :white_check_mark: | :x: | :white_check_mark: | | [Create Database](#create-database) | :white_check_mark: | :white_check_mark: | :x: | :white_check_mark: |
| [Import Dataset](#import-dataset) | :white_check_mark: | :white_check_mark: | :x: | :white_check_mark: | | [Import Dataset](#import-dataset) | :white_check_mark: | :white_check_mark: | :x: | :white_check_mark: |
| [Import Database Dump](#import-database-dump) | :x: | :x: | :white_check_mark: | :x: | | [Import Database Dump](#import-database-dump) | :white_check_mark: | :white_check_mark: | :x: | :x: |
| [Import Live Data](#import-live-data) | :x: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | [Import Live Data](#import-live-data) | :x: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| [Export Subset](#export-subset) | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | [Export Subset](#export-subset) | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| [Assign Database PID](#assign-database-pid) | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | [Assign Database PID](#assign-database-pid) | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
...@@ -541,19 +541,58 @@ access to. This is the default for self-created databases like above in [Create ...@@ -541,19 +541,58 @@ access to. This is the default for self-created databases like above in [Create
A user wants to import a database dump in `.sql` (or in `.sql.gz`) format into DBRepo. A user wants to import a database dump in `.sql` (or in `.sql.gz`) format into DBRepo.
=== "JDBC API" === "UI"
First, create a new database as descriped in the [Create Database](#create-database) use-case above. Then, import
the database dump `dump.sql` via the [MySQL Workbench](https://www.mysql.com/products/workbench/) client which is
semi-compatible with MariaDB databases, i.e. core features work some status/performance features do not.
Setup a new connection in the MySQL Workbench (c.f. Figure 14) by clicking the small
":material-plus-circle-outline:" button :material-numeric-1-circle-outline: to open the dialog. In the opened dialog
fill out the connection parameters (for local deployments the hostname is `127.0.0.1` and port `3307` for the
[Data Database](../system-databases-data/) :material-numeric-2-circle-outline:.
The default credentials are username `root` and password `dbrepo`, type the password in
:material-numeric-3-circle-outline: and click the "OK" button. Then finish the setup of the new connection by
clicking the "OK" button :material-numeric-4-circle-outline:.
<figure markdown>
![Setup New Connection in MySQL Workbench](images/screenshots/import-database-dump-step-1.png)
<figcaption>Figure 14: Setup New Connection in MySQL Workbench.</figcaption>
</figure>
Now you should be able to see some statistics for the Data Database (c.f. Figure 15), especially that it is running
:material-numeric-1-circle-outline: and basic connection and version information :material-numeric-2-circle-outline:.
<figure markdown>
![Server status of the Data Database in MySQL Workbench](images/screenshots/import-database-dump-step-2.png)
<figcaption>Figure 15: Server status of the Data Database in MySQL Workbench.</figcaption>
</figure>
Then proceed to import the database dump `dump.sql` by clicking "Data Import/Restore"
:material-numeric-1-circle-outline: and select "Import from Self-Contained File" in the Import Options. Then
select the `dump.sql` file in the file path selection. Last, select the database you want to import this `dump.sql`
into :material-numeric-4-circle-outline: (you can also create a new database for the import by clicking "New...").
The import starts after clicking "Start Import" :material-numeric-5-circle-outline:.
<figure markdown>
![Data Import/Restore in MySQL Workbench](images/screenshots/import-database-dump-step-3.png)
<figcaption>Figure 16: Data Import/Restore in MySQL Workbench.</figcaption>
</figure>
=== "Terminal"
First, create a new database as descriped in the [Create Database](#create-database) use-case above. Then, import First, create a new database as descriped in the [Create Database](#create-database) use-case above. Then, import
the database dump `dump.sql` via the `mariadb` client. the database dump `dump.sql` via the `mariadb` client.
```bash ```bash
mariadb -u USERNAME -pYOURPASSWORD db_name < dump.sql mariadb -H127.0.0.1 -p3307 -uUSERNAME -pYOURPASSWORD db_name < dump.sql
``` ```
Alternatively, if your database dump is compressed, import the `dump.sql.gz` by piping it through `gunzip`. Alternatively, if your database dump is compressed, import the `dump.sql.gz` by piping it through `gunzip`.
```bash ```bash
gunzip < dump.sql.gz | mysql -u root -pYOURPASSWORD db_name gunzip < dump.sql.gz | mysql -H127.0.0.1 -p3307 -uUSERNAME -pYOURPASSWORD db_name
``` ```
The [Metadata Service](../system-services-metadata) periodically (by default configuration every 60 seconds) checks The [Metadata Service](../system-services-metadata) periodically (by default configuration every 60 seconds) checks
......
#!/bin/bash #!/bin/bash
MIN_CPU=8
MIN_RAM=8
SKIP_CHECKS=${SKIP_CHECKS:-0}
# dependency # checks
docker info > /dev/null if [[ $SKIP_CHECKS -eq 0 ]]; then
if [ $? -ne 0 ]; then echo "[✨] Startup check ..."
echo "Docker is not installed (or accessible in bash) on your system:" docker info > /dev/null
echo "" if [[ $? -ne 0 ]]; then
echo " - install docker from https://docs.docker.com/desktop/install/linux-install/" echo "Docker is not installed (or accessible in bash) on your system:"
echo " - make sure the docker executable is in \$PATH" echo ""
exit 2 echo " - install docker from https://docs.docker.com/desktop/install/linux-install/"
echo " - make sure the docker executable is in \$PATH"
exit 2
else
echo "$(docker --version) OK"
fi
CPU=$(cat /proc/cpuinfo | grep processor | wc -l)
if [[ $CPU -lt $MIN_CPU ]]; then
echo "You do not have enough vCPU resources available:"
echo ""
echo " - we found ${CPU} vCPU cores instead of necessary ${MIN_CPU}"
echo " - if you believe this is a mistake, skip startup checks with the SKIP_CHECKS=1 flag"
exit 3
else
echo "vCPU ${CPU} OK"
fi
RAM=$(free -g -t | awk 'NR==2 {print $4}')
if [[ $RAM -lt $MIN_RAM ]]; then
echo "You do not have enough RAM free resources:"
echo ""
echo " - we found ${RAM}GB RAM (free) instead of necessary ${RAM}GB"
echo " - if you believe this is a mistake, skip startup checks with the SKIP_CHECKS=1 flag"
exit 4
else
echo "RAM ${RAM}GB OK"
fi
else
echo "[✨] Skipping checks ..."
fi fi
# environment # environment
echo "[🚀] Gathering environment ..." echo "[🚀] Gathering environment ..."
mkdir -p ./dist mkdir -p ./dist
curl -sSL -o ./docker-compose.yml https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/docker-compose.prod.yml curl -sSL -o ./docker-compose.yml https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/docker-compose.prod.yml
curl -sSL -o ./dist/setup-schema_local.sql https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/dbrepo-metadata-db/setup-schema_local.sql curl -sSL -o ./dist/setup-schema_local.sql https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/dbrepo-metadata-db/setup-schema_local.sql
curl -sSL -o ./dist/rabbitmq.conf https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/dbrepo-broker-service/rabbitmq.conf curl -sSL -o ./dist/rabbitmq.conf https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/dbrepo-broker-service/rabbitmq.conf
curl -sSL -o ./dist/enabled_plugins https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/dbrepo-broker-service/enabled_plugins curl -sSL -o ./dist/enabled_plugins https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/dbrepo-broker-service/enabled_plugins
curl -sSL -o ./dist/cert.pem https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/dbrepo-broker-service/cert.pem curl -sSL -o ./dist/cert.pem https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/dbrepo-broker-service/cert.pem
curl -sSL -o ./dist/pubkey.pem https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/dbrepo-broker-service/pubkey.pem curl -sSL -o ./dist/pubkey.pem https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/dbrepo-broker-service/pubkey.pem
curl -sSL -o ./dist/definitions.json https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/dbrepo-broker-service/definitions.json curl -sSL -o ./dist/definitions.json https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/dbrepo-broker-service/definitions.json
curl -sSL -o ./dist/dbrepo.conf https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/dbrepo-gateway-service/dbrepo.conf curl -sSL -o ./dist/dbrepo.conf https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/dbrepo-gateway-service/dbrepo.conf
curl -sSL -o ./dist/opensearch_dashboards.yml https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/dbrepo-search-db/opensearch_dashboards.yml curl -sSL -o ./dist/opensearch_dashboards.yml https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/dbrepo-search-db/opensearch_dashboards.yml
curl -sSL -o ./dist/dbrepo.config.json https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/dbrepo-ui/dbrepo.config.json curl -sSL -o ./dist/dbrepo.config.json https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/dbrepo-ui/dbrepo.config.json
curl -sSL -o ./dist/s3_config.json https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/dev/dbrepo-storage-service/s3_config.json curl -sSL -o ./dist/s3_config.json https://gitlab.phaidra.org/fair-data-austria-db-repository/fda-services/-/raw/master/dbrepo-storage-service/s3_config.json
echo "[📦] Pulling images ..." echo "[📦] Pulling images ..."
docker compose pull docker compose pull
...@@ -45,4 +75,5 @@ if [ $? -eq 0 ]; then ...@@ -45,4 +75,5 @@ if [ $? -eq 0 ]; then
echo "" echo ""
echo " docker compose logs -f" echo " docker compose logs -f"
echo "" echo ""
echo "Read about next steps online: https://www.ifs.tuwien.ac.at/infrastructures/dbrepo/latest/deployment-docker-compose/#next-steps"
fi fi
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment