Skip to content
Snippets Groups Projects
Verified Commit 23bee4dd authored by Martin Weise's avatar Martin Weise
Browse files

Merge branch 'master' into release-1.4.4

parents 4c4e9653 bbb04201
No related branches found
No related tags found
2 merge requests!296Dev,!294Master
Showing
with 339 additions and 151 deletions
......@@ -33,89 +33,78 @@ This package supports Python 3.11+.
## Quickstart
Create a table and import a .csv file from your computer.
Get public data from a table as pandas `DataFrame`:
```python
from dbrepo.RestClient import RestClient
from dbrepo.api.dto import CreateTableColumn, ColumnType, CreateTableConstraints
client = RestClient(endpoint='https://test.dbrepo.tuwien.ac.at', username="foo",
password="bar")
# analyse csv
analysis = client.analyse_datatypes(file_path="sensor.csv", separator=",")
print(f"Analysis result: {analysis}")
# -> columns=(date=date, precipitation=decimal, lat=decimal, lng=decimal), separator=,
# line_termination=\n
# create table
table = client.create_table(database_id=1,
name="Sensor Data",
constraints=CreateTableConstraints(checks=['precipitation >= 0'],
uniques=[['precipitation']]),
columns=[CreateTableColumn(name="date",
type=ColumnType.DATE,
dfid=3, # YYYY-MM-dd
primary_key=True,
null_allowed=False),
CreateTableColumn(name="precipitation",
type=ColumnType.DECIMAL,
size=10,
d=4,
primary_key=False,
null_allowed=True),
CreateTableColumn(name="lat",
type=ColumnType.DECIMAL,
size=10,
d=4,
primary_key=False,
null_allowed=True),
CreateTableColumn(name="lng",
type=ColumnType.DECIMAL,
size=10,
d=4,
primary_key=False,
null_allowed=True)])
print(f"Create table result {table}")
# -> (id=1, internal_name=sensor_data, ...)
client.import_table_data(database_id=1, table_id=1, file_path="sensor.csv", separator=",",
skip_lines=1, line_encoding="\n")
print(f"Finished.")
client = RestClient(endpoint="https://dbrepo1.ec.tuwien.ac.at")
# Get a small data slice of just three rows
df = client.get_table_data(database_id=7, table_id=13, page=0, size=3, df=True)
print(df)
# x_coord component unit ... value stationid meantype
# 0 16.52617 Feinstaub (PM10) µg/m³ ... 21.0 01:0001 HMW
# 1 16.52617 Feinstaub (PM10) µg/m³ ... 23.0 01:0001 HMW
# 2 16.52617 Feinstaub (PM10) µg/m³ ... 26.0 01:0001 HMW
#
# [3 rows x 12 columns]
```
The library is well-documented, please see the [full documentation](../sphinx) or
the [PyPI page](https://pypi.org/project/dbrepo/).
Import data into a table:
## Supported Features & Best-Practices
```python
import pandas as pd
from dbrepo.RestClient import RestClient
- Manage user account ([docs](../usage-overview/#create-user-account))
- Manage databases ([docs](../usage-overview/#create-database))
- Manage database access & visibility ([docs](../usage-overview/#private-database-access))
- Import dataset ([docs](../usage-overview/#private-database-access))
- Create persistent identifiers ([docs](../usage-overview/#assign-database-pid))
- Execute queries ([docs](../usage-overview/#export-subset))
- Get data from tables/views/subsets
client = RestClient(endpoint="https://dbrepo1.ec.tuwien.ac.at", username="foo",
password="bar")
df = pd.DataFrame(data={'x_coord': 16.52617, 'component': 'Feinstaub (PM10)',
'unit': 'µg/m³', ...})
client.import_table_data(database_id=7, table_id=13, file_name_or_data_frame=df)
```
## Secrets
## Supported Features & Best-Practices
It is not recommended to store credentials directly in the notebook as they will be versioned with git, etc. Use
environment variables instead:
- Manage user
account ([docs](https://www.ifs.tuwien.ac.at/infrastructures/dbrepo/1.4.4/api/#create-user-account))
- Manage
databases ([docs](https://www.ifs.tuwien.ac.at/infrastructures/dbrepo//usage-overview/#create-database))
- Manage database access &
visibility ([docs](https://www.ifs.tuwien.ac.at/infrastructures/dbrepo/1.4.4/api/#create-database))
- Import
dataset ([docs](https://www.ifs.tuwien.ac.at/infrastructures/dbrepo/1.4.4/api/#import-dataset))
- Create persistent
identifiers ([docs](https://www.ifs.tuwien.ac.at/infrastructures/dbrepo/1.4.4/api/#assign-database-pid))
- Execute
queries ([docs](https://www.ifs.tuwien.ac.at/infrastructures/dbrepo/1.4.4/api/#export-subset))
- Get data from tables/views/subsets
```properties title=".env"
DBREPO_ENDPOINT=https://test.dbrepo.tuwien.ac.at
DBREPO_USERNAME=foo
DBREPO_PASSWORD=bar
DBREPO_SECURE=True
## Configure
All credentials can optionally be set/overridden with environment variables. This is especially useful when sharing
Jupyter Notebooks by creating an invisible `.env` file and loading it:
``` title=".env"
REST_API_ENDPOINT="https://dbrepo1.ec.tuwien.ac.at"
REST_API_USERNAME="foo"
REST_API_PASSWORD="bar"
REST_API_SECURE="True"
AMQP_API_HOST="https://dbrepo1.ec.tuwien.ac.at"
AMQP_API_PORT="5672"
AMQP_API_USERNAME="foo"
AMQP_API_PASSWORD="bar"
AMQP_API_VIRTUAL_HOST="dbrepo"
REST_UPLOAD_ENDPOINT="https://dbrepo1.ec.tuwien.ac.at/api/upload/files"
```
Then use the default constructor of the `RestClient` to e.g. analyse a CSV. Your secrets are automatically passed:
You can disable logging by setting the log level to e.g. `INFO`:
```python title="analysis.py"
```python
from dbrepo.RestClient import RestClient
client = RestClient()
analysis = client.analyse_datatypes(file_path="sensor.csv", separator=",")
import logging
logging.getLogger().setLevel(logging.INFO)
...
client = RestClient(...)
```
## Future
......
---
author: Martin Weise
---
## tl;dr
[:fontawesome-solid-database:  Dataset](https://handle.stage.datacite.org/10.82556/gd17-aq82){ .md-button .md-button--primary target="_blank" }
[:material-file-document:  Archive](https://doi.org/10.48436/mtha8-w2406){ .md-button .md-button--secondary target="_blank" }
## Description
This digital record contains historical air pollution and air quality data from approximately 20 air monitoring stations
in Vienna, spanning the years from 1980 to 2021. The data was provided by the Umweltbundesamt and is stored in its
original form in this record. This record forms the basis of an analysis carried out in a bachelor's thesis at the TU
Wien.
## Solution
<figure markdown>
![Jupyter Notebook](../../images/screenshots/air-notebook.png){ .img-border }
<figcaption>Figure 1: Jupyter Notebook accessing data on DBRepo using the Python Library.</figcaption>
</figure>
## DBRepo Features
- [x] Import complex dataset
- [x] System versioning
- [x] Subset exploration
- [x] Aggregated views
- [x] Precise & PID of queries tables
- [x] External data access for analysis
## Acknowledgement
This work was part of a cooperation with the [Umweltbundesamt](https://www.umweltbundesamt.at/).
<img src="../../images/logos/umweltbundesamt.png" width=100 />
\ No newline at end of file
---
author: Martin Weise
---
## tl;dr
[:fontawesome-solid-database: &nbsp;Dataset](https://dbrepo1.ec.tuwien.ac.at/pid/15){ .md-button .md-button--primary target="_blank" }
[:simple-github: &nbsp;Archive](https://github.com/CSSEGISandData/COVID-19){ .md-button .md-button--secondary target="_blank" }
## Description
This dataset contains the daily COVID-19 data provided publicly
by [Johns Hopkins University](https://coronavirus.jhu.edu/about/how-to-use-our-data).
## Solution
We imported their daily snapshots provided as 1145 versioned .csv files from their Git repository archive and imported
them daily into DBRepo as system-versioned data that can be queried. During the time of this project the COVID-19
pandemic was still ongoing and therefore daily snapshots demanded a correct import script to be maintained.
## DBRepo Features
- [x] Data pipeline from Git repository
- [x] System versioning
- [x] Subset exploration
---
author: Martin Weise
---
## tl;dr
tbd
## Description
TBD
## Solution
TBD
## DBRepo Features
- [x] Import through CSV-dataset upload
- [x] Data views implementing embargo period (24 hours)
- [x] External access from Grafana Dashboard
---
author: Martin Weise
---
## tl;dr
[:fontawesome-solid-database: &nbsp;Dataset](https://dbrepo1.ec.tuwien.ac.at/pid/34){ .md-button .md-button--primary target="_blank" }
[:material-file-document: &nbsp;Archive](https://gitlab.tuwien.ac.at/martin.weise/fairnb){ .md-button .md-button--secondary target="_blank" }
## Description
We use a dataset collected by [Aljanaki et al.](https://www2.projects.science.uu.nl/memotion/emotifydata/), consisting
of 400 MP3 music files, each having a playtime of one minute and labeled with one of four genres: rock, pop, classical
and electronic, each genre contains 100 files, the genre will be used as label for the ML model. Then by generating MFCC
vectors and training a SVM, the ML-model can classify emotions of the provided .mp3 files with and accuracy of 76.25%.
<figure markdown>
![](../../images/screenshots/mfcc-jupyter.png){ .img-border }
<figcaption>Figure 1: Accuracy of predictions matrix in Jupyter Notebook.</figcaption>
</figure>
## Solution
DBRepo is used as relational data storage of the raw- and aggregated features, prediction results and the splits of the
training- and test data. For each of the 400 .mp3 files, 40 MFCC feature vectors are generated. This data is stored
in aggregated form in the [`aggregated_features`](https://dbrepo1.ec.tuwien.ac.at/pid/47) table.
## DBRepo Features
- [x] Database as storage for machine learning data
- [x] System versioning
- [x] Subset exploration
- [x] Precise & PID of database tables
- [x] External data access for analysis
......@@ -15,7 +15,7 @@ maintenance, quality of products and ultimately process efficiency and -producti
<figure markdown>
![](../../images/screenshots/power.png)
<figcaption>Figure 1: aaaa from <a href="https://publik.tuwien.ac.at/files/PubDat_252294.pdf">Hacksteiner (2016)</a>.</figcaption>
<figcaption>Figure 1: Total power usage of machine floor TU Pilotfabrik, image from <a href="https://publik.tuwien.ac.at/files/PubDat_252294.pdf">Hacksteiner (2016)</a>.</figcaption>
</figure>
## Solution
......
---
author: Martin Weise
---
## tl;dr
[:fontawesome-solid-database: &nbsp;Dataset](https://handle.stage.datacite.org/10.82556/g2ac-vh88){ .md-button .md-button--primary target="_blank" }
[:simple-jupyter: &nbsp;Notebook](https://binder.science.datalab.tuwien.ac.at/v2/git/https%3A%2F%2Fgitlab.tuwien.ac.at%2Fmartin.weise%2Ftres/HEAD){ .md-button .md-button--secondary target="_blank" }
## Description
This digital record contains historical air pollution and air quality data from approximately 20 air monitoring stations
in Vienna, spanning the years from 1980 to 2021. The data was provided by the Umweltbundesamt and is stored in its
original form in this record. This record forms the basis of an analysis carried out in a bachelor's thesis at the TU
Wien.
## Solution
<figure markdown>
![Jupyter Notebook](../../images/screenshots/air-notebook.png){ .img-border }
<figcaption>Figure 1: Jupyter Notebook accessing data on DBRepo using the Python Library.</figcaption>
</figure>
## DBRepo Features
- [x] Import complex dataset
- [x] System versioning
- [x] Subset exploration
- [x] Aggregated views
- [x] Precise & PID of queries tables
- [x] External data access for analysis
## Acknowledgement
This work was part of a cooperation with the [Umweltbundesamt](https://www.umweltbundesamt.at/).
<img src="../../images/logos/umweltbundesamt.png" width=100 />
\ No newline at end of file
---
author: Martin Weise
---
## tl;dr
[:fontawesome-solid-database: &nbsp;Dataset](https://dbrepo1.ec.tuwien.ac.at/database/27/info){ .md-button .md-button--primary target="_blank" }
[:simple-jupyter: &nbsp;Notebook](https://binder.science.datalab.tuwien.ac.at/v2/git/https%3A%2F%2Fgitlab.tuwien.ac.at%2Ffairdata%2Fxps/HEAD){ .md-button .md-button--secondary target="_blank" }
## Description
X-ray Photoelectron Spectroscopy (XPS) is one of the most used methods in material sciences. Irradiation of solid
materials with X-ray radiation kicks out electrons from atoms that are near the atomic nucleus. With XPS data being
highly reproducible once machine parameters are known and understood, the demand for creating a comprehensive database
connecting material properties to compositions via XPS spectra becomes evident.
## Solution
We read XPS data from the VAMAS-encoded format and inserted it into a
[database schema](https://gitlab.tuwien.ac.at/fairdata/xps/-/blob/e17860399b1b109c72b01888766f37193dde5870/sql/create_schema.sql)
that captures the VAMAS-schema. It can then be read using the Python Library that executes a database query in SQL to
obtain only the experiment data (c.f. [subset page](https://dbrepo1.ec.tuwien.ac.at/database/27/subset/10/info)).
<figure markdown>
![Jupyter Notebook](../../images/screenshots/xps-notebook.png){ .img-border }
<figcaption>Figure 1: Jupyter Notebook accessing data on DBRepo using the Python Library.</figcaption>
</figure>
Using the DataFrame representation of the Python Library and the [`plotly`](https://pypi.org/project/plotly/) library,
we can visualize the ordinate values directly in the Jupyter Notebook.
<figure markdown>
![Three charts displaying surface analysis data of C, O and Su](../../images/screenshots/xps-chart.png){ .img-border }
<figcaption>Figure 2: Plot of ordinate values encoded within the experiment block.</figcaption>
</figure>
## DBRepo Features
- [x] Data preservation of VAMAS-encoded XPS data
- [x] Subset exploration
- [x] External visualization of the database
- [x] Replication of experiments using only open-source software
## Acknowledgement
This work was part of a cooperation with the [Institute of Applied Physics](http://www.iap.tuwien.ac.at/).
<img src="../../images/logos/iap.png" width=100 />
\ No newline at end of file
.docs/images/logos/iap.png

2.93 KiB

.docs/images/logos/umweltbundesamt.png

9.24 KiB

.docs/images/screenshots/air-notebook.png

124 KiB

.docs/images/screenshots/mfcc-jupyter.png

79.5 KiB

.docs/images/screenshots/xps-chart.png

113 KiB | W: | H:

.docs/images/screenshots/xps-chart.png

61.4 KiB | W: | H:

.docs/images/screenshots/xps-chart.png
.docs/images/screenshots/xps-chart.png
.docs/images/screenshots/xps-chart.png
.docs/images/screenshots/xps-chart.png
  • 2-up
  • Swipe
  • Onion skin
.docs/images/screenshots/xps-notebook.png

121 KiB

......@@ -2,8 +2,6 @@
author: Martin Weise
---
## Why use DBRepo?
Digital repositories see themselves more frequently encountered with the problem of making databases accessible in their
collection. Challenges revolve around organizing, searching and retrieving content stored within databases and
constitute a major technical burden as their internal representation greatly differs from static documents most digital
......@@ -21,28 +19,29 @@ evolving, allows reproducing of query results and supports findable-, accessible
DBRepo makes your dataset searchable without extra effort: most metadata is generated automatically for data in your
databases. The fast and powerful OpenSearch database allows a fast retrieval of any information. Adding semantic mapping
through a suggestion-feature, allows machines to properly understand the context of your data. [Learn more.](../system-services-search/)
through a suggestion-feature, allows machines to properly understand the context of your data
[Learn more](../concepts/search).
### Citable datasets
Adopting the recommendations of the RDA-WGDC, arbitrary subsets can be precisely, persistently identified using
system-versioned tables of MariaDB and the DataCite schema for minting DOIs. External systems i.e. metadata harvesters
(OpenAIRE, Google Datasets) can access these datasets through OAI-PMH, JSON-LD and FAIR Signposting protocols.
[Learn more.](../system-services-metadata/)
(OpenAIRE, Google Datasets) can access these datasets through OAI-PMH, JSON-LD and FAIR Signposting protocols
[Learn more](../concepts/pid).
### Powerful API for Data Scientists
With our strongly typed Python Library, Data Scientists can import, export and work with data from Jupyter Notebook or
Python script, optionally using Pandas DataFrames. For example: the AMQP API Client can collect continuous data from
edge devices like sensors and store them asynchronous in DBRepo. [Learn more.](../usage-python/)
edge devices like sensors and store them asynchronous in DBRepo [Learn more](../api/python).
### Cloud Native
Our lightweight Helm chart allows for installations on any cloud provider or private-cloud setting that has an
underlying PV storage provider. DBRepo can be installed from
the [Artifact Hub](https://artifacthub.io/packages/helm/dbrepo/dbrepo) repository. Databases are managed as MariaDB
Galera Cluster with high degree of availability ensuring your data is always accessible.
[Learn more.](../deployment-helm/)
Galera Cluster with high degree of availability ensuring your data is always accessible
[Learn more](../kubernetes).
## Demo Site
......
......@@ -125,7 +125,7 @@ build-helm:
- echo "$CI_REGISTRY_PASSWORD" | docker login --username "$CI_REGISTRY_USER" --password-stdin $CI_REGISTRY_URL
script:
- apk add sed helm curl
- helm package ./helm/dbrepo --sign --key 'Martin Weise' --keyring ./secring.gpg --destination ./build
- helm package ./helm/dbrepo --destination ./build
verify-install-script:
image: docker.io/docker:24-dind
......
......@@ -167,19 +167,19 @@
},
"boto3": {
"hashes": [
"sha256:7e8418b47dd43954a9088d504541bed8a42b6d06e712d02befba134c1c4d7c6d",
"sha256:7f676daef674fe74f34ce4063228eccc6e60c811f574720e31f230296c4bf29a"
"sha256:b781d267dd5e7583966e05697f6bd45e2f46c01dc619ba0860b042963ee69296",
"sha256:c163fb7135a94e7b8c8c478a44071c843f05e212fa4bec3105f8a437ecbf1bcb"
],
"index": "pypi",
"version": "==1.34.126"
"version": "==1.34.130"
},
"botocore": {
"hashes": [
"sha256:7a8ccb6a7c02456757a984a3a44331b6f51c94cb8b9b287cd045122fd177a4b0",
"sha256:7eff883c638fe30e0b036789df32d851e093d12544615a3b90062b42ac85bdbc"
"sha256:a242b3b0a836b14f308a309565cd63e88654cec238f9b73abbbd3c0526db4c81",
"sha256:a3b36e9dac1ed31c4cb3a5c5e540a7d8a9b90ff1d17f87734e674154b41776d8"
],
"markers": "python_version >= '3.8'",
"version": "==1.34.126"
"version": "==1.34.130"
},
"certifi": {
"hashes": [
......@@ -887,45 +887,54 @@
},
"numpy": {
"hashes": [
"sha256:03a8c78d01d9781b28a6989f6fa1bb2c4f2d51201cf99d3dd875df6fbd96b23b",
"sha256:08beddf13648eb95f8d867350f6a018a4be2e5ad54c8d8caed89ebca558b2818",
"sha256:1af303d6b2210eb850fcf03064d364652b7120803a0b872f5211f5234b399f20",
"sha256:1dda2e7b4ec9dd512f84935c5f126c8bd8b9f2fc001e9f54af255e8c5f16b0e0",
"sha256:2a02aba9ed12e4ac4eb3ea9421c420301a0c6460d9830d74a9df87efa4912010",
"sha256:2e4ee3380d6de9c9ec04745830fd9e2eccb3e6cf790d39d7b98ffd19b0dd754a",
"sha256:3373d5d70a5fe74a2c1bb6d2cfd9609ecf686d47a2d7b1d37a8f3b6bf6003aea",
"sha256:47711010ad8555514b434df65f7d7b076bb8261df1ca9bb78f53d3b2db02e95c",
"sha256:4c66707fabe114439db9068ee468c26bbdf909cac0fb58686a42a24de1760c71",
"sha256:50193e430acfc1346175fcbdaa28ffec49947a06918b7b92130744e81e640110",
"sha256:52b8b60467cd7dd1e9ed082188b4e6bb35aa5cdd01777621a1658910745b90be",
"sha256:60dedbb91afcbfdc9bc0b1f3f402804070deed7392c23eb7a7f07fa857868e8a",
"sha256:62b8e4b1e28009ef2846b4c7852046736bab361f7aeadeb6a5b89ebec3c7055a",
"sha256:666dbfb6ec68962c033a450943ded891bed2d54e6755e35e5835d63f4f6931d5",
"sha256:675d61ffbfa78604709862923189bad94014bef562cc35cf61d3a07bba02a7ed",
"sha256:679b0076f67ecc0138fd2ede3a8fd196dddc2ad3254069bcb9faf9a79b1cebcd",
"sha256:7349ab0fa0c429c82442a27a9673fc802ffdb7c7775fad780226cb234965e53c",
"sha256:7ab55401287bfec946ced39700c053796e7cc0e3acbef09993a9ad2adba6ca6e",
"sha256:7e50d0a0cc3189f9cb0aeb3a6a6af18c16f59f004b866cd2be1c14b36134a4a0",
"sha256:95a7476c59002f2f6c590b9b7b998306fba6a5aa646b1e22ddfeaf8f78c3a29c",
"sha256:96ff0b2ad353d8f990b63294c8986f1ec3cb19d749234014f4e7eb0112ceba5a",
"sha256:9fad7dcb1aac3c7f0584a5a8133e3a43eeb2fe127f47e3632d43d677c66c102b",
"sha256:9ff0f4f29c51e2803569d7a51c2304de5554655a60c5d776e35b4a41413830d0",
"sha256:a354325ee03388678242a4d7ebcd08b5c727033fcff3b2f536aea978e15ee9e6",
"sha256:a4abb4f9001ad2858e7ac189089c42178fcce737e4169dc61321660f1a96c7d2",
"sha256:ab47dbe5cc8210f55aa58e4805fe224dac469cde56b9f731a4c098b91917159a",
"sha256:afedb719a9dcfc7eaf2287b839d8198e06dcd4cb5d276a3df279231138e83d30",
"sha256:b3ce300f3644fb06443ee2222c2201dd3a89ea6040541412b8fa189341847218",
"sha256:b97fe8060236edf3662adfc2c633f56a08ae30560c56310562cb4f95500022d5",
"sha256:bfe25acf8b437eb2a8b2d49d443800a5f18508cd811fea3181723922a8a82b07",
"sha256:cd25bcecc4974d09257ffcd1f098ee778f7834c3ad767fe5db785be9a4aa9cb2",
"sha256:d209d8969599b27ad20994c8e41936ee0964e6da07478d6c35016bc386b66ad4",
"sha256:d5241e0a80d808d70546c697135da2c613f30e28251ff8307eb72ba696945764",
"sha256:edd8b5fe47dab091176d21bb6de568acdd906d1887a4584a15a9a96a1dca06ef",
"sha256:f870204a840a60da0b12273ef34f7051e98c3b5961b61b0c2c1be6dfd64fbcd3",
"sha256:ffa75af20b44f8dba823498024771d5ac50620e6915abac414251bd971b4529f"
"sha256:04494f6ec467ccb5369d1808570ae55f6ed9b5809d7f035059000a37b8d7e86f",
"sha256:0a43f0974d501842866cc83471bdb0116ba0dffdbaac33ec05e6afed5b615238",
"sha256:0e50842b2295ba8414c8c1d9d957083d5dfe9e16828b37de883f51fc53c4016f",
"sha256:0ec84b9ba0654f3b962802edc91424331f423dcf5d5f926676e0150789cb3d95",
"sha256:17067d097ed036636fa79f6a869ac26df7db1ba22039d962422506640314933a",
"sha256:1cde1753efe513705a0c6d28f5884e22bdc30438bf0085c5c486cdaff40cd67a",
"sha256:1e72728e7501a450288fc8e1f9ebc73d90cfd4671ebbd631f3e7857c39bd16f2",
"sha256:2635dbd200c2d6faf2ef9a0d04f0ecc6b13b3cad54f7c67c61155138835515d2",
"sha256:2ce46fd0b8a0c947ae047d222f7136fc4d55538741373107574271bc00e20e8f",
"sha256:34f003cb88b1ba38cb9a9a4a3161c1604973d7f9d5552c38bc2f04f829536609",
"sha256:354f373279768fa5a584bac997de6a6c9bc535c482592d7a813bb0c09be6c76f",
"sha256:38ecb5b0582cd125f67a629072fed6f83562d9dd04d7e03256c9829bdec027ad",
"sha256:3e8e01233d57639b2e30966c63d36fcea099d17c53bf424d77f088b0f4babd86",
"sha256:3f6bed7f840d44c08ebdb73b1825282b801799e325bcbdfa6bc5c370e5aecc65",
"sha256:4554eb96f0fd263041baf16cf0881b3f5dafae7a59b1049acb9540c4d57bc8cb",
"sha256:46e161722e0f619749d1cd892167039015b2c2817296104487cd03ed4a955995",
"sha256:49d9f7d256fbc804391a7f72d4a617302b1afac1112fac19b6c6cec63fe7fe8a",
"sha256:4d2f62e55a4cd9c58c1d9a1c9edaedcd857a73cb6fda875bf79093f9d9086f85",
"sha256:5f64641b42b2429f56ee08b4f427a4d2daf916ec59686061de751a55aafa22e4",
"sha256:63b92c512d9dbcc37f9d81b123dec99fdb318ba38c8059afc78086fe73820275",
"sha256:6d7696c615765091cc5093f76fd1fa069870304beaccfd58b5dcc69e55ef49c1",
"sha256:79e843d186c8fb1b102bef3e2bc35ef81160ffef3194646a7fdd6a73c6b97196",
"sha256:821eedb7165ead9eebdb569986968b541f9908979c2da8a4967ecac4439bae3d",
"sha256:84554fc53daa8f6abf8e8a66e076aff6ece62de68523d9f665f32d2fc50fd66e",
"sha256:8d83bb187fb647643bd56e1ae43f273c7f4dbcdf94550d7938cfc32566756514",
"sha256:903703372d46bce88b6920a0cd86c3ad82dae2dbef157b5fc01b70ea1cfc430f",
"sha256:9416a5c2e92ace094e9f0082c5fd473502c91651fb896bc17690d6fc475128d6",
"sha256:9a1712c015831da583b21c5bfe15e8684137097969c6d22e8316ba66b5baabe4",
"sha256:9c27f0946a3536403efb0e1c28def1ae6730a72cd0d5878db38824855e3afc44",
"sha256:a356364941fb0593bb899a1076b92dfa2029f6f5b8ba88a14fd0984aaf76d0df",
"sha256:a7039a136017eaa92c1848152827e1424701532ca8e8967fe480fe1569dae581",
"sha256:acd3a644e4807e73b4e1867b769fbf1ce8c5d80e7caaef0d90dcdc640dfc9787",
"sha256:ad0c86f3455fbd0de6c31a3056eb822fc939f81b1618f10ff3406971893b62a5",
"sha256:b4c76e3d4c56f145d41b7b6751255feefae92edbc9a61e1758a98204200f30fc",
"sha256:b6f6a8f45d0313db07d6d1d37bd0b112f887e1369758a5419c0370ba915b3871",
"sha256:c5a59996dc61835133b56a32ebe4ef3740ea5bc19b3983ac60cc32be5a665d54",
"sha256:c73aafd1afca80afecb22718f8700b40ac7cab927b8abab3c3e337d70e10e5a2",
"sha256:cee6cc0584f71adefe2c908856ccc98702baf95ff80092e4ca46061538a2ba98",
"sha256:cef04d068f5fb0518a77857953193b6bb94809a806bd0a14983a8f12ada060c9",
"sha256:cf5d1c9e6837f8af9f92b6bd3e86d513cdc11f60fd62185cc49ec7d1aba34864",
"sha256:e61155fae27570692ad1d327e81c6cf27d535a5d7ef97648a17d922224b216de",
"sha256:e7f387600d424f91576af20518334df3d97bc76a300a755f9a8d6e4f5cadd289",
"sha256:ed08d2703b5972ec736451b818c2eb9da80d66c3e84aed1deeb0c345fefe461b",
"sha256:fbd6acc766814ea6443628f4e6751d0da6593dae29c08c0b2606164db026970c",
"sha256:feff59f27338135776f6d4e2ec7aeeac5d5f7a08a83e80869121ef8164b74af9"
],
"index": "pypi",
"version": "==1.26.4"
"version": "==2.0.0"
},
"opensearch-py": {
"hashes": [
......@@ -1352,11 +1361,11 @@
},
"setuptools": {
"hashes": [
"sha256:54faa7f2e8d2d11bcd2c07bed282eef1046b5c080d1c32add737d7b5817b1ad4",
"sha256:f211a66637b8fa059bb28183da127d4e86396c991a942b028c6650d4319c3fd0"
"sha256:01a1e793faa5bd89abc851fa15d0a0db26f160890c7102cd8dce643e886b47f5",
"sha256:d9b8b771455a97c8a9f3ab3448ebe0b29b5e105f1228bba41028be116985a267"
],
"markers": "python_version >= '3.8'",
"version": "==70.0.0"
"version": "==70.1.0"
},
"six": {
"hashes": [
......@@ -1400,11 +1409,11 @@
},
"urllib3": {
"hashes": [
"sha256:450b20ec296a467077128bff42b73080516e71b56ff59a60a02bef2232c4fa9d",
"sha256:d0570876c61ab9e520d776c38acbbb5b05a776d3f9ff98a5c8fd5162a444cf19"
"sha256:a448b2f64d686155468037e1ace9f2d2199776e17f0a46610480d311f73e3472",
"sha256:dd505485549a7a552833da5e6063639d0d177c04f23bc3864e41e5dc5f612168"
],
"markers": "python_version >= '3.8'",
"version": "==2.2.1"
"version": "==2.2.2"
},
"werkzeug": {
"hashes": [
......@@ -1996,11 +2005,11 @@
},
"urllib3": {
"hashes": [
"sha256:450b20ec296a467077128bff42b73080516e71b56ff59a60a02bef2232c4fa9d",
"sha256:d0570876c61ab9e520d776c38acbbb5b05a776d3f9ff98a5c8fd5162a444cf19"
"sha256:a448b2f64d686155468037e1ace9f2d2199776e17f0a46610480d311f73e3472",
"sha256:dd505485549a7a552833da5e6063639d0d177c04f23bc3864e41e5dc5f612168"
],
"markers": "python_version >= '3.8'",
"version": "==2.2.1"
"version": "==2.2.2"
},
"wrapt": {
"hashes": [
......
No preview for this file type
No preview for this file type
......@@ -95,6 +95,35 @@ public class SubsetServiceIntegrationTest extends AbstractUnitTest {
assertEquals(0.0, response.getResult().get(2).get("rainfall"));
}
@Test
public void execute_joinWithAlias_succeeds() throws QueryStoreInsertException, TableMalformedException, SQLException,
QueryNotFoundException, InterruptedException, UserNotFoundException, NotAllowedException,
RemoteUnavailableException, ServiceException, DatabaseNotFoundException {
/* pre-condition */
Thread.sleep(1000) /* wait for test container some more */;
/* mock */
when(metadataServiceGateway.getUserById(QUERY_1_CREATED_BY))
.thenReturn(QUERY_1_CREATOR);
/* test */
final QueryResultDto response = queryService.execute(DATABASE_1_PRIVILEGED_DTO, QUERY_7_STATEMENT, Instant.now(), USER_1_ID, 0L, 10L, null, null);
assertNotNull(response);
assertNotNull(response.getId());
assertNotNull(response.getHeaders());
assertEquals(5, response.getHeaders().size());
assertEquals(List.of(Map.of("id", 0), Map.of("date", 1), Map.of("location", 2), Map.of("lat", 3), Map.of("lng", 4)), response.getHeaders());
assertNotNull(response.getResult());
assertEquals(1, response.getResult().size());
/* row 0 */
assertEquals(BigInteger.valueOf(1L), response.getResult().get(0).get("id"));
assertEquals(Instant.ofEpochSecond(1228089600), response.getResult().get(0).get("date"));
assertEquals("Albury", response.getResult().get(0).get("location"));
assertEquals(-36.0653583, response.getResult().get(0).get("lat"));
assertEquals(146.9112214, response.getResult().get(0).get("lng"));
}
@Test
public void execute_oneResult_succeeds() throws QueryStoreInsertException, TableMalformedException, SQLException,
QueryNotFoundException, InterruptedException, UserNotFoundException, NotAllowedException,
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment