Skip to content

User Guide#

piveau-metrics offers the following methods of interaction:

  • A comprehensive RESTful API, that offers all functionalties of piveau-metrics
  • A Single-Page-Application (piveau-metrics-ui), that is mainly designed for the end user. This UI is viewed as deprecated and not further developed. A new UI is planned.

Using the API#

The main API and interaction point is offered by the piveau-metrics-cache service. You find the OpenAPI documentation on the base route of the service (e.g. http://localhost:8185, if you followed the quick start guide). Optionally you can find it in the services repository on Gitlab. The documentation offers details about all available endpoints and interaction methods. To get you started we will follow the steps for creating a catalogue and dataset from the piveau-hub user documentation with the metrics services enabled. This requires you have piveau-hub and the piveau-metrics services running and the API for the repo key on hand. You find the latter in the configuration of the piveau-hub-repo. We also assume, that the piveau-hub-repo is configured to use the metrics services, as can be seen in the example docker compose file.

Accessing the metrics#

After creating the catalogue and the dataset, you can find the metrics graph here:

GET http://localhost:8081/datasets/example-dataset/metrics

Info

The datasets has to be created while the metrics services are running.

Using the cache API#

The cache provides a simple REST API which returns metrics for single datasets and distributions and also aggregated metrics for whole catalogues, places or the whole portal as JSON.

Data Refresh#

To get the metrics metadata for the aggregations from the triplestore into the cache, the cache has to retrieve them. This can be configured to happen on a chedule. In this guide it should be triggered manually via the refresh endpoint.

GET http://localhost:8185/admin/refresh

Getting Dataset and Distribution#

To get the metrics data for a single Datasets can

GET http://localhost:8185/datasets/example-dataset

For the metrics data for all distributions of a dataset, you can call

GET http://localhost:8185/datasets/example-dataset/distributions

This includes all metrics data for each of the datasets distribution, including the shacl violations.

Info

This data will come directly from the triplestore, so no cache refresh is needed for these.

Getting aggregated data#

The cache API offers the results in aggregated form over a) all datasets, b) all datasets in a single catalogue and c) all datasets from a specific country. You can retrieve the most current aggregations and also a timeline.

To get the most current aggregation over all datasets, you can call

GET http://localhost:8185/global

Using the shacl-validation API#

The SHACL validation service validates a dataset against a schema. It is part of the metrics pipeline, but also offers an api to validate a dataset manually. The service uses different schemas to validate against, e.g. DCAT-AP 1, DCAT-AP 2 or DCAT-AP.de 2.

To validate a dataset against the strictest version of DCAT-AP 2.1.1, you can use this call:

POST http://localhost:8181/shacl/validation/report?shapeModel=dcatap211level1
Content-Type: text/turtle

@prefix dcat: <http://www.w3.org/ns/dcat#> .
@prefix dct: <http://purl.org/dc/terms/> .

<https://example.io/set/data/example-dataset>
    a dcat:Dataset ;
    dct:title "Example Dataset 2"@en ;
    dct:description "This is an example Dataset" ;
    dcat:distribution <https://example.io/set/distribution/1> .

<https://example.io/set/distribution/1>
    a dcat:Distribution ;
    dcat:accessURL <http://a-csv-file.com> ;
    dct:format <http://publications.europa.eu/resource/authority/file-type/CSV>  ;
    dct:title "Example Distribution 1" .