The quality checker relies on the [demonstrator 4.2](https://git.rwth-aachen.de/fair-ds/ap-4-2-demonstrator/ap-4.2-data-validation-and-quality-assurance-demonstrator) to perform the checks. Thus, RuQaD relies
on further maintenance by the demonstrator's development team.
<!--*\<(Optional) Directory/File Location>*-->
##### Directory/File Location
**Source code:**`src/ruqad/qualitycheck.py`
<!--*\<(Optional) Fulfilled Requirements>*
*\<(optional) Open Issues/Problems/Risks>*-->
##### Open issues #####
- The demonstrator 4.2 service currently relies on running as Gitlab pipeline jobs, which introduces
...
...
@@ -239,55 +239,21 @@ on further maintenance by the demonstrator's development team.
#### RuQaD Crawler
<!--*\<Purpose/Responsibility>*-->
##### Purpose/Responsibility
The RuQaD Crawler executes metadata checks and bundles data and metadata for insertion into the
BatCAT dataspace.
<!--*\<Interface(s)>*-->
##### Interface(s)
The crawler is implemented as a Python module with a function `trigger_crawler(...)` which looks for
data and quality check files to evaluate and insert into the BatCAT dataspace. It uses LinkAhead's
crawler framework for metadata checks, object creation and interaction with the target dataspace.
<!-- *\<(optional) Open Issues/Problems/Risks>* -->
<!-- #### \<Name interface 1> -->
<!-- … -->
<!-- #### \<Name interface m> -->
### Level 2
#### White Box RuQaD Crawler
...
...
@@ -312,13 +278,11 @@ This functionality is extended by a custom converters and data transformers.
-**Converter:** Custom conversion plugin to create resulting (sub) entities.
-**Transformer:** Custom conversion plugin to transform input data into properties.
*<(optionally) describe important interfaces>*
##### Crawler wrapper #####
<bstyle="color: red; font-size: 32pt">TODO: rewrite from overall view to wrapper component description</b>
<!--*\<Purpose/Responsibility>* -->
###### Purpose / Responsibility
The crawler wrapper scans files in specific directories of the file system and synchronizes them
with the LinkAhead instance. Before insertion and updates of `Records` in LinkAhead, a meta data
...
...
@@ -328,7 +292,7 @@ validation error messages and prevents insertions or updates of the scan result.
component also carries out a check of data FAIRness of the data exported from kadi4mat (in ELN
format).
<!--*\<Interface(s)>*-->
###### Interface(s)
The crawler uses:
- A cfood (file in YAML format) which specifies the mapping from data found on the file system to `Records` in LinkAhead.
- A definition of the identifiables (file in YAML format) which defines the properties that are needed to uniquely identify `Records` of the data model in LinkAhead.
...
...
@@ -339,47 +303,21 @@ The crawler uses:
The interface is a Python-function that is implemented as a module into the RuQaD demonstrator. The function calls the scanner and main crawler functions of the