Skip to content
Snippets Groups Projects
Commit c0c2ea17 authored by Daniel Hornung's avatar Daniel Hornung
Browse files

Merge branch 'f-revert-rename-optional' into 'f-sav-converter'

MAINT: Rename `h5_crawler` back to `h5-crawler`

See merge request !172
parents f1917c55 5414650b
No related branches found
No related tags found
3 merge requests!178FIX: #96 Better error output for crawl.py script.,!172MAINT: Rename `h5_crawler` back to `h5-crawler`,!171sav/spss converter
Pipeline #52184 passed
......@@ -34,7 +34,7 @@ RUN rm -r /git/.git
# Install pycaosdb.ini for the tests
RUN mv /git/.docker/tester_pycaosdb.ini /git/integrationtests/pycaosdb.ini
RUN cd /git/ && pip3 install .[h5_crawler,spss]
RUN cd /git/ && pip3 install .[h5-crawler,spss]
WORKDIR /git/integrationtests
# wait for server,
......
......@@ -131,7 +131,7 @@ unittest_py3.8:
# TODO: Use f-branch logic here
- pip install git+https://gitlab.indiscale.com/caosdb/src/caosdb-pylib.git@dev
- pip install git+https://gitlab.indiscale.com/caosdb/src/caosdb-advanced-user-tools.git@dev
- pip install .[h5_crawler,spss]
- pip install .[h5-crawler,spss]
# actual test
- caosdb-crawler --help
- pytest --cov=caosdb -vv ./unittests
......@@ -168,7 +168,7 @@ unittest_py3.13:
# TODO: Use f-branch logic here
- pip install git+https://gitlab.indiscale.com/caosdb/src/caosdb-pylib.git@dev
- (! pip install git+https://gitlab.indiscale.com/caosdb/src/caosdb-advanced-user-tools.git@dev)
- (! pip install .[h5_crawler,spss])
- (! pip install .[h5-crawler,spss])
# actual test
- (! caosdb-crawler --help)
- (! pytest --cov=caosdb -vv ./unittests)
......
......@@ -44,7 +44,7 @@ console_scripts =
csv_to_datamodel = caoscrawler.scripts.generators:csv_to_datamodel_main
[options.extras_require]
h5_crawler =
h5-crawler =
h5py >= 3.8
numpy
spss =
......
......@@ -260,13 +260,13 @@ HDF5 Converters
For treating `HDF5 Files
<https://docs.hdfgroup.org/hdf5/develop/_s_p_e_c.html>`_, there are in total
four individual converters corresponding to the internal structure of HDF5 files:
the :ref:`H5FileConverter` which opens the file itself and creates further
structure elements from HDF5 groups, datasets, and included multi-dimensional
arrays that are in turn treated by the :ref:`H5GroupConverter`, the
:ref:`H5DatasetConverter`, and the :ref:`H5NdarrayConverter`, respectively. You
need to install the LinkAhead crawler with its optional ``h5crawler`` dependency
for using these converters.
four individual converters corresponding to the internal structure of HDF5
files: the :ref:`H5FileConverter` which opens the file itself and creates
further structure elements from HDF5 groups, datasets, and included
multi-dimensional arrays that are in turn treated by the
:ref:`H5GroupConverter`, the :ref:`H5DatasetConverter`, and the
:ref:`H5NdarrayConverter`, respectively. You need to install the LinkAhead
crawler with its optional ``h5-crawler`` dependency for using these converters.
The basic idea when crawling HDF5 files is to treat them very similar to
:ref:`dictionaries <DictElement Converter>` in which the attributes on root,
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment