Skip to content
GitLab
Explore
Sign in
Register
Primary navigation
Search or go to…
Project
CaosDB Crawler
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Iterations
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Locked files
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package registry
Container registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Code review analytics
Issue analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
caosdb
Software
CaosDB Crawler
Merge requests
!220
F doc workflow
Code
Review changes
Check out branch
Download
Patches
Plain diff
Merged
F doc workflow
f-doc-workflow
into
dev
Overview
0
Commits
4
Pipelines
1
Changes
1
Merged
Henrik tom Wörden
requested to merge
f-doc-workflow
into
dev
2 months ago
Overview
0
Commits
4
Pipelines
1
Changes
1
Expand
doc
0
0
Merge request reports
Viewing commit
b3669164
Prev
Next
Show latest version
1 file
+
10
−
5
Inline
Compare changes
Side-by-side
Inline
Show whitespace changes
Show one file at a time
b3669164
DOC: minor rephrasing
· b3669164
Henrik tom Wörden
authored
2 months ago
src/doc/workflow.rst
+
10
−
5
Options
@@ -4,17 +4,18 @@ Crawler Workflow
The LinkAhead crawler aims to provide a very flexible framework for synchronizing
data on file systems (or potentially other sources of information) with a
running LinkAhead instance. The workflow that is used in the scientific environment
should be choosen according to the users needs. It is also possible to combine multiple workflow or use them in parallel.
should be choosen according to the users needs. It is also possible to combine
multiple workflow or use them in parallel.
In this document we will describe several workflows for crawler operation.
Local Crawler Operation
-----------------------
A very simple setup that can also reliably used for testing
(e.g. in local
docker containers)
sets up the crawler on a local computer. The files that
are being crawled need to be visible to both, the local
comput
er and
the
machine, running
the LinkAhead.
A very simple setup that can also reliably
be
used for testing
sets up the crawler on a local computer. The files that
are being crawled need to be visible to both, the local
ly running crawl
er and
the LinkAhead
server
.
Prerequisites
+++++++++++++
@@ -58,3 +59,7 @@ Running the crawler
The following command line assumes that the extroot folder visible in the LinkAhead docker container is located in "../extroot":
caosdb-crawler -i identifiables.yml --prefix /extroot --debug --provenance=provenance.yml -s update cfood.yml ../extroot/ExperimentalData/
Server Side Crawler Operation
-----------------------
To be filled.
Loading