Skip to content
Snippets Groups Projects
Commit 963d36e8 authored by Alexander Schlemmer's avatar Alexander Schlemmer
Browse files

Merge branch 'f-logging' into 'dev'

ENH: enable logging for commandline use

See merge request !159
parents 24c61846 c853fc02
Branches
Tags
2 merge requests!178FIX: #96 Better error output for crawl.py script.,!159ENH: enable logging for commandline use
Pipeline #51845 passed with warnings
...@@ -74,6 +74,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ...@@ -74,6 +74,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
* The `identifiable_adapters.IdentifiableAdapter` uses entity ids (negative for * The `identifiable_adapters.IdentifiableAdapter` uses entity ids (negative for
entities that don't exist remotely) instead of entity objects for keeping entities that don't exist remotely) instead of entity objects for keeping
track of references. track of references.
* Log output is either written to $SHARED_DIR/ (when this variable is set) or just to the terminal.
### Deprecated ### ### Deprecated ###
......
...@@ -1550,11 +1550,19 @@ def crawler_main(crawled_directory_path: str, ...@@ -1550,11 +1550,19 @@ def crawler_main(crawled_directory_path: str,
try: try:
crawler = Crawler(securityMode=securityMode) crawler = Crawler(securityMode=securityMode)
# setup logging and reporting if serverside execution if "SHARED_DIR" in os.environ: # setup logging and reporting if serverside execution
if "SHARED_DIR" in os.environ:
userlog_public, htmluserlog_public, debuglog_public = configure_server_side_logging() userlog_public, htmluserlog_public, debuglog_public = configure_server_side_logging()
# TODO make this optional
_create_status_record( _create_status_record(
get_config_setting("public_host_url") + "/Shared/" + htmluserlog_public, crawler.run_id) get_config_setting("public_host_url") + "/Shared/" + htmluserlog_public,
crawler.run_id)
else: # setup stdout logging for other cases
root_logger = logging.getLogger()
root_logger.setLevel(level=(logging.DEBUG if debug else logging.INFO))
handler = logging.StreamHandler(stream=sys.stdout)
handler.setLevel(logging.DEBUG if debug else logging.INFO)
root_logger.addHandler(handler)
logger.handlers.clear()
debug_tree = DebugTree() debug_tree = DebugTree()
crawled_data = scan_directory( crawled_data = scan_directory(
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment