Skip to content

Crawler cache access fails on Windows

Summary

Result when trying to do the Hello World example:

>>> inserts, updates = crawler.synchronize(commit_changes=True, unique_names=True, crawled_data=[hello_rec])
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\caosdb\miniconda3\lib\site-packages\caoscrawler\crawl.py", line 868, in synchronize
    update_cache = UpdateCache()
  File "C:\Users\caosdb\miniconda3\lib\site-packages\caosadvancedtools\cache.py", line 86, in __init__
    self.create_cache()
  File "C:\Users\caosdb\miniconda3\lib\site-packages\caosadvancedtools\cache.py", line 385, in create_cache
    self.run_sql_commands([
  File "C:\Users\caosdb\miniconda3\lib\site-packages\caosadvancedtools\cache.py", line 162, in run_sql_commands
    conn = sqlite3.connect(self.db_file)
sqlite3.OperationalError: unable to open database file

Expected Behavior

Should work.

Actual Behavior

Fails.

Steps to Reproduce the Problem

Please describe, step by step, how others can reproduce the problem. Please try these steps for yourself on a clean system.

  1. Do the Hello world example.

Specifications

  • Version: Installed with pip install caoscrawler
  • Platform: Windows 10 Pro with Miniconda (Python 3.8.5)

Possible fixes

Do you have ideas how the issue can be resolved?

Edited by Joscha Schmiedt