Skip to content

ENH: add security levels that may prevent updates or inserts

Henrik tom Wörden requested to merge f-authorization into dev

Summary

Add the security modes "RETRIEVE", "INSERT" and "UPDATE".

Focus

  • Please check whether you deem the Test meaningful (think hard!). Is it stable against later changes? Does it meaningfully test the behavior?

Test Environment

unit and int tests

Please make sure that sendmail_to_file exists in your PATH (e.g. by adding a sym link in .local/bin pointing to deploy sendmail.

In the following we add SHARED_DIR=/tmp to simulate behavior as SSS.

integrationtests/test_data/extroot/realworld_example

# insert data model
python3 -m caosadvancedtools.models.parser --sync schema/dataset.schema.json 
python3 -m caosadvancedtools.models.parser --sync schema/dataspace.schema.json
# run crawler
SHARED_DIR=/tmp python -m caoscrawler.crawl --debug -i identifiables.yml -s insert    dataset_cfoods.yml data
# make a change to the data (nonidentifying prop). E.g change the comment in `data/35/03_raw_data/001_dataset1/metadata.json`
# run the crawler again
SHARED_DIR=/tmp python -m caoscrawler.crawl --debug -i identifiables.yml -s insert    dataset_cfoods.yml data
# Check mail in `/tmp/mail`
# You will see a note on unauthorized change. Authorize them using the runid
SHARED_DIR=/tmp python -m caoscrawler.authorize 0f5fc284-0cb8-11ed-8862-88b11163657d

Please add the instructions to manually conduct such a test to the README.md in integration tests once you tried them and they worked.

Check List for the Author

Please, prepare your MR for a review. Be sure to write a summary and a focus and create gitlab comments for the reviewer. They should guide the reviewer through the changes, explain your changes and also point out open questions. For further good practices have a look at our review guidelines

  • All automated tests pass
  • Reference related issues
  • Up-to-date CHANGELOG.md (or not necessary)
  • Annotations in code (Gitlab comments)
    • Intent of new code
    • Problems with old code
    • Why this implementation?

Check List for the Reviewer

  • I understand the intent of this MR
  • All automated tests pass
  • Up-to-date CHANGELOG.md (or not necessary)
  • The test environment setup works and the intended behavior is reproducible in the test environment
  • In-code documentation and comments are up-to-date.
  • Check: Are there specifications? Are they satisfied?

For further good practices have a look at our review guidelines.

Edited by Florian Spreckelsen

Merge request reports