Data Information System (DAISY) is a data bookkeeping application designed to help Biomedical Research institutions with their GDPR compliance.
For more information, please refer to the official Daisy documentation.
You are encouraged to try Daisy for yourself using our DEMO deployment.
- docker: https://docs.docker.com/install/
-
Get the source code
git clone [email protected]:elixir-luxembourg/daisy.git cd daisy
-
Create your settings file
cp elixir_daisy/settings_local.template.py elixir_daisy/settings_local.py
Optional: edit the file elixir_daisy/settings_local.py to adapt to your environment.
-
Build daisy docker image
docker-compose up --build
Wait for the build to finish and keep the process running
-
Open a new shell and go to daisy folder
-
Build the database
docker-compose exec web python manage.py migrate
-
Build the solr schema
docker-compose exec web python manage.py build_solr_schema -c /solr/daisy/conf -r daisy -u default
-
Compile and deploy static files
docker-compose exec web python manage.py collectstatic
-
Create initial data in the database
docker-compose exec web bash -c "cd core/fixtures/ && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/edda.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hpo.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hdo.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hgnc.json" docker-compose exec web python manage.py load_initial_data
Initial data includes, for instance, controlled vocabularies terms and initial list of institutions and cohorts.
This step can take several minutes to complete -
Load demo data
docker-compose exec web python manage.py load_demo_data
This will create mock datasets, projects and create an demo admin account.
-
Optional - import users from an active directory instance
docker-compose exec web python manage.py import_users
-
Build the search index
docker-compose exec web python manage.py rebuild_index -u default
-
Browse to https://localhost
a demo admin account is available:username: admin password: demo
In addition to loading of initial data, DAISY database can be populated by importing Project, Dataset and Partners records from JSON files using commands import_projects
, import_datasets
and import_partners
respectively.
The commands for import are accepting one JSON file (flag -f
):
docker-compose exec web python manage.py <COMMAND> -f ${PATH_TO_JSON_FILE}
where ${PATH_TO_JSON_FILE} is the path to a json file containing the records definitions. See file daisy/data/demo/projects.json as an example.
Alternatively, you can specify directory containing multiple JSON files to be imported with -d
flag:
docker-compose exec web python manage.py <COMMAND> -d ${PATH_TO_DIR}
Information in the DAISY database can be exported to JSON files. The command for export are given below:
docker-compose exec web python manage.py export_partners -f ${JSON_FILE}
where ${JSON_FILE} is the path to a json file that will be produced. In addition to export_partners
, you can run export_projects
and export_datasets
in the same way.
-
Create a database backup.
docker-compose exec db pg_dump daisy --port=5432 --username=daisy --no-password --clean > backup_`date +%y-%m-%d`.sql
-
Make sure docker containers are stopped.
docker-compose stop
-
Get last Daisy release.
git checkout master git pull
-
Rebuild and start the docker containers.
docker-compose up --build
Open a new terminal window to execute the following commands.
-
Update the database schema.
docker-compose exec web python manage.py migrate
-
Update the solr schema.
docker-compose exec web python manage.py build_solr_schema -c /solr/daisy/conf -r daisy -u default
-
Collect static files.
docker-compose exec web python manage.py collectstatic
-
Reload initial data (optional).
IMPORTANT NOTE: The initial data package provides some default values for various lookup lists e.g. data sensitivity classes, document or data types. If, while using DAISY, you have customized these default lists, please keep in mind that running the
load_initial_data
command during update will re-introduce those default values. If this is not desired, then please skip the reloading of initial data step during your update. You manage lookup lists through the application interface.docker-compose exec web bash -c "cd core/fixtures/ && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/edda.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hpo.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hdo.json && wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hgnc.json" docker-compose exec web python manage.py load_initial_data
IMPORTANT NOTE: This step can take several minutes to complete.
-
Rebuild the search index.
docker-compose exec web python manage.py rebuild_index -u default
-
Reimport the users (optional).
If LDAP was used during initial setup to import users, they have to be imported again:
docker-compose exec web python manage.py import_users
See DEPLOYMENT.
To be completed.
./manage.py import_users
Single file mode:
./manage.py import_projects -f path/to/json_file.json
Batch mode:
./manage.py import_projects -d path/to/dir/with/json/files/
Available commands: import_projects
, import_datasets
, import_partners
.
In case of problems, add --verbose
flag to the command, and take a look inside ./log/daisy.log
.
cd web/static/vendor/
npm ci
cd web/static/vendor
npm run-script build
./manage.py runserver
The following command will install the test dependencies and execute the tests:
python setup.py pytest
If tests dependencies are already installed, one can also run the tests just by executing:
pytest
To get access to the admin page, you must log in with a superuser account.
On the Users
section, you can give any user a staff
status and he will be able to access any project/datasets.