diff --git a/doc/sphinx-guides/source/admin/make-data-count.rst b/doc/sphinx-guides/source/admin/make-data-count.rst index 8a96e949ff9..521ee4f97ca 100644 --- a/doc/sphinx-guides/source/admin/make-data-count.rst +++ b/doc/sphinx-guides/source/admin/make-data-count.rst @@ -146,7 +146,9 @@ Configuring Your Dataverse Installation for Make Data Count Citations Please note: as explained in the note above about limitations, this feature is not available to Dataverse installations that use Handles. -To configure your Dataverse installation to pull citations from the test vs. production DataCite server see :ref:`doi.dataciterestapiurlstring` in the Installation Guide. +To configure your Dataverse installation to pull citations from the test vs. +production DataCite server see :ref:`dataverse.pid.datacite.rest-api-url` in +the Installation Guide. Please note that in the curl example, Bash environment variables are used with the idea that you can set a few environment variables and copy and paste the examples as is. For example, "$DOI" could become "doi:10.5072/FK2/BL2IBM" by issuing the following export command from Bash: diff --git a/doc/sphinx-guides/source/api/native-api.rst b/doc/sphinx-guides/source/api/native-api.rst index 5cf90359001..5c6f328703e 100644 --- a/doc/sphinx-guides/source/api/native-api.rst +++ b/doc/sphinx-guides/source/api/native-api.rst @@ -485,6 +485,8 @@ You should expect an HTTP 200 ("OK") response and JSON indicating the database I .. note:: Only a Dataverse installation account with superuser permissions is allowed to include files when creating a dataset via this API. Adding files this way only adds their file metadata to the database, you will need to manually add the physical files to the file system. +.. _api-import-dataset: + Import a Dataset into a Dataverse Collection ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -2932,7 +2934,7 @@ Each user can get a dump of their basic information in JSON format by passing in curl -H "X-Dataverse-key:$API_TOKEN" $SERVER_URL/api/users/:me -.. _pids-api: + Managing Harvesting Server and Sets ----------------------------------- @@ -3043,6 +3045,10 @@ The fully expanded example above (without the environment variables) looks like Only users with superuser permissions may delete harvesting sets. + + +.. _pids-api: + PIDs ---- diff --git a/doc/sphinx-guides/source/developers/testing.rst b/doc/sphinx-guides/source/developers/testing.rst index 7bde4055e33..e711b862463 100755 --- a/doc/sphinx-guides/source/developers/testing.rst +++ b/doc/sphinx-guides/source/developers/testing.rst @@ -79,6 +79,17 @@ greatly extended parameterized testing. Some guidance how to write those: - https://blog.codefx.org/libraries/junit-5-parameterized-tests/ - See also some examples in our codebase. +JUnit 5 Test Helper Extensions +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Our codebase provides little helpers to ease dealing with state during tests. +Some tests might need to change something which should be restored after the test ran. + +For unit tests, the most interesting part is to set a JVM setting just for the current test. +Please use ``@JvmSetting(key = JvmSettings.XXX, value = "")`` annotation on a test method or +a test class to set and clear the property automatically. + + Observing Changes to Code Coverage ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ diff --git a/doc/sphinx-guides/source/installation/config.rst b/doc/sphinx-guides/source/installation/config.rst index 5c227417271..b8a91435d61 100644 --- a/doc/sphinx-guides/source/installation/config.rst +++ b/doc/sphinx-guides/source/installation/config.rst @@ -144,36 +144,71 @@ As the person installing the Dataverse Software, you may or may not be a local m Persistent Identifiers and Publishing Datasets ---------------------------------------------- -Persistent identifiers are a required and integral part of the Dataverse Software. They provide a URL that is guaranteed to resolve to the datasets or files they represent. The Dataverse Software currently supports creating identifiers using DOI and Handle. +Persistent identifiers are a required and integral part of the Dataverse +Software. They provide a URL that is guaranteed to resolve to the datasets or +files they represent. The Dataverse Software currently supports creating +identifiers using DOI and Handle. -By default, the installer configures a default DOI namespace (10.5072) with DataCite as the registration provider. Please note that as of the release 4.9.3, we can no longer use EZID as the provider. Unlike EZID, DataCite requires that you register for a test account, configured with your own prefix (please contact support@datacite.org). Once you receive the login name, password, and prefix for the account, configure the credentials in your domain.xml, as the following two JVM options:: +By default, the installer configures a default DOI namespace (10.5072) with +DataCite as the registration provider. Please note that as of the release +4.9.3, we can no longer use EZID as the provider. Unlike EZID, DataCite +requires that you register for a test account, configured with your own prefix +(please contact support@datacite.org). - -Ddoi.username=... - -Ddoi.password=... +Once you receive the login name, password, and prefix for the account, +configure the credentials via :ref:`dataverse.pid.datacite.username` and +:ref:`dataverse.pid.datacite.password`, then restart Payara. -and restart Payara. The prefix can be configured via the API (where it is referred to as "Authority"): +The prefix can be configured via the API (where it is referred to as +"Authority"): ``curl -X PUT -d 10.xxxx http://localhost:8080/api/admin/settings/:Authority`` -Once this is done, you will be able to publish datasets and files, but the persistent identifiers will not be citable, and they will only resolve from the DataCite test environment (and then only if the Dataverse installation from which you published them is accessible - DOIs minted from your laptop will not resolve). Note that any datasets or files created using the test configuration cannot be directly migrated and would need to be created again once a valid DOI namespace is configured. - -To properly configure persistent identifiers for a production installation, an account and associated namespace must be acquired for a fee from a DOI or HDL provider. **DataCite** (https://www.datacite.org) is the recommended DOI provider (see https://dataverse.org/global-dataverse-community-consortium for more on joining DataCite) but **EZID** (http://ezid.cdlib.org) is an option for the University of California according to https://www.cdlib.org/cdlinfo/2017/08/04/ezid-doi-service-is-evolving/ . **Handle.Net** (https://www.handle.net) is the HDL provider. - -Once you have your DOI or Handle account credentials and a namespace, configure your Dataverse installation to use them using the JVM options and database settings below. +Once this is done, you will be able to publish datasets and files, but the +persistent identifiers will not be citable, and they will only resolve from the +DataCite test environment (and then only if the Dataverse installation from +which you published them is accessible - DOIs minted from your laptop will not +resolve). Note that any datasets or files created using the test configuration +cannot be directly migrated and would need to be created again once a valid DOI +namespace is configured. + +To properly configure persistent identifiers for a production installation, an +account and associated namespace must be acquired for a fee from a DOI or HDL +provider. **DataCite** (https://www.datacite.org) is the recommended DOI +provider (see https://dataverse.org/global-dataverse-community-consortium for +more on joining DataCite) but **EZID** (http://ezid.cdlib.org) is an option for +the University of California according to +https://www.cdlib.org/cdlinfo/2017/08/04/ezid-doi-service-is-evolving/ . +**Handle.Net** (https://www.handle.net) is the HDL provider. + +Once you have your DOI or Handle account credentials and a namespace, configure +your Dataverse installation to use them using the JVM options and database +settings below. Configuring Your Dataverse Installation for DOIs ++++++++++++++++++++++++++++++++++++++++++++++++ -By default, your Dataverse installation attempts to register DOIs for each dataset and file under a test authority, though you must apply for your own credentials as explained above. +By default, your Dataverse installation attempts to register DOIs for each +dataset and file under a test authority, though you must apply for your own +credentials as explained above. Here are the configuration options for DOIs: -**JVM Options:** +**JVM Options for DataCite:** + +- :ref:`dataverse.pid.datacite.mds-api-url` +- :ref:`dataverse.pid.datacite.rest-api-url` +- :ref:`dataverse.pid.datacite.username` +- :ref:`dataverse.pid.datacite.password` + +**JVM Options for EZID:** + +As stated above, with very few exceptions, you will not be able to use +this provider. -- :ref:`doi.baseurlstring` -- :ref:`doi.username` -- :ref:`doi.password` -- :ref:`doi.dataciterestapiurlstring` +- :ref:`dataverse.pid.ezid.api-url` +- :ref:`dataverse.pid.ezid.username` +- :ref:`dataverse.pid.ezid.password` **Database Settings:** @@ -188,7 +223,8 @@ Here are the configuration options for DOIs: Configuring Your Dataverse Installation for Handles +++++++++++++++++++++++++++++++++++++++++++++++++++ -Here are the configuration options for handles: +Here are the configuration options for handles. Most notably, you need to +change the ``:Protocol`` setting, as it defaults to DOI usage. **JVM Options:** @@ -270,6 +306,8 @@ If you wish to change which store is used by default, you'll need to delete the It is also possible to set maximum file upload size limits per store. See the :ref:`:MaxFileUploadSizeInBytes` setting below. +.. _storage-files-dir: + File Storage ++++++++++++ @@ -1285,35 +1323,69 @@ When changing values these values with ``asadmin``, you'll need to delete the ol It's also possible to change these values by stopping Payara, editing ``payara5/glassfish/domains/domain1/config/domain.xml``, and restarting Payara. +.. _dataverse.fqdn: + dataverse.fqdn ++++++++++++++ -If the Dataverse installation has multiple DNS names, this option specifies the one to be used as the "official" host name. For example, you may want to have dataverse.example.edu, and not the less appealing server-123.socsci.example.edu to appear exclusively in all the registered global identifiers, Data Deposit API records, etc. +The URL to access your Dataverse installation gets used in multiple places: + +- Email confirmation links +- Password reset links +- Generating a Private URL +- PID minting +- Exporting to Schema.org format (and showing JSON-LD in HTML's tag) +- Exporting to DDI format +- Which Dataverse installation an "external tool" should return to +- URLs embedded in SWORD API responses +- ... -The password reset feature requires ``dataverse.fqdn`` to be configured. +Usually it will follow the pattern ``https:///``. +The FQDN part of the your Dataverse installation URL can be determined by setting ``dataverse.fqdn``. -.. note:: +**Notes:** - Do note that whenever the system needs to form a service URL, by default, it will be formed with ``https://`` and port 443. I.e., - ``https://{dataverse.fqdn}/`` - If that does not suit your setup, you can define an additional option, ``dataverse.siteUrl``, explained below. +- The URL will default to using ``https://`` and no additional port information. If that does not suit your setup, you + can define an additional option, ``dataverse.siteUrl``, :ref:`explained below `, which always + takes precedence. +- Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_FQDN``. + Defaults to ``localhost`` when used with ``mp.config.profile=ct`` .. _dataverse.siteUrl: dataverse.siteUrl +++++++++++++++++ -.. note:: +Some environments may require using a different URL pattern to access your installation. You might need to use +HTTP without "S", a non-standard port and so on. This is especially useful in development or testing environments. + +You can provide a custom tailored site URL via ``dataverse.siteUrl``, which always takes precedence. +Example: ``dataverse.siteUrl=http://localhost:8080`` + +**Notes:** + +- This setting may be used in combination with variable replacement, referencing :ref:`dataverse.fqdn` with + ``./asadmin create-jvm-options "\-Ddataverse.siteUrl=http\://\${dataverse.fqdn}\:8080"`` +- Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_SITEURL``. + Defaults to ``http://${dataverse.fqdn}:8080`` when used with ``mp.config.profile=ct`` - and specify the protocol and port number you would prefer to be used to advertise the URL for your Dataverse installation. - For example, configured in domain.xml: - ``-Ddataverse.fqdn=dataverse.example.edu`` - ``-Ddataverse.siteUrl=http://${dataverse.fqdn}:8080`` dataverse.files.directory +++++++++++++++++++++++++ -This is how you configure the path Dataverse uses for temporary files. (File store specific dataverse.files.\.directory options set the permanent data storage locations.) +Please provide an absolute path to a directory backed by some mounted file system. This directory is used for a number +of purposes: + +1. ``/temp`` after uploading, data is temporarily stored here for ingest and/or before + shipping to the final storage destination. +2. ``/sword`` a place to store uploads via the :doc:`../api/sword` before transfer + to final storage location and/or ingest. +3. ``//`` data location for file system imports, see + :ref:`api-import-dataset`. +4. ``/googlecloudkey.json`` used with :ref:`Google Cloud Configuration` for BagIt exports. + +This directory might also be used for permanent storage of data, but this setting is independent from +:ref:`storage-files-dir` configuration. dataverse.auth.password-reset-timeout-in-minutes ++++++++++++++++++++++++++++++++++++++++++++++++ @@ -1371,6 +1443,61 @@ Defaults to ``5432``, the default PostgreSQL port. Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_DB_PORT``. +.. _dataverse.solr.host: + +dataverse.solr.host ++++++++++++++++++++ + +The hostname of a Solr server to connect to. Remember to restart / redeploy Dataverse after changing the setting +(as with :ref:`:SolrHostColonPort`). + +Defaults to ``localhost``. + +Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_SOLR_HOST``. +Defaults to ``solr``, when used with ``mp.config.profile=ct`` (:ref:`see below <:ApplicationServerSettings>`). + +dataverse.solr.port ++++++++++++++++++++ + +The Solr server port to connect to. Remember to restart / redeploy Dataverse after changing the setting +(as with :ref:`:SolrHostColonPort`). + +Defaults to ``8983``, the default Solr port. + +Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_SOLR_PORT``. + +dataverse.solr.core ++++++++++++++++++++ + +The name of the Solr core to use for this Dataverse installation. Might be used to switch to a different core quickly. +Remember to restart / redeploy Dataverse after changing the setting (as with :ref:`:SolrHostColonPort`). + +Defaults to ``collection1``. + +Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_SOLR_CORE``. + +dataverse.solr.protocol ++++++++++++++++++++++++ + +The Solr server URL protocol for the connection. Remember to restart / redeploy Dataverse after changing the setting +(as with :ref:`:SolrHostColonPort`). + +Defaults to ``http``, but might be set to ``https`` for extra secure Solr installations. + +Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_SOLR_PROTOCOL``. + +dataverse.solr.path ++++++++++++++++++++ + +The path part of the Solr endpoint URL (e.g. ``/solr/collection1`` of ``http://localhost:8389/solr/collection1``). +Might be used to target a Solr API at non-default places. Remember to restart / redeploy Dataverse after changing the +setting (as with :ref:`:SolrHostColonPort`). + +Defaults to ``/solr/${dataverse.solr.core}``, interpolating the core name when used. Make sure to include the variable +when using it to configure your core name! + +Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_SOLR_PATH``. + dataverse.rserve.host +++++++++++++++++++++ @@ -1420,77 +1547,180 @@ dataverse.dataAccess.thumbnail.pdf.limit For limiting the size (in bytes) of thumbnail images generated from files. The default is 1000000 bytes (1 MB). -.. _doi.baseurlstring: -doi.baseurlstring -+++++++++++++++++ +.. _dataverse.pid.datacite.mds-api-url: -As of this writing, "https://mds.datacite.org" (DataCite) and "https://ezid.cdlib.org" (EZID) are the main valid values. +dataverse.pid.datacite.mds-api-url +++++++++++++++++++++++++++++++++++ -While the above two options are recommended because they have been tested by the Dataverse Project Team, it is also possible to use a DataCite Client API as a proxy to DataCite. In this case, requests made to the Client API are captured and passed on to DataCite for processing. The application will interact with the DataCite Client API exactly as if it were interacting directly with the DataCite API, with the only difference being the change to the base endpoint URL. +Configure the basic endpoint of the `DataCite MDS API `_, +used to mint and manage DOIs. Valid values are "https://mds.datacite.org" and "https://mds.test.datacite.org" +(see also note below). -For example, the Australian Data Archive (ADA) successfully uses the Australian National Data Service (ANDS) API (a proxy for DataCite) to mint their DOIs through their Dataverse installation using a ``doi.baseurlstring`` value of "https://researchdata.ands.org.au/api/doi/datacite" as documented at https://documentation.ands.org.au/display/DOC/ANDS+DataCite+Client+API . As ADA did for ANDS DOI minting, any DOI provider (and their corresponding DOI configuration parameters) other than DataCite must be tested with the Dataverse Software to establish whether or not it will function properly. +Out of the box, the installer script configures your installation to use a `DataCite MDS Test API +base URL `_. You can delete it like this: -Out of the box, the Dataverse Software is configured to use a test MDS DataCite base URL string. You can delete it like this: +``./asadmin delete-jvm-options '-Ddataverse.pid.datacite.mds-api-url=https\://mds.test.datacite.org'`` -``./asadmin delete-jvm-options '-Ddoi.baseurlstring=https\://mds.test.datacite.org'`` +Then, to switch to `production DataCite `__, +you can issue the following command: -Then, to switch to production DataCite, you can issue the following command: +``./asadmin create-jvm-options '-Ddataverse.pid.datacite.mds-api-url=https\://mds.datacite.org'`` -``./asadmin create-jvm-options '-Ddoi.baseurlstring=https\://mds.datacite.org'`` +Without setting an option, always defaults to testing API endpoint. -See also these related database settings below: +**Notes:** -- :ref:`:DoiProvider` -- :ref:`:Protocol` -- :ref:`:Authority` -- :ref:`:Shoulder` +- See also these related database settings below: :ref:`:DoiProvider`, + :ref:`:Protocol`, :ref:`:Authority`, :ref:`:Shoulder`. +- Can also be set via *MicroProfile Config API* sources, e.g. the environment + variable ``DATAVERSE_PID_DATACITE_MDS_API_URL``. +- This setting was formerly known as ``doi.baseurlstring`` and has been renamed. + You should delete and re-add it. +- While using DataCite directly is recommended because it is tested by the Dataverse + Project Team plus field tested with most installations, it is also possible + to use a DataCite Client API as a proxy to DataCite. `Since the launch of DataCite Fabrica in + 2019, the only example by Australian National Data Services (ANDS) has been decommissioned + `_. -.. _doi.dataciterestapiurlstring: -doi.dataciterestapiurlstring -++++++++++++++++++++++++++++ +.. _dataverse.pid.datacite.rest-api-url: -This configuration option affects the ``updateCitationsForDataset`` API endpoint documented under :ref:`MDC-updateCitationsForDataset` in the Admin Guide as well as the /pids/* API. +dataverse.pid.datacite.rest-api-url ++++++++++++++++++++++++++++++++++++ -As of this writing, "https://api.datacite.org" (DataCite) and "https://api.test.datacite.org" (DataCite Testing) are the main valid values. +Configure the basic endpoint of the `DataCite REST API `_, +currently used for :doc:`/admin/make-data-count` integration and :ref:`Native PIDs API +` information retrieval. Valid values are "https://api.datacite.org" and +"https://api.test.datacite.org". -Out of the box, the Dataverse Software is configured to use a test DataCite REST API base URL string. You can delete it like this: +Out of the box, the installer configures your installation to use a `DataCite +REST Test API base URL `_. You +can delete it like this: -``./asadmin delete-jvm-options '-Ddoi.dataciterestapiurlstring=https\://api.test.datacite.org'`` +``./asadmin delete-jvm-options '-Ddataverse.pid.datacite.rest-api-url=https\://api.test.datacite.org'`` -Then, to switch to production DataCite, you can issue the following command: +Then, to switch to `production DataCite `__, +you can issue the following command: -``./asadmin create-jvm-options '-Ddoi.dataciterestapiurlstring=https\://api.datacite.org'`` +``./asadmin create-jvm-options '-Ddataverse.pid.datacite.rest-api-url=https\://api.datacite.org'`` -For backward compatibility, if this option is not defined, the value of '-Ddoi.mdcbaseurlstring' is used if set. If not the default used is "https\://api.datacite.org:. +Without setting an option, always defaults to testing API endpoint. -See also these related database settings below: +**Notes:** -- :ref:`:MDCLogPath` -- :ref:`:DisplayMDCMetrics` +- See also these related database settings below: :ref:`:MDCLogPath`, + :ref:`:DisplayMDCMetrics`. +- Can also be set via *MicroProfile Config API* sources, e.g. the environment + variable ``DATAVERSE_PID_DATACITE_REST_API_URL``. +- This setting was formerly known as ``doi.dataciterestapiurlstring`` or + ``doi.mdcbaseurlstring`` and has been renamed. You should delete and re-add it. -.. _doi.username: -doi.username -++++++++++++ +.. _dataverse.pid.datacite.username: + +dataverse.pid.datacite.username ++++++++++++++++++++++++++++++++ -Used in conjuction with ``doi.baseurlstring``. +DataCite uses `HTTP Basic authentication `_ +for `Fabrica `_ and their APIs. You need to provide +the same credentials to Dataverse Software to mint and manage DOIs for you. Once you have a username from your provider, you can enter it like this: -``./asadmin create-jvm-options '-Ddoi.username=YOUR_USERNAME_HERE'`` +``./asadmin create-jvm-options '-Ddataverse.pid.datacite.username=YOUR_USERNAME_HERE'`` -.. _doi.password: +**Notes:** -doi.password -++++++++++++ +- Used in conjuction with :ref:`dataverse.pid.datacite.mds-api-url`, + :ref:`dataverse.pid.datacite.rest-api-url` and :ref:`dataverse.pid.datacite.password`. +- Can also be set via *MicroProfile Config API* sources, e.g. the environment + variable ``DATAVERSE_PID_DATACITE_USERNAME``. +- This setting was formerly known as ``doi.username`` and has been renamed. + You should delete and re-add it. + +.. _dataverse.pid.datacite.password: + +dataverse.pid.datacite.password ++++++++++++++++++++++++++++++++ + +Once you have a password from your provider, you should create a password alias. +This avoids storing it in clear text, although you could use a JVM option `to reference +a different place `__. + +``./asadmin create-password-alias dataverse.pid.datacite.password`` + +It will allow you to enter the password while not echoing the characters. +To manage these, read up on `Payara docs about password aliases `__. -Used in conjuction with ``doi.baseurlstring``. +**Notes:** + +- Used in conjuction with :ref:`dataverse.pid.datacite.mds-api-url`, + :ref:`dataverse.pid.datacite.rest-api-url` and :ref:`dataverse.pid.datacite.username`. +- Can also be set via *MicroProfile Config API* sources, e.g. the environment + variable ``DATAVERSE_PID_DATACITE_PASSWORD`` (although you shouldn't use + environment variables for passwords). +- This setting was formerly known as ``doi.password`` and has been renamed. + You should delete the old JVM option and the wrapped password alias, then recreate + with new alias name as above. + + +.. _dataverse.pid.ezid.api-url: + +dataverse.pid.ezid.api-url +++++++++++++++++++++++++++ + +The EZID DOI provider is likely not an option if you are `not associated with +California Digital Library (CDL) or Purdue University +`_. + +Defaults to ``https://ezid.cdlib.org``. + +Can also be set via *MicroProfile Config API* sources, e.g. the environment +variable ``DATAVERSE_PID_EZID_API_URL``. This setting was formerly known as +``doi.baseurlstring`` and has been renamed. You should delete and re-add it. + + +.. _dataverse.pid.ezid.username: + +dataverse.pid.ezid.username ++++++++++++++++++++++++++++ + +The EZID DOI provider is likely not an option if you are `not associated with +California Digital Library (CDL) or Purdue University +`_. + +Works the same way as :ref:`dataverse.pid.datacite.username`, but for the EZID DOI +provider. + +Can also be set via *MicroProfile Config API* sources, e.g. the environment +variable ``DATAVERSE_PID_EZID_USERNAME``. + +This setting was formerly known as ``doi.username`` and has been renamed. You +should delete and re-add it. + +.. _dataverse.pid.ezid.password: + +dataverse.pid.ezid.password ++++++++++++++++++++++++++++ + +The EZID DOI provider is likely not an option if you are `not associated with +California Digital Library (CDL) or Purdue University +`_. + +Works the same way as :ref:`dataverse.pid.datacite.password`, but for the EZID DOI +provider. + +Can also be set via *MicroProfile Config API* sources, e.g. the environment +variable ``DATAVERSE_PID_EZID_PASSWORD`` (although you shouldn't use +environment variables for passwords). + +This setting was formerly known as ``doi.password`` and has been renamed. You +should delete the old JVM option and the wrapped password alias, then recreate +as shown for :ref:`dataverse.pid.datacite.password` but with the EZID alias +name. -Once you have a password from your provider, you can enter it like this: -``./asadmin create-jvm-options '-Ddoi.password=YOUR_PASSWORD_HERE'`` .. _dataverse.handlenet.admcredfile: @@ -1586,6 +1816,21 @@ To facilitate large file upload and download, the Dataverse Software installer b and restart Payara to apply your change. +mp.config.profile ++++++++++++++++++ + +MicroProfile Config 2.0 defines the `concept of "profiles" `_. +They can be used to change configuration values by context. This is used in Dataverse to change some configuration +defaults when used inside container context rather classic installations. + +As per the spec, you will need to set the configuration value ``mp.config.profile`` to ``ct`` as early as possible. +This is best done with a system property: + +``./asadmin create-system-properties 'mp.config.profile=ct'`` + +You might also create your own profiles and use these, please refer to the upstream documentation linked above. + + .. _database-settings: Database Settings @@ -1718,15 +1963,22 @@ By default the footer says "Copyright © [YYYY]" but you can add text after the :DoiProvider ++++++++++++ -As of this writing "DataCite" and "EZID" are the only valid options for production installations. Developers using Dataverse Software 4.10+ are welcome to use the keyword "FAKE" to configure a non-production installation with an non-resolving, in-code provider, which will basically short-circuit the DOI publishing process. ``:DoiProvider`` is only needed if you are using DOI. +As of this writing "DataCite" and "EZID" are the only valid options for +production installations. Developers using Dataverse Software 4.10+ are welcome +to use the keyword "FAKE" to configure a non-production installation with an +non-resolving, in-code provider, which will basically short-circuit the DOI +publishing process. ``:DoiProvider`` is only needed if you are using DOI. ``curl -X PUT -d DataCite http://localhost:8080/api/admin/settings/:DoiProvider`` -This setting relates to the ``:Protocol``, ``:Authority``, ``:Shoulder``, and ``:IdentifierGenerationStyle`` database settings below as well as the following JVM options: +This setting relates to the ``:Protocol``, ``:Authority``, ``:Shoulder``, and +``:IdentifierGenerationStyle`` database settings below as well as the following +JVM options: -- :ref:`doi.baseurlstring` -- :ref:`doi.username` -- :ref:`doi.password` +- :ref:`dataverse.pid.datacite.mds-api-url` +- :ref:`dataverse.pid.datacite.rest-api-url` +- :ref:`dataverse.pid.datacite.username` +- :ref:`dataverse.pid.datacite.password` .. _:Protocol: @@ -2073,6 +2325,8 @@ Limit the number of files in a zip that your Dataverse installation will accept. ``curl -X PUT -d 2048 http://localhost:8080/api/admin/settings/:ZipUploadFilesLimit`` +.. _:SolrHostColonPort: + :SolrHostColonPort ++++++++++++++++++ @@ -2080,6 +2334,8 @@ By default your Dataverse installation will attempt to connect to Solr on port 8 ``curl -X PUT -d localhost:8983 http://localhost:8080/api/admin/settings/:SolrHostColonPort`` +**Note:** instead of using a database setting, you could alternatively use JVM settings like :ref:`dataverse.solr.host`. + :SolrFullTextIndexing +++++++++++++++++++++ diff --git a/modules/dataverse-parent/pom.xml b/modules/dataverse-parent/pom.xml index c2f536c4d41..1e2fb989603 100644 --- a/modules/dataverse-parent/pom.xml +++ b/modules/dataverse-parent/pom.xml @@ -164,7 +164,7 @@ 1.15.0 - 0.4.1 + 2.10.1 4.13.1 5.7.0 diff --git a/pom.xml b/pom.xml index ce9f1c4b63d..525d134d382 100644 --- a/pom.xml +++ b/pom.xml @@ -601,9 +601,9 @@ test - org.microbean - microbean-microprofile-config - ${microbean-mpconfig.version} + io.smallrye.config + smallrye-config + ${smallrye-mpconfig.version} test @@ -641,10 +641,17 @@ **/*.xml **/firstNames/*.* **/*.xsl - **/*.properties **/services/* + + src/main/resources + + true + + **/*.properties + + diff --git a/src/main/java/edu/harvard/iq/dataverse/DOIDataCiteRegisterService.java b/src/main/java/edu/harvard/iq/dataverse/DOIDataCiteRegisterService.java index 218e4c85474..9e685d03f43 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DOIDataCiteRegisterService.java +++ b/src/main/java/edu/harvard/iq/dataverse/DOIDataCiteRegisterService.java @@ -23,6 +23,8 @@ import javax.persistence.EntityManager; import javax.persistence.PersistenceContext; import javax.persistence.TypedQuery; + +import edu.harvard.iq.dataverse.settings.JvmSettings; import org.apache.commons.text.StringEscapeUtils; import org.jsoup.Jsoup; import org.jsoup.nodes.Document; @@ -53,7 +55,11 @@ public class DOIDataCiteRegisterService { private DataCiteRESTfullClient getClient() throws IOException { if (client == null) { - client = new DataCiteRESTfullClient(System.getProperty("doi.baseurlstring"), System.getProperty("doi.username"), System.getProperty("doi.password")); + client = new DataCiteRESTfullClient( + JvmSettings.DATACITE_MDS_API_URL.lookup(), + JvmSettings.DATACITE_USERNAME.lookup(), + JvmSettings.DATACITE_PASSWORD.lookup() + ); } return client; } diff --git a/src/main/java/edu/harvard/iq/dataverse/DOIDataCiteServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/DOIDataCiteServiceBean.java index e7dd49a6926..a07f691bc60 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DOIDataCiteServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/DOIDataCiteServiceBean.java @@ -13,6 +13,7 @@ import javax.ejb.EJB; import javax.ejb.Stateless; +import edu.harvard.iq.dataverse.settings.JvmSettings; import org.apache.commons.httpclient.HttpException; import org.apache.commons.httpclient.HttpStatus; @@ -219,9 +220,9 @@ public void deleteIdentifier(DvObject dvObject) throws IOException, HttpExceptio private void deleteDraftIdentifier(DvObject dvObject) throws IOException { //ToDo - incorporate into DataCiteRESTfulClient - String baseUrl = systemConfig.getDataCiteRestApiUrlString(); - String username = System.getProperty("doi.username"); - String password = System.getProperty("doi.password"); + String baseUrl = JvmSettings.DATACITE_REST_API_URL.lookup(); + String username = JvmSettings.DATACITE_USERNAME.lookup(); + String password = JvmSettings.DATACITE_PASSWORD.lookup(); GlobalId doi = dvObject.getGlobalId(); /** * Deletes the DOI from DataCite if it can. Returns 204 if PID was deleted diff --git a/src/main/java/edu/harvard/iq/dataverse/DOIEZIdServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/DOIEZIdServiceBean.java index d21caf32411..2ddaca3468c 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DOIEZIdServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/DOIEZIdServiceBean.java @@ -1,5 +1,6 @@ package edu.harvard.iq.dataverse; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.ucsb.nceas.ezid.EZIDException; import edu.ucsb.nceas.ezid.EZIDService; import edu.ucsb.nceas.ezid.EZIDServiceRequest; @@ -26,10 +27,10 @@ public class DOIEZIdServiceBean extends AbstractGlobalIdServiceBean { public DOIEZIdServiceBean() { logger.log(Level.FINE,"Constructor"); - baseURLString = System.getProperty("doi.baseurlstring"); + baseURLString = JvmSettings.EZID_API_URL.lookup(); ezidService = new EZIDService(baseURLString); - USERNAME = System.getProperty("doi.username"); - PASSWORD = System.getProperty("doi.password"); + USERNAME = JvmSettings.EZID_USERNAME.lookup(); + PASSWORD = JvmSettings.EZID_PASSWORD.lookup(); logger.log(Level.FINE, "Using baseURLString {0}", baseURLString); try { ezidService.login(USERNAME, PASSWORD); diff --git a/src/main/java/edu/harvard/iq/dataverse/Dataset.java b/src/main/java/edu/harvard/iq/dataverse/Dataset.java index c60ea7020bd..1c883e31e36 100644 --- a/src/main/java/edu/harvard/iq/dataverse/Dataset.java +++ b/src/main/java/edu/harvard/iq/dataverse/Dataset.java @@ -33,6 +33,8 @@ import javax.persistence.Table; import javax.persistence.Temporal; import javax.persistence.TemporalType; + +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.util.BundleUtil; import edu.harvard.iq.dataverse.util.StringUtil; @@ -515,11 +517,8 @@ private Collection getCategoryNames() { @Deprecated public Path getFileSystemDirectory() { Path studyDir = null; - - String filesRootDirectory = System.getProperty("dataverse.files.directory"); - if (filesRootDirectory == null || filesRootDirectory.equals("")) { - filesRootDirectory = "/tmp/files"; - } + + String filesRootDirectory = JvmSettings.FILES_DIRECTORY.lookup(); if (this.getAlternativePersistentIndentifiers() != null && !this.getAlternativePersistentIndentifiers().isEmpty()) { for (AlternativePersistentIdentifier api : this.getAlternativePersistentIndentifiers()) { diff --git a/src/main/java/edu/harvard/iq/dataverse/HandlenetServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/HandlenetServiceBean.java index 1a8ee8a85e8..df16991b51e 100644 --- a/src/main/java/edu/harvard/iq/dataverse/HandlenetServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/HandlenetServiceBean.java @@ -24,8 +24,6 @@ import java.io.File; import java.io.FileInputStream; -import java.net.InetAddress; -import java.net.UnknownHostException; import java.util.*; import java.util.logging.Level; import java.util.logging.Logger; @@ -34,6 +32,7 @@ import java.security.PrivateKey; /* Handlenet imports: */ +import edu.harvard.iq.dataverse.util.SystemConfig; import net.handle.hdllib.AbstractMessage; import net.handle.hdllib.AbstractResponse; import net.handle.hdllib.AdminRecord; @@ -247,21 +246,7 @@ private String getRegistrationUrl(DvObject dvObject) { } public String getSiteUrl() { - logger.log(Level.FINE,"getSiteUrl"); - String hostUrl = System.getProperty("dataverse.siteUrl"); - if (hostUrl != null && !"".equals(hostUrl)) { - return hostUrl; - } - String hostName = System.getProperty("dataverse.fqdn"); - if (hostName == null) { - try { - hostName = InetAddress.getLocalHost().getCanonicalHostName(); - } catch (UnknownHostException e) { - return null; - } - } - hostUrl = "https://" + hostName; - return hostUrl; + return SystemConfig.getDataverseSiteUrlStatic(); } private byte[] readKey(final String file) { diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Info.java b/src/main/java/edu/harvard/iq/dataverse/api/Info.java index 4fe5cba5b9f..fd7824c15cf 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Info.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Info.java @@ -1,5 +1,6 @@ package edu.harvard.iq.dataverse.api; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; import edu.harvard.iq.dataverse.util.SystemConfig; import javax.ejb.EJB; @@ -44,7 +45,7 @@ public Response getInfo() { @GET @Path("server") public Response getServer() { - return response( req -> ok(systemConfig.getDataverseServer())); + return response( req -> ok(JvmSettings.FQDN.lookup())); } @GET diff --git a/src/main/java/edu/harvard/iq/dataverse/api/MakeDataCountApi.java b/src/main/java/edu/harvard/iq/dataverse/api/MakeDataCountApi.java index 8f6ec6b1c7d..92848cb9cbc 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/MakeDataCountApi.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/MakeDataCountApi.java @@ -6,6 +6,7 @@ import edu.harvard.iq.dataverse.makedatacount.DatasetExternalCitationsServiceBean; import edu.harvard.iq.dataverse.makedatacount.DatasetMetrics; import edu.harvard.iq.dataverse.makedatacount.DatasetMetricsServiceBean; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.util.SystemConfig; import java.io.FileReader; @@ -141,7 +142,10 @@ public Response updateCitationsForDataset(@PathParam("id") String id) throws Mal // DataCite wants "doi=", not "doi:". String authorityPlusIdentifier = persistentId.replaceFirst("doi:", ""); // Request max page size and then loop to handle multiple pages - URL url = new URL(systemConfig.getDataCiteRestApiUrlString() + "/events?doi=" + authorityPlusIdentifier + "&source=crossref&page[size]=1000"); + URL url = new URL(JvmSettings.DATACITE_REST_API_URL.lookup() + + "/events?doi=" + + authorityPlusIdentifier + + "&source=crossref&page[size]=1000"); logger.fine("Retrieving Citations from " + url.toString()); boolean nextPage = true; JsonArrayBuilder dataBuilder = Json.createArrayBuilder(); diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Pids.java b/src/main/java/edu/harvard/iq/dataverse/api/Pids.java index 5a2acf3209f..61def31ee79 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Pids.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Pids.java @@ -5,6 +5,7 @@ import edu.harvard.iq.dataverse.engine.command.impl.DeletePidCommand; import edu.harvard.iq.dataverse.engine.command.impl.ReservePidCommand; import edu.harvard.iq.dataverse.pidproviders.PidUtil; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.util.BundleUtil; import java.util.Arrays; import javax.ejb.Stateless; @@ -46,9 +47,13 @@ public Response getPid(@QueryParam("persistentId") String persistentId) { } catch (WrappedResponse ex) { return error(Response.Status.FORBIDDEN, BundleUtil.getStringFromBundle("api.errors.invalidApiToken")); } - String baseUrl = systemConfig.getDataCiteRestApiUrlString(); - String username = System.getProperty("doi.username"); - String password = System.getProperty("doi.password"); + + // FIXME: Even before changing to MPCONFIG retrieval, this was pinned to be DataCite specific! + // Should this be extended to EZID and other PID systems like Handle? + String baseUrl = JvmSettings.DATACITE_REST_API_URL.lookup(); + String username = JvmSettings.DATACITE_USERNAME.lookup(); + String password = JvmSettings.DATACITE_PASSWORD.lookup(); + try { JsonObjectBuilder result = PidUtil.queryDoi(persistentId, baseUrl, username, password); return ok(result); diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SwordConfigurationImpl.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SwordConfigurationImpl.java index ce5f9415fcc..1e506c6a0b1 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SwordConfigurationImpl.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SwordConfigurationImpl.java @@ -1,5 +1,6 @@ package edu.harvard.iq.dataverse.api.datadeposit; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.util.SystemConfig; import java.io.File; import java.util.Arrays; @@ -86,37 +87,32 @@ public boolean storeAndCheckBinary() { @Override public String getTempDirectory() { - String tmpFileDir = System.getProperty(SystemConfig.FILES_DIRECTORY); - if (tmpFileDir != null) { - String swordDirString = tmpFileDir + File.separator + "sword"; - File swordDirFile = new File(swordDirString); - /** - * @todo Do we really need this check? It seems like we do because - * if you create a dataset via the native API and then later try to - * upload a file via SWORD, the directory defined by - * dataverse.files.directory may not exist and we get errors deep in - * the SWORD library code. Could maybe use a try catch in the doPost - * method of our SWORDv2MediaResourceServlet. - */ - if (swordDirFile.exists()) { + // will throw a runtime exception when not found + String tmpFileDir = JvmSettings.FILES_DIRECTORY.lookup(); + + String swordDirString = tmpFileDir + File.separator + "sword"; + File swordDirFile = new File(swordDirString); + /** + * @todo Do we really need this check? It seems like we do because + * if you create a dataset via the native API and then later try to + * upload a file via SWORD, the directory defined by + * dataverse.files.directory may not exist and we get errors deep in + * the SWORD library code. Could maybe use a try catch in the doPost + * method of our SWORDv2MediaResourceServlet. + */ + if (swordDirFile.exists()) { + return swordDirString; + } else { + boolean mkdirSuccess = swordDirFile.mkdirs(); + if (mkdirSuccess) { + logger.info("Created directory " + swordDirString); return swordDirString; } else { - boolean mkdirSuccess = swordDirFile.mkdirs(); - if (mkdirSuccess) { - logger.info("Created directory " + swordDirString); - return swordDirString; - } else { - String msgForSwordUsers = ("Could not determine or create SWORD temp directory. Check logs for details."); - logger.severe(msgForSwordUsers + " Failed to create " + swordDirString); - // sadly, must throw RunTimeException to communicate with SWORD user - throw new RuntimeException(msgForSwordUsers); - } + String msgForSwordUsers = ("Could not determine or create SWORD temp directory. Check logs for details."); + logger.severe(msgForSwordUsers + " Failed to create " + swordDirString); + // sadly, must throw RunTimeException to communicate with SWORD user + throw new RuntimeException(msgForSwordUsers); } - } else { - String msgForSwordUsers = ("JVM option \"" + SystemConfig.FILES_DIRECTORY + "\" not defined. Check logs for details."); - logger.severe(msgForSwordUsers); - // sadly, must throw RunTimeException to communicate with SWORD user - throw new RuntimeException(msgForSwordUsers); } } diff --git a/src/main/java/edu/harvard/iq/dataverse/batch/jobs/importer/filesystem/FileRecordJobListener.java b/src/main/java/edu/harvard/iq/dataverse/batch/jobs/importer/filesystem/FileRecordJobListener.java index 6b82a665c17..039048d06fe 100644 --- a/src/main/java/edu/harvard/iq/dataverse/batch/jobs/importer/filesystem/FileRecordJobListener.java +++ b/src/main/java/edu/harvard/iq/dataverse/batch/jobs/importer/filesystem/FileRecordJobListener.java @@ -59,6 +59,7 @@ import org.apache.commons.io.IOUtils; +import java.io.File; import java.io.FileReader; import java.io.IOException; import java.sql.Timestamp; @@ -79,7 +80,7 @@ @Dependent public class FileRecordJobListener implements ItemReadListener, StepListener, JobListener { - public static final String SEP = System.getProperty("file.separator"); + public static final String SEP = File.separator; private static final UserNotification.Type notifyType = UserNotification.Type.FILESYSTEMIMPORT; diff --git a/src/main/java/edu/harvard/iq/dataverse/batch/jobs/importer/filesystem/FileRecordReader.java b/src/main/java/edu/harvard/iq/dataverse/batch/jobs/importer/filesystem/FileRecordReader.java index b3d3a7107a6..f3e2f9c8b13 100644 --- a/src/main/java/edu/harvard/iq/dataverse/batch/jobs/importer/filesystem/FileRecordReader.java +++ b/src/main/java/edu/harvard/iq/dataverse/batch/jobs/importer/filesystem/FileRecordReader.java @@ -54,7 +54,7 @@ @Dependent public class FileRecordReader extends AbstractItemReader { - public static final String SEP = System.getProperty("file.separator"); + public static final String SEP = File.separator; @Inject JobContext jobContext; diff --git a/src/main/java/edu/harvard/iq/dataverse/batch/util/LoggingUtil.java b/src/main/java/edu/harvard/iq/dataverse/batch/util/LoggingUtil.java index 4a778dc7abb..a2f76ca953d 100644 --- a/src/main/java/edu/harvard/iq/dataverse/batch/util/LoggingUtil.java +++ b/src/main/java/edu/harvard/iq/dataverse/batch/util/LoggingUtil.java @@ -154,8 +154,8 @@ public static Logger getJobLogger(String jobId) { try { Logger jobLogger = Logger.getLogger("job-"+jobId); FileHandler fh; - String logDir = System.getProperty("com.sun.aas.instanceRoot") + System.getProperty("file.separator") - + "logs" + System.getProperty("file.separator") + "batch-jobs" + System.getProperty("file.separator"); + String logDir = System.getProperty("com.sun.aas.instanceRoot") + File.separator + + "logs" + File.separator + "batch-jobs" + File.separator; checkCreateLogDirectory( logDir ); fh = new FileHandler(logDir + "job-" + jobId + ".log"); logger.log(Level.INFO, "JOB LOG: " + logDir + "job-" + jobId + ".log"); diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/GoogleCloudSubmitToArchiveCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/GoogleCloudSubmitToArchiveCommand.java index af4c960c2d6..46b2781266e 100644 --- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/GoogleCloudSubmitToArchiveCommand.java +++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/GoogleCloudSubmitToArchiveCommand.java @@ -1,40 +1,38 @@ package edu.harvard.iq.dataverse.engine.command.impl; +import com.google.auth.oauth2.ServiceAccountCredentials; +import com.google.cloud.storage.Blob; +import com.google.cloud.storage.Bucket; +import com.google.cloud.storage.Storage; +import com.google.cloud.storage.StorageOptions; import edu.harvard.iq.dataverse.DOIDataCiteRegisterService; import edu.harvard.iq.dataverse.DataCitation; import edu.harvard.iq.dataverse.Dataset; -import edu.harvard.iq.dataverse.DatasetVersion; import edu.harvard.iq.dataverse.DatasetLock.Reason; +import edu.harvard.iq.dataverse.DatasetVersion; import edu.harvard.iq.dataverse.authorization.Permission; import edu.harvard.iq.dataverse.authorization.users.ApiToken; import edu.harvard.iq.dataverse.engine.command.Command; import edu.harvard.iq.dataverse.engine.command.DataverseRequest; import edu.harvard.iq.dataverse.engine.command.RequiredPermissions; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.util.bagit.BagGenerator; import edu.harvard.iq.dataverse.util.bagit.OREMap; import edu.harvard.iq.dataverse.workflow.step.Failure; import edu.harvard.iq.dataverse.workflow.step.WorkflowStepResult; +import org.apache.commons.codec.binary.Hex; -import java.io.BufferedInputStream; +import java.io.File; import java.io.FileInputStream; -import java.io.FileNotFoundException; import java.io.IOException; import java.io.PipedInputStream; import java.io.PipedOutputStream; import java.nio.charset.Charset; import java.security.DigestInputStream; import java.security.MessageDigest; -import java.security.NoSuchAlgorithmException; import java.util.Map; import java.util.logging.Logger; -import org.apache.commons.codec.binary.Hex; -import com.google.auth.oauth2.ServiceAccountCredentials; -import com.google.cloud.storage.Blob; -import com.google.cloud.storage.Bucket; -import com.google.cloud.storage.Storage; -import com.google.cloud.storage.StorageOptions; - @RequiredPermissions(Permission.PublishDataset) public class GoogleCloudSubmitToArchiveCommand extends AbstractSubmitToArchiveCommand implements Command { @@ -54,10 +52,12 @@ public WorkflowStepResult performArchiveSubmission(DatasetVersion dv, ApiToken t logger.fine("Project: " + projectName + " Bucket: " + bucketName); if (bucketName != null && projectName != null) { Storage storage; - try { - FileInputStream fis = new FileInputStream(System.getProperty("dataverse.files.directory") + System.getProperty("file.separator")+ "googlecloudkey.json"); + + String cloudKeyFile = JvmSettings.FILES_DIRECTORY.lookup() + File.separator + "googlecloudkey.json"; + + try (FileInputStream cloudKeyStream = new FileInputStream(cloudKeyFile)) { storage = StorageOptions.newBuilder() - .setCredentials(ServiceAccountCredentials.fromStream(fis)) + .setCredentials(ServiceAccountCredentials.fromStream(cloudKeyStream)) .setProjectId(projectName) .build() .getService(); diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/ImportFromFileSystemCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/ImportFromFileSystemCommand.java index 64beba82450..5f31ea756eb 100644 --- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/ImportFromFileSystemCommand.java +++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/ImportFromFileSystemCommand.java @@ -12,17 +12,20 @@ import edu.harvard.iq.dataverse.engine.command.RequiredPermissions; import edu.harvard.iq.dataverse.engine.command.exception.CommandException; import edu.harvard.iq.dataverse.engine.command.exception.IllegalCommandException; -import static edu.harvard.iq.dataverse.util.json.NullSafeJsonBuilder.jsonObjectBuilder; -import java.io.File; -import java.util.Properties; -import java.util.logging.Level; -import java.util.logging.Logger; +import edu.harvard.iq.dataverse.settings.JvmSettings; + import javax.batch.operations.JobOperator; import javax.batch.operations.JobSecurityException; import javax.batch.operations.JobStartException; import javax.batch.runtime.BatchRuntime; import javax.json.JsonObject; import javax.json.JsonObjectBuilder; +import java.io.File; +import java.util.Properties; +import java.util.logging.Level; +import java.util.logging.Logger; + +import static edu.harvard.iq.dataverse.util.json.NullSafeJsonBuilder.jsonObjectBuilder; @RequiredPermissions(Permission.EditDataset) public class ImportFromFileSystemCommand extends AbstractCommand { @@ -69,18 +72,20 @@ public JsonObject execute(CommandContext ctxt) throws CommandException { logger.info(error); throw new IllegalCommandException(error, this); } - File directory = new File(System.getProperty("dataverse.files.directory") - + File.separator + dataset.getAuthority() + File.separator + dataset.getIdentifier()); - // TODO: - // The above goes directly to the filesystem directory configured by the - // old "dataverse.files.directory" JVM option (otherwise used for temp - // files only, after the Multistore implementation (#6488). - // We probably want package files to be able to use specific stores instead. - // More importantly perhaps, the approach above does not take into account - // if the dataset may have an AlternativePersistentIdentifier, that may be - // designated isStorageLocationDesignator() - i.e., if a different identifer - // needs to be used to name the storage directory, instead of the main/current - // persistent identifier above. + + File directory = new File( + String.join(File.separator, JvmSettings.FILES_DIRECTORY.lookup(), + dataset.getAuthority(), dataset.getIdentifier())); + + // TODO: The above goes directly to the filesystem directory configured by the + // old "dataverse.files.directory" JVM option (otherwise used for temp + // files only, after the Multistore implementation (#6488). + // We probably want package files to be able to use specific stores instead. + // More importantly perhaps, the approach above does not take into account + // if the dataset may have an AlternativePersistentIdentifier, that may be + // designated isStorageLocationDesignator() - i.e., if a different identifer + // needs to be used to name the storage directory, instead of the main/current + // persistent identifier above. if (!isValidDirectory(directory)) { String error = "Dataset directory is invalid. " + directory; logger.info(error); @@ -93,11 +98,10 @@ public JsonObject execute(CommandContext ctxt) throws CommandException { throw new IllegalCommandException(error, this); } - File uploadDirectory = new File(System.getProperty("dataverse.files.directory") - + File.separator + dataset.getAuthority() + File.separator + dataset.getIdentifier() - + File.separator + uploadFolder); - // TODO: - // see the comment above. + File uploadDirectory = new File(String.join(File.separator, JvmSettings.FILES_DIRECTORY.lookup(), + dataset.getAuthority(), dataset.getIdentifier(), uploadFolder)); + + // TODO: see the comment above. if (!isValidDirectory(uploadDirectory)) { String error = "Upload folder is not a valid directory."; logger.info(error); diff --git a/src/main/java/edu/harvard/iq/dataverse/export/ddi/DdiExportUtil.java b/src/main/java/edu/harvard/iq/dataverse/export/ddi/DdiExportUtil.java index 1952acb67a3..166de10952b 100644 --- a/src/main/java/edu/harvard/iq/dataverse/export/ddi/DdiExportUtil.java +++ b/src/main/java/edu/harvard/iq/dataverse/export/ddi/DdiExportUtil.java @@ -32,18 +32,15 @@ import edu.harvard.iq.dataverse.export.DDIExporter; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; -import static edu.harvard.iq.dataverse.util.SystemConfig.FQDN; -import static edu.harvard.iq.dataverse.util.SystemConfig.SITE_URL; import edu.harvard.iq.dataverse.util.BundleUtil; import edu.harvard.iq.dataverse.util.FileUtil; +import edu.harvard.iq.dataverse.util.SystemConfig; import edu.harvard.iq.dataverse.util.json.JsonUtil; import edu.harvard.iq.dataverse.util.xml.XmlPrinter; import java.io.ByteArrayOutputStream; import java.io.IOException; import java.io.OutputStream; -import java.net.InetAddress; -import java.net.UnknownHostException; import java.nio.file.Files; import java.nio.file.Path; import java.nio.file.Paths; @@ -1300,7 +1297,7 @@ private static void writeNotesElement(XMLStreamWriter xmlw, DatasetVersionDTO da // harvesting *all* files are encoded as otherMats; even tabular ones. private static void createOtherMats(XMLStreamWriter xmlw, List fileDtos) throws XMLStreamException { // The preferred URL for this dataverse, for cooking up the file access API links: - String dataverseUrl = getDataverseSiteUrl(); + String dataverseUrl = SystemConfig.getDataverseSiteUrlStatic(); for (FileDTO fileDTo : fileDtos) { // We'll continue using the scheme we've used before, in DVN2-3: non-tabular files are put into otherMat, @@ -1347,7 +1344,7 @@ private static void createOtherMats(XMLStreamWriter xmlw, List fileDtos private static void createOtherMatsFromFileMetadatas(XMLStreamWriter xmlw, List fileMetadatas) throws XMLStreamException { // The preferred URL for this dataverse, for cooking up the file access API links: - String dataverseUrl = getDataverseSiteUrl(); + String dataverseUrl = SystemConfig.getDataverseSiteUrlStatic(); for (FileMetadata fileMetadata : fileMetadatas) { // We'll continue using the scheme we've used before, in DVN2-3: non-tabular files are put into otherMat, @@ -1563,33 +1560,6 @@ private static void saveJsonToDisk(String datasetVersionAsJson) throws IOExcepti Files.write(Paths.get("/tmp/out.json"), datasetVersionAsJson.getBytes()); } - /** - * The "official", designated URL of the site; - * can be defined as a complete URL; or derived from the - * "official" hostname. If none of these options is set, - * defaults to the InetAddress.getLocalHOst() and https; - */ - private static String getDataverseSiteUrl() { - String hostUrl = System.getProperty(SITE_URL); - if (hostUrl != null && !"".equals(hostUrl)) { - return hostUrl; - } - String hostName = System.getProperty(FQDN); - if (hostName == null) { - try { - hostName = InetAddress.getLocalHost().getCanonicalHostName(); - } catch (UnknownHostException e) { - hostName = null; - } - } - - if (hostName != null) { - return "https://" + hostName; - } - - return "http://localhost:8080"; - } - @@ -1901,7 +1871,7 @@ private static void createVarDDI(XMLStreamWriter xmlw, DataVariable dv, FileMeta } private static void createFileDscr(XMLStreamWriter xmlw, DatasetVersion datasetVersion) throws XMLStreamException { - String dataverseUrl = getDataverseSiteUrl(); + String dataverseUrl = SystemConfig.getDataverseSiteUrlStatic(); for (FileMetadata fileMetadata : datasetVersion.getFileMetadatas()) { DataFile dataFile = fileMetadata.getDataFile(); diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xrecord.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xrecord.java index 7e115c78f06..4485b798658 100644 --- a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xrecord.java +++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xrecord.java @@ -8,14 +8,12 @@ import edu.harvard.iq.dataverse.Dataset; import edu.harvard.iq.dataverse.export.ExportException; import edu.harvard.iq.dataverse.export.ExportService; -import static edu.harvard.iq.dataverse.util.SystemConfig.FQDN; -import static edu.harvard.iq.dataverse.util.SystemConfig.SITE_URL; import java.io.ByteArrayOutputStream; import java.io.IOException; import java.io.InputStream; import java.io.OutputStream; -import java.net.InetAddress; -import java.net.UnknownHostException; + +import edu.harvard.iq.dataverse.util.SystemConfig; import org.apache.poi.util.ReplacingInputStream; /** @@ -149,7 +147,7 @@ private void writeMetadataStream(InputStream inputStream, OutputStream outputStr private String customMetadataExtensionRef(String identifier) { String ret = "<" + METADATA_FIELD + " directApiCall=\"" - + getDataverseSiteUrl() + + SystemConfig.getDataverseSiteUrlStatic() + DATAVERSE_EXTENDED_METADATA_API + "?exporter=" + DATAVERSE_EXTENDED_METADATA_FORMAT @@ -164,21 +162,4 @@ private String customMetadataExtensionRef(String identifier) { private boolean isExtendedDataverseMetadataMode(String formatName) { return DATAVERSE_EXTENDED_METADATA_FORMAT.equals(formatName); } - - private String getDataverseSiteUrl() { - String hostUrl = System.getProperty(SITE_URL); - if (hostUrl != null && !"".equals(hostUrl)) { - return hostUrl; - } - String hostName = System.getProperty(FQDN); - if (hostName == null) { - try { - hostName = InetAddress.getLocalHost().getCanonicalHostName(); - } catch (UnknownHostException e) { - return null; - } - } - hostUrl = "https://" + hostName; - return hostUrl; - } } diff --git a/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java index bd66e822c20..7ddefc0a6ba 100644 --- a/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java @@ -30,6 +30,7 @@ import edu.harvard.iq.dataverse.datavariable.VariableMetadataUtil; import edu.harvard.iq.dataverse.datavariable.VariableServiceBean; import edu.harvard.iq.dataverse.harvest.client.HarvestingClient; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; import edu.harvard.iq.dataverse.util.FileUtil; import edu.harvard.iq.dataverse.util.StringUtil; @@ -85,6 +86,8 @@ import org.apache.tika.metadata.Metadata; import org.apache.tika.parser.ParseContext; import org.apache.tika.sax.BodyContentHandler; +import org.eclipse.microprofile.config.Config; +import org.eclipse.microprofile.config.ConfigProvider; import org.xml.sax.ContentHandler; @Stateless @@ -92,6 +95,7 @@ public class IndexServiceBean { private static final Logger logger = Logger.getLogger(IndexServiceBean.class.getCanonicalName()); + private static final Config config = ConfigProvider.getConfig(); @PersistenceContext(unitName = "VDCNet-ejbPU") private EntityManager em; @@ -152,13 +156,16 @@ public class IndexServiceBean { public static final String HARVESTED = "Harvested"; private String rootDataverseName; private Dataverse rootDataverseCached; - private SolrClient solrServer; + SolrClient solrServer; private VariableMetadataUtil variableMetadataUtil; @PostConstruct public void init() { - String urlString = "http://" + systemConfig.getSolrHostColonPort() + "/solr/collection1"; + String protocol = JvmSettings.SOLR_PROT.lookup(); + String path = JvmSettings.SOLR_PATH.lookup(); + + String urlString = protocol + "://" + systemConfig.getSolrHostColonPort() + path; solrServer = new HttpSolrClient.Builder(urlString).build(); rootDataverseName = findRootDataverseCached().getName(); diff --git a/src/main/java/edu/harvard/iq/dataverse/search/SolrClientService.java b/src/main/java/edu/harvard/iq/dataverse/search/SolrClientService.java index f00ece9aacc..355c6c198e4 100644 --- a/src/main/java/edu/harvard/iq/dataverse/search/SolrClientService.java +++ b/src/main/java/edu/harvard/iq/dataverse/search/SolrClientService.java @@ -5,6 +5,7 @@ */ package edu.harvard.iq.dataverse.search; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.util.SystemConfig; import java.io.IOException; import java.util.logging.Logger; @@ -15,6 +16,8 @@ import javax.inject.Named; import org.apache.solr.client.solrj.SolrClient; import org.apache.solr.client.solrj.impl.HttpSolrClient; +import org.eclipse.microprofile.config.Config; +import org.eclipse.microprofile.config.ConfigProvider; /** * @@ -30,6 +33,7 @@ @Singleton public class SolrClientService { private static final Logger logger = Logger.getLogger(SolrClientService.class.getCanonicalName()); + private static final Config config = ConfigProvider.getConfig(); @EJB SystemConfig systemConfig; @@ -38,9 +42,11 @@ public class SolrClientService { @PostConstruct public void init() { - String urlString = "http://" + systemConfig.getSolrHostColonPort() + "/solr/collection1"; - solrClient = new HttpSolrClient.Builder(urlString).build(); + String protocol = JvmSettings.SOLR_PROT.lookup(); + String path = JvmSettings.SOLR_PATH.lookup(); + String urlString = protocol + "://" + systemConfig.getSolrHostColonPort() + path; + solrClient = new HttpSolrClient.Builder(urlString).build(); } @PreDestroy diff --git a/src/main/java/edu/harvard/iq/dataverse/settings/JvmSettings.java b/src/main/java/edu/harvard/iq/dataverse/settings/JvmSettings.java new file mode 100644 index 00000000000..fd051cd776b --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/settings/JvmSettings.java @@ -0,0 +1,153 @@ +package edu.harvard.iq.dataverse.settings; + +import org.eclipse.microprofile.config.ConfigProvider; + +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Collections; +import java.util.List; +import java.util.Optional; +import java.util.stream.Collectors; + +/** + * Enum to store each and every JVM-based setting as a reference, + * much like the enum {@link SettingsServiceBean.Key} for DB settings. + * + * To be able to have more control over JVM settings names, + * avoid typos, maybe create lists of settings and so on, + * this enum will provide the place to add any old and new + * settings that are destined to be made at the JVM level. + * + * Further extensions of this class include + * - adding predicates for validation and + * - offering injecting parameters into keys (as used with the file access subsystem) and + * - adding data manipulation for aliased config names. + */ +public enum JvmSettings { + // the upmost root scope - every setting shall start with it. + PREFIX("dataverse"), + + // GENERAL SETTINGS + VERSION(PREFIX, "version"), + BUILD(PREFIX, "build"), + FQDN(PREFIX, "fqdn"), + SITE_URL(PREFIX, "siteUrl"), + + // FILES SETTINGS + SCOPE_FILES(PREFIX, "files"), + FILES_DIRECTORY(SCOPE_FILES, "directory"), + + // SOLR INDEX SETTINGS + SCOPE_SOLR(PREFIX, "solr"), + SOLR_HOST(SCOPE_SOLR, "host"), + SOLR_PORT(SCOPE_SOLR, "port"), + SOLR_PROT(SCOPE_SOLR, "protocol"), + SOLR_CORE(SCOPE_SOLR, "core"), + SOLR_PATH(SCOPE_SOLR, "path"), + + // PERSISTENT IDENTIFIER SETTINGS + SCOPE_PID(PREFIX, "pid"), + + // PROVIDER EZID (legacy) - these settings were formerly kept together with DataCite ones + SCOPE_PID_EZID(SCOPE_PID, "ezid"), + EZID_API_URL(SCOPE_PID_EZID, "api-url", "doi.baseurlstring"), + EZID_USERNAME(SCOPE_PID_EZID, "username", "doi.username"), + EZID_PASSWORD(SCOPE_PID_EZID, "password", "doi.password"), + + // PROVIDER DATACITE + SCOPE_PID_DATACITE(SCOPE_PID, "datacite"), + DATACITE_MDS_API_URL(SCOPE_PID_DATACITE, "mds-api-url", "doi.baseurlstring"), + DATACITE_REST_API_URL(SCOPE_PID_DATACITE, "rest-api-url", "doi.dataciterestapiurlstring", "doi.mdcbaseurlstring"), + DATACITE_USERNAME(SCOPE_PID_DATACITE, "username", "doi.username"), + DATACITE_PASSWORD(SCOPE_PID_DATACITE, "password", "doi.password"), + + + ; + + private static final String SCOPE_SEPARATOR = "."; + + private final String key; + private final String scopedKey; + private final JvmSettings parent; + private final List oldNames; + + JvmSettings(String key) { + this.key = key; + this.scopedKey = key; + this.parent = null; + this.oldNames = List.of(); + } + + JvmSettings(JvmSettings scope, String key) { + this.key = key; + this.scopedKey = scope.scopedKey + SCOPE_SEPARATOR + key; + this.parent = scope; + this.oldNames = List.of(); + } + + JvmSettings(JvmSettings scope, String key, String... oldNames) { + this.key = key; + this.scopedKey = scope.scopedKey + SCOPE_SEPARATOR + key; + this.parent = scope; + this.oldNames = Arrays.stream(oldNames).collect(Collectors.toUnmodifiableList()); + } + + private static final List aliased = new ArrayList<>(); + static { + for (JvmSettings setting : JvmSettings.values()) { + if (!setting.oldNames.isEmpty()) { + aliased.add(setting); + } + } + } + + /** + * Get all settings having old names to include them in {@link edu.harvard.iq.dataverse.settings.source.AliasConfigSource} + * @return List of settings with old alias names. Can be empty, but will not be null. + */ + public static List getAliasedSettings() { + return Collections.unmodifiableList(aliased); + } + + /** + * Return a list of old names to be used as aliases for backward compatibility. + * Will return empty list if no old names present. + * + * @return List of old names, may be empty, but never null. + */ + public List getOldNames() { + return oldNames; + } + + /** + * Retrieve the scoped key for this setting. Scopes are separated by dots. + * + * @return The scoped key (or the key if no scope). Example: dataverse.subscope.subsubscope.key + */ + public String getScopedKey() { + return this.scopedKey; + } + + /** + * Lookup this setting via MicroProfile Config as a required option (it will fail if not present). + * @throws java.util.NoSuchElementException - if the property is not defined or is defined as an empty string + * @return The setting as a String + */ + public String lookup() { + // This must be done with the full-fledged lookup, as we cannot store the config in an instance or static + // variable, as the alias config source depends on this enum (circular dependency). This is easiest + // avoided by looking up the static cached config at the cost of a method invocation. + return ConfigProvider.getConfig().getValue(this.getScopedKey(), String.class); + } + + /** + * Lookup this setting via MicroProfile Config as an optional setting. + * @return The setting as String wrapped in a (potentially empty) Optional + */ + public Optional lookupOptional() { + // This must be done with the full-fledged lookup, as we cannot store the config in an instance or static + // variable, as the alias config source depends on this enum (circular dependency). This is easiest + // avoided by looking up the static cached config at the cost of a method invocation. + return ConfigProvider.getConfig().getOptionalValue(this.getScopedKey(), String.class); + } +} diff --git a/src/main/java/edu/harvard/iq/dataverse/settings/SettingsServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/settings/SettingsServiceBean.java index 12ae777f3f8..c60918bb1ab 100644 --- a/src/main/java/edu/harvard/iq/dataverse/settings/SettingsServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/settings/SettingsServiceBean.java @@ -174,7 +174,12 @@ public enum Key { * */ SearchRespectPermissionRoot, - /** Solr hostname and port, such as "localhost:8983". */ + /** + * Solr hostname and port, such as "localhost:8983". + * @deprecated New installations should not use this database setting, but use {@link JvmSettings#SOLR_HOST} + * and {@link JvmSettings#SOLR_PORT}. + */ + @Deprecated(forRemoval = true, since = "2022-07-01") SolrHostColonPort, /** Enable full-text indexing in solr up to max file size */ SolrFullTextIndexing, //true or false (default) diff --git a/src/main/java/edu/harvard/iq/dataverse/settings/source/AliasConfigSource.java b/src/main/java/edu/harvard/iq/dataverse/settings/source/AliasConfigSource.java index fbdbd982085..ebc415cdd5f 100644 --- a/src/main/java/edu/harvard/iq/dataverse/settings/source/AliasConfigSource.java +++ b/src/main/java/edu/harvard/iq/dataverse/settings/source/AliasConfigSource.java @@ -1,5 +1,6 @@ package edu.harvard.iq.dataverse.settings.source; +import edu.harvard.iq.dataverse.settings.JvmSettings; import org.eclipse.microprofile.config.ConfigProvider; import org.eclipse.microprofile.config.spi.ConfigSource; @@ -7,6 +8,7 @@ import java.net.URL; import java.util.HashMap; import java.util.HashSet; +import java.util.List; import java.util.Map; import java.util.Properties; import java.util.Set; @@ -16,9 +18,6 @@ /** * Enable using an old name for a new config name. * Usages will be logged and this source will ALWAYS stand back if the new name is used anywhere. - * - * By using a DbSettingConfigSource value (dataverse.settings.fromdb.XXX) as old name, we can - * alias a new name to an old db setting, enabling backward compatibility. */ public final class AliasConfigSource implements ConfigSource { @@ -33,11 +32,20 @@ public AliasConfigSource() { Properties aliasProps = readAliases(ALIASES_PROP_FILE); // store in our aliases map importAliases(aliasProps); + // also store all old names from JvmSettings + importJvmSettings(JvmSettings.getAliasedSettings()); } catch (IOException e) { logger.info("Could not read from "+ALIASES_PROP_FILE+". Skipping MPCONFIG alias setup."); } } + void importJvmSettings(List aliasedSettings) { + aliasedSettings.forEach( + setting -> setting.getOldNames().forEach( + oldName -> aliases.put(setting.getScopedKey(), oldName))); + } + + Properties readAliases(String filePath) throws IOException { // get resource from classpath ClassLoader classLoader = this.getClass().getClassLoader(); diff --git a/src/main/java/edu/harvard/iq/dataverse/settings/source/DbSettingConfigHelper.java b/src/main/java/edu/harvard/iq/dataverse/settings/source/DbSettingConfigHelper.java deleted file mode 100644 index 7b9783dee06..00000000000 --- a/src/main/java/edu/harvard/iq/dataverse/settings/source/DbSettingConfigHelper.java +++ /dev/null @@ -1,27 +0,0 @@ -package edu.harvard.iq.dataverse.settings.source; - -import edu.harvard.iq.dataverse.settings.SettingsServiceBean; - -import javax.annotation.PostConstruct; -import javax.ejb.EJB; -import javax.ejb.Singleton; -import javax.ejb.Startup; - -/** - * This is a small helper bean for the MPCONFIG DbSettingConfigSource. - * As it is a singleton and built at application start (=deployment), it will inject the (stateless) - * settings service into the MPCONFIG POJO once it's ready. - * - * MPCONFIG requires it's sources to be POJOs. No direct dependency injection possible. - */ -@Singleton -@Startup -public class DbSettingConfigHelper { - @EJB - SettingsServiceBean settingsSvc; - - @PostConstruct - public void injectService() { - DbSettingConfigSource.injectSettingsService(settingsSvc); - } -} diff --git a/src/main/java/edu/harvard/iq/dataverse/settings/source/DbSettingConfigSource.java b/src/main/java/edu/harvard/iq/dataverse/settings/source/DbSettingConfigSource.java deleted file mode 100644 index 838cd415819..00000000000 --- a/src/main/java/edu/harvard/iq/dataverse/settings/source/DbSettingConfigSource.java +++ /dev/null @@ -1,83 +0,0 @@ -package edu.harvard.iq.dataverse.settings.source; - -import edu.harvard.iq.dataverse.settings.Setting; -import edu.harvard.iq.dataverse.settings.SettingsServiceBean; -import org.eclipse.microprofile.config.spi.ConfigSource; - -import java.time.Duration; -import java.time.Instant; -import java.util.Map; -import java.util.Set; -import java.util.concurrent.ConcurrentHashMap; -import java.util.logging.Logger; - -/** - * A caching wrapper around SettingServiceBean to provide database settings to MicroProfile Config API. - * Please be aware that this class relies on dependency injection during the application startup. - * Values will not be available before and a severe message will be logged to allow monitoring (potential race conditions) - * The settings will be cached for at least one minute, avoiding unnecessary database calls. - */ -public class DbSettingConfigSource implements ConfigSource { - - private static final Logger logger = Logger.getLogger(DbSettingConfigSource.class.getCanonicalName()); - private static final ConcurrentHashMap properties = new ConcurrentHashMap<>(); - private static Instant lastUpdate; - private static SettingsServiceBean settingsSvc; - public static final String PREFIX = "dataverse.settings.fromdb"; - - /** - * Let the SettingsServiceBean be injected by DbSettingConfigHelper with PostConstruct - * @param injected - */ - public static void injectSettingsService(SettingsServiceBean injected) { - settingsSvc = injected; - updateProperties(); - } - - /** - * Retrieve settings from the database via service and update cache. - */ - public static void updateProperties() { - // skip if the service has not been injected yet - if (settingsSvc == null) { - return; - } - properties.clear(); - Set dbSettings = settingsSvc.listAll(); - dbSettings.forEach(s -> properties.put(PREFIX+"."+s.getName().substring(1) + (s.getLang() == null ? "" : "."+s.getLang()), s.getContent())); - lastUpdate = Instant.now(); - } - - @Override - public Map getProperties() { - // if the cache is at least XX number of seconds old, update before serving data. - if (lastUpdate == null || Instant.now().minus(Duration.ofSeconds(60)).isAfter(lastUpdate)) { - updateProperties(); - } - return properties; - } - - @Override - public Set getPropertyNames() { - return getProperties().keySet(); - } - - @Override - public int getOrdinal() { - return 50; - } - - @Override - public String getValue(String key) { - // log usages for which this has been designed, but not yet ready to serve... - if (settingsSvc == null && key.startsWith(PREFIX)) { - logger.severe("MPCONFIG DbSettingConfigSource not ready yet, but requested for '"+key+"'."); - } - return getProperties().getOrDefault(key, null); - } - - @Override - public String getName() { - return "DataverseDB"; - } -} diff --git a/src/main/java/edu/harvard/iq/dataverse/settings/spi/DbSettingConfigSourceProvider.java b/src/main/java/edu/harvard/iq/dataverse/settings/spi/DbSettingConfigSourceProvider.java deleted file mode 100644 index 856a2c64a01..00000000000 --- a/src/main/java/edu/harvard/iq/dataverse/settings/spi/DbSettingConfigSourceProvider.java +++ /dev/null @@ -1,14 +0,0 @@ -package edu.harvard.iq.dataverse.settings.spi; - -import edu.harvard.iq.dataverse.settings.source.DbSettingConfigSource; -import org.eclipse.microprofile.config.spi.ConfigSource; -import org.eclipse.microprofile.config.spi.ConfigSourceProvider; - -import java.util.Arrays; - -public class DbSettingConfigSourceProvider implements ConfigSourceProvider { - @Override - public Iterable getConfigSources(ClassLoader forClassLoader) { - return Arrays.asList(new DbSettingConfigSource()); - } -} \ No newline at end of file diff --git a/src/main/java/edu/harvard/iq/dataverse/util/FileUtil.java b/src/main/java/edu/harvard/iq/dataverse/util/FileUtil.java index 64dadc54a4a..66c60c1150e 100644 --- a/src/main/java/edu/harvard/iq/dataverse/util/FileUtil.java +++ b/src/main/java/edu/harvard/iq/dataverse/util/FileUtil.java @@ -40,6 +40,7 @@ import edu.harvard.iq.dataverse.ingest.IngestServiceShapefileHelper; import edu.harvard.iq.dataverse.ingest.IngestableDataChecker; import edu.harvard.iq.dataverse.license.License; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.util.file.BagItFileHandler; import edu.harvard.iq.dataverse.util.file.CreateDataFileResult; import edu.harvard.iq.dataverse.util.file.BagItFileHandlerFactory; @@ -1386,11 +1387,8 @@ public static boolean canIngestAsTabular(String mimeType) { } public static String getFilesTempDirectory() { - String filesRootDirectory = System.getProperty("dataverse.files.directory"); - if (filesRootDirectory == null || filesRootDirectory.equals("")) { - filesRootDirectory = "/tmp/files"; - } - + + String filesRootDirectory = JvmSettings.FILES_DIRECTORY.lookup(); String filesTempDirectory = filesRootDirectory + "/temp"; if (!Files.exists(Paths.get(filesTempDirectory))) { diff --git a/src/main/java/edu/harvard/iq/dataverse/util/SystemConfig.java b/src/main/java/edu/harvard/iq/dataverse/util/SystemConfig.java index bd27405fae5..2a2493cc2ac 100644 --- a/src/main/java/edu/harvard/iq/dataverse/util/SystemConfig.java +++ b/src/main/java/edu/harvard/iq/dataverse/util/SystemConfig.java @@ -1,18 +1,28 @@ package edu.harvard.iq.dataverse.util; import com.ocpsoft.pretty.PrettyContext; - import edu.harvard.iq.dataverse.DataFile; import edu.harvard.iq.dataverse.DataverseServiceBean; import edu.harvard.iq.dataverse.DvObjectContainer; import edu.harvard.iq.dataverse.authorization.AuthenticationServiceBean; import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinAuthenticationProvider; import edu.harvard.iq.dataverse.authorization.providers.oauth2.AbstractOAuth2AuthenticationProvider; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; import edu.harvard.iq.dataverse.validation.PasswordValidatorUtil; -import java.io.FileInputStream; -import java.io.IOException; -import java.io.InputStream; +import org.eclipse.microprofile.config.Config; +import org.eclipse.microprofile.config.ConfigProvider; +import org.passay.CharacterRule; + +import javax.ejb.EJB; +import javax.ejb.Stateless; +import javax.inject.Named; +import javax.json.Json; +import javax.json.JsonArray; +import javax.json.JsonObject; +import javax.json.JsonReader; +import javax.json.JsonString; +import javax.json.JsonValue; import java.io.StringReader; import java.net.InetAddress; import java.net.UnknownHostException; @@ -23,25 +33,12 @@ import java.util.List; import java.util.Map; import java.util.MissingResourceException; -import java.util.Properties; +import java.util.Optional; import java.util.ResourceBundle; import java.util.logging.Logger; import java.util.regex.Matcher; import java.util.regex.Pattern; -import javax.ejb.EJB; -import javax.ejb.Stateless; -import javax.inject.Named; -import javax.json.Json; -import javax.json.JsonArray; -import javax.json.JsonObject; -import javax.json.JsonReader; -import javax.json.JsonString; -import javax.json.JsonValue; - -import org.passay.CharacterRule; -import org.apache.commons.io.IOUtils; - /** * System-wide configuration */ @@ -50,6 +47,7 @@ public class SystemConfig { private static final Logger logger = Logger.getLogger(SystemConfig.class.getCanonicalName()); + private static final Config config = ConfigProvider.getConfig(); @EJB SettingsServiceBean settingsService; @@ -61,28 +59,7 @@ public class SystemConfig { AuthenticationServiceBean authenticationService; public static final String DATAVERSE_PATH = "/dataverse/"; - - /** - * A JVM option for the advertised fully qualified domain name (hostname) of - * the Dataverse installation, such as "dataverse.example.com", which may - * differ from the hostname that the server knows itself as. - * - * The equivalent in DVN 3.x was "dvn.inetAddress". - */ - public static final String FQDN = "dataverse.fqdn"; - - /** - * A JVM option for specifying the "official" URL of the site. - * Unlike the FQDN option above, this would be a complete URL, - * with the protocol, port number etc. - */ - public static final String SITE_URL = "dataverse.siteUrl"; - - /** - * A JVM option for where files are stored on the file system. - */ - public static final String FILES_DIRECTORY = "dataverse.files.directory"; - + /** * Some installations may not want download URLs to their files to be * available in Schema.org JSON-LD output. @@ -95,12 +72,6 @@ public class SystemConfig { */ private static final String PASSWORD_RESET_TIMEOUT_IN_MINUTES = "dataverse.auth.password-reset-timeout-in-minutes"; - /** - * A common place to find the String for a sane Solr hostname:port - * combination. - */ - private String saneDefaultForSolrHostColonPort = "localhost:8983"; - /** * The default number of datafiles that we allow to be created through * zip file upload. @@ -109,9 +80,8 @@ public class SystemConfig { public static final long defaultZipDownloadLimit = 104857600L; // 100MB private static final int defaultMultipleUploadFilesLimit = 1000; private static final int defaultLoginSessionTimeout = 480; // = 8 hours - - private static String appVersionString = null; - private static String buildNumberString = null; + + private String buildNumber = null; private static final String JVM_TIMER_SERVER_OPTION = "dataverse.timerServer"; @@ -132,137 +102,61 @@ public String getVersion() { // candidate for being moved into some kind of an application-scoped caching // service... some CachingService @Singleton - ? (L.A. 5.8) public String getVersion(boolean withBuildNumber) { - - if (appVersionString == null) { - - // The Version Number is no longer supplied in a .properties file - so - // we can't just do - // return BundleUtil.getStringFromBundle("version.number", null, ResourceBundle.getBundle("VersionNumber", Locale.US)); - // - // Instead, we'll rely on Maven placing the version number into the - // Manifest, and getting it from there: - // (this is considered a better practice, and will also allow us - // to maintain this number in only one place - the pom.xml file) - // -- L.A. 4.0.2 - - // One would assume, that once the version is in the MANIFEST.MF, - // as Implementation-Version:, it would be possible to obtain - // said version simply as - // appVersionString = getClass().getPackage().getImplementationVersion(); - // alas - that's not working, for whatever reason. (perhaps that's - // only how it works with jar-ed packages; not with .war files). - // People on the interwebs suggest that one should instead - // open the Manifest as a resource, then extract its attributes. - // There were some complications with that too. Plus, relying solely - // on the MANIFEST.MF would NOT work for those of the developers who - // are using "in place deployment" (i.e., where - // Netbeans runs their builds directly from the local target - // directory, bypassing the war file deployment; and the Manifest - // is only available in the .war file). For that reason, I am - // going to rely on the pom.properties file, and use java.util.Properties - // to read it. We have to look for this file in 2 different places - // depending on whether this is a .war file deployment, or a - // developers build. (the app-level META-INF is only populated when - // a .war file is built; the "maven-archiver" directory, on the other - // hand, is only available when it's a local build deployment). - // So, long story short, I'm resorting to the convoluted steps below. - // It may look hacky, but it should actually be pretty solid and - // reliable. - - - // First, find the absolute path url of the application persistence file - // always supplied with the Dataverse app: - java.net.URL fileUrl = Thread.currentThread().getContextClassLoader().getResource("META-INF/persistence.xml"); - String filePath = null; - - - if (fileUrl != null) { - filePath = fileUrl.getFile(); - if (filePath != null) { - InputStream mavenPropertiesInputStream = null; - String mavenPropertiesFilePath; - Properties mavenProperties = new Properties(); - - - filePath = filePath.replaceFirst("/[^/]*$", "/"); - // Using a relative path, find the location of the maven pom.properties file. - // First, try to look for it in the app-level META-INF. This will only be - // available if it's a war file deployment: - mavenPropertiesFilePath = filePath.concat("../../../META-INF/maven/edu.harvard.iq/dataverse/pom.properties"); - - try { - mavenPropertiesInputStream = new FileInputStream(mavenPropertiesFilePath); - } catch (IOException ioex) { - // OK, let's hope this is a local dev. build. - // In that case the properties file should be available in - // the maven-archiver directory: - - mavenPropertiesFilePath = filePath.concat("../../../../maven-archiver/pom.properties"); - - // try again: - - try { - mavenPropertiesInputStream = new FileInputStream(mavenPropertiesFilePath); - } catch (IOException ioex2) { - logger.warning("Failed to find and/or open for reading the pom.properties file."); - mavenPropertiesInputStream = null; - } - } - - if (mavenPropertiesInputStream != null) { - try { - mavenProperties.load(mavenPropertiesInputStream); - appVersionString = mavenProperties.getProperty("version"); - } catch (IOException ioex) { - logger.warning("caught IOException trying to read and parse the pom properties file."); - } finally { - IOUtils.closeQuietly(mavenPropertiesInputStream); - } - } - - } else { - logger.warning("Null file path representation of the location of persistence.xml in the webapp root directory!"); - } - } else { - logger.warning("Could not find the location of persistence.xml in the webapp root directory!"); - } - - - if (appVersionString == null) { - // still null? - defaulting to 4.0: - appVersionString = "4.0"; - } - } + // Retrieve the version via MPCONFIG + // NOTE: You may override the version via all methods of MPCONFIG. + // It will default to read from microprofile-config.properties source, + // which contains in the source a Maven property reference to ${project.version}. + // When packaging the app to deploy it, Maven will replace this, rendering it a static entry. + // NOTE: MicroProfile Config will cache the entry for us in internal maps. + String appVersion = JvmSettings.VERSION.lookup(); if (withBuildNumber) { - if (buildNumberString == null) { - // (build number is still in a .properties file in the source tree; it only - // contains a real build number if this war file was built by - // Jenkins) - + if (buildNumber == null) { + // (build number is still in a .properties file in the source tree; it only + // contains a real build number if this war file was built by Jenkins) + // TODO: might be replaced with same trick as for version via Maven property w/ empty default try { - buildNumberString = ResourceBundle.getBundle("BuildNumber").getString("build.number"); + buildNumber = ResourceBundle.getBundle("BuildNumber").getString("build.number"); } catch (MissingResourceException ex) { - buildNumberString = null; + buildNumber = null; + } + + // Also try to read the build number via MicroProfile Config if not already present from the + // properties file (so can be overridden by env var or other source) + if (buildNumber == null || buildNumber.isEmpty()) { + buildNumber = JvmSettings.BUILD.lookupOptional().orElse(""); } } - if (buildNumberString != null && !buildNumberString.equals("")) { - return appVersionString + " build " + buildNumberString; - } - } + if (!buildNumber.equals("")) { + return appVersion + " build " + buildNumber; + } + } - return appVersionString; + return appVersion; } - + + /** + * Retrieve the Solr endpoint in "host:port" form, to be used with a Solr client. + * + * This will retrieve the setting from either the database ({@link SettingsServiceBean.Key#SolrHostColonPort}) or + * via Microprofile Config API (properties {@link JvmSettings#SOLR_HOST} and {@link JvmSettings#SOLR_PORT}). + * + * A database setting always takes precedence. If not given via other config sources, a default from + * resources/META-INF/microprofile-config.properties is used. (It's possible to use profiles.) + * + * @return Solr endpoint as string "hostname:port" + */ public String getSolrHostColonPort() { - String SolrHost; - if ( System.getenv("SOLR_SERVICE_HOST") != null && System.getenv("SOLR_SERVICE_HOST") != ""){ - SolrHost = System.getenv("SOLR_SERVICE_HOST"); - } - else SolrHost = saneDefaultForSolrHostColonPort; - String solrHostColonPort = settingsService.getValueForKey(SettingsServiceBean.Key.SolrHostColonPort, SolrHost); - return solrHostColonPort; + // Get from MPCONFIG. Might be configured by a sysadmin or simply return the default shipped with + // resources/META-INF/microprofile-config.properties. + // NOTE: containers should use system property mp.config.profile=ct to use sane container usage default + String host = JvmSettings.SOLR_HOST.lookup(); + String port = JvmSettings.SOLR_PORT.lookup(); + + // DB setting takes precedence over all. If not present, will return default from above. + return Optional.ofNullable(settingsService.getValueForKey(SettingsServiceBean.Key.SolrHostColonPort)) + .orElse(host + ":" + port); } public boolean isProvCollectionEnabled() { @@ -340,32 +234,58 @@ public static int getMinutesUntilPasswordResetTokenExpires() { } /** - * The "official", designated URL of the site; - * can be defined as a complete URL; or derived from the - * "official" hostname. If none of these options is set, - * defaults to the InetAddress.getLocalHOst() and https; - * These are legacy JVM options. Will be eventualy replaced - * by the Settings Service configuration. + * Lookup (or construct) the designated URL of this instance from configuration. + * + * Can be defined as a complete URL via dataverse.siteUrl; or derived from the hostname + * dataverse.fqdn and HTTPS. If none of these options is set, defaults to the + * {@link InetAddress#getLocalHost} and HTTPS. + * + * NOTE: This method does not provide any validation. + * TODO: The behaviour of this method is subject to a later change, see + * https://github.com/IQSS/dataverse/issues/6636 + * + * @return The designated URL of this instance as per configuration. */ public String getDataverseSiteUrl() { return getDataverseSiteUrlStatic(); } + /** + * Lookup (or construct) the designated URL of this instance from configuration. + * + * Can be defined as a complete URL via dataverse.siteUrl; or derived from the hostname + * dataverse.fqdn and HTTPS. If none of these options is set, defaults to the + * {@link InetAddress#getLocalHost} and HTTPS. + * + * NOTE: This method does not provide any validation. + * TODO: The behaviour of this method is subject to a later change, see + * https://github.com/IQSS/dataverse/issues/6636 + * + * @return The designated URL of this instance as per configuration. + */ public static String getDataverseSiteUrlStatic() { - String hostUrl = System.getProperty(SITE_URL); - if (hostUrl != null && !"".equals(hostUrl)) { - return hostUrl; + // If dataverse.siteUrl has been configured, simply return it + Optional siteUrl = JvmSettings.SITE_URL.lookupOptional(); + if (siteUrl.isPresent()) { + return siteUrl.get(); } - String hostName = System.getProperty(FQDN); - if (hostName == null) { - try { - hostName = InetAddress.getLocalHost().getCanonicalHostName(); - } catch (UnknownHostException e) { - return null; - } + + // Other wise try to lookup dataverse.fqdn setting and default to HTTPS + Optional fqdn = JvmSettings.FQDN.lookupOptional(); + if (fqdn.isPresent()) { + return "https://" + fqdn.get(); + } + + // Last resort - get the servers local name and use it. + // BEWARE - this is dangerous. + // 1) A server might have a different name than your repository URL. + // 2) The underlying reverse DNS lookup might point to a different name than your repository URL. + // 3) If this server has multiple IPs assigned, which one will it be for the lookup? + try { + return "https://" + InetAddress.getLocalHost().getCanonicalHostName(); + } catch (UnknownHostException e) { + return null; } - hostUrl = "https://" + hostName; - return hostUrl; } /** @@ -375,22 +295,6 @@ public String getPageURLWithQueryString() { return PrettyContext.getCurrentInstance().getRequestURL().toURL() + PrettyContext.getCurrentInstance().getRequestQueryString().toQueryString(); } - /** - * The "official" server's fully-qualified domain name: - */ - public String getDataverseServer() { - // still reliese on a JVM option: - String fqdn = System.getProperty(FQDN); - if (fqdn == null) { - try { - fqdn = InetAddress.getLocalHost().getCanonicalHostName(); - } catch (UnknownHostException e) { - return null; - } - } - return fqdn; - } - public String getGuidesBaseUrl() { String saneDefault = "https://guides.dataverse.org"; String guidesBaseUrl = settingsService.getValueForKey(SettingsServiceBean.Key.GuidesBaseUrl, saneDefault); @@ -1090,11 +994,6 @@ public boolean isDatafileValidationOnPublishEnabled() { public boolean directUploadEnabled(DvObjectContainer container) { return Boolean.getBoolean("dataverse.files." + container.getEffectiveStorageDriverId() + ".upload-redirect"); } - - public String getDataCiteRestApiUrlString() { - //As of 5.0 the 'doi.dataciterestapiurlstring' is the documented jvm option. Prior versions used 'doi.mdcbaseurlstring' or were hardcoded to api.datacite.org, so the defaults are for backward compatibility. - return System.getProperty("doi.dataciterestapiurlstring", System.getProperty("doi.mdcbaseurlstring", "https://api.datacite.org")); - } public boolean isExternalDataverseValidationEnabled() { return settingsService.getValueForKey(SettingsServiceBean.Key.DataverseMetadataValidatorScript) != null; diff --git a/src/main/resources/META-INF/microprofile-config.properties b/src/main/resources/META-INF/microprofile-config.properties index 09d71dfbf3a..b5af966ee80 100644 --- a/src/main/resources/META-INF/microprofile-config.properties +++ b/src/main/resources/META-INF/microprofile-config.properties @@ -1,5 +1,35 @@ +# GENERAL +# Will be replaced by Maven property in /target via filtering (see ) +dataverse.version=${project.version} +dataverse.build= + +# Default only for containers! (keep mimicking the current behaviour +# changing that is part of https://github.com/IQSS/dataverse/issues/6636) +%ct.dataverse.fqdn=localhost +%ct.dataverse.siteUrl=http://${dataverse.fqdn}:8080 + +# FILES +dataverse.files.directory=/tmp/dataverse + +# SEARCH INDEX +dataverse.solr.host=localhost +# Activating mp config profile -Dmp.config.profile=ct changes default to "solr" as DNS name +%ct.dataverse.solr.host=solr +dataverse.solr.port=8983 +dataverse.solr.protocol=http +dataverse.solr.core=collection1 +dataverse.solr.path=/solr/${dataverse.solr.core} + # DATABASE dataverse.db.host=localhost dataverse.db.port=5432 dataverse.db.user=dataverse dataverse.db.name=dataverse + +# PERSISTENT IDENTIFIER PROVIDERS +# EZID +dataverse.pid.ezid.api-url=https://ezid.cdlib.org + +# DataCite +dataverse.pid.datacite.mds-api-url=https://mds.test.datacite.org +dataverse.pid.datacite.rest-api-url=https://api.test.datacite.org \ No newline at end of file diff --git a/src/main/resources/META-INF/services/org.eclipse.microprofile.config.spi.ConfigSourceProvider b/src/main/resources/META-INF/services/org.eclipse.microprofile.config.spi.ConfigSourceProvider index f2e23ca1b4e..796f03d7ce3 100644 --- a/src/main/resources/META-INF/services/org.eclipse.microprofile.config.spi.ConfigSourceProvider +++ b/src/main/resources/META-INF/services/org.eclipse.microprofile.config.spi.ConfigSourceProvider @@ -1,2 +1 @@ edu.harvard.iq.dataverse.settings.spi.AliasConfigSourceProvider -edu.harvard.iq.dataverse.settings.spi.DbSettingConfigSourceProvider diff --git a/src/test/java/edu/harvard/iq/dataverse/export/SchemaDotOrgExporterTest.java b/src/test/java/edu/harvard/iq/dataverse/export/SchemaDotOrgExporterTest.java index b5453e75fe5..7119dfaf834 100644 --- a/src/test/java/edu/harvard/iq/dataverse/export/SchemaDotOrgExporterTest.java +++ b/src/test/java/edu/harvard/iq/dataverse/export/SchemaDotOrgExporterTest.java @@ -6,9 +6,9 @@ import edu.harvard.iq.dataverse.license.LicenseServiceBean; import edu.harvard.iq.dataverse.mocks.MockDatasetFieldSvc; -import static edu.harvard.iq.dataverse.util.SystemConfig.SITE_URL; import static edu.harvard.iq.dataverse.util.SystemConfig.FILES_HIDE_SCHEMA_DOT_ORG_DOWNLOAD_URLS; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; import edu.harvard.iq.dataverse.util.json.JsonParser; import edu.harvard.iq.dataverse.util.json.JsonUtil; @@ -31,6 +31,8 @@ import javax.json.Json; import javax.json.JsonObject; import javax.json.JsonReader; + +import edu.harvard.iq.dataverse.util.testing.JvmSetting; import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.AfterAll; import org.junit.jupiter.api.Test; @@ -64,6 +66,7 @@ public static void tearDownClass() { * Test of exportDataset method, of class SchemaDotOrgExporter. */ @Test + @JvmSetting(key = JvmSettings.SITE_URL, value = "https://librascholar.org") public void testExportDataset() throws Exception { File datasetVersionJson = new File("src/test/resources/json/dataset-finch2.json"); String datasetVersionAsJson = new String(Files.readAllBytes(Paths.get(datasetVersionJson.getAbsolutePath()))); @@ -92,7 +95,6 @@ public void testExportDataset() throws Exception { Dataverse dataverse = new Dataverse(); dataverse.setName("LibraScholar"); dataset.setOwner(dataverse); - System.setProperty(SITE_URL, "https://librascholar.org"); boolean hideFileUrls = false; if (hideFileUrls) { System.setProperty(FILES_HIDE_SCHEMA_DOT_ORG_DOWNLOAD_URLS, "true"); diff --git a/src/test/java/edu/harvard/iq/dataverse/search/IndexServiceBeanTest.java b/src/test/java/edu/harvard/iq/dataverse/search/IndexServiceBeanTest.java index ad4647e4898..aab6af660cb 100644 --- a/src/test/java/edu/harvard/iq/dataverse/search/IndexServiceBeanTest.java +++ b/src/test/java/edu/harvard/iq/dataverse/search/IndexServiceBeanTest.java @@ -1,18 +1,5 @@ package edu.harvard.iq.dataverse.search; -import static org.junit.Assert.assertTrue; - -import java.io.IOException; -import java.util.Arrays; -import java.util.Set; -import java.util.logging.Logger; -import java.util.stream.Collectors; - -import org.apache.solr.client.solrj.SolrServerException; -import org.junit.Before; -import org.junit.Test; -import org.mockito.Mockito; - import edu.harvard.iq.dataverse.ControlledVocabularyValue; import edu.harvard.iq.dataverse.Dataset; import edu.harvard.iq.dataverse.DatasetField; @@ -26,21 +13,47 @@ import edu.harvard.iq.dataverse.MetadataBlock; import edu.harvard.iq.dataverse.branding.BrandingUtil; import edu.harvard.iq.dataverse.mocks.MocksFactory; +import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; import edu.harvard.iq.dataverse.util.SystemConfig; +import edu.harvard.iq.dataverse.util.testing.JvmSetting; +import org.apache.solr.client.solrj.SolrServerException; +import org.apache.solr.client.solrj.impl.HttpSolrClient; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.InjectMocks; +import org.mockito.Mock; +import org.mockito.Mockito; +import org.mockito.junit.jupiter.MockitoExtension; +import java.io.IOException; +import java.util.Arrays; +import java.util.Set; +import java.util.logging.Logger; +import java.util.stream.Collectors; + +import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertTrue; + +@ExtendWith(MockitoExtension.class) public class IndexServiceBeanTest { private static final Logger logger = Logger.getLogger(IndexServiceBeanTest.class.getCanonicalName()); private IndexServiceBean indexService; private Dataverse dataverse; - @Before + @Mock + private SettingsServiceBean settingsService; + @InjectMocks + private SystemConfig systemConfig = new SystemConfig(); + + @BeforeEach public void setUp() { dataverse = MocksFactory.makeDataverse(); dataverse.setDataverseType(DataverseType.UNCATEGORIZED); indexService = new IndexServiceBean(); - indexService.systemConfig = new SystemConfig(); + indexService.systemConfig = systemConfig; indexService.settingsService = Mockito.mock(SettingsServiceBean.class); indexService.dataverseService = Mockito.mock(DataverseServiceBean.class); indexService.datasetFieldService = Mockito.mock(DatasetFieldServiceBean.class); @@ -48,6 +61,36 @@ public void setUp() { Mockito.when(indexService.dataverseService.findRootDataverse()).thenReturn(dataverse); } + + @Test + public void testInitWithDefaults() { + // given + String url = "http://localhost:8983/solr/collection1"; + + // when + indexService.init(); + + // then + HttpSolrClient client = (HttpSolrClient) indexService.solrServer; + assertEquals(url, client.getBaseURL()); + } + + + @Test + @JvmSetting(key = JvmSettings.SOLR_HOST, value = "foobar") + @JvmSetting(key = JvmSettings.SOLR_PORT, value = "1234") + @JvmSetting(key = JvmSettings.SOLR_CORE, value = "test") + void testInitWithConfig() { + // given + String url = "http://foobar:1234/solr/test"; + + // when + indexService.init(); + + // then + HttpSolrClient client = (HttpSolrClient) indexService.solrServer; + assertEquals(url, client.getBaseURL()); + } @Test public void TestIndexing() throws SolrServerException, IOException { diff --git a/src/test/java/edu/harvard/iq/dataverse/search/SolrClientServiceTest.java b/src/test/java/edu/harvard/iq/dataverse/search/SolrClientServiceTest.java new file mode 100644 index 00000000000..a3b3c8a2080 --- /dev/null +++ b/src/test/java/edu/harvard/iq/dataverse/search/SolrClientServiceTest.java @@ -0,0 +1,59 @@ +package edu.harvard.iq.dataverse.search; + +import edu.harvard.iq.dataverse.settings.JvmSettings; +import edu.harvard.iq.dataverse.settings.SettingsServiceBean; +import edu.harvard.iq.dataverse.util.SystemConfig; +import edu.harvard.iq.dataverse.util.testing.JvmSetting; +import org.apache.solr.client.solrj.impl.HttpSolrClient; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; +import org.mockito.InjectMocks; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; + +import static org.junit.jupiter.api.Assertions.assertEquals; + +@ExtendWith(MockitoExtension.class) +class SolrClientServiceTest { + + @Mock + SettingsServiceBean settingsServiceBean; + @InjectMocks + SystemConfig systemConfig; + SolrClientService clientService = new SolrClientService(); + + @BeforeEach + void setUp() { + clientService.systemConfig = systemConfig; + } + + @Test + void testInitWithDefaults() { + // given + String url = "http://localhost:8983/solr/collection1"; + + // when + clientService.init(); + + // then + HttpSolrClient client = (HttpSolrClient) clientService.getSolrClient(); + assertEquals(url, client.getBaseURL()); + } + + @Test + @JvmSetting(key = JvmSettings.SOLR_HOST, value = "foobar") + @JvmSetting(key = JvmSettings.SOLR_PORT, value = "1234") + @JvmSetting(key = JvmSettings.SOLR_CORE, value = "test") + void testInitWithConfig() { + // given + String url = "http://foobar:1234/solr/test"; + + // when + clientService.init(); + + // then + HttpSolrClient client = (HttpSolrClient) clientService.getSolrClient(); + assertEquals(url, client.getBaseURL()); + } +} \ No newline at end of file diff --git a/src/test/java/edu/harvard/iq/dataverse/settings/source/AliasConfigSourceTest.java b/src/test/java/edu/harvard/iq/dataverse/settings/source/AliasConfigSourceTest.java deleted file mode 100644 index 36c4d99f743..00000000000 --- a/src/test/java/edu/harvard/iq/dataverse/settings/source/AliasConfigSourceTest.java +++ /dev/null @@ -1,41 +0,0 @@ -package edu.harvard.iq.dataverse.settings.source; - -import org.junit.jupiter.api.Test; - -import java.io.IOException; -import java.util.Properties; - -import static org.junit.jupiter.api.Assertions.*; - -class AliasConfigSourceTest { - - AliasConfigSource source = new AliasConfigSource(); - - @Test - void getValue() { - // given - System.setProperty("dataverse.hello.foobar", "test"); - Properties aliases = new Properties(); - aliases.setProperty("dataverse.goodbye.foobar", "dataverse.hello.foobar"); - - // when - source.importAliases(aliases); - - // then - assertEquals("test", source.getValue("dataverse.goodbye.foobar")); - } - - @Test - void readImportTestAliasesFromFile() throws IOException { - // given - System.setProperty("dataverse.old.example", "test"); - String filePath = "test-microprofile-aliases.properties"; - - // when - Properties aliases = source.readAliases(filePath); - source.importAliases(aliases); - - // then - assertEquals("test", source.getValue("dataverse.new.example")); - } -} \ No newline at end of file diff --git a/src/test/java/edu/harvard/iq/dataverse/settings/source/DbSettingConfigSourceTest.java b/src/test/java/edu/harvard/iq/dataverse/settings/source/DbSettingConfigSourceTest.java deleted file mode 100644 index 9ceca24aadf..00000000000 --- a/src/test/java/edu/harvard/iq/dataverse/settings/source/DbSettingConfigSourceTest.java +++ /dev/null @@ -1,48 +0,0 @@ -package edu.harvard.iq.dataverse.settings.source; - -import edu.harvard.iq.dataverse.settings.Setting; -import edu.harvard.iq.dataverse.settings.SettingsServiceBean; -import org.junit.jupiter.api.BeforeEach; -import org.junit.jupiter.api.MethodOrderer.OrderAnnotation; -import org.junit.jupiter.api.Order; -import org.junit.jupiter.api.Test; -import org.junit.jupiter.api.TestMethodOrder; -import org.junit.jupiter.api.extension.ExtendWith; -import org.mockito.Mock; -import org.mockito.Mockito; -import org.mockito.junit.jupiter.MockitoExtension; - -import java.util.Arrays; -import java.util.HashSet; -import java.util.Set; - -import static org.junit.jupiter.api.Assertions.*; - -@ExtendWith(MockitoExtension.class) -@TestMethodOrder(OrderAnnotation.class) -class DbSettingConfigSourceTest { - - DbSettingConfigSource dbSource = new DbSettingConfigSource(); - @Mock - SettingsServiceBean settingsSvc; - - @Test - @Order(1) - void testEmptyIfNoSettingsService() { - assertEquals(null, dbSource.getValue("foobar")); - assertDoesNotThrow(DbSettingConfigSource::updateProperties); - } - - @Test - @Order(2) - void testDataRetrieval() { - Set settings = new HashSet<>(Arrays.asList(new Setting(":FooBar", "hello"), new Setting(":FooBarI18N", "de", "hallo"))); - Mockito.when(settingsSvc.listAll()).thenReturn(settings); - - DbSettingConfigSource.injectSettingsService(settingsSvc); - - assertEquals("hello", dbSource.getValue("dataverse.settings.fromdb.FooBar")); - assertEquals("hallo", dbSource.getValue("dataverse.settings.fromdb.FooBarI18N.de")); - } - -} \ No newline at end of file diff --git a/src/test/java/edu/harvard/iq/dataverse/util/SystemConfigTest.java b/src/test/java/edu/harvard/iq/dataverse/util/SystemConfigTest.java index 891b029f521..8c944ff0a5b 100644 --- a/src/test/java/edu/harvard/iq/dataverse/util/SystemConfigTest.java +++ b/src/test/java/edu/harvard/iq/dataverse/util/SystemConfigTest.java @@ -1,13 +1,103 @@ package edu.harvard.iq.dataverse.util; +import edu.harvard.iq.dataverse.settings.JvmSettings; +import edu.harvard.iq.dataverse.settings.SettingsServiceBean; +import edu.harvard.iq.dataverse.util.testing.JvmSetting; import org.junit.jupiter.api.Test; +import org.junit.jupiter.api.extension.ExtendWith; import org.junit.jupiter.params.ParameterizedTest; import org.junit.jupiter.params.provider.CsvSource; +import org.mockito.InjectMocks; +import org.mockito.Mock; +import org.mockito.junit.jupiter.MockitoExtension; import static org.junit.jupiter.api.Assertions.assertEquals; +import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.mockito.Mockito.doReturn; +@ExtendWith(MockitoExtension.class) class SystemConfigTest { + @InjectMocks + SystemConfig systemConfig = new SystemConfig(); + @Mock + SettingsServiceBean settingsService; + + @Test + void testGetVersion() { + // given + String version = "100.100"; + System.setProperty(JvmSettings.VERSION.getScopedKey(), version); + + // when + String result = systemConfig.getVersion(false); + + // then + assertEquals(version, result); + } + + @Test + void testGetVersionWithBuild() { + // given + String version = "100.100"; + String build = "FOOBAR"; + System.setProperty(JvmSettings.VERSION.getScopedKey(), version); + System.setProperty(JvmSettings.BUILD.getScopedKey(), build); + + // when + String result = systemConfig.getVersion(true); + + // then + assertTrue(result.startsWith(version), "'" + result + "' not starting with " + version); + assertTrue(result.contains("build")); + + // Cannot test this here - there might be the bundle file present which is not under test control + //assertTrue(result.endsWith(build), "'" + result + "' not ending with " + build); + } + + @Test + @JvmSetting(key = JvmSettings.SOLR_HOST, value = "foobar") + @JvmSetting(key = JvmSettings.SOLR_PORT, value = "1234") + void testGetSolrHostColonPortNoDBEntry() { + // given + String hostPort = "foobar:1234"; + + // when + doReturn(null).when(settingsService).getValueForKey(SettingsServiceBean.Key.SolrHostColonPort); + String result = systemConfig.getSolrHostColonPort(); + + // then + assertEquals(hostPort, result); + } + + @Test + @JvmSetting(key = JvmSettings.SOLR_HOST, value = "foobar") + @JvmSetting(key = JvmSettings.SOLR_PORT, value = "1234") + void testGetSolrHostColonPortWithDBEntry() { + // given + String dbEntry = "hello:4321"; + + // when + doReturn(dbEntry).when(settingsService).getValueForKey(SettingsServiceBean.Key.SolrHostColonPort); + String result = systemConfig.getSolrHostColonPort(); + + // then + assertEquals(dbEntry, result); + } + + @Test + void testGetSolrHostColonPortDefault() { + // given + String hostPort = "localhost:8983"; + + // when + doReturn(null).when(settingsService).getValueForKey(SettingsServiceBean.Key.SolrHostColonPort); + String result = systemConfig.getSolrHostColonPort(); + + // then + assertEquals(hostPort, result); + } + @Test void testGetLongLimitFromStringOrDefault_withNullInput() { long defaultValue = 5L; diff --git a/src/test/java/edu/harvard/iq/dataverse/util/testing/JvmSetting.java b/src/test/java/edu/harvard/iq/dataverse/util/testing/JvmSetting.java new file mode 100644 index 00000000000..a8c0e1f7481 --- /dev/null +++ b/src/test/java/edu/harvard/iq/dataverse/util/testing/JvmSetting.java @@ -0,0 +1,62 @@ +package edu.harvard.iq.dataverse.util.testing; + +import org.junit.jupiter.api.extension.ExtendWith; +import org.junit.jupiter.api.parallel.ResourceAccessMode; +import org.junit.jupiter.api.parallel.ResourceLock; +import org.junit.jupiter.api.parallel.Resources; + +import java.lang.annotation.ElementType; +import java.lang.annotation.Inherited; +import java.lang.annotation.Repeatable; +import java.lang.annotation.Retention; +import java.lang.annotation.RetentionPolicy; +import java.lang.annotation.Target; + + +/** + * {@code @SetJvmSetting} is a JUnit Jupiter extension to set the value of a + * JVM setting (internally a system property) for a test execution. + * + *

The key and value of the JVM setting to be set must be specified via + * {@link #key()} and {@link #value()}. After the annotated method has been + * executed, the initial default value is restored.

+ * + *

{@code SetJvmSetting} can be used on the method and on the class level. + * It is repeatable and inherited from higher-level containers. If a class is + * annotated, the configured property will be set before every test inside that + * class. Any method level configurations will override the class level + * configurations.

+ * + * Parallel execution of tests using this extension is prohibited by using + * resource locking provided by JUnit5 - system properties are a global state, + * these tests NEED to be done serial. + */ +@Retention(RetentionPolicy.RUNTIME) +@Target({ ElementType.METHOD, ElementType.TYPE }) +@Inherited +@Repeatable(JvmSetting.JvmSettings.class) +@ExtendWith(JvmSettingExtension.class) +@ResourceLock(value = Resources.SYSTEM_PROPERTIES, mode = ResourceAccessMode.READ_WRITE) +public @interface JvmSetting { + + /** + * The key of the system property to be set. + */ + edu.harvard.iq.dataverse.settings.JvmSettings key(); + + /** + * The value of the system property to be set. + */ + String value(); + + /** + * Containing annotation of repeatable {@code @SetSystemProperty}. + */ + @Retention(RetentionPolicy.RUNTIME) + @Target({ ElementType.METHOD, ElementType.TYPE }) + @Inherited + @ExtendWith(JvmSettingExtension.class) + @interface JvmSettings { + JvmSetting[] value(); + } +} diff --git a/src/test/java/edu/harvard/iq/dataverse/util/testing/JvmSettingExtension.java b/src/test/java/edu/harvard/iq/dataverse/util/testing/JvmSettingExtension.java new file mode 100644 index 00000000000..de18df0f575 --- /dev/null +++ b/src/test/java/edu/harvard/iq/dataverse/util/testing/JvmSettingExtension.java @@ -0,0 +1,49 @@ +package edu.harvard.iq.dataverse.util.testing; + +import org.junit.jupiter.api.extension.AfterTestExecutionCallback; +import org.junit.jupiter.api.extension.BeforeTestExecutionCallback; +import org.junit.jupiter.api.extension.ExtensionContext; + +public class JvmSettingExtension implements BeforeTestExecutionCallback, AfterTestExecutionCallback { + + private ExtensionContext.Store getStore(ExtensionContext context) { + return context.getStore(ExtensionContext.Namespace.create(getClass(), context.getRequiredTestClass(), context.getRequiredTestMethod())); + } + + @Override + public void beforeTestExecution(ExtensionContext extensionContext) throws Exception { + extensionContext.getTestMethod().ifPresent(method -> { + JvmSetting[] settings = method.getAnnotationsByType(JvmSetting.class); + for (JvmSetting setting : settings) { + // get the setting ... + String oldSetting = System.getProperty(setting.key().getScopedKey()); + + // if present - store in context to restore later + if (oldSetting != null) { + getStore(extensionContext).put(setting.key().getScopedKey(), oldSetting); + } + + // set to new value + System.setProperty(setting.key().getScopedKey(), setting.value()); + } + }); + } + + @Override + public void afterTestExecution(ExtensionContext extensionContext) throws Exception { + extensionContext.getTestMethod().ifPresent(method -> { + JvmSetting[] settings = method.getAnnotationsByType(JvmSetting.class); + for (JvmSetting setting : settings) { + // get a stored setting from context + String oldSetting = getStore(extensionContext).remove(setting.key().getScopedKey(), String.class); + // if present before, restore + if (oldSetting != null) { + System.setProperty(setting.key().getScopedKey(), oldSetting); + // if NOT present before, delete + } else { + System.clearProperty(setting.key().getScopedKey()); + } + } + }); + } +} diff --git a/src/test/resources/META-INF/microprofile-config.properties b/src/test/resources/META-INF/microprofile-config.properties new file mode 100644 index 00000000000..21f70b53896 --- /dev/null +++ b/src/test/resources/META-INF/microprofile-config.properties @@ -0,0 +1,11 @@ +# DEFAULTS FOR TESTS +# Unlike src/main/resources/META-INF/microprofile-config.properties, this file will not be included in +# a packaged WAR. It can be used to provide sane defaults for things like unit tests on classes requiring +# some sort of configuration. + +# PersistentIdentifierServiceBeanTest loads all the providers, which makes the EZID provider reach out +# to the service - switching to example.org to not trigger a DDoS via test executions at their place. +dataverse.pid.ezid.api-url=http://example.org +# Also requires the username and the password to be present when used in production, use a default for unit testing. +dataverse.pid.ezid.username=Dataverse Unit Test +dataverse.pid.ezid.password=supersecret \ No newline at end of file