Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

7000 mpconfig for JVM settings #8810

Closed
wants to merge 25 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
156f81f
fix(settings): remove DbConfigSource #7680
poikilotherm Jun 20, 2022
0ab159d
deps(settings): switch to Smallrye MP config #7000
poikilotherm Jun 20, 2022
436cd66
feat(settings): start new enum JvmSettings #7000
poikilotherm Jun 20, 2022
5e63b0f
feat(settings): filter MPCONFIG for Maven properties #7000
poikilotherm Jun 20, 2022
4f1964e
feat(settings): add Dataverse version to MPCONFIG #7000
poikilotherm Jun 20, 2022
4cd3573
refactor(settings): simplify SystemConfig.getVersion #7000
poikilotherm Jun 20, 2022
94b8d88
feat(tests): add JUnit5 JVM setting helper
poikilotherm Jun 21, 2022
91ccdc4
docs(dev): add notes about @JvmSetting for unit tests
poikilotherm Jun 21, 2022
770dd23
refactor(settings): make Solr endpoint configurable via MPCONFIG #7000
poikilotherm Jun 21, 2022
edb0b64
feat(settings,solr): make Solr URL details configurable
poikilotherm Jun 21, 2022
7c98cf1
docs(settings): add Solr MPCONFIG options to guides #7000
poikilotherm Jun 21, 2022
561c965
feat(settings): add fluent JvmSettings.lookup() #7000
poikilotherm Jun 22, 2022
983e49f
refactor(settings): use new JvmSettings.lookup() #7000
poikilotherm Jun 22, 2022
a5aab2d
refactor(settings): replace lookups of dataverse.files.directory with…
poikilotherm Jun 22, 2022
ae8368c
docs(settings): provide more detail for dataverse.files.directory
poikilotherm Jun 22, 2022
6e7978c
refactor(settings): store scoped key on construction for JvmSettings
poikilotherm Jun 23, 2022
a2edcf0
docs(settings): mark :SolrHostColonPort with @Deprecated #7000
poikilotherm Jun 23, 2022
8cb8d97
feat(settings): add old names / alias support in JvmSettings #7000
poikilotherm Jun 23, 2022
2d84352
refactor(settings): replace dataverse.fqdn and siteUrl lookups via MP…
poikilotherm Jun 27, 2022
5278a82
docs(settings): update fqdn and siteUrl desc
poikilotherm Jun 27, 2022
065ec61
style: replace system prop 'file.separator' with File.separator
poikilotherm Jun 27, 2022
b43315a
refactor(settings): rename and retrieve EZID/DataCite settings via MP…
poikilotherm Jun 27, 2022
51c2793
docs(api): fix wrong position of PID API anchor
poikilotherm Jun 29, 2022
c541538
docs(settings): update DOI provider JVM options #7000
poikilotherm Jun 29, 2022
4e41d25
test(pid): add defaults for EZID via MPCONFIG
poikilotherm Jun 29, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion doc/sphinx-guides/source/admin/make-data-count.rst
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,9 @@ Configuring Your Dataverse Installation for Make Data Count Citations

Please note: as explained in the note above about limitations, this feature is not available to Dataverse installations that use Handles.

To configure your Dataverse installation to pull citations from the test vs. production DataCite server see :ref:`doi.dataciterestapiurlstring` in the Installation Guide.
To configure your Dataverse installation to pull citations from the test vs.
production DataCite server see :ref:`dataverse.pid.datacite.rest-api-url` in
the Installation Guide.

Please note that in the curl example, Bash environment variables are used with the idea that you can set a few environment variables and copy and paste the examples as is. For example, "$DOI" could become "doi:10.5072/FK2/BL2IBM" by issuing the following export command from Bash:

Expand Down
8 changes: 7 additions & 1 deletion doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -485,6 +485,8 @@ You should expect an HTTP 200 ("OK") response and JSON indicating the database I

.. note:: Only a Dataverse installation account with superuser permissions is allowed to include files when creating a dataset via this API. Adding files this way only adds their file metadata to the database, you will need to manually add the physical files to the file system.

.. _api-import-dataset:

Import a Dataset into a Dataverse Collection
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Expand Down Expand Up @@ -2932,7 +2934,7 @@ Each user can get a dump of their basic information in JSON format by passing in

curl -H "X-Dataverse-key:$API_TOKEN" $SERVER_URL/api/users/:me

.. _pids-api:


Managing Harvesting Server and Sets
-----------------------------------
Expand Down Expand Up @@ -3043,6 +3045,10 @@ The fully expanded example above (without the environment variables) looks like

Only users with superuser permissions may delete harvesting sets.



.. _pids-api:

PIDs
----

Expand Down
11 changes: 11 additions & 0 deletions doc/sphinx-guides/source/developers/testing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,17 @@ greatly extended parameterized testing. Some guidance how to write those:
- https://blog.codefx.org/libraries/junit-5-parameterized-tests/
- See also some examples in our codebase.

JUnit 5 Test Helper Extensions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Our codebase provides little helpers to ease dealing with state during tests.
Some tests might need to change something which should be restored after the test ran.

For unit tests, the most interesting part is to set a JVM setting just for the current test.
Please use ``@JvmSetting(key = JvmSettings.XXX, value = "")`` annotation on a test method or
a test class to set and clear the property automatically.


Observing Changes to Code Coverage
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Expand Down
402 changes: 329 additions & 73 deletions doc/sphinx-guides/source/installation/config.rst

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion modules/dataverse-parent/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,7 @@

<!-- Testing dependencies -->
<testcontainers.version>1.15.0</testcontainers.version>
<microbean-mpconfig.version>0.4.1</microbean-mpconfig.version>
<smallrye-mpconfig.version>2.10.1</smallrye-mpconfig.version>

<junit.version>4.13.1</junit.version>
<junit.jupiter.version>5.7.0</junit.jupiter.version>
Expand Down
15 changes: 11 additions & 4 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -601,9 +601,9 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.microbean</groupId>
<artifactId>microbean-microprofile-config</artifactId>
<version>${microbean-mpconfig.version}</version>
<groupId>io.smallrye.config</groupId>
<artifactId>smallrye-config</artifactId>
<version>${smallrye-mpconfig.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
Expand Down Expand Up @@ -641,10 +641,17 @@
<include>**/*.xml</include>
<include>**/firstNames/*.*</include>
<include>**/*.xsl</include>
<include>**/*.properties</include>
<include>**/services/*</include>
</includes>
</resource>
<resource>
<directory>src/main/resources</directory>
<!-- Filter files matching here for Maven properties and replace -->
<filtering>true</filtering>
<includes>
<include>**/*.properties</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,8 @@
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.TypedQuery;

import edu.harvard.iq.dataverse.settings.JvmSettings;
import org.apache.commons.text.StringEscapeUtils;
import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
Expand Down Expand Up @@ -53,7 +55,11 @@ public class DOIDataCiteRegisterService {

private DataCiteRESTfullClient getClient() throws IOException {
if (client == null) {
client = new DataCiteRESTfullClient(System.getProperty("doi.baseurlstring"), System.getProperty("doi.username"), System.getProperty("doi.password"));
client = new DataCiteRESTfullClient(
JvmSettings.DATACITE_MDS_API_URL.lookup(),
JvmSettings.DATACITE_USERNAME.lookup(),
JvmSettings.DATACITE_PASSWORD.lookup()
);
}
return client;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
import javax.ejb.EJB;
import javax.ejb.Stateless;

import edu.harvard.iq.dataverse.settings.JvmSettings;
import org.apache.commons.httpclient.HttpException;
import org.apache.commons.httpclient.HttpStatus;

Expand Down Expand Up @@ -219,9 +220,9 @@ public void deleteIdentifier(DvObject dvObject) throws IOException, HttpExceptio
private void deleteDraftIdentifier(DvObject dvObject) throws IOException {

//ToDo - incorporate into DataCiteRESTfulClient
String baseUrl = systemConfig.getDataCiteRestApiUrlString();
String username = System.getProperty("doi.username");
String password = System.getProperty("doi.password");
String baseUrl = JvmSettings.DATACITE_REST_API_URL.lookup();
String username = JvmSettings.DATACITE_USERNAME.lookup();
String password = JvmSettings.DATACITE_PASSWORD.lookup();
GlobalId doi = dvObject.getGlobalId();
/**
* Deletes the DOI from DataCite if it can. Returns 204 if PID was deleted
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
package edu.harvard.iq.dataverse;

import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.ucsb.nceas.ezid.EZIDException;
import edu.ucsb.nceas.ezid.EZIDService;
import edu.ucsb.nceas.ezid.EZIDServiceRequest;
Expand All @@ -26,10 +27,10 @@ public class DOIEZIdServiceBean extends AbstractGlobalIdServiceBean {

public DOIEZIdServiceBean() {
logger.log(Level.FINE,"Constructor");
baseURLString = System.getProperty("doi.baseurlstring");
baseURLString = JvmSettings.EZID_API_URL.lookup();
ezidService = new EZIDService(baseURLString);
USERNAME = System.getProperty("doi.username");
PASSWORD = System.getProperty("doi.password");
USERNAME = JvmSettings.EZID_USERNAME.lookup();
PASSWORD = JvmSettings.EZID_PASSWORD.lookup();
logger.log(Level.FINE, "Using baseURLString {0}", baseURLString);
try {
ezidService.login(USERNAME, PASSWORD);
Expand Down
9 changes: 4 additions & 5 deletions src/main/java/edu/harvard/iq/dataverse/Dataset.java
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,8 @@
import javax.persistence.Table;
import javax.persistence.Temporal;
import javax.persistence.TemporalType;

import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.harvard.iq.dataverse.util.BundleUtil;
import edu.harvard.iq.dataverse.util.StringUtil;

Expand Down Expand Up @@ -515,11 +517,8 @@ private Collection<String> getCategoryNames() {
@Deprecated
public Path getFileSystemDirectory() {
Path studyDir = null;

String filesRootDirectory = System.getProperty("dataverse.files.directory");
if (filesRootDirectory == null || filesRootDirectory.equals("")) {
filesRootDirectory = "/tmp/files";
}

String filesRootDirectory = JvmSettings.FILES_DIRECTORY.lookup();

if (this.getAlternativePersistentIndentifiers() != null && !this.getAlternativePersistentIndentifiers().isEmpty()) {
for (AlternativePersistentIdentifier api : this.getAlternativePersistentIndentifiers()) {
Expand Down
19 changes: 2 additions & 17 deletions src/main/java/edu/harvard/iq/dataverse/HandlenetServiceBean.java
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,6 @@

import java.io.File;
import java.io.FileInputStream;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.util.*;
import java.util.logging.Level;
import java.util.logging.Logger;
Expand All @@ -34,6 +32,7 @@
import java.security.PrivateKey;

/* Handlenet imports: */
import edu.harvard.iq.dataverse.util.SystemConfig;
import net.handle.hdllib.AbstractMessage;
import net.handle.hdllib.AbstractResponse;
import net.handle.hdllib.AdminRecord;
Expand Down Expand Up @@ -247,21 +246,7 @@ private String getRegistrationUrl(DvObject dvObject) {
}

public String getSiteUrl() {
logger.log(Level.FINE,"getSiteUrl");
String hostUrl = System.getProperty("dataverse.siteUrl");
if (hostUrl != null && !"".equals(hostUrl)) {
return hostUrl;
}
String hostName = System.getProperty("dataverse.fqdn");
if (hostName == null) {
try {
hostName = InetAddress.getLocalHost().getCanonicalHostName();
} catch (UnknownHostException e) {
return null;
}
}
hostUrl = "https://" + hostName;
return hostUrl;
return SystemConfig.getDataverseSiteUrlStatic();
}

private byte[] readKey(final String file) {
Expand Down
3 changes: 2 additions & 1 deletion src/main/java/edu/harvard/iq/dataverse/api/Info.java
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
package edu.harvard.iq.dataverse.api;

import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import edu.harvard.iq.dataverse.util.SystemConfig;
import javax.ejb.EJB;
Expand Down Expand Up @@ -44,7 +45,7 @@ public Response getInfo() {
@GET
@Path("server")
public Response getServer() {
return response( req -> ok(systemConfig.getDataverseServer()));
return response( req -> ok(JvmSettings.FQDN.lookup()));
}

@GET
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
import edu.harvard.iq.dataverse.makedatacount.DatasetExternalCitationsServiceBean;
import edu.harvard.iq.dataverse.makedatacount.DatasetMetrics;
import edu.harvard.iq.dataverse.makedatacount.DatasetMetricsServiceBean;
import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.harvard.iq.dataverse.util.SystemConfig;

import java.io.FileReader;
Expand Down Expand Up @@ -141,7 +142,10 @@ public Response updateCitationsForDataset(@PathParam("id") String id) throws Mal
// DataCite wants "doi=", not "doi:".
String authorityPlusIdentifier = persistentId.replaceFirst("doi:", "");
// Request max page size and then loop to handle multiple pages
URL url = new URL(systemConfig.getDataCiteRestApiUrlString() + "/events?doi=" + authorityPlusIdentifier + "&source=crossref&page[size]=1000");
URL url = new URL(JvmSettings.DATACITE_REST_API_URL.lookup() +
"/events?doi=" +
authorityPlusIdentifier +
"&source=crossref&page[size]=1000");
logger.fine("Retrieving Citations from " + url.toString());
boolean nextPage = true;
JsonArrayBuilder dataBuilder = Json.createArrayBuilder();
Expand Down
11 changes: 8 additions & 3 deletions src/main/java/edu/harvard/iq/dataverse/api/Pids.java
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
import edu.harvard.iq.dataverse.engine.command.impl.DeletePidCommand;
import edu.harvard.iq.dataverse.engine.command.impl.ReservePidCommand;
import edu.harvard.iq.dataverse.pidproviders.PidUtil;
import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.harvard.iq.dataverse.util.BundleUtil;
import java.util.Arrays;
import javax.ejb.Stateless;
Expand Down Expand Up @@ -46,9 +47,13 @@ public Response getPid(@QueryParam("persistentId") String persistentId) {
} catch (WrappedResponse ex) {
return error(Response.Status.FORBIDDEN, BundleUtil.getStringFromBundle("api.errors.invalidApiToken"));
}
String baseUrl = systemConfig.getDataCiteRestApiUrlString();
String username = System.getProperty("doi.username");
String password = System.getProperty("doi.password");

// FIXME: Even before changing to MPCONFIG retrieval, this was pinned to be DataCite specific!
// Should this be extended to EZID and other PID systems like Handle?
String baseUrl = JvmSettings.DATACITE_REST_API_URL.lookup();
String username = JvmSettings.DATACITE_USERNAME.lookup();
String password = JvmSettings.DATACITE_PASSWORD.lookup();

try {
JsonObjectBuilder result = PidUtil.queryDoi(persistentId, baseUrl, username, password);
return ok(result);
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
package edu.harvard.iq.dataverse.api.datadeposit;

import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.harvard.iq.dataverse.util.SystemConfig;
import java.io.File;
import java.util.Arrays;
Expand Down Expand Up @@ -86,37 +87,32 @@ public boolean storeAndCheckBinary() {

@Override
public String getTempDirectory() {
String tmpFileDir = System.getProperty(SystemConfig.FILES_DIRECTORY);
if (tmpFileDir != null) {
String swordDirString = tmpFileDir + File.separator + "sword";
File swordDirFile = new File(swordDirString);
/**
* @todo Do we really need this check? It seems like we do because
* if you create a dataset via the native API and then later try to
* upload a file via SWORD, the directory defined by
* dataverse.files.directory may not exist and we get errors deep in
* the SWORD library code. Could maybe use a try catch in the doPost
* method of our SWORDv2MediaResourceServlet.
*/
if (swordDirFile.exists()) {
// will throw a runtime exception when not found
String tmpFileDir = JvmSettings.FILES_DIRECTORY.lookup();

String swordDirString = tmpFileDir + File.separator + "sword";
File swordDirFile = new File(swordDirString);
/**
* @todo Do we really need this check? It seems like we do because
* if you create a dataset via the native API and then later try to
* upload a file via SWORD, the directory defined by
* dataverse.files.directory may not exist and we get errors deep in
* the SWORD library code. Could maybe use a try catch in the doPost
* method of our SWORDv2MediaResourceServlet.
*/
if (swordDirFile.exists()) {
return swordDirString;
} else {
boolean mkdirSuccess = swordDirFile.mkdirs();
if (mkdirSuccess) {
logger.info("Created directory " + swordDirString);
return swordDirString;
} else {
boolean mkdirSuccess = swordDirFile.mkdirs();
if (mkdirSuccess) {
logger.info("Created directory " + swordDirString);
return swordDirString;
} else {
String msgForSwordUsers = ("Could not determine or create SWORD temp directory. Check logs for details.");
logger.severe(msgForSwordUsers + " Failed to create " + swordDirString);
// sadly, must throw RunTimeException to communicate with SWORD user
throw new RuntimeException(msgForSwordUsers);
}
String msgForSwordUsers = ("Could not determine or create SWORD temp directory. Check logs for details.");
logger.severe(msgForSwordUsers + " Failed to create " + swordDirString);
// sadly, must throw RunTimeException to communicate with SWORD user
throw new RuntimeException(msgForSwordUsers);
}
} else {
String msgForSwordUsers = ("JVM option \"" + SystemConfig.FILES_DIRECTORY + "\" not defined. Check logs for details.");
logger.severe(msgForSwordUsers);
// sadly, must throw RunTimeException to communicate with SWORD user
throw new RuntimeException(msgForSwordUsers);
}
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@

import org.apache.commons.io.IOUtils;

import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.sql.Timestamp;
Expand All @@ -79,7 +80,7 @@
@Dependent
public class FileRecordJobListener implements ItemReadListener, StepListener, JobListener {

public static final String SEP = System.getProperty("file.separator");
public static final String SEP = File.separator;

private static final UserNotification.Type notifyType = UserNotification.Type.FILESYSTEMIMPORT;

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@
@Dependent
public class FileRecordReader extends AbstractItemReader {

public static final String SEP = System.getProperty("file.separator");
public static final String SEP = File.separator;

@Inject
JobContext jobContext;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -154,8 +154,8 @@ public static Logger getJobLogger(String jobId) {
try {
Logger jobLogger = Logger.getLogger("job-"+jobId);
FileHandler fh;
String logDir = System.getProperty("com.sun.aas.instanceRoot") + System.getProperty("file.separator")
+ "logs" + System.getProperty("file.separator") + "batch-jobs" + System.getProperty("file.separator");
String logDir = System.getProperty("com.sun.aas.instanceRoot") + File.separator
+ "logs" + File.separator + "batch-jobs" + File.separator;
checkCreateLogDirectory( logDir );
fh = new FileHandler(logDir + "job-" + jobId + ".log");
logger.log(Level.INFO, "JOB LOG: " + logDir + "job-" + jobId + ".log");
Expand Down
Loading