-
Notifications
You must be signed in to change notification settings - Fork 501
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Destroy Dataset: Cannot destroy a dataset that has a mapped shape file. #4093
Comments
We talked about this on Wednesday during sprint planning and I think we agreed we should fix it with a cascade delete. A related post from TDL just came in: https://groups.google.com/d/msg/dataverse-community/84beQsFuC9w/LZF_dHn-AAAJ |
What about deleting the file from WorldMap? |
Hmm. Well, they should start with the |
Once we fix this bug, we'd like to destroy this dataset: https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/T2VIXT |
Another one, good test case in prod: https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/4WL81Z |
I just made pull request #4245 and am moving this to Code Review at https://waffle.io/IQSS/dataverse |
Moving this into develop to take a stab at utilizing cascade / jpa |
Moved the JPA mapping to the DataFile which correctly triggers cascade.
In addition to the currently committed code, I have an extra small piece that deletes the map from worldmap itself, as that does not seem to happen in the current flows (I may have missed it tho). The piece of code I have still hits an error in tests tho, so it'll have to wait. |
If failure, log and keep going
Thanks for the continued work on this. I mentioned in Sprint Planning earlier today that it's OK if we move this to the backlog instead of continuing to spend time here. If there's some value here (seems to work for non-restricted files?), we should consider merging what we have instead of trying to track down all cases. |
Below is my summary of the state of this branch. Without a local dev environment for this it is hard to move forward. I've tried following the guides and working with Kevin to point to the dev geoconnect from my local, but the connection fails. I could try setting up geoconnect locally but that seems out of scope. I could also keep sending builds off to Kevin and testing remotely but that's extremely slow and a poor use of time. The branch as-is could be pulled into development but I don't like how little improvement it causes. Current statusWorks:
Broken:
Not in scope but broken:
|
One other option to move forward would be to just rely on manual steps to do this cleanup. This would mean: For unrestricted datasets
For restricted datasets
This does not work for deletion of individual DataFiles because this action can be performed by the end user. We have yet to try fixing this one with code tho, and may be a simple code fix. |
Thanks for putting a bow on this. Moving to backlog, we can pick it up in the future. |
Need to remove this dataset from prod when fixed: doi:10.7910/DVN/SCYBXS |
|
…side, and possible remote layer data. #4093
@kcondon You wrote in this issue "Need to remove this dataset from prod when fixed: doi:10.7910/DVN/SCYBXS". Do you mind if I use the destroy endpoint to try to remove it? I just used it to destroy two other test datasets with mapped shape files, but Harvard Dataverse sent me an email about how it "failed to delete the WorldMap layer associated with the restricted file". I don't remember seeing any restricted files in the two datasets, and I'm wondering if I'd get the same message after trying to destroy doi:10.7910/DVN/SCYBXS, which has no restricted files. |
@jggautier I'm not responsible for destroying any datasets. I believe people recorded some problem ones in this ticket so if you are destroying any of those, just note it, thanks. |
Okay, these datasets were mentioned in this thread:
The first was already destroyed. I just destroyed the other two. Neither had restricted files but each time I used the endpoint, an email like the one below was still sent to my superuser account email address:
I guess I'm just including this here "for the record" since it doesn't matter much: these are test files so we don't need to make an effort to delete them from the internet (like we would if they contained PII) and the WorldMap integration is going away eventually. |
This is currently in production, v4.7.1
Server log error:
Local Exception Stack:
Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: org.postgresql.util.PSQLException: ERROR: update or delete on table "dvobject" violates foreign key constraint "fk_maplayermetadata_datafile_id" on table "maplayermetadata"
Detail: Key (id)=(3047705) is still referenced from table "maplayermetadata".
Error Code: 0
Call: DELETE FROM DVOBJECT WHERE (ID = ?)
bind => [1 parameter bound]
Query: DeleteObjectQuery([DataFile id:3047705 name:null])
at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:340)
at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.processExceptionForCommError(DatabaseAccessor.java:1611)
at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeDirectNoSelect(DatabaseAccessor.java:898)
at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeNoSelect(DatabaseAccessor.java:962)
at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.basicExecuteCall(DatabaseAccessor.java:631)
at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.executeCall(DatabaseAccessor.java:558)
at org.eclipse.persistence.internal.sessions.AbstractSession.basicExecuteCall(AbstractSession.java:2002)
at org.eclipse.persistence.sessions.server.ClientSession.executeCall(ClientSession.java:298)
at org.eclipse.persistence.internal.queries.DatasourceCallQueryMechanism.executeCall(DatasourceCallQueryMechanism.java:242)
at org.eclipse.persistence.internal.queries.DatasourceCallQueryMechanism.deleteObject(DatasourceCallQueryMechanism.java:203)
at org.eclipse.persistence.internal.queries.StatementQueryMechanism.deleteObject(StatementQueryMechanism.java:104)
at org.eclipse.persistence.queries.DeleteObjectQuery.executeDatabaseQuery(DeleteObjectQuery.java:218)
at org.eclipse.persistence.queries.DatabaseQuery.execute(DatabaseQuery.java:899)
at org.eclipse.persistence.queries.DatabaseQuery.executeInUnitOfWork(DatabaseQuery.java:798)
at org.eclipse.persistence.queries.ObjectLevelModifyQuery.executeInUnitOfWorkObjectLevelModifyQuery(ObjectLevelModifyQuery.java:108)
at org.eclipse.persistence.queries.DeleteObjectQuery.executeInUnitOfWorkObjectLevelModifyQuery(DeleteObjectQuery.java:119)
at org.eclipse.persistence.queries.ObjectLevelModifyQuery.executeInUnitOfWork(ObjectLevelModifyQuery.java:85
The text was updated successfully, but these errors were encountered: