Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs(no-code): Adding documentation for no-migration upgrade option #2656

Merged
merged 3 commits into from
Jun 5, 2021
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
31 changes: 29 additions & 2 deletions docs/advanced/no-code-upgrade.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,9 +52,12 @@ run the datahub-upgrade job, which will run the above docker container to migrat

### Step 2: Execute Migration Job

#### Docker Compose Deployments
#### Docker Compose Deployments - Preserve Data

If you do not care about migrating your data, you can refer to the Docker Compose Deployments - Lose All Existing Data
section below.

The easiest option is to execute the `run_upgrade.sh` script located under `docker/datahub-upgrade/nocode`.
To migrate existing data, the easiest option is to execute the `run_upgrade.sh` script located under `docker/datahub-upgrade/nocode`.

```
cd docker/datahub-upgrade/nocode
Expand All @@ -74,6 +77,30 @@ You can either
To see the required environment variables, see the (datahub-upgrade)[../../docker/datahub-upgrade/README.md]
documentation

#### Docker Compose Deployments - Lose All Existing Data

This path is quickest but will wipe your Datahub's database.
If you want to make sure your current data is migrated, refer to the Docker Compose Deployments - Preserve Data section above.
If you are ok losing your data and re-ingesting, this approach is simplest.

```
# make sure you are on the latest
git checkout master
git pull origin master

# wipe all your existing data and turn off all processes
./docker/nuke.sh

# spin up latest datahub
./docker/quickstart.sh

# re-ingest data, for example, to ingest sample data:
./docker/ingestion/ingestion.sh
```

After that, you will be upgraded and good to go.


##### How to fix the "listening to port 5005" issue

Fix for this issue have been published to the acryldata/datahub-upgrade:head tag. Please pull latest master and rerun
Expand Down