Skip to content

Commit

Permalink
Merge pull request #23 from catenax-ng/main
Browse files Browse the repository at this point in the history
[Release 3.1]  Merging code to Master for release3.1
  • Loading branch information
Siegfriedk authored May 16, 2023
2 parents b04df2d + c1f5019 commit a108585
Show file tree
Hide file tree
Showing 70 changed files with 2,105 additions and 688 deletions.
11 changes: 7 additions & 4 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,9 @@
All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),

## [Unreleased]
## [2.0.0] - 2023-05-05
- Removed token log statement from logs.
- EDC version 0.3.0 changes for multiple BPN.
- Error handling & input validation Messages for exceptions during upload / creation.
- Manufacturer country code list.
- Cancel contract agreement on provider side.
Expand All @@ -14,6 +15,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- To find out which contract belongs to which dataset User have to download the history file

## [1.9.1] - 2023-03-24

### Fixed
- Helm charts fixed with default values.
- Database dependency updated in charts.
Expand Down Expand Up @@ -126,8 +128,9 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Compliance with Catena-X Guidelines
- Integration with Digital Twin registry service.

[unreleased]: https://github.com/eclipse-tractusx/dft-backend/compare/dftbackend-1.9.1...main
[1.9.1]: https://github.com/eclipse-tractusx/dft-backend/compare/1.9.0...dftbackend-1.9.1
[unreleased]: https://github.com/eclipse-tractusx/dft-backend/compare/2.0.0...main
[2.0.0]: https://github.com/eclipse-tractusx/dft-backend/compare/1.9.1...2.0.0
[1.9.1]: https://github.com/eclipse-tractusx/dft-backend/compare/1.9.0...1.9.1
[1.9.0]: https://github.com/eclipse-tractusx/dft-backend/compare/dft-backend-1.8.1...1.9.0
[1.8.1]: https://github.com/eclipse-tractusx/dft-backend/compare/dft-backend-1.8.0...dft-backend-1.8.1
[1.8.0]: https://github.com/eclipse-tractusx/dft-backend/compare/dft-backend-1.7.0...dft-backend-1.8.0
Expand Down
156 changes: 79 additions & 77 deletions DEPENDENCIES

Large diffs are not rendered by default.

79 changes: 33 additions & 46 deletions InstallationGuide.md
Original file line number Diff line number Diff line change
@@ -1,50 +1,39 @@
# Installation Guide
## Product DFT
Install from the command line:

docker container run -d --name [conatainer_name] ghcr.io/catenax-ng/tx-dft-backend:[tag]
It is necessary to inject the environment variables, credentials and URLs that can be found on application.properties file.
For more details, please refer configuration section from [README.md](README.md)

### RUN SDE backend in ArgoCD
We have helm chart available for ArgoCD deployment. In deployment, if don't specified specific version, the app version/chart version is automatically picked up by ArgoCD and deployed to the environment using Helm charts.

In values.yaml you can find `default` as value for all required configuration. You need to change all those values as per your need. For reference, please refer configuration example section.

As part of argo CD deployment using helm chart the postgres database dependency will get provide automatic but for EDC, DigitalTwin and Portal you need to provide valid details as per configuration requirement other wise SDE service will get started with default configuration but will not work as expected.

### RUN SDE Backend in k8s cluster
#### Prerequisites
- k8s cluster/ minikube
- helm
- Docker

In values.yaml you can find `default` as value for all required configuration. You need to change all those values as per your need. For reference, please refer configuration example section.

```helm repo add eclipse-tractusx-dft-backend https://github.com/eclipse-tractusx/dft-backend/tree/main/charts ```

```helm install release-name eclipse-tractusx/dft-backend ```

### RUN SDE Backend Locally
#### Prerequisites
- JDK18
- Postgres 11.9.13

#### Steps
1. Clone the GitHub Repository - https://github.com/eclipse-tractusx/dft-backend.
2. Get your instance of postgres running (Create **dftdb** new database).
3. Setup your project environment to JDK 18.
4. Provide require application configuration in application.properties as specified in step configuration.properties.
5. Start the SDE spring boot application from your IDE using main class or use spring CLI.

It is necessary to inject the environment variables, credentials and URLs that can be found on application.properties file.
#### CatenaX variables
| Property | Value | Description | Example |
|----------------|----------------|--------------------|---------|
| manufacturerId | MANUFACTURERID | Id of manufacturer | CatenaX |


#### Digital Twins variables:
| Property name | Environment Variable Name | Description | Example Value |
|-------------------------------------------|-------------------------------------------|------------------------------------------------|----------------------------------|
| digital-twins.hostname | DIGITAL-TWINS_HOSTNAME | hostname for Digital Twins | https:// |
| digital-twins.authentication.url | DIGITAL-TWINS_AUTHENTICATION_URL | authentication url for Digital Twins | https:// |
| digital-twins.authentication.clientId | DIGITAL-TWINS_AUTHENTICATION_CLIENTID | client ID authentication for Digital Twins | sa-cl6-cx-4 |
| digital-twins.authentication.clientSecret | DIGITAL-TWINS_AUTHENTICATION_CLIENTSECRET | client secret authentication for Digital Twins | VrL8uSG5Tn3NrFiY39vs0klTmlvsRRmo |

The values are on the [Vault].
*<i><b>Must create a GitHub token to access</b></i>

#### EDC variables:
| Property name | Environment Variable Name | Description | Example Value |
|------------------|---------------------------|-----------------------------------------------|---------------|
| edc.hostname | EDC_HOSTNAME | edc hostname | https:// |
| edc.apiKeyHeader | EDC_APIKEYHEADER | API KEY header for edc | X-Api_Key |
| edc.apiKey | EDC_APIKEY | API KEY for edc | 123456 |
| dft.hostname | DFT_HOSTNAME | hostname for DFT | https:// |
| dft.apiKeyHeader | DFT_APIKEYHEADER | url authentication key for edc asset payload | Api-Key |
| dft.apiKey | DFT_APIKEY | url authentication code for edc asset payload | someCode |
| edc.enabled | EDC_ENABLED | enable / disable edc | true / false |

#### Keycloak variables:
| Property name | Environment Variable Name | Description | Example Value |
|-----------------------------------|---------------------------|------------------------------|---------------|
| connector.discovery.token-url | KEYCLOCK_HOSTNAME | keyclock hostnam | https:// |
| connector.discovery.clientId | KEYCLOCK_CLIENTID | keyclock clientId | X-Api_Key |
| connector.discovery.clientSecret | KEYCLOCK_CLIENTSECRET | keyclock clientse | 123456 |
| portal.backend.hostname | PORTAL_HOSTNAME | portal hostname | https:// |


The values are in the [Vault].
*<i><b>Must create a GitHub token to access</b></i>

## Upload a file:
When a file .csv is uploaded, the program checks whether the file is a SerialPartTypization or an AssemblyPartRelationship and there is a pipeline for each one.
Expand Down Expand Up @@ -109,7 +98,5 @@ Apart from both upload Batch upload is additional feature were added into DFT.
The file .csv is loaded in memory, the content is saved and then, the file is removed from memory.


If the file is not .csv, it is read, processed and is considered as FAILED


If the file is not .csv, it is read, processed and is considered as FAILED.

Loading

0 comments on commit a108585

Please sign in to comment.