Skip to content

Commit

Permalink
Merge branch 'master' of github.com:apache/superset into dynamic_dash…
Browse files Browse the repository at this point in the history
…board_component

� Conflicts:
�	superset-frontend/src/dashboard/components/BuilderComponentPane.tsx
  • Loading branch information
simcha90 committed Nov 4, 2021
2 parents 369afd4 + 1c12167 commit 278303d
Show file tree
Hide file tree
Showing 117 changed files with 6,839 additions and 8,077 deletions.
3 changes: 3 additions & 0 deletions UPDATING.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,11 +26,14 @@ assists people when migrating to a new version.

### Breaking Changes

- [17290](https://github.com/apache/superset/pull/17290): Bumps pandas to `1.3.4` and pyarrow to `5.0.0`
- [16660](https://github.com/apache/incubator-superset/pull/16660): The `columns` Jinja parameter has been renamed `table_columns` to make the `columns` query object parameter available in the Jinja context.
- [16711](https://github.com/apache/incubator-superset/pull/16711): The `url_param` Jinja function will now by default escape the result. For instance, the value `O'Brien` will now be changed to `O''Brien`. To disable this behavior, call `url_param` with `escape_result` set to `False`: `url_param("my_key", "my default", escape_result=False)`.

### Potential Downtime

- [16756](https://github.com/apache/incubator-superset/pull/16756): a change which renames the `dbs.allow_csv_upload` column to `dbs.allow_file_upload` via a (potentially locking) DDL operation.

### Deprecations

### Other
Expand Down
8 changes: 6 additions & 2 deletions docs/src/pages/docs/Connecting to Databases/firebolt.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,15 @@ Superset has been tested on `firebolt-sqlalchemy>=0.0.1`.
The recommended connection string is:

```
firebolt://{username}:{password}@{host}/{database}
firebolt://{username}:{password}@{database}
or
firebolt://{username}:{password}@{database}/{engine_name}
```

Here's a connection string example of Superset connecting to a Firebolt database:

```
firebolt://email@domain:password@host/sample_database
firebolt://email@domain:password@sample_database
or
firebolt://email@domain:password@sample_database/sample_engine
```
62 changes: 45 additions & 17 deletions docs/src/pages/docs/Connecting to Databases/google-bigquery.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,31 +11,55 @@ version: 1
The recommended connector library for BigQuery is
[pybigquery](https://github.com/mxmzdlv/pybigquery).

The connection string for BigQuery looks like:

### Install BigQuery Driver
Follow the steps [here](/docs/databases/dockeradddrivers) about how to
install new database drivers when setting up Superset locally via docker-compose.
```
bigquery://{project_id}
echo "pybigquery" >> ./docker/requirements-local.txt
```

When adding a new BigQuery connection in Superset, you'll also need to add the GCP Service Account
### Connecting to BigQuery
When adding a new BigQuery connection in Superset, you'll need to add the GCP Service Account
credentials file (as a JSON).

1. Create your Service Account via the Google Cloud Platform control panel, provide it access to the
appropriate BigQuery datasets, and download the JSON configuration file for the service account.

2. n Superset, Add a JSON blob to the **Secure Extra** field in the database configuration form with
the following format:

2. In Superset, you can either upload that JSON or add the JSON blob in the following format (this should be the content of your credential JSON file):
```
{
"credentials_info": <contents of credentials JSON file>
}
```
"type": "service_account",
"project_id": "...",
"private_key_id": "...",
"private_key": "...",
"client_email": "...",
"client_id": "...",
"auth_uri": "...",
"token_uri": "...",
"auth_provider_x509_cert_url": "...",
"client_x509_cert_url": "..."
}
```

The resulting file should have this structure:
![CleanShot 2021-10-22 at 04 18 11](https://user-images.githubusercontent.com/52086618/138352958-a18ef9cb-8880-4ef1-88c1-452a9f1b8105.gif)

```
{

3. Additionally, can connect via SQLAlchemy URI instead

The connection string for BigQuery looks like:

```
bigquery://{project_id}
```
Go to the **Advanced** tab, Add a JSON blob to the **Secure Extra** field in the database configuration form with
the following format:
```
{
"credentials_info": <contents of credentials JSON file>
}
```

The resulting file should have this structure:
```
{
"credentials_info": {
"type": "service_account",
"project_id": "...",
Expand All @@ -47,11 +71,15 @@ The resulting file should have this structure:
"token_uri": "...",
"auth_provider_x509_cert_url": "...",
"client_x509_cert_url": "..."
}
}
}
```
```

You should then be able to connect to your BigQuery datasets.

![CleanShot 2021-10-22 at 04 47 08](https://user-images.githubusercontent.com/52086618/138354340-df57f477-d3e5-42d4-b032-d901c69d2213.gif)



To be able to upload CSV or Excel files to BigQuery in Superset, you'll need to also add the
[pandas_gbq](https://github.com/pydata/pandas-gbq) library.
2 changes: 1 addition & 1 deletion docs/src/pages/docs/Connecting to Databases/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ A list of some of the recommended packages.
|[Elasticsearch](/docs/databases/elasticsearch)|```pip install elasticsearch-dbapi```|```elasticsearch+http://{user}:{password}@{host}:9200/```|
|[Exasol](/docs/databases/exasol)|```pip install sqlalchemy-exasol```|```exa+pyodbc://{username}:{password}@{hostname}:{port}/my_schema?CONNECTIONLCALL=en_US.UTF-8&driver=EXAODBC```|
|[Google Sheets](/docs/databases/google-sheets)|```pip install shillelagh[gsheetsapi]```|```gsheets://```|
|[Firebolt](/docs/databases/firebolt)|```pip install firebolt-sqlalchemy```|```firebolt://{username}:{password}@{host}/{database}```|
|[Firebolt](/docs/databases/firebolt)|```pip install firebolt-sqlalchemy```|```firebolt://{username}:{password}@{database} or firebolt://{username}:{password}@{database}/{engine_name}```|
|[Hologres](/docs/databases/hologres)|```pip install psycopg2```|```postgresql+psycopg2://<UserName>:<DBPassword>@<Database Host>/<Database Name>```|
|[IBM Db2](/docs/databases/ibm-db2)|```pip install ibm_db_sa```|```db2+ibm_db://```|
|[IBM Netezza Performance Server](/docs/databases/netezza)|```pip install nzalchemy```|```netezza+nzpy://<UserName>:<DBPassword>@<Database Host>/<Database Name>```|
Expand Down
6 changes: 3 additions & 3 deletions docs/src/pages/docs/installation/kubernetes.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -24,14 +24,14 @@ helm repo add superset https://apache.github.io/superset
"superset" has been added to your repositories
```

1. View charts in repo
2. View charts in repo
```sh
helm search repo superset
NAME CHART VERSION APP VERSION DESCRIPTION
superset/superset 0.1.1 1.0 Apache Superset is a modern, enterprise-ready b...
```

1. Configure your setting overrides
3. Configure your setting overrides

Just like any typical Helm chart, you'll need to craft a `values.yaml` file that would define/override any of the values exposed into the default [values.yaml](https://github.com/apache/superset/tree/master/helm/superset/values.yaml), or from any of the dependent charts it depends on:

Expand All @@ -40,7 +40,7 @@ Just like any typical Helm chart, you'll need to craft a `values.yaml` file that

More info down below on some important overrides you might need.

1. Install and run
4. Install and run

```sh
helm upgrade --install --values my-values.yaml superset superset/superset
Expand Down
2 changes: 1 addition & 1 deletion helm/superset/Chart.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ maintainers:
- name: craig-rueda
email: [email protected]
url: https://github.com/craig-rueda
version: 0.3.11
version: 0.3.12
dependencies:
- name: postgresql
version: 10.2.0
Expand Down
3 changes: 3 additions & 0 deletions helm/superset/templates/ingress.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,9 @@ metadata:
{{ toYaml . | indent 4 }}
{{- end }}
spec:
{{- if .Values.ingress.ingressClassName }}
ingressClassName: {{ .Values.ingress.ingressClassName }}
{{- end }}
{{- if .Values.ingress.tls }}
tls:
{{- range .Values.ingress.tls }}
Expand Down
2 changes: 1 addition & 1 deletion helm/superset/values.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -165,8 +165,8 @@ service:

ingress:
enabled: false
# ingressClassName: nginx
annotations: {}
# kubernetes.io/ingress.class: nginx
# kubernetes.io/tls-acme: "true"
## Extend timeout to allow long running queries.
# nginx.ingress.kubernetes.io/proxy-connect-timeout: "300"
Expand Down
4 changes: 2 additions & 2 deletions requirements/base.txt
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,7 @@ packaging==21.0
# via
# bleach
# deprecation
pandas==1.2.5
pandas==1.3.4
# via apache-superset
parsedatetime==2.6
# via apache-superset
Expand All @@ -178,7 +178,7 @@ polyline==1.4.0
# via apache-superset
prison==0.2.1
# via flask-appbuilder
pyarrow==4.0.1
pyarrow==5.0.0
# via apache-superset
pycparser==2.20
# via cffi
Expand Down
4 changes: 2 additions & 2 deletions requirements/development.in
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,11 @@
# under the License.
-r base.in
flask-cors>=2.0.0
mysqlclient==1.4.2.post1
mysqlclient==2.0.3
pillow>=8.3.1,<9
pydruid>=0.6.1,<0.7
pyhive[hive]>=0.6.1
psycopg2-binary==2.8.5
psycopg2-binary==2.9.1
tableschema
thrift>=0.11.0,<1.0.0
progress>=1.5,<2
Expand Down
6 changes: 3 additions & 3 deletions requirements/development.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SHA1:dbd3e93a11a36fc6b18d6194ac96ba29bd0ad2a8
# SHA1:a2fe77c9b8bffc8c8f3de4df6709c8be957c2f87
#
# This file is autogenerated by pip-compile-multi
# To update, run:
Expand Down Expand Up @@ -36,15 +36,15 @@ jsonlines==2.0.0
# via tabulator
linear-tsv==1.1.0
# via tabulator
mysqlclient==1.4.2.post1
mysqlclient==2.0.3
# via -r requirements/development.in
openpyxl==3.0.7
# via tabulator
pillow==8.3.1
# via -r requirements/development.in
progress==1.6
# via -r requirements/development.in
psycopg2-binary==2.8.5
psycopg2-binary==2.9.1
# via -r requirements/development.in
pure-sasl==0.6.2
# via thrift-sasl
Expand Down
3 changes: 3 additions & 0 deletions requirements/testing.in
Original file line number Diff line number Diff line change
Expand Up @@ -19,13 +19,16 @@
docker
flask-testing
freezegun
google-cloud-bigquery
ipdb
# pinning ipython as pip-compile-multi was bringing higher version
# of the ipython that was not found in CI
ipython
openapi-spec-validator
openpyxl
pandas_gbq
parameterized
pybigquery
pyfakefs
pyhive[presto]>=0.6.3
pylint==2.9.6
Expand Down
31 changes: 21 additions & 10 deletions requirements/testing.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SHA1:a36e63b551290f1060a819fe4f1f50bc6200403c
# SHA1:4aabffca9a6688f2911d6f8697495e7045a529d0
#
# This file is autogenerated by pip-compile-multi
# To update, run:
Expand Down Expand Up @@ -31,7 +31,7 @@ flask-testing==0.8.1
# via -r requirements/testing.in
freezegun==1.1.0
# via -r requirements/testing.in
google-api-core[grpc]==2.1.0
google-api-core[grpc]==2.2.1
# via
# google-cloud-bigquery
# google-cloud-bigquery-storage
Expand All @@ -49,8 +49,9 @@ google-auth-oauthlib==0.4.6
# via
# pandas-gbq
# pydata-google-auth
google-cloud-bigquery[bqstorage,pandas]==2.28.0
google-cloud-bigquery[bqstorage,pandas]==2.29.0
# via
# -r requirements/testing.in
# apache-superset
# pandas-gbq
# pybigquery
Expand All @@ -60,14 +61,19 @@ google-cloud-core==2.1.0
# via google-cloud-bigquery
google-crc32c==1.3.0
# via google-resumable-media
google-resumable-media==2.0.3
google-resumable-media==2.1.0
# via google-cloud-bigquery
googleapis-common-protos==1.53.0
# via google-api-core
grpcio==1.41.0
# via
# google-api-core
# grpcio-status
grpcio==1.41.1
# via
# google-api-core
# google-cloud-bigquery
# grpcio-status
grpcio-status==1.41.1
# via google-api-core
iniconfig==1.1.1
# via pytest
ipdb==0.13.9
Expand Down Expand Up @@ -99,7 +105,9 @@ openapi-schema-validator==0.1.5
openapi-spec-validator==0.3.1
# via -r requirements/testing.in
pandas-gbq==0.15.0
# via apache-superset
# via
# -r requirements/testing.in
# apache-superset
parameterized==0.8.1
# via -r requirements/testing.in
parso==0.8.2
Expand All @@ -110,15 +118,16 @@ pickleshare==0.7.5
# via ipython
prompt-toolkit==3.0.19
# via ipython
proto-plus==1.19.2
proto-plus==1.19.7
# via
# google-cloud-bigquery
# google-cloud-bigquery-storage
protobuf==3.18.1
protobuf==3.19.1
# via
# google-api-core
# google-cloud-bigquery
# googleapis-common-protos
# grpcio-status
# proto-plus
ptyprocess==0.7.0
# via pexpect
Expand All @@ -129,7 +138,9 @@ pyasn1==0.4.8
pyasn1-modules==0.2.8
# via google-auth
pybigquery==0.10.2
# via apache-superset
# via
# -r requirements/testing.in
# apache-superset
pydata-google-auth==1.2.0
# via pandas-gbq
pyfakefs==4.5.0
Expand Down
6 changes: 3 additions & 3 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ def get_git_sha() -> str:

setup(
name="apache-superset",
description=("A modern, enterprise-ready business intelligence web application"),
description="A modern, enterprise-ready business intelligence web application",
long_description=long_description,
long_description_content_type="text/markdown",
version=version_string,
Expand Down Expand Up @@ -90,15 +90,15 @@ def get_git_sha() -> str:
"isodate",
"markdown>=3.0",
"msgpack>=1.0.0, <1.1",
"pandas>=1.2.2, <1.3",
"pandas>=1.3.0, <1.4",
"parsedatetime",
"pgsanity",
"polyline",
"pyparsing>=2.4.7, <3.0.0",
"python-dateutil",
"python-dotenv",
"python-geohash",
"pyarrow>=4.0.1, <4.1",
"pyarrow>=5.0.0, <6.0",
"pyyaml>=5.4",
"PyJWT>=1.7.1, <2",
"redis",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,13 @@ import { getChartAlias, Slice } from 'cypress/utils/vizPlugins';
export const WORLD_HEALTH_DASHBOARD = '/superset/dashboard/world_health/';
export const TABBED_DASHBOARD = '/superset/dashboard/tabbed_dash/';

export const testItems = {
dashboard: 'Cypress Sales Dashboard',
dataset: 'Vehicle Sales',
chart: 'Cypress chart',
defaultNameDashboard: '[ untitled dashboard ]',
};

export const CHECK_DASHBOARD_FAVORITE_ENDPOINT =
'/superset/favstar/Dashboard/*/count';

Expand Down
Loading

0 comments on commit 278303d

Please sign in to comment.