diff --git a/airbyte-integrations/connectors/destination-databricks/README.md b/airbyte-integrations/connectors/destination-databricks/README.md index 4ce7f30e1cfd4..8a0f3d57c9606 100644 --- a/airbyte-integrations/connectors/destination-databricks/README.md +++ b/airbyte-integrations/connectors/destination-databricks/README.md @@ -6,9 +6,8 @@ For information about how to use this connector within Airbyte, see [the User Do ## Databricks JDBC Driver This connector requires a JDBC driver to connect to Databricks cluster. The driver is developed by Simba. -{% hint style="warning" %} +WARNING: Before building, or using this connector, you must agree to the [JDBC ODBC driver license](https://databricks.com/jdbc-odbc-driver-license). This means that you can only use this driver to connector third party applications to Apache Spark SQL within a Databricks offering using the ODBC and/or JDBC protocols. -{% endhint %} This is currently a private connector that is only available on Airbyte Cloud. We are working on a solution to publicize it (issue [\#6043](https://github.com/airbytehq/airbyte/issues/6043)). diff --git a/docs/api-documentation.md b/docs/api-documentation.md index 430f14e94ba01..cefe654159e27 100644 --- a/docs/api-documentation.md +++ b/docs/api-documentation.md @@ -1,9 +1,11 @@ # API documentation -{% hint style="warning" %} +:::caution + For Airbyte Open-Source you don't need the API Token for Authentication! All endpoints are possible to access using the API without it. **Note**: Airbyte Cloud does not currently support API access. -{% endhint %} + +::: Our Configuration API is still in alpha and might change. You won’t lose any functionality, but you may need to update your code to catch up to any backwards incompatible changes in the API. diff --git a/docs/archive/examples/postgres-replication.md b/docs/archive/examples/postgres-replication.md index aa7a57ba86fd7..dbb2edf785707 100644 --- a/docs/archive/examples/postgres-replication.md +++ b/docs/archive/examples/postgres-replication.md @@ -101,9 +101,11 @@ Now let's verify that this worked. Let's output the contents of the destination docker exec airbyte-destination psql -U postgres -c "SELECT * FROM public.users;" ``` -{% hint style="info" %} +:::info + Don't worry about the awkward `public_users` name for now; we are currently working on an update to allow users to configure their destination table names! -{% endhint %} + +::: You should see the rows from the source database inside the destination database! diff --git a/docs/archive/examples/slack-history.md b/docs/archive/examples/slack-history.md index f44a7ecb8507f..6b602a8f09e67 100644 --- a/docs/archive/examples/slack-history.md +++ b/docs/archive/examples/slack-history.md @@ -29,9 +29,11 @@ docker run -it --rm \ That's it! -{% hint style="info" %} +:::info + MeiliSearch stores data in $\(pwd\)/data.ms, so if you prefer to store it somewhere else, just adjust this path. -{% endhint %} + +::: ## 2. Replicate Your Slack Messages to MeiliSearch diff --git a/docs/connector-development/cdk-faros-js.md b/docs/connector-development/cdk-faros-js.md index b1b33e0fb793e..4b977778923cf 100644 --- a/docs/connector-development/cdk-faros-js.md +++ b/docs/connector-development/cdk-faros-js.md @@ -1,4 +1,4 @@ -# Connector Development Kit \(Javascript\) +# Connector Development Kit (Javascript) The [Faros AI TypeScript/JavaScript CDK](https://github.com/faros-ai/airbyte-connectors/tree/main/faros-airbyte-cdk) allows you to build Airbyte connectors quickly similarly to how our [Python CDK](cdk-python/) does. This CDK currently offers support for creating Airbyte source connectors for: diff --git a/docs/connector-development/cdk-python/README.md b/docs/connector-development/cdk-python/README.md index 0e25dcc78fb04..83cad8b48de54 100644 --- a/docs/connector-development/cdk-python/README.md +++ b/docs/connector-development/cdk-python/README.md @@ -1,4 +1,4 @@ -# Connector Development Kit \(Python\) +# Connector Development Kit (Python) The Airbyte Python CDK is a framework for rapidly developing production-grade Airbyte connectors. The CDK currently offers helpers specific for creating Airbyte source connectors for: diff --git a/docs/connector-development/connector-specification-reference.md b/docs/connector-development/connector-specification-reference.md index 6e9f94d3de06e..6d3fa257be024 100644 --- a/docs/connector-development/connector-specification-reference.md +++ b/docs/connector-development/connector-specification-reference.md @@ -81,9 +81,11 @@ this will display a multi-line textbox in the UI like the following screenshot: In some cases, a connector needs to accept one out of many options. For example, a connector might need to know the compression codec of the file it will read, which will render in the Airbyte UI as a list of the available codecs. In JSONSchema, this can be expressed using the [oneOf](https://json-schema.org/understanding-json-schema/reference/combining.html#oneof) keyword. -{% hint style="info" %} +:::info + Some connectors may follow an older format for dropdown lists, we are currently migrating away from that to this standard. -{% endhint %} + +::: In order for the Airbyte UI to correctly render a specification, however, a few extra rules must be followed: diff --git a/docs/connector-development/testing-connectors/README.md b/docs/connector-development/testing-connectors/README.md index bba1a3fd12168..3ccd1c1046cb5 100644 --- a/docs/connector-development/testing-connectors/README.md +++ b/docs/connector-development/testing-connectors/README.md @@ -30,9 +30,11 @@ For instance: ``` ### 3. Requesting GitHub PR Integration Test Runs -{% hint style="warning" %} +:::caution + This option is not available to PRs from forks, so it is effectively limited to Airbyte employees. -{% endhint %} + +::: If you don't want to handle secrets, you're making a relatively minor change, or you want to ensure the connector's integration test will run remotely, you should request builds on GitHub. You can request an integration test run by creating a comment with a slash command. diff --git a/docs/connector-development/testing-connectors/legacy-standard-source-tests.md b/docs/connector-development/testing-connectors/legacy-standard-source-tests.md index 68ec07a3b267f..4b9959d794a38 100644 --- a/docs/connector-development/testing-connectors/legacy-standard-source-tests.md +++ b/docs/connector-development/testing-connectors/legacy-standard-source-tests.md @@ -61,9 +61,11 @@ First, you can run the image locally. Connectors should have instructions in the ### 2. Requesting GitHub PR Integration Test Runs -{% hint style="warning" %} +:::caution + This option is not available to PRs from forks, so it is effectively limited to Airbyte employees. -{% endhint %} + +::: If you don't want to handle secrets, you're making a relatively minor change, or you want to ensure the connector's integration test will run remotely, you should request builds on GitHub. You can request an integration test run by creating a comment with a slash command. diff --git a/docs/connector-development/tutorials/building-a-java-destination.md b/docs/connector-development/tutorials/building-a-java-destination.md index 2e77e40d13062..8361dd127fa03 100644 --- a/docs/connector-development/tutorials/building-a-java-destination.md +++ b/docs/connector-development/tutorials/building-a-java-destination.md @@ -21,13 +21,17 @@ Docker and Java with the versions listed in the [tech stack section](../../under * Step 7: Write unit tests or integration tests * Step 8: Update the docs \(in `docs/integrations/destinations/<destination-name>.md`\) -{% hint style="info" %} +:::info + All `./gradlew` commands must be run from the root of the airbyte project. -{% endhint %} -{% hint style="info" %} +::: + +:::info + If you need help with any step of the process, feel free to submit a PR with your progress and any questions you have, or ask us on [slack](https://slack.airbyte.io). -{% endhint %} + +::: ## Explaining Each Step @@ -53,9 +57,11 @@ You can build the destination by running: This compiles the Java code for your destination and builds a Docker image with the connector. At this point, we haven't implemented anything of value yet, but once we do, you'll use this command to compile your code and Docker image. -{% hint style="info" %} +:::info + Airbyte uses Gradle to manage Java dependencies. To add dependencies for your connector, manage them in the `build.gradle` file inside your connector's directory. -{% endhint %} + +::: #### Iterating on your implementation @@ -156,9 +162,11 @@ To implement the `write` Airbyte operation, implement the `getConsumer` method i * [Local CSV](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/destination-csv/src/main/java/io/airbyte/integrations/destination/csv/CsvDestination.java#L90) * [Postgres](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connectors/destination-postgres/src/main/java/io/airbyte/integrations/destination/postgres/PostgresDestination.java) -{% hint style="info" %} +:::info + The Postgres destination leverages the `AbstractJdbcDestination` superclass which makes it extremely easy to create a destination for a database or data warehouse if it has a compatible JDBC driver. If the destination you are implementing has a JDBC driver, be sure to check out `AbstractJdbcDestination`. -{% endhint %} + +::: For a brief overview on the Airbyte catalog check out [the Beginner's Guide to the Airbyte Catalog](../../understanding-airbyte/beginners-guide-to-catalog.md). diff --git a/docs/connector-development/tutorials/building-a-python-destination.md b/docs/connector-development/tutorials/building-a-python-destination.md index c6afcbf3a4bd6..8693cb164e9c3 100644 --- a/docs/connector-development/tutorials/building-a-python-destination.md +++ b/docs/connector-development/tutorials/building-a-python-destination.md @@ -21,9 +21,11 @@ Docker and Python with the versions listed in the [tech stack section](../../und * Step 7: Write unit tests or integration tests * Step 8: Update the docs \(in `docs/integrations/destinations/<destination-name>.md`\) -{% hint style="info" %} +:::info + If you need help with any step of the process, feel free to submit a PR with your progress and any questions you have, or ask us on [slack](https://slack.airbyte.io). Also reference the KvDB python destination implementation if you want to see an example of a working destination. -{% endhint %} + +::: ## Explaining Each Step @@ -79,9 +81,11 @@ The destination interface is described in detail in the [Airbyte Specification]( The generated files fill in a lot of information for you and have docstrings describing what you need to do to implement each method. The next few steps are just implementing that interface. -{% hint style="info" %} +:::info + All logging should be done through the `self.logger` object available in the `Destination` class. Otherwise, logs will not be shown properly in the Airbyte UI. -{% endhint %} + +::: Everyone develops differently but here are 3 ways that we recommend iterating on a destination. Consider using whichever one matches your style. diff --git a/docs/connector-development/tutorials/building-a-python-source.md b/docs/connector-development/tutorials/building-a-python-source.md index 2f9911b136220..7aeaa820f9e94 100644 --- a/docs/connector-development/tutorials/building-a-python-source.md +++ b/docs/connector-development/tutorials/building-a-python-source.md @@ -8,9 +8,11 @@ This article provides a checklist for how to create a python source. Each step i Docker, Python, and Java with the versions listed in the [tech stack section](../../understanding-airbyte/tech-stack.md). -{% hint style="info" %} +:::info + All the commands below assume that `python` points to a version of python >3.7. On some systems, `python` points to a Python2 installation and `python3` points to Python3. If this is the case on your machine, substitute all `python` commands in this guide with `python3` . Otherwise, make sure to install Python 3 before beginning. -{% endhint %} + +::: ## Checklist @@ -29,13 +31,17 @@ All the commands below assume that `python` points to a version of python >3. * Step 11: Add the connector to the API/UI \(by adding an entry in `airbyte-config/init/src/main/resources/seed/source_definitions.yaml`\) * Step 12: Add docs \(in `docs/integrations/sources/<source-name>.md`\) -{% hint style="info" %} +:::info + Each step of the Creating a Source checklist is explained in more detail below. -{% endhint %} -{% hint style="info" %} +::: + +:::info + All `./gradlew` commands must be run from the root of the airbyte project. -{% endhint %} + +::: ### Submitting a Source to Airbyte @@ -45,9 +51,11 @@ All `./gradlew` commands must be run from the root of the airbyte project. * Once the config is stored in Github Secrets, edit `.github/workflows/test-command.yml` and `.github/workflows/publish-command.yml` to inject the config into the build environment. * Edit the `airbyte/tools/bin/ci_credentials.sh` script to pull the script from the build environment and write it to `secrets/config.json` during the build. -{% hint style="info" %} +:::info + If you have a question about a step the Submitting a Source to Airbyte checklist include it in your PR or ask it on [slack](https://slack.airbyte.io). -{% endhint %} + +::: ## Explaining Each Step @@ -96,9 +104,11 @@ The commands we ran above created a virtual environment for your source. If you Pretty much all it takes to create a source is to implement the `Source` interface. The template fills in a lot of information for you and has extensive docstrings describing what you need to do to implement each method. The next 4 steps are just implementing that interface. -{% hint style="info" %} +:::info + All logging should be done through the `logger` object passed into each method. Otherwise, logs will not be shown in the Airbyte UI. -{% endhint %} + +::: #### Iterating on your implementation @@ -175,9 +185,11 @@ The Standard Tests are a set of tests that run against all sources. These tests You can run the tests using `./gradlew :airbyte-integrations:connectors:source-<source-name>:integrationTest`. Make sure to run this command from the Airbyte repository root. -{% hint style="info" %} +:::info + In some rare cases we make exceptions and allow a source to not need to pass all the standard tests. If for some reason you think your source cannot reasonably pass one of the tests cases, reach out to us on github or slack, and we can determine whether there's a change we can make so that the test will pass or if we should skip that test for your source. -{% endhint %} + +::: ### Step 9: Write unit tests and/or integration tests diff --git a/docs/contributing-to-airbyte/README.md b/docs/contributing-to-airbyte/README.md index 848cda65002b2..342cb0ba04979 100644 --- a/docs/contributing-to-airbyte/README.md +++ b/docs/contributing-to-airbyte/README.md @@ -60,9 +60,11 @@ It's easy to add your own connector to Airbyte! **Since Airbyte connectors are e For sources, simply head over to our [Python CDK](../connector-development/cdk-python/). -{% hint style="info" %} +:::info + The CDK currently does not support creating destinations, but it will very soon. -{% endhint %} + +:::: * See [Building new connectors](../connector-development/) to get started. * Since we frequently build connectors in Python, on top of Singer or in Java, we've created generator libraries to get you started quickly: [Build Python Source Connectors](../connector-development/tutorials/building-a-python-source.md) and [Build Java Destination Connectors](../connector-development/tutorials/building-a-java-destination.md) diff --git a/docs/contributing-to-airbyte/developing-locally.md b/docs/contributing-to-airbyte/developing-locally.md index daa5b8ad01ecd..7bfdd219e0774 100644 --- a/docs/contributing-to-airbyte/developing-locally.md +++ b/docs/contributing-to-airbyte/developing-locally.md @@ -8,9 +8,11 @@ The following technologies are required to build Airbyte locally. 4. `Docker` 5. `Jq` -{% hint style="info" %} +:::info + Manually switching between different language versions can get hairy. We recommend using a version manager such as [`pyenv`](https://github.com/pyenv/pyenv) or [`jenv`](https://github.com/jenv/jenv). -{% endhint %} + +::: To start contributing: @@ -26,9 +28,11 @@ To start contributing: ## Build with `gradle` -{% hint style="info" %} +:::info + If you're using Mac M1 \(Apple Silicon\) machines, you may run into some problems (Temporal failing during runs, and some connectors not working). See the [GitHub issue](https://github.com/airbytehq/airbyte/issues/2017) for more information. -{% endhint %} + +::: To compile and build just the platform \(not all the connectors\): @@ -40,28 +44,34 @@ This will build all the code and run all the unit tests. `SUB_BUILD=PLATFORM ./gradlew build` creates all the necessary artifacts \(Webapp, Jars and Docker images\) so that you can run Airbyte locally. Since this builds everything, it can take some time. -{% hint style="info" %} +:::info + Optionally, you may pass a `VERSION` environment variable to the gradle build command. If present, gradle will use this value as a tag for all created artifacts (both Jars and Docker images). If unset, gradle will default to using the current VERSION in `.env` for Jars, and `dev` as the Docker image tag. -{% endhint %} -{% hint style="info" %} +::: + +:::info + Gradle will use all CPU cores by default. If Gradle uses too much/too little CPU, tuning the number of CPU cores it uses to better suit a dev's need can help. Adjust this by either, 1. Setting an env var: `export GRADLE_OPTS="-Dorg.gradle.workers.max=3"`. 2. Setting a cli option: `SUB_BUILD=PLATFORM ./gradlew build --max-workers 3` 3. Setting the `org.gradle.workers.max` property in the `gradle.properties` file. A good rule of thumb is to set this to \(\# of cores - 1\). -{% endhint %} -{% hint style="info" %} +::: + +:::info + On Mac, if you run into an error while compiling openssl \(this happens when running pip install\), you may need to explicitly add these flags to your bash profile so that the C compiler can find the appropriate libraries. ```text export LDFLAGS="-L/usr/local/opt/openssl/lib" export CPPFLAGS="-I/usr/local/opt/openssl/include" ``` -{% endhint %} + +::: ## Run in `dev` mode with `docker-compose` diff --git a/docs/deploying-airbyte/local-deployment.md b/docs/deploying-airbyte/local-deployment.md index c6d49cccd4938..d6ce39c0b1c6a 100644 --- a/docs/deploying-airbyte/local-deployment.md +++ b/docs/deploying-airbyte/local-deployment.md @@ -1,8 +1,9 @@ # Local Deployment -{% hint style="info" %} +:::info These instructions have been tested on MacOS, Windows 10 and Ubuntu 20.04. -{% endhint %} + +::: ## Setup & launch Airbyte @@ -15,13 +16,14 @@ cd airbyte docker-compose up ``` -{% hint style="info" %} +:::info If you're using Mac M1 \(Apple Silicon\) machines, you must [build and run Airbyte locally in dev mode](../contributing-to-airbyte/developing-locally.md). Some users using Macs with an M1 chip are facing problems running Airbyte even with a locally built version of the platform. You can subscribe to [Issue \#2017](https://github.com/airbytehq/airbyte/issues/2017) and get updates on M1-related issues. -{% endhint %} + +::: * In your browser, just visit [http://localhost:8000](http://localhost:8000) * Start moving some data! diff --git a/docs/deploying-airbyte/on-aws-ec2.md b/docs/deploying-airbyte/on-aws-ec2.md index 645b77c111d4d..18da4e1608daf 100644 --- a/docs/deploying-airbyte/on-aws-ec2.md +++ b/docs/deploying-airbyte/on-aws-ec2.md @@ -1,8 +1,10 @@ -# On AWS \(EC2\) +# On AWS (EC2) + +:::info -{% hint style="info" %} The instructions have been tested on `Amazon Linux 2 AMI (HVM)` -{% endhint %} + +::: ## Create a new instance @@ -46,9 +48,11 @@ The instructions have been tested on `Amazon Linux 2 AMI (HVM)` ## Install environment -{% hint style="info" %} +:::info + Note: The following commands will be entered either on your local terminal or in your ssh session on the instance terminal. The comments above each command block will indicate where to enter the commands. -{% endhint %} + +::: * Connect to your instance @@ -106,15 +110,19 @@ docker-compose up -d ## Connect to Airbyte -{% hint style="danger" %} +:::danger + For security reasons, we strongly recommend to not expose Airbyte on Internet available ports. Future versions will add support for SSL & Authentication. -{% endhint %} + +::: * Create ssh tunnel for port 8000 -{% hint style="info" %} +:::info + If you want to use different ports you will need to modify `API_URL` in your `.env` file and restart Airbyte. -{% endhint %} + +::: ```bash # In your workstation terminal diff --git a/docs/deploying-airbyte/on-aws-ecs.md b/docs/deploying-airbyte/on-aws-ecs.md index 63ccca5c0ee21..6c6b0e29dbfa9 100644 --- a/docs/deploying-airbyte/on-aws-ecs.md +++ b/docs/deploying-airbyte/on-aws-ecs.md @@ -1,8 +1,10 @@ -# On AWS ECS \(Coming Soon\) +# On AWS ECS (Coming Soon) + +:::info -{% hint style="info" %} We do not currently support deployment on ECS. -{% endhint %} + +::: The current iteration is not compatible with ECS. Airbyte currently relies on docker containers being able to create other docker containers. ECS does not permit containers to do this. We will be revising this strategy soon, so that we can be compatible with ECS and other container services. diff --git a/docs/deploying-airbyte/on-azure-vm-cloud-shell.md b/docs/deploying-airbyte/on-azure-vm-cloud-shell.md index 316e054f6a94e..fbb3c450f863d 100644 --- a/docs/deploying-airbyte/on-azure-vm-cloud-shell.md +++ b/docs/deploying-airbyte/on-azure-vm-cloud-shell.md @@ -1,8 +1,10 @@ -# On Azure\(VM\) +# On Azure (VM) + +:::info -{% hint style="info" %} The instructions have been tested on `Azure VM Linux (ubuntu 18.04)` -{% endhint %} + +::: ## Launch Azure Cloud Shell @@ -118,13 +120,17 @@ sudo docker-compose up -d ## Connect to Airbyte -{% hint style="danger" %} +:::danger + For security reasons, we strongly recommend to not expose Airbyte on Internet available ports. Future versions will add support for SSL & Authentication. -{% endhint %} -{% hint style="info" %} +::: + +:::info + This part assumes that you have access to a terminal on your workstation -{% endhint %} + +::: * Create ssh tunnel for port 8000 diff --git a/docs/deploying-airbyte/on-digitalocean-droplet.md b/docs/deploying-airbyte/on-digitalocean-droplet.md index 808920bba67e1..6db2bcd78d378 100644 --- a/docs/deploying-airbyte/on-digitalocean-droplet.md +++ b/docs/deploying-airbyte/on-digitalocean-droplet.md @@ -1,4 +1,4 @@ -# On DigitalOcean \(Droplet\) +# On DigitalOcean (Droplet) The instructions have been tested on `DigitalOcean Droplet ($5)`. diff --git a/docs/deploying-airbyte/on-gcp-compute-engine.md b/docs/deploying-airbyte/on-gcp-compute-engine.md index 65378cfc538c0..c33d0a40654a5 100644 --- a/docs/deploying-airbyte/on-gcp-compute-engine.md +++ b/docs/deploying-airbyte/on-gcp-compute-engine.md @@ -1,8 +1,10 @@ -# On GCP \(Compute Engine\) +# On GCP (Compute Engine) + +:::info -{% hint style="info" %} The instructions have been tested on `Debian GNU/Linux 10 (buster)` -{% endhint %} + +::: ## Create a new instance @@ -20,9 +22,11 @@ The instructions have been tested on `Debian GNU/Linux 10 (buster)` ## Install environment -{% hint style="info" %} +:::info + Note: The following commands will be entered either on your local terminal or in your ssh session on the instance terminal. The comments above each command block will indicate where to enter the commands. -{% endhint %} + +::: * Set variables in your terminal @@ -116,9 +120,11 @@ docker-compose up -d ## Connect to Airbyte -{% hint style="danger" %} +:::danger + For security reasons, we strongly recommend to not expose Airbyte publicly. Future versions will add support for SSL & Authentication. -{% endhint %} + +::: * Create ssh tunnel. diff --git a/docs/deploying-airbyte/on-kubernetes.md b/docs/deploying-airbyte/on-kubernetes.md index d208f0760ba58..e2867f591eb2f 100644 --- a/docs/deploying-airbyte/on-kubernetes.md +++ b/docs/deploying-airbyte/on-kubernetes.md @@ -1,4 +1,4 @@ -# On Kubernetes \(Beta\) +# On Kubernetes (Beta) ## Overview diff --git a/docs/integrations/README.md b/docs/integrations/README.md index 2cd69adfdb3dd..ed17d625cf7fc 100644 --- a/docs/integrations/README.md +++ b/docs/integrations/README.md @@ -1,8 +1,10 @@ # Connector Catalog -{% hint style="info" %} +:::info + Some connectors on the following list are not yet available on Airbyte Cloud. -{% endhint %} + +::: ## Connector Release Stages diff --git a/docs/integrations/destinations/gcs.md b/docs/integrations/destinations/gcs.md index 3ada50c095946..72a837926d587 100644 --- a/docs/integrations/destinations/gcs.md +++ b/docs/integrations/destinations/gcs.md @@ -1,4 +1,4 @@ -# Google Cloud Storage \(GCS\) +# Google Cloud Storage (GCS) ## Overview diff --git a/docs/integrations/destinations/local-csv.md b/docs/integrations/destinations/local-csv.md index ec4b3a6a35d7c..f0b02d405af15 100644 --- a/docs/integrations/destinations/local-csv.md +++ b/docs/integrations/destinations/local-csv.md @@ -1,8 +1,10 @@ # Local CSV -{% hint style="danger" %} +:::danger + This destination is meant to be used on a local workstation and won't work on Kubernetes -{% endhint %} + +::: ## Overview diff --git a/docs/integrations/destinations/local-json.md b/docs/integrations/destinations/local-json.md index aea607f486d37..e885d9d3b9da1 100644 --- a/docs/integrations/destinations/local-json.md +++ b/docs/integrations/destinations/local-json.md @@ -1,8 +1,10 @@ # Local JSON -{% hint style="danger" %} +:::danger + This destination is meant to be used on a local workstation and won't work on Kubernetes -{% endhint %} + +::: ## Overview diff --git a/docs/integrations/destinations/mssql.md b/docs/integrations/destinations/mssql.md index 8346c156fb1de..b85e0b14c6eb4 100644 --- a/docs/integrations/destinations/mssql.md +++ b/docs/integrations/destinations/mssql.md @@ -11,9 +11,11 @@ ## Output Schema -{% hint style="warning" %} +:::caution + Tables in MSSQL destinations will be prefixed by `_airbyte_raw` due to the fact that MSSQL does not currently support basic normalization. This prefix cannot be removed and this is normal behavior. -{% endhint %} + +::: Each stream will be output into its own table in SQL Server. Each table will contain 3 columns: diff --git a/docs/integrations/destinations/redshift.md b/docs/integrations/destinations/redshift.md index f44eb2b92cb0c..4925b67d610be 100644 --- a/docs/integrations/destinations/redshift.md +++ b/docs/integrations/destinations/redshift.md @@ -45,9 +45,11 @@ You will need to choose an existing database or create a new database that will 2. Allow connections from Airbyte to your Redshift cluster \(if they exist in separate VPCs\) 3. A staging S3 bucket with credentials \(for the COPY strategy\). -{% hint style="info" %} +:::info + Even if your Airbyte instance is running on a server in the same VPC as your Redshift cluster, you may need to place them in the **same security group** to allow connections between the two. -{% endhint %} + +::: ### Setup guide diff --git a/docs/integrations/sources/drupal.md b/docs/integrations/sources/drupal.md index 7c144e87e86e0..f7e3591c75eb7 100644 --- a/docs/integrations/sources/drupal.md +++ b/docs/integrations/sources/drupal.md @@ -4,9 +4,11 @@ ## Sync overview -{% hint style="warning" %} +:::caution + You will only be able to connect to a self-hosted instance of Drupal using these instructions. -{% endhint %} + +::: Drupal can run on MySQL, Percona, MariaDb, MSSQL, MongoDB, Postgres, or SQL-Lite. If you're not using SQL-lite, you can use Airbyte to sync your Drupal instance by connecting to the underlying database using the appropriate Airbyte connector: @@ -15,9 +17,11 @@ Drupal can run on MySQL, Percona, MariaDb, MSSQL, MongoDB, Postgres, or SQL-Lite * [Mongo](mongodb-v2.md) * [Postgres](postgres.md) -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/google-ads.md b/docs/integrations/sources/google-ads.md index 23f75428ffa20..9a6b503b870e7 100644 --- a/docs/integrations/sources/google-ads.md +++ b/docs/integrations/sources/google-ads.md @@ -1,8 +1,10 @@ # Google Ads -{% hint style="warning" %} +:::caution + If you don't already have a developer token from Google Ads, make sure you follow the [instructions](google-ads.md#how-to-apply-for-the-developer-token) so your request doesn't get denied. -{% endhint %} + +::: ## Features diff --git a/docs/integrations/sources/http-request.md b/docs/integrations/sources/http-request.md index bce9807d6b8b3..ebefc687a7bd6 100644 --- a/docs/integrations/sources/http-request.md +++ b/docs/integrations/sources/http-request.md @@ -1,8 +1,10 @@ # HTTP Request (Graveyarded) -{% hint style="warning" %} +:::caution + This connector is graveyarded and will not be receiving any updates from the Airbyte team. Its functionalities have been replaced by the [Airbyte CDK](../../connector-development/cdk-python/README.md), which allows you to create source connectors for any HTTP API. -{% endhint %} + +::: ## Overview diff --git a/docs/integrations/sources/hubspot.md b/docs/integrations/sources/hubspot.md index 8428dd3b9b2b7..ffabe35eca3df 100644 --- a/docs/integrations/sources/hubspot.md +++ b/docs/integrations/sources/hubspot.md @@ -67,9 +67,11 @@ Depending on the type of engagement, different properties will be set for that o 2. Fill out a start date 3. You're done. -{% hint style="info" %} +:::info + HubSpot's API will [rate limit](https://developers.hubspot.com/docs/api/usage-details) the amount of records you can sync daily, so make sure that you are on the appropriate plan if you are planning on syncing more than 250,000 records per day. -{% endhint %} + +::: ### Requirements \(Airbyte Open-Source\) @@ -91,9 +93,11 @@ To obtain the API Key for the account, go to settings -> integrations \(under See HubSpot [docs](https://legacydocs.hubspot.com/docs/methods/oauth2/oauth2-quickstart) if you need help finding these fields -{% hint style="info" %} +:::info + HubSpot's API will [rate limit](https://developers.hubspot.com/docs/api/usage-details) the amount of records you can sync daily, so make sure that you are on the appropriate plan if you are planning on syncing more than 250,000 records per day. -{% endhint %} + +::: ## Rate Limiting & Performance diff --git a/docs/integrations/sources/magento.md b/docs/integrations/sources/magento.md index 69718d8dbcfc8..d398d0d363fde 100644 --- a/docs/integrations/sources/magento.md +++ b/docs/integrations/sources/magento.md @@ -6,9 +6,11 @@ Magento runs on MySQL. You can use Airbyte to sync your Magento instance by connecting to the underlying database using the [MySQL connector](mysql.md). -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/microsoft-dynamics-customer-engagement.md b/docs/integrations/sources/microsoft-dynamics-customer-engagement.md index 88ae5f951e0d9..7c0467f85c675 100644 --- a/docs/integrations/sources/microsoft-dynamics-customer-engagement.md +++ b/docs/integrations/sources/microsoft-dynamics-customer-engagement.md @@ -6,9 +6,11 @@ MS Dynamics Customer Engagement runs on [MSSQL](https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/deploy/system-requirements-required-technologies?view=op-9-1) database. You can use the [MSSQL connector](mssql.md) to sync your MS Dynamics Customer Engagement instance by connecting to the underlying database. -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/microsoft-dynamics-gp.md b/docs/integrations/sources/microsoft-dynamics-gp.md index d5a8317905aff..00c72bb5e480a 100644 --- a/docs/integrations/sources/microsoft-dynamics-gp.md +++ b/docs/integrations/sources/microsoft-dynamics-gp.md @@ -6,9 +6,11 @@ MS Dynamics GP runs on the [MSSQL](https://docs.microsoft.com/en-us/dynamics-gp/installation/installing-on-first-computer) database. You can use the [MSSQL connector](mssql.md) to sync your MS Dynamics GP instance by connecting to the underlying database. -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/microsoft-dynamics-nav.md b/docs/integrations/sources/microsoft-dynamics-nav.md index d98aa67282cdc..8b1a6fabe2a98 100644 --- a/docs/integrations/sources/microsoft-dynamics-nav.md +++ b/docs/integrations/sources/microsoft-dynamics-nav.md @@ -6,9 +6,11 @@ MS Dynamics NAV runs on the [MSSQL](https://docs.microsoft.com/en-us/dynamics-nav/installation-considerations-for-microsoft-sql-server) database. You can use the [MSSQL connector](mssql.md) to sync your MS Dynamics NAV instance by connecting to the underlying database. -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/mssql.md b/docs/integrations/sources/mssql.md index 276219142e68a..6e9de424edc72 100644 --- a/docs/integrations/sources/mssql.md +++ b/docs/integrations/sources/mssql.md @@ -1,4 +1,4 @@ -# Microsoft SQL Server \(MSSQL\) +# Microsoft SQL Server (MSSQL) ## Features diff --git a/docs/integrations/sources/okta.md b/docs/integrations/sources/okta.md index 99e07056785b6..1f3dbc64be9b7 100644 --- a/docs/integrations/sources/okta.md +++ b/docs/integrations/sources/okta.md @@ -43,9 +43,11 @@ The connector is restricted by normal Okta [requests limitation](https://develop In order to pull data out of your Okta instance, you need to create an [API Token](https://developer.okta.com/docs/guides/create-an-api-token/overview/). -{% hint style="info" %} +:::info + Different Okta APIs require different admin privilege levels. API tokens inherit the privilege level of the admin account used to create them -{% endhint %} + +::: 1. Sign in to your Okta organization as a user with [administrator privileges](https://help.okta.com/en/prod/okta_help_CSH.htm#ext_Security_Administrators) 2. Access the API page: In the Admin Console, select API from the Security menu and then select the Tokens tab. diff --git a/docs/integrations/sources/oracle-peoplesoft.md b/docs/integrations/sources/oracle-peoplesoft.md index 535bd1cd16dde..3e721147b3bb7 100644 --- a/docs/integrations/sources/oracle-peoplesoft.md +++ b/docs/integrations/sources/oracle-peoplesoft.md @@ -10,9 +10,11 @@ Oracle PeopleSoft can run on the [Oracle, MSSQL, or IBM DB2](https://docs.oracle * [MSSQL](mssql.md) * [Oracle](oracle.md) -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/oracle-siebel-crm.md b/docs/integrations/sources/oracle-siebel-crm.md index 5a1e40c9f70f9..ee73bd1321602 100644 --- a/docs/integrations/sources/oracle-siebel-crm.md +++ b/docs/integrations/sources/oracle-siebel-crm.md @@ -10,9 +10,11 @@ Oracle Siebel CRM can run on the [Oracle, MSSQL, or IBM DB2](https://docs.oracle * [MSSQL](mssql.md) * [Oracle](oracle.md) -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/sap-business-one.md b/docs/integrations/sources/sap-business-one.md index bfd3ac62e3bf6..9acbfb75fd2e0 100644 --- a/docs/integrations/sources/sap-business-one.md +++ b/docs/integrations/sources/sap-business-one.md @@ -6,9 +6,11 @@ SAP Business One can run on the MSSQL or SAP HANA databases. If your instance is deployed on MSSQL, you can use Airbyte to sync your SAP Business One instance by using the [MSSQL connector](mssql.md). -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/slack.md b/docs/integrations/sources/slack.md index f513bb6e49c4d..123a7c4a379b8 100644 --- a/docs/integrations/sources/slack.md +++ b/docs/integrations/sources/slack.md @@ -62,15 +62,19 @@ You can get more detailed information about this type of authentication by readi ### Setup guide -{% hint style="info" %} +:::info + If you are using an "legacy" Slack API, skip to the Legacy API Key section below. -{% endhint %} + +::: In order to pull data out of your Slack instance, you need to create a Slack App. This may sound daunting, but it is actually pretty straight forward. Slack supplies [documentation](https://api.slack.com/start) on how to build apps. Feel free to follow that if you want to do something fancy. We'll describe the steps we followed to creat the Slack App for this tutorial. -{% hint style="info" %} +:::info + This tutorial assumes that you are an administrator on your slack instance. If you are not, you will need to coordinate with your administrator on the steps that require setting permissions for your app. -{% endhint %} + +::: 1. Go to the [apps page](https://api.slack.com/apps) 2. Click "Create New App" diff --git a/docs/integrations/sources/spree-commerce.md b/docs/integrations/sources/spree-commerce.md index 7e582a7856523..5271bfe99a032 100644 --- a/docs/integrations/sources/spree-commerce.md +++ b/docs/integrations/sources/spree-commerce.md @@ -9,9 +9,11 @@ Spree Commerce can run on the MySQL or Postgres databases. You can use Airbyte t * [MySQL](mysql.md) * [Postgres](postgres.md) -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/sugar-crm.md b/docs/integrations/sources/sugar-crm.md index 9a6f5ce9f05df..27e2960ff1056 100644 --- a/docs/integrations/sources/sugar-crm.md +++ b/docs/integrations/sources/sugar-crm.md @@ -4,9 +4,11 @@ ## Sync overview -{% hint style="warning" %} +:::caution + You will only be able to connect to a self-hosted instance of Sugar CRM using these instructions. -{% endhint %} + +::: Sugar CRM can run on the MySQL, MSSQL, Oracle, or Db2 databases. You can use Airbyte to sync your Sugar CRM instance by connecting to the underlying database using the appropriate Airbyte connector: @@ -15,13 +17,17 @@ Sugar CRM can run on the MySQL, MSSQL, Oracle, or Db2 databases. You can use Air * [MSSQL](mssql.md) * [Oracle](oracle.md) -{% hint style="info" %} +:::info + To use Oracle or DB2, you'll require an Enterprise or Ultimate Sugar subscription. -{% endhint %} -{% hint style="info" %} +::: + +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/wordpress.md b/docs/integrations/sources/wordpress.md index 4a3a3c27ceff7..8d70e8fbfe607 100644 --- a/docs/integrations/sources/wordpress.md +++ b/docs/integrations/sources/wordpress.md @@ -6,9 +6,11 @@ Wordpress runs on a MySQL database. You can use Airbyte to sync your Wordpress instance by connecting to the underlying MySQL database and leveraging the [MySQL](mysql.md) connector. -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/integrations/sources/zencart.md b/docs/integrations/sources/zencart.md index 81b731d4d8265..4d52400355320 100644 --- a/docs/integrations/sources/zencart.md +++ b/docs/integrations/sources/zencart.md @@ -6,9 +6,11 @@ Zencart runs on a MySQL database. You can use Airbyte to sync your Zencart instance by connecting to the underlying MySQL database and leveraging the [MySQL](mysql.md) connector. -{% hint style="info" %} +:::info + Reach out to your service representative or system admin to find the parameters required to connect to the underlying database -{% endhint %} + +::: ### Output schema diff --git a/docs/operator-guides/configuring-airbyte-db.md b/docs/operator-guides/configuring-airbyte-db.md index 5283c41ebe102..1b57f2ae90733 100644 --- a/docs/operator-guides/configuring-airbyte-db.md +++ b/docs/operator-guides/configuring-airbyte-db.md @@ -58,9 +58,11 @@ CONFIG_DATABASE_URL=jdbc:postgresql://<host>:<port>/<database>?<extra-parameters ## Initializing the database -{% hint style="info" %} +:::info + This step is only required when you setup Airbyte with a custom database for the first time. -{% endhint %} + +::: If you provide an empty database to Airbyte and start Airbyte up for the first time, the server will automatically create the relevant tables in your database, and copy the data. Please make sure: diff --git a/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md b/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md index 9ea574b9fecfd..ff3cdaf32c140 100644 --- a/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md +++ b/docs/operator-guides/transformation-and-normalization/transformations-with-airbyte.md @@ -1,4 +1,4 @@ -# Transformations with Airbyte \(Part 3/3\) +# Transformations with Airbyte (Part 3/3) ## Overview diff --git a/docs/operator-guides/transformation-and-normalization/transformations-with-dbt.md b/docs/operator-guides/transformation-and-normalization/transformations-with-dbt.md index aabd62a406893..e7ea6b4158bb3 100644 --- a/docs/operator-guides/transformation-and-normalization/transformations-with-dbt.md +++ b/docs/operator-guides/transformation-and-normalization/transformations-with-dbt.md @@ -1,4 +1,4 @@ -# Transformations with dbt \(Part 2/3\) +# Transformations with dbt (Part 2/3) ## Overview diff --git a/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md b/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md index df085355c6e41..3f6c9357d2c1c 100644 --- a/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md +++ b/docs/operator-guides/transformation-and-normalization/transformations-with-sql.md @@ -1,4 +1,4 @@ -# Transformations with SQL \(Part 1/3\) +# Transformations with SQL (Part 1/3) ## Transformations with SQL \(Part 1/3\) diff --git a/docs/operator-guides/upgrading-airbyte.md b/docs/operator-guides/upgrading-airbyte.md index aa52f4627b7cc..cf1b6c6ba7ae3 100644 --- a/docs/operator-guides/upgrading-airbyte.md +++ b/docs/operator-guides/upgrading-airbyte.md @@ -53,9 +53,11 @@ If you use custom connectors, this upgrade requires your all of your connector s If you did not start Airbyte from the root of the Airbyte monorepo, you may run into issues where existing orphaned Airbyte configurations will prevent you from upgrading with the automatic process. To fix this, we will need to globally remove these lost Airbyte configurations. You can do this with `docker volume rm $(docker volume ls -q | grep airbyte)`. -{% hint style="danger" %} +:::danger + This will completely reset your Airbyte deployment back to scratch and you will lose all data. -{% endhint %} + +::: ## Upgrading on K8s \(0.27.0-alpha and above\) diff --git a/docs/operator-guides/using-the-airflow-airbyte-operator.md b/docs/operator-guides/using-the-airflow-airbyte-operator.md index a365bd3462a0e..e0297827fc510 100644 --- a/docs/operator-guides/using-the-airflow-airbyte-operator.md +++ b/docs/operator-guides/using-the-airflow-airbyte-operator.md @@ -6,9 +6,11 @@ description: Start triggering Airbyte jobs with Apache Airflow in minutes Airbyte is an official community provider for the Apache Airflow project. The Airbyte operator allows you to trigger synchronization jobs in Apache Airflow, and this tutorial will walk through configuring your Airflow DAG to do so. -{% hint style="warning" %} +:::caution + Due to some difficulties in setting up Airflow, we recommend first trying out the deployment using the local example [here](https://github.com/airbytehq/airbyte/tree/master/resources/examples/airflow), as it contains accurate configuration required to get the Airbyte operator up and running. -{% endhint %} + +::: The Airbyte Provider documentation on Airflow project can be found [here](https://airflow.apache.org/docs/apache-airflow-providers-airbyte/stable/index.html). diff --git a/docs/project-overview/changelog/platform.md b/docs/project-overview/changelog/platform.md index e93dc7d850768..d27e9246e3387 100644 --- a/docs/project-overview/changelog/platform.md +++ b/docs/project-overview/changelog/platform.md @@ -68,16 +68,20 @@ This is the changelog for Airbyte Platform. For our connector changelog, please ## [10-21-2021 - 0.30.22](https://github.com/airbytehq/airbyte/releases/tag/v0.30.22-alpha) * We now support experimental deployment of Airbyte on Macbooks with M1 chips! -{% hint style="info" %} +:::info + This interim patch period mostly contained stability changes for Airbyte Cloud, so we skipped from `0.30.16` to `0.30.22`. -{% endhint %} + +::: ## [10-07-2021 - 0.30.16](https://github.com/airbytehq/airbyte/releases/tag/v0.30.16-alpha) * On Kubernetes deployments, you can now configure the Airbyte Worker Pod's image pull policy. -{% hint style="info" %} +:::info + This interim patch period mostly contained stability changes for Airbyte Cloud, so we skipped from `0.30.2` to `0.30.16`. -{% endhint %} + +::: ## [09-30-2021 - 0.30.2](https://github.com/airbytehq/airbyte/releases/tag/v0.30.2-alpha) * Fixed a bug that would fail Airbyte upgrades for deployments with sync notifications. diff --git a/docs/quickstart/add-a-destination.md b/docs/quickstart/add-a-destination.md index 5802e1c6786da..1204df4038a15 100644 --- a/docs/quickstart/add-a-destination.md +++ b/docs/quickstart/add-a-destination.md @@ -6,9 +6,11 @@ The resulting files will be located in `/tmp/airbyte_local/json_data` To set it up, just follow the instructions on the screenshot below. -{% hint style="info" %} +:::info + You might have to wait ~30 seconds before the fields show up because it is the first time you're using Airbyte. -{% endhint %} + +:::  diff --git a/docs/quickstart/add-a-source.md b/docs/quickstart/add-a-source.md index 3d299d4b3194f..e6fbc14731093 100644 --- a/docs/quickstart/add-a-source.md +++ b/docs/quickstart/add-a-source.md @@ -6,9 +6,11 @@ Our demo source will pull data from an external API, which will pull down the in To set it up, just follow the instructions on the screenshot below. -{% hint style="info" %} +:::info + You might have to wait ~30 seconds before the fields show up because it is the first time you're using Airbyte. -{% endhint %} + +:::  diff --git a/docs/quickstart/getting-started.md b/docs/quickstart/getting-started.md index 82e994ec5d369..44cbeacb5be1f 100644 --- a/docs/quickstart/getting-started.md +++ b/docs/quickstart/getting-started.md @@ -30,9 +30,11 @@ The source we are creating will pull data from an external API. It will replicat To set it up, just follow the instructions on the screenshot below. -{% hint style="info" %} +:::info + You might have to wait ~30 seconds before the fields show up because it is the first time you're using Airbyte. -{% endhint %} + +:::  @@ -44,9 +46,11 @@ The resulting files will be located in `/tmp/airbyte_local/json_data` To set it up, just follow the instructions on the screenshot below. -{% hint style="info" %} +:::info + You might have to wait ~30 seconds before the fields show up because it is the first time you're using Airbyte. -{% endhint %} + +:::  diff --git a/docs/understanding-airbyte/basic-normalization.md b/docs/understanding-airbyte/basic-normalization.md index 4ae5615e1f7ff..a8127852c4323 100644 --- a/docs/understanding-airbyte/basic-normalization.md +++ b/docs/understanding-airbyte/basic-normalization.md @@ -2,9 +2,11 @@ ## High-Level Overview -{% hint style="info" %} +:::info + The high-level overview contains all the information you need to use Basic Normalization when pulling from APIs. Information past that can be read for advanced or educational purposes. -{% endhint %} + +::: When you run your first Airbyte sync without the basic normalization, you'll notice that your data gets written to your destination as one data column with a JSON blob that contains all of your data. This is the `_airbyte_raw_` table that you may have seen before. Why do we create this table? A core tenet of ELT philosophy is that data should be untouched as it moves through the E and L stages so that the raw data is always accessible. If an unmodified version of the data exists in the destination, it can be retransformed without needing to sync data again. diff --git a/docs/understanding-airbyte/cdc.md b/docs/understanding-airbyte/cdc.md index 18367d899a7cb..3b381493651dc 100644 --- a/docs/understanding-airbyte/cdc.md +++ b/docs/understanding-airbyte/cdc.md @@ -1,4 +1,4 @@ -# Change Data Capture \(CDC\) +# Change Data Capture (CDC) ## What is log-based incremental replication? diff --git a/docs/understanding-airbyte/glossary.md b/docs/understanding-airbyte/glossary.md index 7741893902c6b..37d017f25b76a 100644 --- a/docs/understanding-airbyte/glossary.md +++ b/docs/understanding-airbyte/glossary.md @@ -42,24 +42,30 @@ Airbyte spits out tables with the prefix `_airbyte_raw_`. This is your replicate ### AirbyteCatalog -{% hint style="info" %} +:::info + This is only relevant for individuals who want to create a connector. -{% endhint %} + +::: This refers to how you define the data that you can retrieve from a Source. For example, if you want to retrieve information from an API, the data that you can receive needs to be defined clearly so that Airbyte can have a clear expectation of what endpoints are supported and what the objects that the streams return look like. This is represented as a sort of schema that Airbyte can interpret. Learn more [here](beginners-guide-to-catalog.md). ### Airbyte Specification -{% hint style="info" %} +:::info + This is only relevant for individuals who want to create a connector. -{% endhint %} + +::: This refers to the functions that a Source or Destination must implement to successfully retrieve data and load it, respectively. Implementing these functions using the Airbyte Specification makes a Source or Destination work correctly. Learn more [here](airbyte-specification.md). ### Temporal -{% hint style="info" %} +:::info + This is only relevant for individuals who want to learn about or contribute to our underlying platform. -{% endhint %} + +::: [Temporal](https://temporal.io) is a development kit that lets you create workflows, parallelize them, and handle failures/retries gracefully. We use it to reliably schedule each step of the ELT process, and a Temporal service is always deployed with each Airbyte installation. diff --git a/docs/understanding-airbyte/namespaces.md b/docs/understanding-airbyte/namespaces.md index 3fe323cc6e1a3..c2ff1af2ef4a7 100644 --- a/docs/understanding-airbyte/namespaces.md +++ b/docs/understanding-airbyte/namespaces.md @@ -2,9 +2,11 @@ ## High-Level Overview -{% hint style="info" %} +:::info + The high-level overview contains all the information you need to use Namespaces when pulling from APIs. Information past that can be read for advanced or educational purposes. -{% endhint %} + +::: When looking through our connector docs, you'll notice that some sources and destinations support "Namespaces." These allow you to organize and separate your data into groups in the destination if the destination supports it. In most cases, namespaces are schemas in the database you're replicating to. If your desired destination doesn't support it, you can ignore this feature.