diff --git a/docs/versioned_docs/version-7.0/auth/azure.md b/docs/versioned_docs/version-7.0/auth/azure.md
deleted file mode 100644
index ce872f914f25..000000000000
--- a/docs/versioned_docs/version-7.0/auth/azure.md
+++ /dev/null
@@ -1,182 +0,0 @@
----
-sidebar_label: Azure
----
-
-# Azure Active Directory Authentication
-
-To get started, run the setup command:
-
-```bash
-yarn rw setup auth azure-active-directory
-```
-
-This installs all the packages, writes all the files, and makes all the code
-modifications you need. For a detailed explanation of all the api- and web-side
-changes that aren't exclusive to Azure, see the top-level
-[Authentication](../authentication.md) doc. For now, let's focus on Azure's
-side of things.
-
-Follow the steps in [Single-page application: App registration](https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-spa-app-registration).
-After registering your app, you'll be redirected to its "Overview" section.
-We're interested in two credentials here, "Application (client) ID" and "Directory (tenant) ID".
-Go ahead and copy "Application (client) ID" to your `.env` file as `AZURE_ACTIVE_DIRECTORY_CLIENT_ID`.
-But "Directory (tenant) ID" needs a bit more explanation.
-
-Azure has an option called "Authority". It's a URL that specifies a directory that MSAL (Microsoft Authentication Library) can request tokens from.
-You can read more about it [here](https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-client-application-configuration#authority),
-but to cut to the chase, you probably want `https://login.microsoftonline.com/${tenantId}` as your Authority, where `tenantId` is "Directory (tenant) ID".
-
-After substituting your app's "Directory (tenant) ID" in the URL, add it to your `.env` file as `AZURE_ACTIVE_DIRECTORY_AUTHORITY`.
-All together now:
-
-```bash title=".env"
-AZURE_ACTIVE_DIRECTORY_CLIENT_ID="..."
-# Where `tenantId` is your app's "Directory (tenant) ID"
-AZURE_ACTIVE_DIRECTORY_AUTHORITY="https://login.microsoftonline.com/${tenantId}"
-```
-
-Ok, back to [Single-page application: App registration](https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-spa-app-registration).
-At the end, it says...
-
-> Next, configure the app registration with a Redirect URI to specify where the Microsoft identity platform should redirect the client along with any security tokens.
-> Use the steps appropriate for the version of MSAL.js you're using in your application:
->
-> - MSAL.js 2.0 with auth code flow (recommended)
-> - MSAL.js 1.0 with implicit flow
-
-Redwood uses [MSAL.js 2.0 with auth code flow](https://learn.microsoft.com/en-us/azure/active-directory/develop/scenario-spa-app-registration#redirect-uri-msaljs-20-with-auth-code-flow), so follow the steps there next.
-When it asks you for a Redirect URI, enter `http://localhost:8910` and `http://localhost:8910/login`, and copy these into your `.env` file as `AZURE_ACTIVE_DIRECTORY_REDIRECT_URI` and `AZURE_ACTIVE_DIRECTORY_LOGOUT_REDIRECT_URI`:
-
-:::tip Can't add multiple URI's?
-
-Configure one, then you'll be able to configure another.
-
-:::
-
-```bash title=".env"
-AZURE_ACTIVE_DIRECTORY_CLIENT_ID="..."
-# Where `tenantId` is your app's "Directory (tenant) ID"
-AZURE_ACTIVE_DIRECTORY_AUTHORITY="https://login.microsoftonline.com/${tenantId}"
-AZURE_ACTIVE_DIRECTORY_REDIRECT_URI="http://localhost:8910"
-AZURE_ACTIVE_DIRECTORY_LOGOUT_REDIRECT_URI="http://localhost:8910/login"
-```
-
-That's it for .env vars. Don't forget to include them in the `includeEnvironmentVariables` array in `redwood.toml`:
-
-```toml title="redwood.toml"
-[web]
- # ...
- includeEnvironmentVariables = [
- "AZURE_ACTIVE_DIRECTORY_CLIENT_ID",
- "AZURE_ACTIVE_DIRECTORY_AUTHORITY",
- "AZURE_ACTIVE_DIRECTORY_REDIRECT_URI",
- "AZURE_ACTIVE_DIRECTORY_LOGOUT_REDIRECT_URI",
- ]
-```
-
-Now let's make sure everything works: if this is a brand new project, generate
-a home page. There we'll try to sign up by destructuring `signUp` from the
-`useAuth` hook (import that from `'src/auth'`). We'll also destructure and
-display `isAuthenticated` to see if it worked:
-
-```
-yarn rw g page home /
-```
-
-```tsx title="web/src/pages/HomePage.tsx"
-import { useAuth } from 'src/auth'
-
-const HomePage = () => {
- const { isAuthenticated, signUp } = useAuth()
-
- return (
- <>
- {/* MetaTags, h1, paragraphs, etc. */}
-
-
{JSON.stringify({ isAuthenticated })}
-
- >
- )
-}
-```
-
-## Roles
-
-To add roles exposed via the `roles` claim, follow [Add app roles to your application and receive them in the token](https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-add-app-roles-in-azure-ad-apps).
-
-## `logIn` Options
-
-`options` in `logIn(options?)` is of type [RedirectRequest](https://azuread.github.io/microsoft-authentication-library-for-js/ref/modules/_azure_msal_browser.html#redirectrequest) and is a good place to pass in optional [scopes](https://docs.microsoft.com/en-us/graph/permissions-reference#user-permissions) to be authorized.
-By default, MSAL sets `scopes` to [/.default](https://docs.microsoft.com/en-us/azure/active-directory/develop/v2-permissions-and-consent#the-default-scope) which is built in for every application that refers to the static list of permissions configured on the application registration. Furthermore, MSAL will add `openid` and `profile` to all requests. In the example below we explicit include `User.Read.All` in the login scope.
-
-```jsx
-await logIn({
- scopes: ['User.Read.All'], // becomes ['openid', 'profile', 'User.Read.All']
-})
-```
-
-See [loginRedirect](https://azuread.github.io/microsoft-authentication-library-for-js/ref/classes/_azure_msal_browser.publicclientapplication.html#loginredirect), [PublicClientApplication](https://azuread.github.io/microsoft-authentication-library-for-js/ref/classes/_azure_msal_browser.publicclientapplication.html) class and [Scopes Behavior](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-core/docs/scopes.md#scopes-behavior) for more documentation.
-
-## `getToken` Options
-
-`options` in `getToken(options?)` is of type [RedirectRequest](https://azuread.github.io/microsoft-authentication-library-for-js/ref/modules/_azure_msal_browser.html#redirectrequest).
-By default, `getToken` will be called with scope `['openid', 'profile']`.
-Since Azure Active Directory applies [incremental consent](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/resources-and-scopes.md#dynamic-scopes-and-incremental-consent), we can extend the permissions from the login example by including another scope, for example `Mail.Read`:
-
-```js
-await getToken({
- scopes: ['Mail.Read'], // becomes ['openid', 'profile', 'User.Read.All', 'Mail.Read']
-})
-```
-
-See [acquireTokenSilent](https://azuread.github.io/microsoft-authentication-library-for-js/ref/classes/_azure_msal_browser.publicclientapplication.html#acquiretokensilent), [Resources and Scopes](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/resources-and-scopes.md#resources-and-scopes) or [full class documentation](https://pub.dev/documentation/msal_js/latest/msal_js/PublicClientApplication-class.html#constructors) for more.
-
-## Azure Active Directory B2C-specific configuration
-
-You can design your own auth flow with Azure Active Directory B2C using [hosted user flows](https://docs.microsoft.com/en-us/azure/active-directory-b2c/add-sign-up-and-sign-in-policy?pivots=b2c-user-flow).
-Using it requires two extra settings.
-
-#### Update the .env file:
-
-```bash title=".env"
-AZURE_ACTIVE_DIRECTORY_AUTHORITY=https://{your-microsoft-tenant-name}.b2clogin.com/{{your-microsoft-tenant-name}}.onmicrosoft.com/{{your-microsoft-user-flow-id}}
-AZURE_ACTIVE_DIRECTORY_JWT_ISSUER=https://{{your-microsoft-tenant-name}}.b2clogin.com/{{your-microsoft-tenant-id}}/v2.0/
-AZURE_ACTIVE_DIRECTORY_KNOWN_AUTHORITY=https://{{your-microsoft-tenant-name}}.b2clogin.com
-```
-
-Here's an example:
-
-```bash title=".env.example"
-AZURE_ACTIVE_DIRECTORY_AUTHORITY=https://rwauthtestb2c.b2clogin.com/rwauthtestb2c.onmicrosoft.com/B2C_1_signupsignin1
-AZURE_ACTIVE_DIRECTORY_JWT_ISSUER=https://rwauthtestb2c.b2clogin.com/775527ef-8a37-4307-8b3d-cc311f58d922/v2.0/
-AZURE_ACTIVE_DIRECTORY_KNOWN_AUTHORITY=https://rwauthtestb2c.b2clogin.com
-```
-
-And don't forget to add `AZURE_ACTIVE_DIRECTORY_KNOWN_AUTHORITY` to the `includeEnvironmentVariables` array in `redwood.toml`.
-(`AZURE_ACTIVE_DIRECTORY_JWT_ISSUER` is only used on the API side. But more importantly, it's sensitive—do *not* include it in the web side.)
-
-#### Update `activeDirectoryClient` instance
-
-This lets the MSAL web-side client know about our new B2C authority:
-
-```jsx title="web/src/auth.{js,ts}"
-const azureActiveDirectoryClient = new PublicClientApplication({
- auth: {
- clientId: process.env.AZURE_ACTIVE_DIRECTORY_CLIENT_ID,
- authority: process.env.AZURE_ACTIVE_DIRECTORY_AUTHORITY,
- redirectUri: process.env.AZURE_ACTIVE_DIRECTORY_REDIRECT_URI,
- postLogoutRedirectUri:
- process.env.AZURE_ACTIVE_DIRECTORY_LOGOUT_REDIRECT_URI,
- // highlight-next-line
- knownAuthorities: [process.env.AZURE_ACTIVE_DIRECTORY_KNOWN_AUTHORITY]
- },
-})
-```
-
-Now you can call the `logIn` and `logOut` functions from `useAuth()`, and everything should just work.
-
-Here's a few more links to relevant documentation for reference:
-- [Overview of tokens in Azure Active Directory B2C](https://docs.microsoft.com/en-us/azure/active-directory-b2c/tokens-overview)
-- [Working with MSAL.js and Azure AD B2C](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/working-with-b2c.md)
diff --git a/docs/versioned_docs/version-7.0/authentication.md b/docs/versioned_docs/version-7.0/authentication.md
deleted file mode 100644
index 91bb3184de3a..000000000000
--- a/docs/versioned_docs/version-7.0/authentication.md
+++ /dev/null
@@ -1,203 +0,0 @@
----
-description: Set up an authentication provider
----
-
-# Authentication
-
-Redwood has integrated auth end to end, from the web side to the api side.
-On the web side, the router can protect pages via the `PrivateSet` component, and even restrict access at the role-level.
-And if you'd prefer to work with the primitives, the `useAuth` hook exposes all the pieces to build the experience you want.
-
-Likewise, the api side is locked down by default: all SDLs are generated with the `@requireAuth` directive, ensuring that making things publicly available is something that you opt in to rather than out of.
-You can also require auth anywhere in your Services, and even in your serverful or serverless functions.
-
-Last but not least, Redwood provides it's own self-hosted, full-featured auth provider: [dbAuth](./auth/dbauth.md).
-
-In this doc, we'll cover auth at a high level.
-All auth providers share the same interface so the information here will be useful no matter which auth provider you use.
-
-## Official integrations
-
-Redwood has a simple API to integrate any auth provider you can think of. But to make it easier for you to get started, Redwood provides official integrations for some of the most popular auth providers out of the box:
-
-- [Auth0](./auth/auth0.md)
-- [Azure Active Directory](./auth/azure.md)
-- [Clerk](./auth/clerk.md)
-- [Firebase](./auth/firebase.md)
-- [Netlify](./auth/netlify.md)
-- [Supabase](./auth/supabase.md)
-- [SuperTokens](./auth/supertokens.md)
-
-:::tip how to tell if an integration is official
-
-To tell if an integration is official, look for the `@redwoodjs` scope.
-For example, Redwood's Auth0 integration comprises two npm packages: `@redwoodjs/auth-auth0-web` and `@redwoodjs/auth-auth0-api`.
-
-:::
-
-Other than bearing the `@redwoodjs` scope, the reason these providers are official is that we're committed to keeping them up to date.
-You can set up any of them via the corresponding auth setup command:
-
-```
-yarn rw setup auth auth0
-```
-
-## The API at a high-level
-
-We mentioned that Redwood has a simple API you can use to integrate any provider you want.
-Whether you roll your own auth provider or choose one of Redwood's integrations, it's good to be familiar with it, so let's dive into it here.
-
-On the web side, there are two components that can be auth enabled: the `RedwoodApolloProvider` in `web/src/App.tsx` and the `Router` in `web/src/Routes.tsx`.
-Both take a `useAuth` prop. If provided, they'll use this hook to get information about the app's auth state. The `RedwoodApolloProvider` uses it to get a token to include in every GraphQL request, and the `Router` uses it to determine if a user has access to private or role-restricted routes.
-
-When you set up an auth provider, the setup command makes a new file, `web/src/auth.ts`. This file's job is to create the `AuthProvider` component and the `useAuth` hook by integrating with the auth provider of your choice. Whenever you need access to the auth context, you'll import the `useAuth` hook from this file. The `RedwoodApolloProvider` and the `Router` are no exceptions:
-
-![web-side-auth](https://user-images.githubusercontent.com/32992335/208549951-469617d7-c798-4d9a-8a29-46efe23cca6a.png)
-
-Once auth is setup on the web side, every GraphQL request includes a JWT (JSON Web Token).
-The api side needs a way of verifying and decoding this token if it's to do anything with it.
-There are two steps to this process:
-
-- decoding the token
-- mapping it into a user object
-
-The `createGraphQLHandler` function in `api/src/functions/graphql.ts` takes two props, `authDecoder` and `getCurrentUser`, for each of these steps (respectively):
-
-```ts title="api/src/functions/graphql.ts"
-// highlight-next-line
-import { authDecoder } from '@redwoodjs/auth-auth0-api'
-import { createGraphQLHandler } from '@redwoodjs/graphql-server'
-
-import directives from 'src/directives/**/*.{js,ts}'
-import sdls from 'src/graphql/**/*.sdl.{js,ts}'
-import services from 'src/services/**/*.{js,ts}'
-
-// highlight-next-line
-import { getCurrentUser } from 'src/lib/auth'
-import { db } from 'src/lib/db'
-import { logger } from 'src/lib/logger'
-
-export const handler = createGraphQLHandler({
- // highlight-start
- authDecoder,
- getCurrentUser,
- // highlight-end
- loggerConfig: { logger, options: {} },
- directives,
- sdls,
- services,
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-### Destructuring the `useAuth` hook
-
-That was auth at a high level.
-Now for a few more details on something you'll probably use a lot, the `useAuth` hook.
-
-The `useAuth` hook provides a streamlined interface to your auth provider's client SDK.
-Much of what the functions it returns do is self explanatory, but the options they take depend on the auth provider:
-
-| Name | Description |
-| :---------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| `client` | The client instance used in creating the auth provider. Most of the functions here use this under the hood |
-| `currentUser` | An object containing information about the current user as set on the `api` side, or if the user isn't authenticated, `null` |
-| `getToken` | Returns a JWT |
-| `hasRole` | Determines if the current user is assigned a role like `"admin"` or assigned to any of the roles in an array |
-| `isAuthenticated` | A boolean indicating whether or not the user is authenticated |
-| `loading` | If the auth context is loading |
-| `logIn` | Logs a user in |
-| `logOut` | Logs a user out |
-| `reauthenticate` | Refetch auth data and context. (This one is called internally and shouldn't be something you have to reach for often) |
-| `signUp` | Signs a user up |
-| `userMetadata` | An object containing the user's metadata (or profile information), fetched directly from an instance of the auth provider client. Or if the user isn't authenticated, `null` |
-
-### Protecting routes
-
-You can require that a user be authenticated to navigate to a route by wrapping it in the `PrivateSet` component.
-An unauthenticated user will be redirected to the route specified in either component's `unauthenticated` prop:
-
-```tsx title="web/src/Routes.tsx"
-import { Router, Route, PrivateSet } from '@redwoodjs/router'
-
-const Routes = () => {
- return (
-
-
-
-
- // highlight-next-line
-
-
-
-
-
- )
-}
-```
-
-You can also restrict access by role by passing a role or an array of roles to the `PrivateSet` component's `hasRole` prop:
-
-```tsx title="web/src/Routes.tsx"
-import { Router, Route, PrivateSet } from '@redwoodjs/router'
-
-const Routes = () => {
- return (
-
-
-
-
-
-
-
-
-
- // highlight-next-line
-
-
-
-
- // highlight-next-line
-
-
-
-
- )
-}
-```
-
-### api-side currentUser
-
-We briefly mentioned that GraphQL requests include an `Authorization` header in every request when a user is authenticated.
-The api side verifies and decodes the token in this header via the `authDecoder` function.
-While information about the user is technically available at this point, it's still pretty raw.
-You can map it into a real user object via the `getCurrentUser` function.
-Both these functions are passed to the `createGraphQLHandler` function in `api/src/functions/graphql.ts`:
-
-```ts title="api/src/functions/graphql.ts"
-export const handler = createGraphQLHandler({
- authDecoder,
- getCurrentUser,
- // ...
-})
-
-```
-
-If you're using one of Redwood's official integrations, `authDecoder` comes from the corresponding integration package (in auth0's case, `@redwoodjs/auth-auth0-api`):
-
-```ts
-import { authDecoder } from '@redwoodjs/auth-auth0-api'
-```
-
-If you're rolling your own, you'll have to write it yourself. See the [Custom Auth](./auth/custom.md#api-side) docs for an example.
-
-It's always up to you to write `getCurrentUser`, though the setup command will stub it out for you in `api/src/lib/auth.ts` with plenty of guidance.
-
-`getCurrentUser`'s return is made globally available in the api side's context via `context.currentUser` for convenience.
-
-### Locking down the GraphQL api
-
-Use the `requireAuth` and `skipAuth` [GraphQL directives](directives#secure-by-default-with-built-in-directives) to protect individual GraphQL calls.
diff --git a/docs/versioned_docs/version-7.0/cli-commands.md b/docs/versioned_docs/version-7.0/cli-commands.md
deleted file mode 100644
index ca8e967aadbd..000000000000
--- a/docs/versioned_docs/version-7.0/cli-commands.md
+++ /dev/null
@@ -1,2285 +0,0 @@
----
-description: A comprehensive reference of Redwood's CLI
----
-
-# Command Line Interface
-
-The following is a comprehensive reference of the Redwood CLI. You can get a glimpse of all the commands by scrolling the aside to the right.
-
-The Redwood CLI has two entry-point commands:
-
-1. **redwood** (alias `rw`), which is for developing an application, and
-2. **redwood-tools** (alias `rwt`), which is for contributing to the framework.
-
-This document covers the `redwood` command . For `redwood-tools`, see [Contributing](https://github.com/redwoodjs/redwood/blob/main/CONTRIBUTING.md#cli-reference-redwood-tools) in the Redwood repo.
-
-**A Quick Note on Syntax**
-
-We use [yargs](http://yargs.js.org/) and borrow its syntax here:
-
-```
-yarn redwood generate page [path] --option
-```
-
-- `redwood g page` is the command.
-- `` and `[path]` are positional arguments.
- - `<>` denotes a required argument.
- - `[]` denotes an optional argument.
-- `--option` is an option.
-
-Every argument and option has a type. Here `` and `[path]` are strings and `--option` is a boolean.
-
-You'll also sometimes see arguments with trailing `..` like:
-
-```
-yarn redwood build [side..]
-```
-
-The `..` operator indicates that the argument accepts an array of values. See [Variadic Positional Arguments](https://github.com/yargs/yargs/blob/master/docs/advanced.md#variadic-positional-arguments).
-
-## create redwood-app
-
-Create a Redwood project using the yarn create command:
-
-```
-yarn create redwood-app [option]
-```
-
-| Arguments & Options | Description |
-| :--------------------- | :---------------------------------------------------------------------------------------------------------------------- |
-| `project directory` | Specify the project directory [Required] |
-| `--yarn-install` | Enables the yarn install step and version-requirement checks. You can pass `--no-yarn-install` to disable this behavior |
-| `--typescript`, `--ts` | Generate a TypeScript project. JavaScript by default |
-| `--overwrite` | Create the project even if the specified project directory isn't empty |
-| `--no-telemetry` | Disable sending telemetry events for this create command and all Redwood CLI commands: https://telemetry.redwoodjs.com |
-| `--yarn1` | Use yarn 1 instead of yarn 3 |
-| `--git-init`, `--git` | Initialize a git repo during the install process, disabled by default |
-
-If you run into trouble during the yarn install step, which may happen if you're developing on an external drive and in other miscellaneous scenarios, try the `--yarn1` flag:
-
-```
-yarn create redwood-app my-redwood-project --yarn1
-```
-
-## build
-
-Build for production.
-
-```bash
-yarn redwood build [side..]
-```
-
-We use Babel to transpile the api side into `./api/dist` and Webpack to package the web side into `./web/dist`.
-
-| Arguments & Options | Description |
-| :------------------ | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| `side` | Which side(s) to build. Choices are `api` and `web`. Defaults to `api` and `web` |
-| `--verbose, -v` | Print more information while building |
-
-#### Usage
-
-See [Builds](builds.md).
-
-#### Example
-
-Running `yarn redwood build` without any arguments generates the Prisma client and builds both sides of your project:
-
-```bash
-~/redwood-app$ yarn redwood build
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood build
- ✔ Generating the Prisma client...
- ✔ Building "api"...
- ✔ Building "web"...
-Done in 17.37s.
-```
-
-Files are output to each side's `dist` directory:
-
-```plaintext {2,6}
-├── api
-│ ├── dist
-│ ├── prisma
-│ └── src
-└── web
- ├── dist
- ├── public
- └── src
-```
-
-## check (alias diagnostics)
-
-Get structural diagnostics for a Redwood project (experimental).
-
-```
-yarn redwood check
-```
-
-#### Example
-
-```bash
-~/redwood-app$ yarn redwood check
-yarn run v1.22.4
-web/src/Routes.js:14:5: error: You must specify a 'notfound' page
-web/src/Routes.js:14:19: error: Duplicate Path
-web/src/Routes.js:15:19: error: Duplicate Path
-web/src/Routes.js:17:40: error: Page component not found
-web/src/Routes.js:17:19: error (INVALID_ROUTE_PATH_SYNTAX): Error: Route path contains duplicate parameter: "/{id}/{id}"
-```
-
-## console (alias c)
-
-Launch an interactive Redwood shell (experimental):
-
-- This has not yet been tested on Windows.
-- The Prisma Client must be generated _prior_ to running this command, e.g. `yarn redwood prisma generate`. This is a known issue.
-
-```
-yarn redwood console
-```
-
-Right now, you can only use the Redwood console to interact with your database (always with `await`):
-
-#### Example
-
-```bash
-~/redwood-app$ yarn redwood console
-yarn run v1.22.4
-> await db.user.findMany()
-> [ { id: 1, email: 'tom@redwoodjs.com', name: 'Tom' } ]
-```
-
-## data-migrate
-
-Data migration tools.
-
-```bash
-yarn redwood data-migrate
-```
-
-| Command | Description |
-| :-------- | :------------------------------------------------------------------------------------------ |
-| `install` | Appends `DataMigration` model to `schema.prisma`, creates `api/db/dataMigrations` directory |
-| `up` | Executes outstanding data migrations |
-
-### data-migrate install
-
-- Appends a `DataMigration` model to `schema.prisma` for tracking which data migrations have already run.
-- Creates a DB migration using `yarn redwood prisma migrate dev --create-only create_data_migrations`.
-- Creates `api/db/dataMigrations` directory to contain data migration scripts
-
-```bash
-yarn redwood data-migrate install
-```
-
-### data-migrate up
-
-Executes outstanding data migrations against the database. Compares the list of files in `api/db/dataMigrations` to the records in the `DataMigration` table in the database and executes any files not present.
-
-If an error occurs during script execution, any remaining scripts are skipped and console output will let you know the error and how many subsequent scripts were skipped.
-
-```bash
-yarn redwood data-migrate up
-```
-
-## dev
-
-Start development servers for api and web.
-
-```bash
-yarn redwood dev [side..]
-```
-
-`yarn redwood dev api` starts the Redwood dev server and `yarn redwood dev web` starts the Webpack dev server with Redwood's config.
-
-| Argument | Description |
-| :----------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
-| `side` | Which dev server(s) to start. Choices are `api` and `web`. Defaults to `api` and `web` |
-| `--forward, --fwd` | String of one or more Vite Dev Server config options. See example usage below |
-
-#### Usage
-
-If you're only working on your sdl and services, you can run just the api server to get GraphQL Playground on port 8911:
-
-```bash
-~/redwood-app$ yarn redwood dev api
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood dev api
-$ /redwood-app/node_modules/.bin/dev-server
-15:04:51 api | Listening on http://localhost:8911
-15:04:51 api | Watching /home/dominic/projects/redwood/redwood-app/api
-15:04:51 api |
-15:04:51 api | Now serving
-15:04:51 api |
-15:04:51 api | ► http://localhost:8911/graphql/
-```
-
-Using `--forward` (alias `--fwd`), you can pass one or more Webpack Dev Server [config options](https://webpack.js.org/configuration/dev-server/). The following will run the dev server, set the port to `1234`, and disable automatic browser opening.
-
-```bash
-~/redwood-app$ yarn redwood dev --fwd="--port=1234 --open=false"
-```
-
-You may need to access your dev application from a different host, like your mobile device or an SSH tunnel. To resolve the “Invalid Host Header” message, run the following:
-
-```bash
-~/redwood-app$ yarn redwood dev --fwd="--allowed-hosts all"
-```
-
-For the full list of Webpack Dev Server settings, see [this documentation](https://webpack.js.org/configuration/dev-server/).
-
-For the full list of Server Configuration settings, see [this documentation](app-configuration-redwood-toml.md#api).
-
-## deploy
-
-Deploy your redwood project to a hosting provider target.
-
-**Netlify, Vercel, and Render**
-
-For hosting providers that auto deploy from Git, the deploy command runs the set of steps to build, apply production DB changes, and apply data migrations. In this context, it is often referred to as a Build Command. _Note: for Render, which uses traditional infrastructure, the command also starts Redwood's api server._
-
-**AWS**
-
-This command runs the steps to both build your project _and_ deploy it to AWS.
-
-```
-yarn redwood deploy
-```
-
-| Commands | Description |
-| :---------------------------- | :--------------------------------------- |
-| `serverless ` | Deploy to AWS using Serverless framework |
-| `netlify [...commands]` | Build command for Netlify deploy |
-| `render [...commands]` | Build command for Render deploy |
-| `vercel [...commands]` | Build command for Vercel deploy |
-
-### deploy serverless
-
-Deploy to AWS CloudFront and Lambda using [Serverless](https://www.serverless.com/) framework
-
-```
-yarn redwood deploy serverless
-```
-
-| Options & Arguments | Description |
-| :------------------ | :------------------------------------------------------------------------------------------------------------------------------------------ |
-| `--side` | which Side(s)to deploy [choices: "api", "web"] [default: "web","api"] |
-| `--stage` | serverless stage, see [serverless stage docs](https://www.serverless.com/blog/stages-and-environments) [default: "production"] |
-| `--pack-only` | Only package the build for deployment |
-| `--first-run` | Use this flag the first time you deploy. The first deploy wizard will walk you through configuring your web side to connect to the api side |
-
-
-### deploy netlify
-
-Build command for Netlify deploy
-
-```
-yarn redwood deploy netlify
-```
-
-| Options | Description |
-| :--------------------- | :-------------------------------------------------- |
-| `--build` | Build for production [default: "true"] |
-| `--prisma` | Apply database migrations [default: "true"] |
-| `--data-migrate, --dm` | Migrate the data in your database [default: "true"] |
-
-#### Example
-The following command will build, apply Prisma DB migrations, and skip data migrations.
-
-```
-yarn redwood deploy netlify --no-data-migrate
-```
-
-:::warning
-While you may be tempted to use the [Netlify CLI](https://cli.netlify.com) commands to [build](https://cli.netlify.com/commands/build) and [deploy](https://cli.netlify.com/commands/deploy) your project directly from you local project directory, doing so **will lead to errors when deploying and/or when running functions**. I.e. errors in the function needed for the GraphQL server, but also other serverless functions.
-
-The main reason for this is that these Netlify CLI commands simply build and deploy -- they build your project locally and then push the dist folder. That means that when building a RedwoodJS project, the [Prisma client is generated with binaries matching the operating system at build time](https://cli.netlify.com/commands/link) -- and not the [OS compatible](https://www.prisma.io/docs/reference/api-reference/prisma-schema-reference#binarytargets-options) with running functions on Netlify. Your Prisma client engine may be `darwin` for OSX or `windows` for Windows, but it needs to be `debian-openssl-1.1.x` or `rhel-openssl-1.1.x`. If the client is incompatible, your functions will fail.
-
-Therefore, please follow the [instructions in the Tutorial](tutorial/chapter4/deployment.md#netlify) to sync your GitHub (or other compatible source control service) repository with Netlify and allow their build and deploy system to manage deployments.
-
-The [Netlify CLI](https://cli.netlify.com) still works well for [linking your project to your site](https://cli.netlify.com/commands/link), testing local builds and also using their [dev](https://cli.netlify.com/commands/dev) or [dev --live](https://cli.netlify.com/commands/dev) to share your local dev server via a tunnel.
-:::
-
-### deploy render
-
-Build (web) and Start (api) command for Render deploy. (For usage instructions, see the Render [Deploy Redwood](https://render.com/docs/deploy-redwood) doc.)
-
-```
-yarn redwood deploy render
-```
-
-| Options & Arguments | Description |
-| :--------------------- | :-------------------------------------------------- |
-| `side` | select side to build [choices: "api", "web"] |
-| `--prisma` | Apply database migrations [default: "true"] |
-| `--data-migrate, --dm` | Migrate the data in your database [default: "true"] |
-| `--serve` | Run server for api in production [default: "true"] |
-
-#### Example
-The following command will build the Web side for static-site CDN deployment.
-
-```
-yarn redwood deploy render web
-```
-
-The following command will apply Prisma DB migrations, run data migrations, and start the api server.
-
-```
-yarn redwood deploy render api
-```
-
-### deploy vercel
-
-Build command for Vercel deploy
-
-```
-yarn redwood deploy vercel
-```
-
-| Options | Description |
-| :--------------------- | :-------------------------------------------------- |
-| `--build` | Build for production [default: "true"] |
-| `--prisma` | Apply database migrations [default: "true"] |
-| `--data-migrate, --dm` | Migrate the data in your database [default: "true"] |
-
-#### Example
-The following command will build, apply Prisma DB migrations, and skip data migrations.
-
-```
-yarn redwood deploy vercel --no-data-migrate
-```
-
-## destroy (alias d)
-
-Rollback changes made by the generate command.
-
-```
-yarn redwood destroy
-```
-
-| Command | Description |
-| :------------------- | :------------------------------------------------------------------------------ |
-| `cell ` | Destroy a cell component |
-| `component ` | Destroy a component |
-| `function ` | Destroy a Function |
-| `layout ` | Destroy a layout component |
-| `page [path]` | Destroy a page and route component |
-| `scaffold ` | Destroy pages, SDL, and Services files based on a given DB schema Model |
-| `sdl ` | Destroy a GraphQL schema and service component based on a given DB schema Model |
-| `service ` | Destroy a service component |
-| `directive ` | Destroy a directive |
-
-## exec
-
-Execute scripts generated by [`yarn redwood generate script `](#generate-script) to run one-off operations, long-running jobs, or utility scripts.
-
-#### Usage
-
-You can pass any flags to the command and use them within your script:
-
-```
-❯ yarn redwood exec syncStripeProducts foo --firstParam 'hello' --two 'world'
-
-[18:13:56] Generating Prisma client [started]
-[18:13:57] Generating Prisma client [completed]
-[18:13:57] Running script [started]
-:: Executing script with args ::
-{ _: [ 'exec', 'foo' ], firstParam: 'hello', two: 'world', '$0': 'rw' }
-[18:13:58] Running script [completed]
-✨ Done in 4.37s.
-```
-
-**Examples of CLI scripts:**
-
-- One-off scripts—such as syncing your Stripe products to your database
-- A background worker you can off-load long running tasks
-- Custom seed scripts for your application during development
-
-See [this how to](how-to/background-worker.md) for an example of using exec to run a background worker.
-
-## experimental (alias exp)
-
-Set up and run experimental features.
-
-Some caveats:
-- these features do not follow SemVer (may be breaking changes in minor and patch releases)
-- these features may be deprecated or removed (anytime)
-- your feedback is wanted and necessary!
-
-For more information, including details about specific features, see this Redwood Forum category:
-[Experimental Features](https://community.redwoodjs.com/c/experimental-features/25)
-
-**Available Experimental Features**
-View all that can be _set up_:
-```
-yarn redwood experimental --help
-```
-
-## generate (alias g)
-
-Save time by generating boilerplate code.
-
-```
-yarn redwood generate
-```
-
-Some generators require that their argument be a model in your `schema.prisma`. When they do, their argument is named ``.
-
-| Command | Description |
-| ---------------------- | ----------------------------------------------------------------------------------------------------- |
-| `cell ` | Generate a cell component |
-| `component ` | Generate a component component |
-| `dataMigration ` | Generate a data migration component |
-| `dbAuth` | Generate sign in, sign up and password reset pages for dbAuth |
-| `deploy ` | Generate a deployment configuration |
-| `function ` | Generate a Function |
-| `layout ` | Generate a layout component |
-| `page [path]` | Generate a page component |
-| `scaffold ` | Generate Pages, SDL, and Services files based on a given DB schema Model. Also accepts `` |
-| `sdl ` | Generate a GraphQL schema and service object |
-| `secret` | Generate a secret key using a cryptographically-secure source of entropy |
-| `service ` | Generate a service component |
-| `types` | Generate types and supplementary code |
-| `script ` | Generate a script that can use your services/libs to execute with `redwood exec script ` |
-
-### TypeScript generators
-
-If your project is configured for TypeScript (see the [TypeScript docs](typescript/index)), the generators will automatically detect and generate `.ts`/`.tsx` files for you
-
-**Undoing a Generator with a Destroyer**
-
-Most generate commands (i.e., everything but `yarn redwood generate dataMigration`) can be undone by their corresponding destroy command. For example, `yarn redwood generate cell` can be undone with `yarn redwood destroy cell`.
-
-### generate cell
-
-Generate a cell component.
-
-```bash
-yarn redwood generate cell
-```
-
-Cells are signature to Redwood. We think they provide a simpler and more declarative approach to data fetching.
-
-| Arguments & Options | Description |
-| -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| `name` | Name of the cell |
-| `--force, -f` | Overwrite existing files |
-| `--typescript, --ts` | Generate TypeScript files Enabled by default if we detect your project is TypeScript |
-| `--query` | Use this flag to specify a specific name for the GraphQL query. The query name must be unique |
-| `--list` | Use this flag to generate a list cell. This flag is needed when dealing with irregular words whose plural and singular is identical such as equipment or pokemon |
-| `--tests` | Generate test files [default: true] |
-| `--stories` | Generate Storybook files [default: true] |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-#### Usage
-
-The cell generator supports both single items and lists. See the [Single Item Cell vs List Cell](cells.md#single-item-cell-vs-list-cell) section of the Cell documentation.
-
-See the [Cells](tutorial/chapter2/cells.md) section of the Tutorial for usage examples.
-
-**Destroying**
-
-```
-yarn redwood destroy cell
-```
-
-#### Example
-
-Generating a user cell:
-
-```bash
-~/redwood-app$ yarn redwood generate cell user
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g cell user
- ✔ Generating cell files...
- ✔ Writing `./web/src/components/UserCell/UserCell.test.js`...
- ✔ Writing `./web/src/components/UserCell/UserCell.js`...
-Done in 1.00s.
-```
-
-A cell defines and exports four constants: `QUERY`, `Loading`, `Empty`, `Failure`, and `Success`:
-
-```jsx title="./web/src/components/UserCell/UserCell.js"
-export const QUERY = gql`
- query {
- user {
- id
- }
- }
-`
-
-export const Loading = () =>
Loading...
-
-export const Empty = () =>
Empty
-
-export const Failure = ({ error }) =>
Error: {error.message}
-
-export const Success = ({ user }) => {
- return JSON.stringify(user)
-}
-```
-
-### generate component
-
-Generate a component.
-
-```bash
-yarn redwood generate component
-```
-
-Redwood loves function components and makes extensive use of React Hooks, which are only enabled in function components.
-
-| Arguments & Options | Description |
-| -------------------- | ------------------------------------------------------------------------------------ |
-| `name` | Name of the component |
-| `--force, -f` | Overwrite existing files |
-| `--typescript, --ts` | Generate TypeScript files Enabled by default if we detect your project is TypeScript |
-| `--tests` | Generate test files [default: true] |
-| `--stories` | Generate Storybook files [default: true] |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-**Destroying**
-
-```
-yarn redwood destroy component
-```
-
-#### Example
-
-Generating a user component:
-
-```bash
-~/redwood-app$ yarn redwood generate component user
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g component user
- ✔ Generating component files...
- ✔ Writing `./web/src/components/User/User.test.js`...
- ✔ Writing `./web/src/components/User/User.js`...
-Done in 1.02s.
-```
-
-The component will export some jsx telling you where to find it.
-
-```jsx title="./web/src/components/User/User.js"
-const User = () => {
- return (
-
-
{'User'}
-
{'Find me in ./web/src/components/User/User.js'}
-
- )
-}
-
-export default User
-```
-
-### generate dataMigration
-
-Generate a data migration script.
-
-```
-yarn redwood generate dataMigration
-```
-
-Creates a data migration script in `api/db/dataMigrations`.
-
-| Arguments & Options | Description |
-| :------------------ | :----------------------------------------------------------------------- |
-| `name` | Name of the data migration, prefixed with a timestamp at generation time |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-#### Usage
-
-See the [Data Migration](data-migrations.md) docs.
-
-#### Usage
-
-See the [Deploy](/docs/deploy/introduction) docs.
-
-### generate dbAuth
-
-Generate log in, sign up, forgot password and password reset pages for dbAuth
-
-```
-yarn redwood generate dbAuth
-```
-
-| Arguments & Options | Description |
-| ------------------- | ------------------------------------------------------------------------------------------------------------------------------------ |
-| `--username-label` | The label to give the username field on the auth forms, e.g. "Email". Defaults to "Username". If not specified you will be prompted |
-| `--password-label` | The label to give the password field on the auth forms, e.g. "Secret". Defaults to "Password". If not specified you will be prompted |
-| `--webAuthn` | Whether or not to add webAuthn support to the log in page. If not specified you will be prompted |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-If you don't want to create your own log in, sign up, forgot password and
-password reset pages from scratch you can use this generator. The pages will be
-available at /login, /signup, /forgot-password, and /reset-password. Check the
-post-install instructions for one change you need to make to those pages: where
-to redirect the user to once their log in/sign up is successful.
-
-If you'd rather create your own, you might want to start from the generated
-pages anyway as they'll contain the other code you need to actually submit the
-log in credentials or sign up fields to the server for processing.
-
-### generate directive
-
-Generate a directive.
-
-```bash
-yarn redwood generate directive
-```
-
-| Arguments & Options | Description |
-| -------------------- | --------------------------------------------------------------------- |
-| `name` | Name of the directive |
-| `--force, -f` | Overwrite existing files |
-| `--typescript, --ts` | Generate TypeScript files (defaults to your projects language target) |
-| `--type` | Directive type [Choices: "validator", "transformer"] |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-#### Usage
-
-See [Redwood Directives](directives.md).
-
-**Destroying**
-
-```
-yarn redwood destroy directive
-```
-
-#### Example
-
-Generating a `myDirective` directive using the interactive command:
-
-```bash
-yarn rw g directive myDirective
-
-? What type of directive would you like to generate? › - Use arrow-keys. Return to submit.
-❯ Validator - Implement a validation: throw an error if criteria not met to stop execution
- Transformer - Modify values of fields or query responses
-```
-
-### generate function
-
-Generate a Function.
-
-```
-yarn redwood generate function
-```
-
-Not to be confused with Javascript functions, Capital-F Functions are meant to be deployed to serverless endpoints like AWS Lambda.
-
-| Arguments & Options | Description |
-| -------------------- | ------------------------------------------------------------------------------------ |
-| `name` | Name of the function |
-| `--force, -f` | Overwrite existing files |
-| `--typescript, --ts` | Generate TypeScript files Enabled by default if we detect your project is TypeScript |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-#### Usage
-
-See the [Custom Function](how-to/custom-function.md) how to.
-
-**Destroying**
-
-```
-yarn redwood destroy function
-```
-
-#### Example
-
-Generating a user function:
-
-```bash
-~/redwood-app$ yarn redwood generate function user
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g function user
- ✔ Generating function files...
- ✔ Writing `./api/src/functions/user.js`...
-Done in 16.04s.
-```
-
-Functions get passed `context` which provides access to things like the current user:
-
-```jsx title="./api/src/functions/user.js"
-export const handler = async (event, context) => {
- return {
- statusCode: 200,
- body: `user function`,
- }
-}
-```
-
-Now if we run `yarn redwood dev api`:
-
-```plaintext {11}
-~/redwood-app$ yarn redwood dev api
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood dev api
-$ /redwood-app/node_modules/.bin/dev-server
-17:21:49 api | Listening on http://localhost:8911
-17:21:49 api | Watching /home/dominic/projects/redwood/redwood-app/api
-17:21:49 api |
-17:21:49 api | Now serving
-17:21:49 api |
-17:21:49 api | ► http://localhost:8911/graphql/
-17:21:49 api | ► http://localhost:8911/user/
-```
-
-### generate layout
-
-Generate a layout component.
-
-```bash
-yarn redwood generate layout
-```
-
-Layouts wrap pages and help you stay DRY.
-
-| Arguments & Options | Description |
-| -------------------- | ------------------------------------------------------------------------------------ |
-| `name` | Name of the layout |
-| `--force, -f` | Overwrite existing files |
-| `--typescript, --ts` | Generate TypeScript files Enabled by default if we detect your project is TypeScript |
-| `--tests` | Generate test files [default: true] |
-| `--stories` | Generate Storybook files [default: true] |
-| `--skipLink` | Generate a layout with a skip link [default: false] |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-#### Usage
-
-See the [Layouts](tutorial/chapter1/layouts.md) section of the tutorial.
-
-**Destroying**
-
-```
-yarn redwood destroy layout
-```
-
-#### Example
-
-Generating a user layout:
-
-```bash
-~/redwood-app$ yarn redwood generate layout user
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g layout user
- ✔ Generating layout files...
- ✔ Writing `./web/src/layouts/UserLayout/UserLayout.test.js`...
- ✔ Writing `./web/src/layouts/UserLayout/UserLayout.js`...
-Done in 1.00s.
-```
-
-A layout will just export its children:
-
-```jsx title="./web/src/layouts/UserLayout/UserLayout.test.js"
-const UserLayout = ({ children }) => {
- return <>{children}>
-}
-
-export default UserLayout
-```
-
-### generate model
-
-Generate a RedwoodRecord model.
-
-```bash
-yarn redwood generate model
-```
-
-| Arguments & Options | Description |
-| ------------------- | --------------------------------------------------- |
-| `name` | Name of the model (in schema.prisma) |
-| `--force, -f` | Overwrite existing files |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-#### Usage
-
-See the [RedwoodRecord docs](redwoodrecord.md).
-
-#### Example
-
-```bash
-~/redwood-app$ yarn redwood generate model User
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g model User
- ✔ Generating model file...
- ✔ Successfully wrote file `./api/src/models/User.js`
- ✔ Parsing datamodel, generating api/src/models/index.js...
-
- Wrote /Users/rob/Sites/redwoodjs/redwood_record/.redwood/datamodel.json
- Wrote /Users/rob/Sites/redwoodjs/redwood_record/api/src/models/index.js
-
-✨ Done in 3.74s.
-```
-
-Generating a model automatically runs `yarn rw record init` as well.
-
-### generate page
-
-Generates a page component and updates the routes.
-
-```bash
-yarn redwood generate page [path]
-```
-
-`path` can include a route parameter which will be passed to the generated
-page. The syntax for that is `/path/to/page/{routeParam}/more/path`. You can
-also specify the type of the route parameter if needed: `{routeParam:Int}`. If
-`path` isn't specified, or if it's just a route parameter, it will be derived
-from `name` and the route parameter, if specified, will be added to the end.
-
-This also updates `Routes.js` in `./web/src`.
-
-| Arguments & Options | Description |
-| -------------------- | ------------------------------------------------------------------------------------ |
-| `name` | Name of the page |
-| `path` | URL path to the page. Defaults to `name` |
-| `--force, -f` | Overwrite existing files |
-| `--typescript, --ts` | Generate TypeScript files Enabled by default if we detect your project is TypeScript |
-| `--tests` | Generate test files [default: true] |
-| `--stories` | Generate Storybook files [default: true] |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-**Destroying**
-
-```
-yarn redwood destroy page [path]
-```
-
-**Examples**
-
-Generating a home page:
-
-```plaintext
-~/redwood-app$ yarn redwood generate page home /
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g page home /
- ✔ Generating page files...
- ✔ Writing `./web/src/pages/HomePage/HomePage.test.js`...
- ✔ Writing `./web/src/pages/HomePage/HomePage.js`...
- ✔ Updating routes file...
-Done in 1.02s.
-```
-
-The page returns jsx telling you where to find it:
-
-```jsx title="./web/src/pages/HomePage/HomePage.js"
-const HomePage = () => {
- return (
-
-
HomePage
-
Find me in ./web/src/pages/HomePage/HomePage.js
-
- )
-}
-
-export default HomePage
-```
-
-And the route is added to `Routes.js`:
-
-```jsx {6} title="./web/src/Routes.js"
-const Routes = () => {
- return (
-
-
-
-
- )
-}
-```
-
-Generating a page to show quotes:
-
-```plaintext
-~/redwood-app$ yarn redwood generate page quote {id}
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g page quote {id}
- ✔ Generating page files...
- ✔ Writing `./web/src/pages/QuotePage/QuotePage.stories.js`...
- ✔ Writing `./web/src/pages/QuotePage/QuotePage.test.js`...
- ✔ Writing `./web/src/pages/QuotePage/QuotePage.js`...
- ✔ Updating routes file...
-Done in 1.02s.
-```
-
-The generated page will get the route parameter as a prop:
-
-```jsx {5,12,14} title="./web/src/pages/QuotePage/QuotePage.js"
-import { Link, routes } from '@redwoodjs/router'
-
-const QuotePage = ({ id }) => {
- return (
- <>
-
QuotePage
-
Find me in "./web/src/pages/QuotePage/QuotePage.js"
-
- My default route is named "quote", link to me with `
- Quote 42`
-
-
The parameter passed to me is {id}
- >
- )
-}
-
-export default QuotePage
-```
-
-And the route is added to `Routes.js`, with the route parameter added:
-
-```jsx {6} title="./web/src/Routes.js"
-const Routes = () => {
- return (
-
-
-
-
- )
-}
-```
-### generate realtime
-
-Generate a boilerplate subscription or live query used with RedwoodJS Realtime.
-
-```bash
-yarn redwood generate realtime
-```
-
-| Arguments & Options | Description |
-| -------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| `name` | Name of the realtime event to setup.post` |
-| `-t, --type` | Choices: `liveQuery`, `subscription`. Optional. If not provided, you will be prompted to select.
-| `--force, -f` | Overwrite existing files
-
-
-#### Usage
-
-See Realtime for more information on how to [setup RedwoodJS Realtime](#setup-realtime) and use Live Queries, and Subscriptions.
-
-**Examples**
-
-Generate a live query.
-
-```bash
-~/redwood-app$ yarn rw g realtime NewLiveQuery
-? What type of realtime event would you like to create? › - Use arrow-keys. Return to submit.
-❯ Live Query
- Create a Live Query to watch for changes in data
- Subscription
-
-✔ What type of realtime event would you like to create? › Live Query
-✔ Checking for realtime environment prerequisites ...
-✔ Adding newlivequery example live query ...
-✔ Generating types ...
-```
-
-Generate a subscription.
-
-```bash
-~/redwood-app$ yarn rw g realtime NewSub
-? What type of realtime event would you like to create? › - Use arrow-keys. Return to submit.
- Live Query
-❯ Subscription - Create a Subscription to watch for events
-
-✔ What type of realtime event would you like to create? › Subscription
-✔ Checking for realtime environment prerequisites ...
-✔ Adding newsub example subscription ...
-✔ Generating types ...
-```
-
-
-### generate scaffold
-
-Generate Pages, SDL, and Services files based on a given DB schema Model. Also accepts ``.
-
-```bash
-yarn redwood generate scaffold
-```
-
-A scaffold quickly creates a CRUD for a model by generating the following files and corresponding routes:
-
-- sdl
-- service
-- layout
-- pages
-- cells
-- components
-
-The content of the generated components is different from what you'd get by running them individually.
-
-| Arguments & Options | Description |
-| -------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| `model` | Model to scaffold. You can also use `` to nest files by type at the given path directory (or directories). For example, `redwood g scaffold admin/post` |
-| `--docs` | Use or set to `true` to generated comments in SDL to use in self-documentating your app's GraphQL API. See: [Self-Documenting GraphQL API](./graphql.md#self-documenting-graphql-api) [default:false] |
-| `--force, -f` | Overwrite existing files |
-| `--tailwind` | Generate TailwindCSS version of scaffold.css (automatically set to `true` if TailwindCSS config exists) |
-| `--typescript, --ts` | Generate TypeScript files Enabled by default if we detect your project is TypeScript |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-#### Usage
-
-See [Creating a Post Editor](tutorial/chapter2/getting-dynamic.md#creating-a-post-editor).
-
-**Nesting of Components and Pages**
-
-By default, redwood will nest the components and pages in a directory named as per the model. For example (where `post` is the model):
-`yarn rw g scaffold post`
-will output the following files, with the components and pages nested in a `Post` directory:
-
-```plaintext {9-20}
- √ Generating scaffold files...
- √ Successfully wrote file `./api/src/graphql/posts.sdl.js`
- √ Successfully wrote file `./api/src/services/posts/posts.js`
- √ Successfully wrote file `./api/src/services/posts/posts.scenarios.js`
- √ Successfully wrote file `./api/src/services/posts/posts.test.js`
- √ Successfully wrote file `./web/src/layouts/PostsLayout/PostsLayout.js`
- √ Successfully wrote file `./web/src/pages/Post/EditPostPage/EditPostPage.js`
- √ Successfully wrote file `./web/src/pages/Post/PostPage/PostPage.js`
- √ Successfully wrote file `./web/src/pages/Post/PostsPage/PostsPage.js`
- √ Successfully wrote file `./web/src/pages/Post/NewPostPage/NewPostPage.js`
- √ Successfully wrote file `./web/src/components/Post/EditPostCell/EditPostCell.js`
- √ Successfully wrote file `./web/src/components/Post/Post/Post.js`
- √ Successfully wrote file `./web/src/components/Post/PostCell/PostCell.js`
- √ Successfully wrote file `./web/src/components/Post/PostForm/PostForm.js`
- √ Successfully wrote file `./web/src/components/Post/Posts/Posts.js`
- √ Successfully wrote file `./web/src/components/Post/PostsCell/PostsCell.js`
- √ Successfully wrote file `./web/src/components/Post/NewPost/NewPost.js`
- √ Adding layout import...
- √ Adding set import...
- √ Adding scaffold routes...
- √ Adding scaffold asset imports...
-```
-
-If it is not desired to nest the components and pages, then redwood provides an option that you can set to disable this for your project.
-Add the following in your `redwood.toml` file to disable the nesting of components and pages.
-
-```
-[generate]
- nestScaffoldByModel = false
-```
-
-Setting the `nestScaffoldByModel = true` will retain the default behavior, but is not required.
-
-Notes:
-
-1. The nesting directory is always set to be PascalCase.
-
-**Namespacing Scaffolds**
-
-You can namespace your scaffolds by providing ``. The layout, pages, cells, and components will be nested in newly created dir(s). In addition, the nesting folder, based upon the model name, is still applied after the path for components and pages, unless turned off in the `redwood.toml` as described above. For example, given a model `user`, running `yarn redwood generate scaffold admin/user` will nest the layout, pages, and components in a newly created `Admin` directory created for each of the `layouts`, `pages`, and `components` folders:
-
-```plaintext {9-20}
-~/redwood-app$ yarn redwood generate scaffold admin/user
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g scaffold admin/user
- ✔ Generating scaffold files...
- ✔ Successfully wrote file `./api/src/graphql/users.sdl.js`
- ✔ Successfully wrote file `./api/src/services/users/users.js`
- ✔ Successfully wrote file `./api/src/services/users/users.scenarios.js`
- ✔ Successfully wrote file `./api/src/services/users/users.test.js`
- ✔ Successfully wrote file `./web/src/layouts/Admin/UsersLayout/UsersLayout.js`
- ✔ Successfully wrote file `./web/src/pages/Admin/User/EditUserPage/EditUserPage.js`
- ✔ Successfully wrote file `./web/src/pages/Admin/User/UserPage/UserPage.js`
- ✔ Successfully wrote file `./web/src/pages/Admin/User/UsersPage/UsersPage.js`
- ✔ Successfully wrote file `./web/src/pages/Admin/User/NewUserPage/NewUserPage.js`
- ✔ Successfully wrote file `./web/src/components/Admin/User/EditUserCell/EditUserCell.js`
- ✔ Successfully wrote file `./web/src/components/Admin/User/User/User.js`
- ✔ Successfully wrote file `./web/src/components/Admin/User/UserCell/UserCell.js`
- ✔ Successfully wrote file `./web/src/components/Admin/User/UserForm/UserForm.js`
- ✔ Successfully wrote file `./web/src/components/Admin/User/Users/Users.js`
- ✔ Successfully wrote file `./web/src/components/Admin/User/UsersCell/UsersCell.js`
- ✔ Successfully wrote file `./web/src/components/Admin/User/NewUser/NewUser.js`
- ✔ Adding layout import...
- ✔ Adding set import...
- ✔ Adding scaffold routes...
- ✔ Adding scaffold asset imports...
-Done in 1.21s.
-```
-
-The routes wrapped in the [`Set`](router.md#sets-of-routes) component with generated layout will be nested too:
-
-```jsx {6-11} title="./web/src/Routes.js"
-const Routes = () => {
- return (
-
-
-
-
-
-
-
-
-
- )
-}
-```
-
-Notes:
-
-1. Each directory in the scaffolded path is always set to be PascalCase.
-2. The scaffold path may be multiple directories deep.
-
-**Destroying**
-
-```
-yarn redwood destroy scaffold
-```
-
-Notes:
-
-1. You can also use `` to destroy files that were generated under a scaffold path. For example, `redwood d scaffold admin/post`
-2. The destroy command will remove empty folders along the path, provided they are lower than the folder level of component, layout, page, etc.
-3. The destroy scaffold command will also follow the `nestScaffoldbyModel` setting in the `redwood.toml` file. For example, if you have an existing scaffold that you wish to destroy, that does not have the pages and components nested by the model name, you can destroy the scaffold by temporarily setting:
-
-```
-[generate]
- nestScaffoldByModel = false
-```
-
-**Troubleshooting**
-
-If you see `Error: Unknown type: ...`, don't panic!
-It's a known limitation with GraphQL type generation.
-It happens when you generate the SDL of a Prisma model that has relations **before the SDL for the related model exists**.
-Please see [Troubleshooting Generators](./schema-relations#troubleshooting-generators) for help.
-
-### generate script
-
-Generates an arbitrary Node.js script in `./scripts/` that can be used with `redwood execute` command later.
-
-| Arguments & Options | Description |
-| -------------------- | ------------------------------------------------------------------------------------ |
-| `name` | Name of the service |
-| `--typescript, --ts` | Generate TypeScript files Enabled by default if we detect your project is TypeScript |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-Scripts have access to services and libraries used in your project. Some examples of how this can be useful:
-
-- create special database seed scripts for different scenarios
-- sync products and prices from your payment provider
-- running cleanup jobs on a regular basis e.g. delete stale/expired data
-- sync data between platforms e.g. email from your db to your email marketing platform
-
-#### Usage
-
-```
-❯ yarn rw g script syncStripeProducts
-
- ✔ Generating script file...
- ✔ Successfully wrote file `./scripts/syncStripeProducts.ts`
- ✔ Next steps...
-
- After modifying your script, you can invoke it like:
-
- yarn rw exec syncStripeProducts
-
- yarn rw exec syncStripeProducts --param1 true
-```
-
-### generate sdl
-
-Generate a GraphQL schema and service object.
-
-```bash
-yarn redwood generate sdl
-```
-
-The sdl will inspect your `schema.prisma` and will do its best with relations. Schema to generators isn't one-to-one yet (and might never be).
-
-
-
-| Arguments & Options | Description |
-| -------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
-| `model` | Model to generate the sdl for |
-| `--crud` | Set to `false`, or use `--no-crud`, if you do not want to generate mutations |
-| `--docs` | Use or set to `true` to generated comments in SDL to use in self-documentating your app's GraphQL API. See: [Self-Documenting GraphQL API](./graphql.md#self-documenting-graphql-api) [default: false] |
-| `--force, -f` | Overwrite existing files |
-| `--tests` | Generate service test and scenario [default: true] |
-| `--typescript, --ts` | Generate TypeScript files Enabled by default if we detect your project is TypeScript |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-> **Note:** The generated sdl will include the `@requireAuth` directive by default to ensure queries and mutations are secure. If your app's queries and mutations are all public, you can set up a custom SDL generator template to apply `@skipAuth` (or a custom validator directive) to suit you application's needs.
-
-**Regenerating the SDL**
-
-Often, as you iterate on your data model, you may add, remove, or rename fields. You still want Redwood to update the generated SDL and service files for those updates because it saves time not having to make those changes manually.
-
-But, since the `generate` command prevents you from overwriting files accidentally, you use the `--force` option -- but a `force` will reset any test and scenarios you may have written which you don't want to lose.
-
-In that case, you can run the following to "regenerate" **just** the SDL file and leave your tests and scenarios intact and not lose your hard work.
-
-```
-yarn redwood g sdl --force --no-tests
-```
-
-#### Example
-
-```bash
-~/redwood-app$ yarn redwood generate sdl user --force --no-tests
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g sdl user
- ✔ Generating SDL files...
- ✔ Writing `./api/src/graphql/users.sdl.js`...
- ✔ Writing `./api/src/services/users/users.js`...
-Done in 1.04s.
-```
-
-**Destroying**
-
-```
-yarn redwood destroy sdl
-```
-
-#### Example
-
-Generating a user sdl:
-
-```bash
-~/redwood-app$ yarn redwood generate sdl user
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g sdl user
- ✔ Generating SDL files...
- ✔ Writing `./api/src/graphql/users.sdl.js`...
- ✔ Writing `./api/src/services/users/users.scenarios.js`...
- ✔ Writing `./api/src/services/users/users.test.js`...
- ✔ Writing `./api/src/services/users/users.js`...
-Done in 1.04s.
-```
-
-The generated sdl defines a corresponding type, query, create/update inputs, and any mutations. To prevent defining mutations, add the `--no-crud` option.
-
-```jsx title="./api/src/graphql/users.sdl.js"
-export const schema = gql`
- type User {
- id: Int!
- email: String!
- name: String
- }
-
- type Query {
- users: [User!]! @requireAuth
- }
-
- input CreateUserInput {
- email: String!
- name: String
- }
-
- input UpdateUserInput {
- email: String
- name: String
- }
-
- type Mutation {
- createUser(input: CreateUserInput!): User! @requireAuth
- updateUser(id: Int!, input: UpdateUserInput!): User! @requireAuth
- deleteUser(id: Int!): User! @requireAuth
- }
-`
-```
-
-The services file fulfills the query. If the `--no-crud` option is added, this file will be less complex.
-
-```jsx title="./api/src/services/users/users.js"
-import { db } from 'src/lib/db'
-
-export const users = () => {
- return db.user.findMany()
-}
-```
-
-For a model with a relation, the field will be listed in the sdl:
-
-```jsx {8} title="./api/src/graphql/users.sdl.js"
-export const schema = gql`
- type User {
- id: Int!
- email: String!
- name: String
- profile: Profile
- }
-
- type Query {
- users: [User!]! @requireAuth
- }
-
- input CreateUserInput {
- email: String!
- name: String
- }
-
- input UpdateUserInput {
- email: String
- name: String
- }
-
- type Mutation {
- createUser(input: CreateUserInput!): User! @requireAuth
- updateUser(id: Int!, input: UpdateUserInput!): User! @requireAuth
- deleteUser(id: Int!): User! @requireAuth
- }
-`
-```
-
-And the service will export an object with the relation as a property:
-
-```jsx {9-13} title="./api/src/services/users/users.js"
-import { db } from 'src/lib/db'
-
-export const users = () => {
- return db.user.findMany()
-}
-
-export const User = {
- profile: (_obj, { root }) => {
- db.user.findUnique({ where: { id: root.id } }).profile(),
- }
-}
-```
-
-**Troubleshooting**
-
-If you see `Error: Unknown type: ...`, don't panic!
-It's a known limitation with GraphQL type generation.
-It happens when you generate the SDL of a Prisma model that has relations **before the SDL for the related model exists**.
-Please see [Troubleshooting Generators](./schema-relations#troubleshooting-generators) for help.
-
-### generate secret
-
-Generate a secret key using a cryptographically-secure source of entropy. Commonly used when setting up dbAuth.
-
-| Arguments & Options | Description |
-| :------------------ | :------------------------------------------------- |
-| `--raw` | Print just the key, without any informational text |
-
-#### Usage
-
-Using the `--raw` option you can easily append a secret key to your .env file, like so:
-
-```
-# yarn v1
-echo "SESSION_SECRET=$(yarn --silent rw g secret --raw)" >> .env
-
-# yarn v3
-echo "SESSION_SECRET=$(yarn rw g secret --raw)" >> .env
-```
-
-### generate service
-
-Generate a service component.
-
-```bash
-yarn redwood generate service
-```
-
-Services are where Redwood puts its business logic. They can be used by your GraphQL API or any other place in your backend code. See [How Redwood Works with Data](tutorial/chapter2/side-quest.md).
-
-| Arguments & Options | Description |
-| -------------------- | ------------------------------------------------------------------------------------ |
-| `name` | Name of the service |
-| `--force, -f` | Overwrite existing files |
-| `--typescript, --ts` | Generate TypeScript files Enabled by default if we detect your project is TypeScript |
-| `--tests` | Generate test and scenario files [default: true] |
-| `--rollback` | Rollback changes if an error occurs [default: true] |
-
-
-**Destroying**
-
-```
-yarn redwood destroy service
-```
-
-#### Example
-
-Generating a user service:
-
-```bash
-~/redwood-app$ yarn redwood generate service user
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood g service user
- ✔ Generating service files...
- ✔ Writing `./api/src/services/users/users.scenarios.js`...
- ✔ Writing `./api/src/services/users/users.test.js`...
- ✔ Writing `./api/src/services/users/users.js`...
-Done in 1.02s.
-```
-
-The generated service component will export a `findMany` query:
-
-```jsx title="./api/src/services/users/users.js"
-import { db } from 'src/lib/db'
-
-export const users = () => {
- return db.user.findMany()
-}
-```
-
-### generate types
-
-Generates supplementary code (project types)
-
-```bash
-yarn redwood generate types
-```
-
-#### Usage
-
-```
-~/redwood-app$ yarn redwood generate types
-yarn run v1.22.10
-$ /redwood-app/node_modules/.bin/redwood g types
-$ /redwood-app/node_modules/.bin/rw-gen
-
-Generating...
-
-- .redwood/schema.graphql
-- .redwood/types/mirror/api/src/services/posts/index.d.ts
-- .redwood/types/mirror/web/src/components/BlogPost/index.d.ts
-- .redwood/types/mirror/web/src/layouts/BlogLayout/index.d.ts
-...
-- .redwood/types/mirror/web/src/components/Post/PostsCell/index.d.ts
-- .redwood/types/includes/web-routesPages.d.ts
-- .redwood/types/includes/all-currentUser.d.ts
-- .redwood/types/includes/web-routerRoutes.d.ts
-- .redwood/types/includes/api-globImports.d.ts
-- .redwood/types/includes/api-globalContext.d.ts
-- .redwood/types/includes/api-scenarios.d.ts
-- api/types/graphql.d.ts
-- web/types/graphql.d.ts
-
-... and done.
-```
-
-## info
-
-Print your system environment information.
-
-```bash
-yarn redwood info
-```
-
-This command's primarily intended for getting information others might need to know to help you debug:
-
-```bash
-~/redwood-app$ yarn redwood info
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/redwood info
-
- System:
- OS: Linux 5.4 Ubuntu 20.04 LTS (Focal Fossa)
- Shell: 5.0.16 - /usr/bin/bash
- Binaries:
- Node: 13.12.0 - /tmp/yarn--1589998865777-0.9683603763419713/node
- Yarn: 1.22.4 - /tmp/yarn--1589998865777-0.9683603763419713/yarn
- Browsers:
- Chrome: 78.0.3904.108
- Firefox: 76.0.1
- npmPackages:
- @redwoodjs/core: ^0.7.0-rc.3 => 0.7.0-rc.3
-
-Done in 1.98s.
-```
-
-## lint
-
-Lint your files.
-
-```bash
-yarn redwood lint
-```
-
-[Our ESLint configuration](https://github.com/redwoodjs/redwood/blob/master/packages/eslint-config/index.js) is a mix of [ESLint's recommended rules](https://eslint.org/docs/rules/), [React's recommended rules](https://www.npmjs.com/package/eslint-plugin-react#list-of-supported-rules), and a bit of our own stylistic flair:
-
-- no semicolons
-- comma dangle when multiline
-- single quotes
-- always use parenthesis around arrow functions
-- enforced import sorting
-
-| Option | Description |
-| :------ | :---------------- |
-| `--fix` | Try to fix errors |
-
-## prisma
-
-Run Prisma CLI within the context of a Redwood project.
-
-```
-yarn redwood prisma
-```
-
-Redwood's `prisma` command is a lightweight wrapper around the Prisma CLI. It's the primary way you interact with your database.
-
-> **What do you mean it's a lightweight wrapper?**
->
-> By lightweight wrapper, we mean that we're handling some flags under the hood for you.
-> You can use the Prisma CLI directly (`yarn prisma`), but letting Redwood act as a proxy (`yarn redwood prisma`) saves you a lot of keystrokes.
-> For example, Redwood adds the `--schema=api/db/schema.prisma` flags automatically.
->
-> If you want to know exactly what `yarn redwood prisma ` runs, which flags it's passing, etc., it's right at the top:
->
-> ```sh{3}
-> $ yarn redwood prisma migrate dev
-> yarn run v1.22.10
-> $ ~/redwood-app/node_modules/.bin/redwood prisma migrate dev
-> Running prisma cli:
-> yarn prisma migrate dev --schema "~/redwood-app/api/db/schema.prisma"
-> ...
-> ```
-
-Since `yarn redwood prisma` is just an entry point into all the database commands that the Prisma CLI has to offer, we won't try to provide an exhaustive reference of everything you can do with it here. Instead what we'll do is focus on some of the most common commands; those that you'll be running on a regular basis, and how they fit into Redwood's workflows.
-
-For the complete list of commands, see the [Prisma CLI Reference](https://www.prisma.io/docs/reference/api-reference/command-reference). It's the authority.
-
-Along with the CLI reference, bookmark Prisma's [Migration Flows](https://www.prisma.io/docs/concepts/components/prisma-migrate/prisma-migrate-flows) doc—it'll prove to be an invaluable resource for understanding `yarn redwood prisma migrate`.
-
-| Command | Description |
-| :------------------ | :----------------------------------------------------------- |
-| `db ` | Manage your database schema and lifecycle during development |
-| `generate` | Generate artifacts (e.g. Prisma Client) |
-| `migrate ` | Update the database schema with migrations |
-
-### prisma db
-
-Manage your database schema and lifecycle during development.
-
-```
-yarn redwood prisma db
-```
-
-The `prisma db` namespace contains commands that operate directly against the database.
-
-#### prisma db pull
-
-Pull the schema from an existing database, updating the Prisma schema.
-
-> 👉 Quick link to the [Prisma CLI Reference](https://www.prisma.io/docs/reference/api-reference/command-reference#db-pull).
-
-```
-yarn redwood prisma db pull
-```
-
-This command, formerly `introspect`, connects to your database and adds Prisma models to your Prisma schema that reflect the current database schema.
-
-> Warning: The command will Overwrite the current schema.prisma file with the new schema. Any manual changes or customization will be lost. Be sure to back up your current schema.prisma file before running `db pull` if it contains important modifications.
-
-#### prisma db push
-
-Push the state from your Prisma schema to your database.
-
-> 👉 Quick link to the [Prisma CLI Reference](https://www.prisma.io/docs/reference/api-reference/command-reference#db-push).
-
-```
-yarn redwood prisma db push
-```
-
-This is your go-to command for prototyping changes to your Prisma schema (`schema.prisma`).
-Prior to to `yarn redwood prisma db push`, there wasn't a great way to try out changes to your Prisma schema without creating a migration.
-This command fills the void by "pushing" your `schema.prisma` file to your database without creating a migration. You don't even have to run `yarn redwood prisma generate` afterward—it's all taken care of for you, making it ideal for iterative development.
-
-#### prisma db seed
-
-Seed your database.
-
-> 👉 Quick link to the [Prisma CLI Reference](https://www.prisma.io/docs/reference/api-reference/command-reference#db-seed-preview).
-
-```
-yarn redwood prisma db seed
-```
-
-This command seeds your database by running your project's `seed.js|ts` file which you can find in your `scripts` directory.
-
-Prisma's got a great [seeding guide](https://www.prisma.io/docs/guides/prisma-guides/seed-database) that covers both the concepts and the nuts and bolts.
-
-> **Important:** Prisma Migrate also triggers seeding in the following scenarios:
->
-> - you manually run the `yarn redwood prisma migrate reset` command
-> - the database is reset interactively in the context of using `yarn redwood prisma migrate dev`—for example, as a result of migration history conflicts or database schema drift
->
-> If you want to use `yarn redwood prisma migrate dev` or `yarn redwood prisma migrate reset` without seeding, you can pass the `--skip-seed` flag.
-
-While having a great seed might not be all that important at the start, as soon as you start collaborating with others, it becomes vital.
-
-**How does seeding actually work?**
-
-If you look at your project's `package.json` file, you'll notice a `prisma` section:
-
-```json
- "prisma": {
- "seed": "yarn rw exec seed"
- },
-```
-
-Prisma runs any command found in the `seed` setting when seeding via `yarn rw prisma db seed` or `yarn rw prisma migrate reset`.
-Here we're using the Redwood [`exec` cli command](#exec) that runs a script.
-
-If you wanted to seed your database using a different method (like `psql` and an `.sql` script), you can do so by changing the "seed" script command.
-
-**More About Seeding**
-
-In addition, you can [code along with Ryan Chenkie](https://www.youtube.com/watch?v=2LwTUIqjbPo), and learn how libraries like [faker](https://www.npmjs.com/package/faker) can help you create a large, realistic database fast, especially in tandem with Prisma's [createMany](https://www.prisma.io/docs/reference/api-reference/prisma-client-reference#createmany).
-
-
-
-
-
-
-
-
-
-
-
-**Log Formatting**
-
-If you use the Redwood Logger as part of your seed script, you can pipe the command to the LogFormatter to output prettified logs.
-
-For example, if your `scripts.seed.js` imports the `logger`:
-
-```jsx title="scripts/seed.js"
-import { db } from 'api/src/lib/db'
-import { logger } from 'api/src/lib/logger'
-
-export default async () => {
- try {
- const posts = [
- {
- title: 'Welcome to the blog!',
- body: "I'm baby single- origin coffee kickstarter lo.",
- },
- {
- title: 'A little more about me',
- body: 'Raclette shoreditch before they sold out lyft.',
- },
- {
- title: 'What is the meaning of life?',
- body: 'Meh waistcoat succulents umami asymmetrical, hoodie post-ironic paleo chillwave tote bag.',
- },
- ]
-
- Promise.all(
- posts.map(async (post) => {
- const newPost = await db.post.create({
- data: { title: post.title, body: post.body },
- })
-
- logger.debug({ data: newPost }, 'Added post')
- })
- )
- } catch (error) {
- logger.error(error)
- }
-}
-```
-
-You can pipe the script output to the formatter:
-
-```bash
-yarn rw prisma db seed | yarn rw-log-formatter
-```
-
-> Note: Just be sure to set `data` attribute, so the formatter recognizes the content.
-> For example: `logger.debug({ data: newPost }, 'Added post')`
-
-### prisma migrate
-
-Update the database schema with migrations.
-
-> 👉 Quick link to the [Prisma Concepts](https://www.prisma.io/docs/concepts/components/prisma-migrate).
-
-```
-yarn redwood prisma migrate
-```
-
-As a database toolkit, Prisma strives to be as holistic as possible. Prisma Migrate lets you use Prisma schema to make changes to your database declaratively, all while keeping things deterministic and fully customizable by generating the migration steps in a simple, familiar format: SQL.
-
-Since migrate generates plain SQL files, you can edit those SQL files before applying the migration using `yarn redwood prisma migrate --create-only`. This creates the migration based on the changes in the Prisma schema, but doesn't apply it, giving you the chance to go in and make any modifications you want. [Daniel Norman's tour of Prisma Migrate](https://www.youtube.com/watch?v=0LKhksstrfg) demonstrates this and more to great effect.
-
-Prisma Migrate has separate commands for applying migrations based on whether you're in dev or in production. The Prisma [Migration flows](https://www.prisma.io/docs/concepts/components/prisma-migrate/prisma-migrate-flows) goes over the difference between these workflows in more detail.
-
-#### prisma migrate dev
-
-Create a migration from changes in Prisma schema, apply it to the database, trigger generators (e.g. Prisma Client).
-
-> 👉 Quick link to the [Prisma CLI Reference](https://www.prisma.io/docs/reference/api-reference/command-reference#migrate-dev).
-
-```
-yarn redwood prisma migrate dev
-```
-
-
-
-
-
-
-
-
-
-
-
-#### prisma migrate deploy
-
-Apply pending migrations to update the database schema in production/staging.
-
-> 👉 Quick link to the [Prisma CLI Reference](https://www.prisma.io/docs/reference/api-reference/command-reference#migrate-deploy).
-
-```
-yarn redwood prisma migrate deploy
-```
-
-#### prisma migrate reset
-
-This command deletes and recreates the database, or performs a "soft reset" by removing all data, tables, indexes, and other artifacts.
-
-It'll also re-seed your database by automatically running the `db seed` command. See [prisma db seed](#prisma-db-seed).
-
-> **_Important:_** For use in development environments only
-
-## record
-
-> This command is experimental and its behavior may change.
-
-Commands for working with RedwoodRecord.
-
-### record init
-
-Parses `schema.prisma` and caches the datamodel as JSON. Reads relationships between models and adds some configuration in `api/src/models/index.js`.
-
-```
-yarn rw record init
-```
-
-## redwood-tools (alias rwt)
-
-Redwood's companion CLI development tool. You'll be using this if you're contributing to Redwood. See [Contributing](https://github.com/redwoodjs/redwood/blob/main/CONTRIBUTING.md#cli-reference-redwood-tools) in the Redwood repo.
-
-## setup
-
-Initialize configuration and integrate third-party libraries effortlessly.
-
-```
-yarn redwood setup
-```
-
-| Commands | Description |
-| ------------------ | ------------------------------------------------------------------------------------------ |
-| `auth` | Set up auth configuration for a provider |
-| `cache` | Set up cache configuration for memcached or redis |
-| `custom-web-index` | Set up an `index.js` file, so you can customize how Redwood web is mounted in your browser |
-| `deploy` | Set up a deployment configuration for a provider |
-| `generator` | Copy default Redwood generator templates locally for customization |
-| `i18n` | Set up i18n |
-| `package` | Peform setup actions by running a third-party npm package |
-| `tsconfig` | Add relevant tsconfig so you can start using TypeScript |
-| `ui` | Set up a UI design or style library |
-| `webpack` | Set up a webpack config file in your project so you can add custom config |
-
-### setup auth
-
-Integrate an auth provider.
-
-```
-yarn redwood setup auth
-```
-
-| Arguments & Options | Description |
-| :------------------ | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| `provider` | Auth provider to configure. Choices are `auth0`, `azureActiveDirectory`, `clerk`, `dbAuth`, `ethereum`, `firebase`, `goTrue`, `magicLink`, `netlify`, `nhost`, and `supabase` |
-| `--force, -f` | Overwrite existing configuration |
-
-#### Usage
-
-See [Authentication](authentication.md).
-
-### setup cache
-
-This command creates a setup file in `api/src/lib/cache.{ts|js}` for connecting to a Memcached or Redis server and allows caching in services. See the [**Caching** section of the Services docs](/docs/services#caching) for usage.
-
-```
-yarn redwood setup cache
-```
-
-| Arguments & Options | Description |
-| :------------------ | :------------------------------------------------------ |
-| `client` | Name of the client to configure, `memcached` or `redis` |
-| `--force, -f` | Overwrite existing files |
-
-### setup custom-web-index
-
-:::warning This command only applies to projects using Webpack
-
-As of v6, all Redwood projects use Vite by default.
-When switching projects to Vite, we made the decision to add the the entry file, `web/src/entry.client.{jsx,tsx}`, back to projects.
-
-:::
-
-Redwood automatically mounts your `` to the DOM, but if you want to customize how that happens, you can use this setup command to generate an `index.js` file in `web/src`.
-
-```
-yarn redwood setup custom-web-index
-```
-
-| Arguments & Options | Description |
-| :------------------ | :----------------------- |
-| `--force, -f` | Overwrite existing files |
-
-### setup generator
-
-Copies a given generator's template files to your local app for customization. The next time you generate that type again, it will use your custom template instead of Redwood's default.
-
-```
-yarn rw setup generator
-```
-
-| Arguments & Options | Description |
-| :------------------ | :------------------------------------------------------------ |
-| `name` | Name of the generator template(s) to copy (see help for list) |
-| `--force, -f` | Overwrite existing copied template files |
-
-#### Usage
-
-If you wanted to customize the page generator template, run the command:
-
-```
-yarn rw setup generator page
-```
-
-And then check `web/generators/page` for the page, storybook and test template files. You don't need to keep all of these templates—you could customize just `page.tsx.template` and delete the others and they would still be generated, but using the default Redwood templates.
-
-The only exception to this rule is the scaffold templates. You'll get four directories, `assets`, `components`, `layouts` and `pages`. If you want to customize any one of the templates in those directories, you will need to keep all the other files inside of that same directory, even if you make no changes besides the one you care about. (This is due to the way the scaffold looks up its template files.) For example, if you wanted to customize only the index page of the scaffold (the one that lists all available records in the database) you would edit `web/generators/scaffold/pages/NamesPage.tsx.template` and keep the other pages in that directory. You _could_ delete the other three directories (`assets`, `components`, `layouts`) if you don't need to customize them.
-
-**Name Variants**
-
-Your template will receive the provided `name` in a number of different variations.
-
-For example, given the name `fooBar` your template will receive the following _variables_ with the given _values_
-
-| Variable | Value |
-| :--------------------- | :--------- |
-| `pascalName` | `FooBar` |
-| `camelName` | `fooBar` |
-| `singularPascalName` | `FooBar` |
-| `pluralPascalName` | `FooBars` |
-| `singularCamelName` | `fooBar` |
-| `pluralCamelName` | `fooBars` |
-| `singularParamName` | `foo-bar` |
-| `pluralParamName` | `foo-bars` |
-| `singularConstantName` | `FOO_BAR` |
-| `pluralConstantName` | `FOO_BARS` |
-
-#### Example
-
-Copying the cell generator templates:
-
-```bash
-~/redwood-app$ yarn rw setup generator cell
-yarn run v1.22.4
-$ /redwood-app/node_modules/.bin/rw setup generator cell
- ✔ Copying generator templates...
- ✔ Wrote templates to /web/generators/cell
-✨ Done in 2.33s.
-```
-
-### setup deploy (config)
-
-Set up a deployment configuration.
-
-```
-yarn redwood setup deploy
-```
-
-| Arguments & Options | Description |
-| :------------------ | :---------------------------------------------------------------------------------------------------- |
-| `provider` | Deploy provider to configure. Choices are `baremetal`, `coherence`, `edgio`, `flightcontrol`, `netlify`, `render`, `vercel`, or `aws-serverless [deprecated]`, |
-| `--database, -d` | Database deployment for Render only [choices: "none", "postgresql", "sqlite"] [default: "postgresql"] |
-| `--force, -f` | Overwrite existing configuration [default: false] |
-
-#### setup deploy netlify
-
-When configuring Netlify deployment, the `setup deploy netlify` command generates a `netlify.toml` [configuration file](https://docs.netlify.com/configure-builds/file-based-configuration/) with the defaults needed to build and deploy a RedwoodJS site on Netlify.
-
-The `netlify.toml` file is a configuration file that specifies how Netlify builds and deploys your site — including redirects, branch and context-specific settings, and more.
-
-This configuration file also defines the settings needed for [Netlify Dev](https://docs.netlify.com/configure-builds/file-based-configuration/#netlify-dev) to detect that your site uses the RedwoodJS framework. Netlify Dev serves your RedwoodJS app as if it runs on the Netlify platform and can serve functions, handle Netlify [headers](https://docs.netlify.com/configure-builds/file-based-configuration/#headers) and [redirects](https://docs.netlify.com/configure-builds/file-based-configuration/#redirects).
-
-Netlify Dev can also create a tunnel from your local development server that allows you to share and collaborate with others using `netlify dev --live`.
-
-```
-// See: netlify.toml
-// ...
-[dev]
- # To use [Netlify Dev](https://www.netlify.com/products/dev/),
- # install netlify-cli from https://docs.netlify.com/cli/get-started/#installation
- # and then use netlify link https://docs.netlify.com/cli/get-started/#link-and-unlink-sites
- # to connect your local project to a site already on Netlify
- # then run netlify dev and our app will be accessible on the port specified below
- framework = "redwoodjs"
- # Set targetPort to the [web] side port as defined in redwood.toml
- targetPort = 8910
- # Point your browser to this port to access your RedwoodJS app
- port = 8888
-```
-
-In order to use [Netlify Dev](https://www.netlify.com/products/dev/) you need to:
-
-- install the latest [netlify-cli](https://docs.netlify.com/cli/get-started/#installation)
-- use [netlify link](https://docs.netlify.com/cli/get-started/#link-and-unlink-sites) to connect to your Netlify site
-- ensure that the `targetPort` matches the [web] side port in `redwood.toml`
-- run `netlify dev` and your site will be served on the specified `port` (e.g., 8888)
-- if you wish to share your local server with others, you can run `netlify dev --live`
-
-> Note: To detect the RedwoodJS framework, please use netlify-cli v3.34.0 or greater.
-
-### setup mailer
-
-This command adds the necessary packages and files to get started using the RedwoodJS mailer. By default it also creates an example mail template which can be skipped with the `--skip-examples` flag.
-
-```
-yarn redwood setup mailer
-```
-
-| Arguments & Options | Description |
-| :---------------------- | :----------------------------- |
-| `--force, -f` | Overwrite existing files |
-| `--skip-examples` | Do not include example content, such as a React email template |
-
-### setup package
-
-This command takes a published npm package that you specify, performs some compatibility checks, and then executes its bin script. This allows you to use third-party packages that can provide you with an easy-to-use setup command for the particular functionality they provide.
-
-This command behaves similarly to `yarn dlx` but will attempt to confirm compatibility between the package you are attempting to run and the current version of Redwood you are running. You can bypass this check by passing the `--force` flag if you feel you understand any potential compatibility issues.
-
-```
-yarn redwood setup package
-```
-
-| Arguments & Options | Description |
-| :------------------ | :----------------------- |
-| `--force, -f` | Forgo compatibility checks |
-
-#### Usage
-
-Run the made up `@redwoodjs/setup-example` package:
-```bash
-~/redwood-app$ yarn rw setup package @redwoodjs/setup-example
-```
-
-Run the same package but using a particular npm tag and avoiding any compatibility checks:
-```bash
-~/redwood-app$ yarn rw setup package @redwoodjs/setup-example@beta --force
-```
-
-**Compatibility Checks**
-
-We perform a simple compatibility check in an attempt to make you aware of potential compatibility issues with setup packages you might wish to run. This works by examining the version of `@redwoodjs/core` you are using within your root `package.json`. We compare this value with a compatibility range the npm package specifies in the `engines.redwoodjs` field of its own `package.json`. If the version of `@redwoodjs/core` you are using falls outside of the compatibility range specified by the package you are attempting to run, we will warn you and ask you to confirm that you wish to continue.
-
-It's the author of the npm package's responsibility to specify the correct compatibility range, so **you should always research the packages you use with this command**. Especially since they will be executing code on your machine!
-
-### setup graphql
-
-This command creates the necessary files to support GraphQL features like fragments and trusted documents.
-
-#### Usage
-
-Run `yarn rw setup graphql `
-
-#### setup graphql fragments
-
-This command creates the necessary configuration to start using [GraphQL Fragments](./graphql/fragments.md).
-
-```
-yarn redwood setup graphql fragments
-```
-
-| Arguments & Options | Description |
-| :------------------ | :--------------------------------------- |
-| `--force, -f` | Overwrite existing files and skip checks |
-
-#### Usage
-
-Run `yarn rw setup graphql fragments`
-
-#### Example
-
-```bash
-~/redwood-app$ yarn rw setup graphql fragments
-✔ Update Redwood Project Configuration to enable GraphQL Fragments
-✔ Generate possibleTypes.ts
-✔ Import possibleTypes in App.tsx
-✔ Add possibleTypes to the GraphQL cache config
-```
-
-#### setup graphql trusted-documents
-
-This command creates the necessary configuration to start using [GraphQL Trusted Documents](./graphql/trusted-documents.md).
-
-
-```
-yarn redwood setup graphql trusted-documents
-```
-
-#### Usage
-
-Run `yarn rw setup graphql trusted-documents`
-
-#### Example
-
-```bash
-~/redwood-app$ yarn rw setup graphql trusted-documents
-✔ Update Redwood Project Configuration to enable GraphQL Trusted Documents ...
-✔ Generating Trusted Documents store ...
-✔ Configuring the GraphQL Handler to use a Trusted Documents store ...
-```
-
-
-If you have not setup the RedwoodJS server file, it will be setup:
-
-```bash
-✔ Adding the experimental server file...
-✔ Adding config to redwood.toml...
-✔ Adding required api packages...
-```
-
-
-### setup realtime
-
-This command creates the necessary files, installs the required packages, and provides examples to setup RedwoodJS Realtime from GraphQL live queries and subscriptions. See the Realtime docs for more information.
-
-```
-yarn redwood setup realtime
-```
-
-| Arguments & Options | Description |
-| :------------------ | :----------------------- |
-| `-e, --includeExamples, --examples` | Include examples of how to implement liveQueries and subscriptions. Default: true. |
-| `--force, -f` | Forgo compatibility checks |
-
-:::note
-
-If the RedwoodJS Server is not setup, it will be installed as well.
-
-:::
-
-#### Usage
-
-Run `yarn rw setup realtime`
-
-#### Example
-
-```bash
-~/redwood-app$ yarn rw setup realtime
-✔ Checking for realtime environment prerequisites ...
-✔ Adding required api packages...
-✔ Adding the realtime api lib ...
-✔ Adding Countdown example subscription ...
-✔ Adding NewMessage example subscription ...
-✔ Adding Auctions example live query ...
-✔ Generating types ...
-```
-
-
-If you have not setup the RedwoodJS server file, it will be setup:
-
-```bash
-✔ Adding the experimental server file...
-✔ Adding config to redwood.toml...
-✔ Adding required api packages...
-```
-
-### setup tsconfig
-
-Add a `tsconfig.json` to both the web and api sides so you can start using [TypeScript](typescript/index).
-
-```
-yarn redwood setup tsconfig
-```
-
-| Arguments & Options | Description |
-| :------------------ | :----------------------- |
-| `--force, -f` | Overwrite existing files |
-
-
-
-### setup ui
-
-Set up a UI design or style library. Right now the choices are [TailwindCSS](https://tailwindcss.com/), [Chakra UI](https://chakra-ui.com/), and [Mantine UI](https://ui.mantine.dev/).
-
-```
-yarn rw setup ui
-```
-
-| Arguments & Options | Description |
-| :------------------ | :-------------------------------------------------------------------------------------- |
-| `library` | Library to configure. Choices are `chakra-ui`, `tailwindcss`, and `mantine` |
-| `--force, -f` | Overwrite existing configuration |
-
-## storybook
-
-Starts Storybook locally
-
-```bash
-yarn redwood storybook
-```
-
-[Storybook](https://storybook.js.org/docs/react/get-started/introduction) is a tool for UI development that allows you to develop your components in isolation, away from all the conflated cruft of your real app.
-
-> "Props in, views out! Make it simple to reason about."
-
-RedwoodJS supports Storybook by creating stories when generating cells, components, layouts and pages. You can then use these to describe how to render that UI component with representative data.
-
-| Arguments & Options | Description |
-| :------------------ | :------------------------------------------------------------------------------------------------- |
-| `--open` | Open Storybook in your browser on start [default: true]. Pass `--no-open` to disable this behavior |
-| `--build` | Build Storybook |
-| `--port` | Which port to run Storybook on [default: 7910] |
-
-## test
-
-Run Jest tests for api and web.
-
-```bash
-yarn redwood test [side..]
-```
-
-| Arguments & Options | Description |
-| ------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| `sides or filter` | Which side(s) to test, and/or a regular expression to match against your test files to filter by |
-| `--help` | Show help |
-| `--version` | Show version number |
-| `--watch` | Run tests related to changed files based on hg/git (uncommitted files). Specify the name or path to a file to focus on a specific set of tests [default: true] |
-| `--watchAll` | Run all tests |
-| `--collectCoverage` | Show test coverage summary and output info to `coverage` directory in project root. See this directory for an .html coverage report |
-| `--clearCache` | Delete the Jest cache directory and exit without running tests |
-| `--db-push` | Syncs the test database with your Prisma schema without requiring a migration. It creates a test database if it doesn't already exist [default: true]. This flag is ignored if your project doesn't have an `api` side. [👉 More details](#prisma-db-push). |
-
-> **Note** all other flags are passed onto the jest cli. So for example if you wanted to update your snapshots you can pass the `-u` flag
-
-## type-check (alias tsc or tc)
-
-Runs a TypeScript compiler check on both the api and the web sides.
-
-```bash
-yarn redwood type-check [side]
-```
-
-| Arguments & Options | Description |
-| ------------------- | ------------------------------------------------------------------------------ |
-| `side` | Which side(s) to run. Choices are `api` and `web`. Defaults to `api` and `web` |
-
-#### Usage
-
-See [Running Type Checks](typescript/introduction.md#running-type-checks).
-
-## serve
-
-Runs a server that serves both the api and the web sides.
-
-```bash
-yarn redwood serve [side]
-```
-
-> You should run `yarn rw build` before running this command to make sure all the static assets that will be served have been built.
-
-`yarn rw serve` is useful for debugging locally or for self-hosting—deploying a single server into a serverful environment. Since both the api and the web sides run in the same server, CORS isn't a problem.
-
-| Arguments & Options | Description |
-| ------------------- | ------------------------------------------------------------------------------ |
-| `side` | Which side(s) to run. Choices are `api` and `web`. Defaults to `api` and `web` |
-| `--port` | What port should the server run on [default: 8911] |
-| `--socket` | The socket the server should run. This takes precedence over port |
-
-### serve api
-
-Runs a server that only serves the api side.
-
-```
-yarn rw serve api
-```
-
-This command uses `apiUrl` in your `redwood.toml`. Use this command if you want to run just the api side on a server (e.g. running on Render).
-
-| Arguments & Options | Description |
-| ------------------- | ----------------------------------------------------------------- |
-| `--port` | What port should the server run on [default: 8911] |
-| `--socket` | The socket the server should run. This takes precedence over port |
-| `--apiRootPath` | The root path where your api functions are served |
-
-For the full list of Server Configuration settings, see [this documentation](app-configuration-redwood-toml.md#api).
-If you want to format your log output, you can pipe the command to the Redwood LogFormatter:
-
-```
-yarn rw serve api | yarn rw-log-formatter
-```
-
-### serve web
-
-Runs a server that only serves the web side.
-
-```
-yarn rw serve web
-```
-
-This command serves the contents in `web/dist`. Use this command if you're debugging (e.g. great for debugging prerender) or if you want to run your api and web sides on separate servers, which is often considered a best practice for scalability (since your api side likely has much higher scaling requirements).
-
-> **But shouldn't I use nginx and/or equivalent technology to serve static files?**
->
-> Probably, but it can be a challenge to setup when you just want something running quickly!
-
-| Arguments & Options | Description |
-| ------------------- | ------------------------------------------------------------------------------------- |
-| `--port` | What port should the server run on [default: 8911] |
-| `--socket` | The socket the server should run. This takes precedence over port |
-| `--apiHost` | Forwards requests from the `apiUrl` (defined in `redwood.toml`) to the specified host |
-
-If you want to format your log output, you can pipe the command to the Redwood LogFormatter:
-
-```
-yarn rw serve web | yarn rw-log-formatter
-```
-
-## upgrade
-
-Upgrade all `@redwoodjs` packages via an interactive CLI.
-
-```bash
-yarn redwood upgrade
-```
-
-This command does all the heavy-lifting of upgrading to a new release for you.
-
-Besides upgrading to a new stable release, you can use this command to upgrade to either of our unstable releases, `canary` and `rc`, or you can upgrade to a specific release version.
-
-A canary release is published to npm every time a PR is merged to the `main` branch, and when we're getting close to a new release, we publish release candidates.
-
-| Option | Description |
-| :-------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| `--dry-run, -d` | Check for outdated packages without upgrading |
-| `--tag, -t` | Choices are "rc", "canary", "latest", "next", "experimental", or a specific version (e.g. "0.19.3"). WARNING: Unstable releases in the case of "canary", "rc", "next", and "experimental". And "canary" releases include breaking changes often requiring codemods if upgrading a project. |
-
-#### Example
-
-Upgrade to the most recent canary:
-
-```bash
-yarn redwood upgrade -t canary
-```
-
-Upgrade to a specific version:
-
-```bash
-yarn redwood upgrade -t 0.19.3
-```
-
-## Background checks
-
-The CLI can check for things in the background, like new versions of the framework, while you dev.
-
-Right now it can only check for new versions.
-If you'd like it to do so, set `notifications.versionUpdates` in the `redwood.toml` file to include an array of the tags you're interested in hearing about.
-(The former has priority.)
-
-By default, the CLI won't check for upgrades—you have to opt into it.
-
-You'll see this notification once a day at most. And the CLI will check for it once a day at most. So, nothing heavy-handed going on here.
diff --git a/docs/versioned_docs/version-7.0/deploy/edgio.md b/docs/versioned_docs/version-7.0/deploy/edgio.md
deleted file mode 100644
index c821c3ee2333..000000000000
--- a/docs/versioned_docs/version-7.0/deploy/edgio.md
+++ /dev/null
@@ -1,16 +0,0 @@
-# Deploy to Edgio
-
-[Edgio](https://edg.io) extends the capabilities of a traditional CDN by not only hosting your static content, but also providing server-side rendering for progressive web applications as well as caching both your APIs and HTML at the network edge to provide your users with the fastest browsing experience.
-
-## Edgio Deploy Setup
-
-In order to deploy your RedwoodJS project to Edgio, the project must first be initialized with the Edgio CLI.
-
-1. In your project, run the command `yarn rw setup deploy edgio`.
-2. Verify the changes to your project, commit and push to your repository.
-3. Deploy your project to Edgio
- 1. If this is your first time deploying to Edgio, the interactive CLI will prompt to authenticate using your browser. You can start the deploy by running `yarn rw deploy edgio`.
- 2. If you are deploying from a **non-interactive** environment, you will need to create an account on [Edgio Developer Console](https://app.layer0.co) first and setup a [deploy token](https://docs.edg.io/guides/deploy_apps#deploy-from-ci). Once the deploy token is created, save it as a secret to your environment. You can start the deploy by running `yarn rw deploy edgio --token=XXX`.
-4. Follow the link in the output to view your site live once deployment has completed!
-
-For more information on deploying to Edgio, check out the [documentation](https://docs.edg.io).
diff --git a/docs/versioned_docs/version-7.0/deploy/flightcontrol.md b/docs/versioned_docs/version-7.0/deploy/flightcontrol.md
deleted file mode 100644
index 192efd1776b6..000000000000
--- a/docs/versioned_docs/version-7.0/deploy/flightcontrol.md
+++ /dev/null
@@ -1,24 +0,0 @@
----
-description: How to deploy a Redwood app to AWS via Flightcontrol
----
-
-# Deploy to AWS with Flightcontrol
-
-[Flightcontrol](https://www.flightcontrol.dev?ref=redwood) enables any developer to deploy to AWS without being a wizard. It's extremely easy to use but lets you pop the hood and leverage the raw power of AWS when needed. It supports servers, static sites, and databases which makes it a perfect fit for hosting scalable Redwood apps.
-
-## Flightcontrol Deploy Setup
-
-1. In your project, run the command `yarn rw setup deploy flightcontrol --database=YOUR_DB_TYPE` where YOUR_DB_TYPE is `mysql` or `postgresql`
-2. Commit the changes and push to github
-3. If you don't have an account, sign up at [app.flightcontrol.dev/signup](https://app.flightcontrol.dev/signup?ref=redwood)
-4. Create a new project from the onboarding screen or project list
- 1. Connect your Github account and select your repo
- 2. Select "Config Type" as `flightcontrol.json`
- 3. Click "Create Project" and complete any required steps like linking your AWS account.
-5. If using dbAuth, add the session secret key env variable in the Flightcontrol dashboard
-
-
-NOTE: If you are using yarn v1, remove the `installCommand`'s from flightcontrol.json
-
-If you have *any* problems or questions, Flightcontrol is very responsive in [their support Discord](https://discord.gg/yY8rSPrD6q).
-
diff --git a/docs/versioned_docs/version-7.0/deploy/vercel.md b/docs/versioned_docs/version-7.0/deploy/vercel.md
deleted file mode 100644
index 5f4e6e33fe04..000000000000
--- a/docs/versioned_docs/version-7.0/deploy/vercel.md
+++ /dev/null
@@ -1,90 +0,0 @@
----
-description: Deploy serverless in an instant with Vercel
----
-
-# Deploy to Vercel
-
->The following instructions assume you have read the [General Deployment Setup](./introduction.md#general-deployment-setup) section above.
-
-## Vercel tl;dr Deploy
-
-If you simply want to experience the Vercel deployment process without a database and/or adding custom code, you can do the following:
-1. create a new redwood project: `yarn create redwood-app ./vercel-deploy`
-2. after your "vercel-deploy" project installation is complete, init git, commit, and add it as a new repo to GitHub, BitBucket, or GitLab
-3. run the command `yarn rw setup deploy vercel` and commit and push changes
-4. use the Vercel [Quick Start](https://vercel.com/#get-started) to deploy
-
-_If you choose this quick deploy experience, the following steps do not apply._
-
-## Redwood Project Setup
-
-If you already have a Redwood project, proceed to the next step.
-
-Otherwise, we recommend experiencing the full Redwood DX via the [Redwood Tutorial](tutorial/foreword.md). Simply return to these instructions when you reach the "Deployment" section.
-
-## Redwood Deploy Configuration
-
-Complete the following two steps. Then save, commit, and push your changes.
-
-### Step 1. Serverless Functions Path
-
-Run the following CLI Command:
-```shell
-yarn rw setup deploy vercel
-```
-
-This updates your `redwood.toml` file, setting `apiUrl = "/api"`:
-
-### Step 2. Database Settings
-
-Follow the steps in the [Prisma and Database](./introduction#3-prisma-and-database) section above. _(Skip this step if your project does not require a database.)_
-
-### Vercel Initial Setup and Configuration
-Either [login](https://vercel.com/login) to your Vercel account and select "Import Project" or use the Vercel [quick start](https://vercel.com/#get-started).
-
-Then select the "Continue" button within the "From Git Repository" section:
-
-
-Next, select the provider where your repo is hosted: GitHub, GitLab, or Bitbucket. You'll be asked to login and then provider the URL of the repository, e.g. for a GitHub repo `https://github.com/your-account/your-project.git`. Select "Continue".
-
-You'll then need to provide permissions for Vercel to access the repo on your hosting provider.
-
-### Import and Deploy your Project
-Vercel will recognize your repo as a Redwood project and take care of most configuration heavy lifting. You should see the following options and, most importantly, the "Framework Preset" showing RedwoodJS.
-
-
-
-Leave the **Build and Output Settings** at the default settings (unless you know what you're doing and have very specific needs).
-
-In the "Environment Variables" dropdown, add `DATABASE_URL` and your app's database connection string as the value. (Or skip if not applicable.)
-
-> When configuring a database, you'll want to append `?connection_limit=1` to the URI. This is [recommended by Prisma](https://www.prisma.io/docs/reference/tools-and-interfaces/prisma-client/deployment#recommended-connection-limit) when working with relational databases in a Serverless context. For production apps, you should setup [connection pooling](https://redwoodjs.com/docs/connection-pooling).
-
-For example, a postgres connection string should look like `postgres://:@/?connection_limit=1`
-
-Finally, click the "Deploy" button. You'll hopefully see a build log without errors (warnings are fine) and end up on a screen that looks like this:
-
-
-
-Go ahead, click that "Visit" button. You’ve earned it 🎉
-
-## Vercel Dashboard Settings
-
-From the Vercel Dashboard you can access the full settings and information for your Redwood App. The default settings seem to work just fine for most Redwood projects. Do take a look around, but be sure check out the [docs as well](https://vercel.com/docs).
-
-From now on, each time you push code to your git repo, Vercel will automatically trigger a deploy of the new code. You can also manually redeploy if you select "Deployments", then the specific deployment from the list, and finally the "Redeploy" option from the vertical dots menu next to "Visit".
-
-## vercel.json configuration
-
-By default, API requests in Vercel have a timeout limit of 15 seconds. To extend this duration, you can modify the vercel.json file by inserting the code snippet provided below. Please be aware that the ability to increase the timeout limit is exclusive to Pro plan subscribers. Additionally, it is important to note that the timeout can be increased up to a maximum of 300 seconds, which is equivalent to 5 minutes.
-
-```
-{
- "functions": {
- "api/src/functions/graphql.*": {
- "maxDuration": 120,
- "runtime": "@vercel/redwood@2.0.5"
- }
- }
-}
-```
diff --git a/docs/versioned_docs/version-7.0/directives.md b/docs/versioned_docs/version-7.0/directives.md
deleted file mode 100644
index b778dc2ef8ce..000000000000
--- a/docs/versioned_docs/version-7.0/directives.md
+++ /dev/null
@@ -1,698 +0,0 @@
----
-description: Customize GraphQL execution
----
-
-# Directives
-
-Redwood Directives are a powerful feature, supercharging your GraphQL-backed Services.
-
-You can think of directives like "middleware" that let you run reusable code during GraphQL execution to perform tasks like authentication and formatting.
-
-Redwood uses them to make it a snap to protect your API Services from unauthorized access.
-
-Here we call those types of directives **Validators**.
-
-You can also use them to transform the output of your query result to modify string values, format dates, shield sensitive data, and more!
-We call those types of directives **Transformers**.
-
-You'll recognize a directive as being 1) preceded by `@` (e.g. `@myDirective`) and 2) declared alongside a field:
-
-```tsx
-type Bar {
- name: String! @myDirective
-}
-```
-
-or a Query or a Mutation:
-
-```tsx
-type Query {
- bars: [Bar!]! @myDirective
-}
-
-type Mutation {
- createBar(input: CreateBarInput!): Bar! @myDirective
-}
-```
-
-You can also define arguments that can be extracted and used when evaluating the directive:
-
-```tsx
-type Bar {
- field: String! @myDirective(roles: ["ADMIN"])
-}
-```
-
-or a Query or Mutation:
-
-```tsx
-type Query {
- bars: [Bar!]! @myDirective(roles: ["ADMIN"])
-}
-```
-
-You can also use directives on relations:
-
-```tsx
-type Baz {
- name: String!
-}
-
-type Bar {
- name: String!
- bazzes: [Baz]! @myDirective
-}
-```
-
-There are many ways to write directives using GraphQL tools and libraries. Believe us, it can get complicated fast.
-
-But, don't fret: Redwood provides an easy and ergonomic way to generate and write your own directives so that you can focus on the implementation logic and not the GraphQL plumbing.
-
-## What is a Redwood Directive?
-
-Redwood directives are purposeful.
-They come in two flavors: **Validators** and **Transformers**.
-
-Whatever flavor of directive you want, all Redwood directives must have the following properties:
-
-- be in the `api/src/directives/{directiveName}` directory where `directiveName` is the directive directory
-- must have a file named `{directiveName}.{js,ts}` (e.g. `maskedEmail.ts`)
-- must export a `schema` and implement either a `validate` or `transform` function
-
-### Understanding the Directive Flow
-
-Since it helps to know a little about the GraphQL phases—specifically the Execution phase—and how Redwood Directives fit in the data-fetching and authentication flow, let's have a quick look at some diagrams.
-
-First, we see the built-in `@requireAuth` Validator directive that can allow or deny access to a Service (a.k.a. a resolver) based on Redwood authentication.
-In this example, the `post(id: Int!)` query is protected using the `@requireAuth` directive.
-
-If the request's context has a `currentUser` and the app's `auth.{js|ts}` determines it `isAuthenticated()`, then the execution phase proceeds to get resolved (for example, the `post({ id })` Service is executed and queries the database using Prisma) and returns the data in the resulting response when execution is done.
-
-![require-auth-directive](https://user-images.githubusercontent.com/1051633/135320891-34dc06fc-b600-4c76-8a35-86bf42c7f179.png)
-
-In this second example, we add the Transformer directive `@welcome` to the `title` field on `Post` in the SDL.
-
-The GraphQL Execution phase proceeds the same as the prior example (because the `post` query is still protected and we'll want to fetch the user's name) and then the `title` field is resolved based on the data fetch query in the service.
-
-Finally after execution is done, then the directive can inspect the `resolvedValue` (here "Welcome to the blog!") and replace the value by inserting the current user's name—"Welcome, Tom, to the blog!"
-
-![welcome-directive](https://user-images.githubusercontent.com/1051633/135320906-5e2d639d-13a1-4aaf-85bf-98529822d244.png)
-
-### Validators
-
-Validators integrate with Redwood's authentication to evaluate whether or not a field, query, or mutation is permitted—that is, if the request context's `currentUser` is authenticated or belongs to one of the permitted roles.
-
-Validators should throw an Error such as `AuthenticationError` or `ForbiddenError` to deny access and simply return to allow.
-
-Here the `@isSubscriber` validator directive checks if the currentUser exists (and therefore is authenticated) and whether or not they have the `SUBSCRIBER` role. If they don't, then access is denied by throwing an error.
-
-```tsx
-import {
- AuthenticationError,
- ForbiddenError,
- createValidatorDirective,
- ValidatorDirectiveFunc,
-} from '@redwoodjs/graphql-server'
-import { hasRole } from 'src/lib/auth'
-
-export const schema = gql`
- directive @isSubscriber on FIELD_DEFINITION
-`
-
-const validate: ValidatorDirectiveFunc = ({ context }) => {
- if (!context.currentUser) {
- throw new AuthenticationError("You don't have permission to do that.")
- }
-
- if (!context.currentUser.roles?.includes('SUBSCRIBER')) {
- throw new ForbiddenError("You don't have access to do that.")
- }
-}
-
-const isSubscriber = createValidatorDirective(schema, validate)
-
-export default isSubscriber
-```
-
-Since validator directives can access arguments (such as `roles`), you can quickly provide RBAC (Role-based Access Control) to fields, queries and mutations.
-
-```tsx
-import gql from 'graphql-tag'
-
-import { createValidatorDirective } from '@redwoodjs/graphql-server'
-
-import { requireAuth as applicationRequireAuth } from 'src/lib/auth'
-import { logger } from 'src/lib/logger'
-
-export const schema = gql`
- directive @requireAuth(roles: [String]) on FIELD_DEFINITION
-`
-
-const validate = ({ directiveArgs }) => {
- const { roles } = directiveArgs
-
- applicationRequireAuth({ roles })
-}
-
-const requireAuth = createValidatorDirective(schema, validate)
-
-export default requireAuth
-```
-
-All Redwood apps come with two built-in validator directives: `@requireAuth` and `@skipAuth`.
-The `@requireAuth` directive takes optional roles.
-You may use these to protect against unwanted GraphQL access to your data.
-Or explicitly allow public access.
-
-> **Note:** Validators evaluate prior to resolving the field value, so you cannot modify the value and any return value is ignored.
-
-### Transformers
-
-Transformers can access the resolved field value to modify and then replace it in the response.
-Transformers apply to both single fields (such as a `User`'s `email`) and collections (such as a set of `Posts` that belong to `User`s) or is the result of a query. As such, Transformers cannot be applied to Mutations.
-
-In the first case of a single field, the directive would return the modified field value. In the latter case, the directive could iterate each `Post` and modify the `title` in each. In all cases, the directive **must** return the same expected "shape" of the data the SDL expects.
-
-> **Note:** you can chain directives to first validate and then transform, such as `@requireAuth @maskedEmail`. Or even combine transformations to cascade formatting a value (you could use `@uppercase` together with `@truncate` to uppercase a title and shorten to 10 characters).
-
-Since transformer directives can access arguments (such as `roles` or `maxLength`) you may fetch those values and use them when applying (or to check if you even should apply) your transformation.
-
-That means that a transformer directive could consider the `permittedRoles` in:
-
-```tsx
-type user {
- email: String! @maskedEmail(permittedRoles: ["ADMIN"])
-}
-```
-
-and if the `currentUser` is an `ADMIN`, then skip the masking transform and simply return the original resolved field value:
-
-```jsx title="./api/src/directives/maskedEmail.directive.js"
-import { createTransformerDirective, TransformerDirectiveFunc } from '@redwoodjs/graphql-server'
-
-export const schema = gql`
- directive @maskedEmail(permittedRoles: [String]) on FIELD_DEFINITION
-`
-
-const transform: TransformerDirectiveFunc = ({ context, resolvedValue }) => {
- return resolvedValue.replace(/[a-zA-Z0-9]/i, '*')
-}
-
-const maskedEmail = createTransformerDirective(schema, transform)
-
-export default maskedEmail
-```
-
-and you would use it in your SDLs like this:
-
-```graphql
-type UserExample {
- id: Int!
- email: String! @maskedEmail # 👈 will replace alphanumeric characters with asterisks in the response!
- name: String
-}
-```
-
-### Where can I use a Redwood Directive?
-
-A directive can only appear in certain locations in a GraphQL schema or operation. These locations are listed in the directive's definition.
-
-In the example below, the `@maskedEmail` example, the directive can only appear in the `FIELD_DEFINITION` location.
-
-An example of a `FIELD_DEFINITION` location is a field that exists on a `Type`:
-
-```graphql
-type UserExample {
- id: Int!
- email: String! @requireAuth
- name: String @maskedEmail # 👈 will maskedEmail name in the response!
-}
-
-type Query {
- userExamples: [UserExample!]! @requireAuth 👈 will enforce auth when fetching all users
- userExamples(id: Int!): UserExample @requireAuth 👈 will enforce auth when fetching a single user
-}
-```
-
-> **Note**: Even though GraphQL supports `FIELD_DEFINITION | ARGUMENT_DEFINITION | INPUT_FIELD_DEFINITION | ENUM_VALUE` locations, RedwoodDirectives can **only** be declared on a `FIELD_DEFINITION` — that is, you **cannot** declare a directive in an `Input type`:
->
-> ```graphql
-> input UserExampleInput {
-> email: String! @maskedEmail # 👈 🙅 not allowed on an input
-> name: String! @requireAuth # 👈 🙅 also not allowed on an input
-> }
-> ```
-
-## When Should I Use a Redwood Directive?
-
-As noted in the [GraphQL spec](https://graphql.org/learn/queries/#directives):
-
-> Directives can be useful to get out of situations where you otherwise would need to do string manipulation to add and remove fields in your query. Server implementations may also add experimental features by defining completely new directives.
-
-Here's a helpful guide for deciding when you should use one of Redwood's Validator or Transformer directives:
-
-| | Use | Directive | Custom? | Type |
-| --- | ---------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------- | ------------ |
-| ✅ | Check if the request is authenticated? | `@requireAuth` | Built-in | Validator |
-| ✅ | Check if the user belongs to a role? | `@requireAuth(roles: ["AUTHOR"])` | Built-in | Validator |
-| ✅ | Only allow admins to see emails, but others get a masked value like "###@######.###" | `@maskedEmail(roles: ["ADMIN"])` | Custom | Transformer |
-| 🙅 | Know if the logged in user can edit the record, and/or values | N/A - Instead do this check in your service |
-| 🙅 | Is my input a valid email address format? | N/A - Instead do this check in your service using [Service Validations](services.md#service-validations) or consider [GraphQL Scalars](https://www.graphql-scalars.dev) |
-| 🙅 | I want to remove a field from the response for data filtering; for example, do not include the title of the post | `@skip(if: true )` or `@include(if: false)` | Instead use [core directives](https://graphql.org/learn/queries/#directives) on the GraphQL client query, not the SDL | Core GraphQL |
-
-## Combining, Chaining and Cascading Directives
-
-Now that you've seen what Validator and Transformer directives look like and where and when you may use them, you may wonder: can I use them together? Can I transform the result of a transformer?
-
-The answer is: yes—yes you can!
-
-### Combine Directives on a Query and a Type Field
-
-Let's say you want to only allow logged-in users to be able to query `User` details and you only want un-redacted email addresses to be shown to ADMINs.
-
-You can apply the `@requireAuth` directive to the `user(id: Int!)` query so you have to be logged in.
-Then, you can compose a `@maskedEmail` directive that checks the logged-in user's role membership and if they're not an ADMIN, mask the email address:
-
-```tsx
- type User {
- id: Int!
- name: String!
- email: String! @maskedEmail(role: "ADMIN")
- createdAt: DateTime!
- }
-
- type Query {
- user(id: Int!): User @requireAuth
- }
-```
-
-Or, let's say I want to only allow logged in users to be able to query User details.
-
-But, I only want ADMIN users to be able to query and fetch the email address.
-
-I can apply the `@requireAuth` directive to the `user(id: Int!)` query so I have to be logged in.
-
-And, I can apply the `@requireAuth` directive to the `email` field with a role argument.
-
-```tsx
- type User {
- id: Int!
- name: String!
- email: String! @requireAuth(role: "ADMIN")
- createdAt: DateTime!
- }
-
- type Query {
- user(id: Int!): User @requireAuth
- }
-```
-
-Now, if a user who is not an ADMIN queries:
-
-```tsx
-query user(id: 1) {
- id
- name
- createdAt
-}
-```
-
-They will get a result.
-
-But, if they try to query:
-
-```tsx
-query user(id: 1) {
- id
- name
- email
- createdAt
-}
-```
-
-They will be forbidden from even making the request.
-
-### Chaining a Validator and a Transformer
-
-Similar to the prior example, you may want to chain directives, but the transform doesn't consider authentication or role membership.
-
-For example, here we ensure that anyone trying to query a User and fetch the email must be authenticated.
-
-And then, if they are, apply a mask to the email field.
-
-```tsx
- type User {
- id: Int!
- name: String!
- email: String! @requireAuth @maskedEmail
- createdAt: DateTime!
- }
-```
-
-### Cascade Transformers
-
-Maybe you want to apply multiple field formatting?
-
-If your request event headers includes geographic or timezone info, you could compose a custom Transformer directive called `@localTimezone` could inspect the header value and convert the `createdAt` from UTC to local time -- something often done in the browser.
-
-Then, you can chain the `@dateFormat` Transformer, to just return the date portion of the timestamp -- and not the time.
-
-```tsx
- type User {
- id: Int!
- name: String!
- email: String!
- createdAt: DateTime! @localTimezone @dateFormat
- }
-```
-
-> **Note**: These directives could be alternatively be implemented as "operation directives" so the client can use them on a query instead of the schema-level. These such directives are a potential future Redwood directive feature.
-
-## GraphQL Handler Setup
-
-Redwood makes it easy to code, organize, and map your directives into your GraphQL schema.
-Simply add them to the `directives` directory and the `createGraphQLHandler` does all the work.
-
-You simply add them to the `directives` directory and the `createGraphQLHandler` will do all the work.
-
-> **Note**: Redwood has a generator that will do all the heavy lifting setup for you!
-
-```tsx title="api/src/functions/graphql.ts"
-import { createGraphQLHandler } from '@redwoodjs/graphql-server'
-
-import directives from 'src/directives/**/*.{js,ts}' // 👈 directives live here
-import sdls from 'src/graphql/**/*.sdl.{js,ts}'
-import services from 'src/services/**/*.{js,ts}'
-
-import { db } from 'src/lib/db'
-import { logger } from 'src/lib/logger'
-
-export const handler = createGraphQLHandler({
- loggerConfig: { logger, options: {} },
- directives, // 👈 directives are added to the schema here
- sdls,
- services,
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-## Secure by Default with Built-in Directives
-
-By default, your GraphQL endpoint is open to the world.
-
-That means anyone can request any query and invoke any Mutation.
-Whatever types and fields are defined in your SDL is data that anyone can access.
-
-But Redwood encourages being secure by default by defaulting all queries and mutations to have the `@requireAuth` directive when generating SDL or a service.
-When your app builds and your server starts up, Redwood checks that **all** queries and mutations have `@requireAuth`, `@skipAuth` or a custom directive applied.
-
-If not, then your build will fail:
-
-```bash
- ✖ Verifying graphql schema...
- Building API...
- Cleaning Web...
- Building Web...
- Prerendering Web...
-You must specify one of @requireAuth, @skipAuth or a custom directive for
-- contacts Query
-- posts Query
-- post Query
-- updatePost Mutation
-- deletePost Mutation
-```
-
-or your server won't startup and you should see that "Schema validation failed":
-
-```bash
-gen | Generating TypeScript definitions and GraphQL schemas...
-gen | 47 files generated
-api | Building... Took 593 ms
-api | [GQL Server Error] - Schema validation failed
-api | ----------------------------------------
-api | You must specify one of @requireAuth, @skipAuth or a custom directive for
-api | - posts Query
-api | - createPost Mutation
-api | - updatePost Mutation
-api | - deletePost Mutation
-```
-
-To correct, just add the appropriate directive to your queries and mutations.
-
-If not, then your build will fail and your server won't startup.
-
-### @requireAuth
-
-It's your responsibility to implement the `requireAuth()` function in your app's `api/src/lib/auth.{js|ts}` to check if the user is properly authenticated and/or has the expected role membership.
-
-The `@requireAuth` directive will call the `requireAuth()` function to determine if the user is authenticated or not.
-
-```tsx title="api/src/lib/auth.ts"
-// ...
-
-export const isAuthenticated = (): boolean => {
- return true // 👈 replace with the appropriate check
-}
-
-// ...
-
-export const requireAuth = ({ roles }: { roles: AllowedRoles }) => {
- if (isAuthenticated()) {
- throw new AuthenticationError("You don't have permission to do that.")
- }
-
- if (!hasRole({ roles })) {
- throw new ForbiddenError("You don't have access to do that.")
- }
-}
-```
-
-> **Note**: The `auth.ts` file here is the stub for a new RedwoodJS app. Once you have setup auth with your provider, this will enforce a proper authentication check.
-
-### @skipAuth
-
-If, however, you want your query or mutation to be public, then simply use `@skipAuth`.
-
-## Custom Directives
-
-Want to write your own directive? You can of course!
-Just generate one using the Redwood CLI; it takes care of the boilerplate and even gives you a handy test!
-
-### Generators
-
-When using the `yarn redwood generate` command,
-you'll be presented with a choice of creating a Validator or a Transformer directive.
-
-```bash
-yarn redwood generate directive myDirective
-
-? What type of directive would you like to generate? › - Use arrow-keys. Return to submit.
-❯ Validator - Implement a validation: throw an error if criteria not met to stop execution
- Transformer - Modify values of fields or query responses
-```
-
-> **Note:** You can pass the `--type` flag with either `validator` or `transformer` to create the desired directive type.
-
-After picking the directive type, the files will be created in your `api/src/directives` directory:
-
-```bash
- ✔ Generating directive file ...
- ✔ Successfully wrote file `./api/src/directives/myDirective/myDirective.test.ts`
- ✔ Successfully wrote file `./api/src/directives/myDirective/myDirective.ts`
- ✔ Generating TypeScript definitions and GraphQL schemas ...
- ✔ Next steps...
-
- After modifying your directive, you can add it to your SDLs e.g.:
- // example todo.sdl.js
- # Option A: Add it to a field
- type Todo {
- id: Int!
- body: String! @myDirective
- }
-
- # Option B: Add it to query/mutation
- type Query {
- todos: [Todo] @myDirective
- }
-```
-
-### Validator
-
-Let's create a `@isSubscriber` directive that checks roles to see if the user is a subscriber.
-
-```bash
-yarn rw g directive isSubscriber --type validator
-```
-
-Next, implement your validation logic in the directive's `validate` function.
-
-Validator directives don't have access to the field value, (i.e. they're called before resolving the value). But they do have access to the `context` and `directiveArgs`.
-They can be async or sync.
-And if you want to stop executing (because of insufficient permissions for example), throw an error.
-The return value is ignored
-
-An example of `directiveArgs` is the `roles` argument in the directive `requireAuth(roles: "ADMIN")`
-
-```tsx
-const validate: ValidatorDirectiveFunc = ({ context, directiveArgs }) => {
- // You can also modify your directive to take arguments
- // and use the directiveArgs object provided to this function to get values
- logger.debug(directiveArgs, 'directiveArgs in isSubscriber directive')
-
- throw new Error('Implementation missing for isSubscriber')
-}
-```
-
-Here we can access the `context` parameter and then check to see if the `currentUser` is authenticated and if they belong to the `SUBSCRIBER` role:
-
-```tsx title="/api/src/directives/isSubscriber/isSubscriber.ts"
-// ...
-
-const validate: ValidatorDirectiveFunc = ({ context }) => {
- if (!context.currentUser)) {
- throw new AuthenticationError("You don't have permission to do that.")
- }
-
- if (!context.currentUser.roles?.includes('SUBSCRIBER')) {
- throw new ForbiddenError("You don't have access to do that.")
- }
-}
-```
-
-#### Writing Validator Tests
-
-When writing a Validator directive test, you'll want to:
-
-- ensure the directive is named consistently and correctly so the directive name maps properly when validating
-- confirm that the directive throws an error when invalid. The Validator directive should always have a reason to throw an error
-
-Since we stub out the `Error('Implementation missing for isSubscriber')` case when generating the Validator directive, these tests should pass.
-But once you begin implementing the validate logic, it's on you to update appropriately.
-
-```tsx
-import { mockRedwoodDirective, getDirectiveName } from '@redwoodjs/testing/api'
-
-import isSubscriber from './isSubscriber'
-
-describe('isSubscriber directive', () => {
- it('declares the directive sdl as schema, with the correct name', () => {
- expect(isSubscriber.schema).toBeTruthy()
- expect(getDirectiveName(isSubscriber.schema)).toBe('isSubscriber')
- })
-
- it('has a isSubscriber throws an error if validation does not pass', () => {
- const mockExecution = mockRedwoodDirective(isSubscriber, {})
-
- expect(mockExecution).toThrowError('Implementation missing for isSubscriber')
- })
-})
-```
-
-:::tip
-If your Validator Directive is asynchronous, you can use `mockAsyncRedwoodDirective` instead.
-
-```ts
-import { mockAsyncRedwoodDirective } from '@redwoodjs/testing/api'
-
-// ...
-
-describe('isSubscriber directive', () => {
- it('has a isSubscriber throws an error if validation does not pass', async () => {
- const mockExecution = mockAsyncRedwoodDirective(isSubscriber, {})
- await expect(mockExecution()).rejects.toThrowError(
- 'Implementation missing for isSubscriber'
- )
- })
-})
-```
-
-:::
-
-### Transformer
-
-Let's create a `@maskedEmail` directive that checks roles to see if the user should see the complete email address or if it should be obfuscated from prying eyes:
-
-```bash
-yarn rw g directive maskedEmail --type transformer
-```
-
-Next, implement your validation logic in the directive's `transform` function.
-
-Transformer directives provide `context` and `resolvedValue` parameters and run **after** resolving the value.
-Transformer directives **must** be synchronous, and return a value.
-You can throw an error, if you want to stop executing, but note that the value has already been resolved.
-
-Take note of the `resolvedValue`:
-
-```tsx
-const transform: TransformerDirectiveFunc = ({ context, resolvedValue }) => {
- return resolvedValue.replace('foo', 'bar')
-}
-```
-
-It contains the value of the field on which the directive was placed. Here, `email`.
-So the `resolvedValue` will be the value of the email property in the User model, the "original value" so-to-speak.
-
-When you return a value from the `transform` function, just return a modified value and that will be returned as the result and replace the `email` value in the response.
-
-> 🛎️ **Important**
->
-> You must return a value of the same type. So, if your `resolvedValue` is a `String`, return a `String`. If it's a `Date`, return a `Date`. Otherwise, your data will not match the SDL Type.
-
-#### Writing Transformer Tests
-
-When writing a Transformer directive test, you'll want to:
-
-- ensure the directive is named consistently and correctly so the directive name maps properly when transforming
-- confirm that the directive returns a value and that it's the expected transformed value
-
-Since we stub out and mock the `mockedResolvedValue` when generating the Transformer directive, these tests should pass.
-
-Here we mock the value `foo` and, since the generated `transform` function replaces `foo` with `bar`, we expect that after execution, the returned value will be `bar`.
-But once you begin implementing the validate logic, it's on you to update appropriately.
-
-```tsx
-import { mockRedwoodDirective, getDirectiveName } from '@redwoodjs/testing/api'
-
-import maskedEmail from './maskedEmail'
-
-describe('maskedEmail directive', () => {
- it('declares the directive sdl as schema, with the correct name', () => {
- expect(maskedEmail.schema).toBeTruthy()
- expect(getDirectiveName(maskedEmail.schema)).toBe('maskedEmail')
- })
-
- it('has a maskedEmail implementation transforms the value', () => {
- const mockExecution = mockRedwoodDirective(maskedEmail, {
- mockedResolvedValue: 'foo',
- })
-
- expect(mockExecution()).toBe('bar')
- })
-})
-```
-
-:::tip
-If your Transformer Directive is asynchronous, you can use `mockAsyncRedwoodDirective` instead.
-
-```ts
-import { mockAsyncRedwoodDirective } from '@redwoodjs/testing/api'
-
-// ...
-
-import maskedEmail from './maskedEmail'
-
-describe('maskedEmail directive', () => {
- it('has a maskedEmail implementation transforms the value', async () => {
- const mockExecution = mockAsyncRedwoodDirective(maskedEmail, {
- mockedResolvedValue: 'foo',
- })
-
- await expect(mockExecution()).resolves.toBe('bar')
- })
-})
-```
-:::
diff --git a/docs/versioned_docs/version-7.0/docker.md b/docs/versioned_docs/version-7.0/docker.md
deleted file mode 100644
index 53a1c3aa0a38..000000000000
--- a/docs/versioned_docs/version-7.0/docker.md
+++ /dev/null
@@ -1,680 +0,0 @@
----
-description: Redwood's Dockerfile
----
-
-# Docker
-
-:::note The Dockerfile is experimental
-
-Redwood's Dockerfile is the collective effort of several hard-working community members.
-We've worked hard to optimize it, but expect changes as we collaborate with users and deploy providers.
-
-:::
-
-If you're not familiar with Docker, we recommend going through their [getting started](https://docs.docker.com/get-started/) documentation.
-
-## Set up
-
-To get started, run the setup command:
-
-```
-yarn rw experimental setup-docker
-```
-
-The setup commands does several things:
-- writes four files: `Dockerfile`, `.dockerignore`, `docker-compose.dev.yml`, and `docker-compose.prod.yml`
-- adds the `@redwoodjs/api-server` and `@redwoodjs/web-server` packages to the api and web sides respectively
-- edits the `browser.open` setting in the `redwood.toml` (right now, if it's set to `true`, it'll break the dev server when running the `docker-compose.dev.yml`)
-
-## Usage
-
-You can start the dev compose file with:
-
-```
-docker compose -f ./docker-compose.dev.yml up
-```
-
-And the prod compose file with:
-
-```
-docker compose -f ./docker-compose.prod.yml up
-```
-
-:::info make sure to specify build args
-
-If your api side or web side depend on env vars at build time, you may need to supply them as `--build-args`, or in the compose files.
-
-This is often the most tedious part of setting up Docker. Have ideas of how it could be better? Let us know on the [forums](https://community.redwoodjs.com/)!
-
-:::
-
-The first time you do this, you'll have to use the `console` stage to go in and migrate the database—just like you would with a Redwood app on your machine:
-
-```
-docker compose -f ./docker-compose.dev.yml run --rm -it console /bin/bash
-root@...:/home/node/app# yarn rw prisma migrate dev
-```
-
-## The Dockerfile in detail
-
-The documentation here goes through and explains every line of Redwood's Dockerfile.
-If you'd like to see the whole Dockerfile for reference, you can find it [here](https://github.com/redwoodjs/redwood/tree/main/packages/cli/src/commands/experimental/templates/docker/Dockerfile) or by setting it up in your project: `yarn rw experimental setup-docker`.
-
-Redwood takes advantage of [Docker's multi-stage build support](https://docs.docker.com/build/building/multi-stage/) to keep the final production images lean.
-
-### The `base` stage
-
-The `base` stage installs dependencies.
-It's used as the base image for the build stages and the `console` stage.
-
-```Dockerfile
-FROM node:20-bookworm-slim as base
-```
-
-We use a Node.js 20 image as the base image because that's the version Redwood targets.
-"bookworm" is the codename for the current stable distribution of Debian (version 12).
-Lastly, the "slim" variant of the `node:20-bookworm` image only includes what Node.js needs which reduces the image's size while making it more secure.
-
-:::tip Why not alpine?
-
-While alpine may be smaller, it uses musl, a different C standard library.
-In developing this Dockerfile, we prioritized security over size.
-
-If you know what you're doing feel free to change this—it's your Dockerfile now!
-Just remember to change the `apt-get` instructions further down too if needed.
-
-:::
-
-Moving on, next we have `corepack enable`:
-
-```Dockerfile
-RUN corepack enable
-```
-
-[Corepack](https://nodejs.org/docs/latest-v18.x/api/corepack.html), Node's manager for package managers, needs to be enabled so that Yarn can use the `packageManager` field in your project's root `package.json` to pick the right version of itself.
-If you'd rather check in the binary, you still can, but you'll need to remember to copy it over (i.e. `COPY --chown=node:node .yarn/releases .yarn/releases`).
-
-```Dockerfile
-RUN apt-get update && apt-get install -y \
- openssl \
- # python3 make gcc \
- && rm -rf /var/lib/apt/lists/*
-```
-
-The `node:20-bookworm-slim` image doesn't have [OpenSSL](https://www.openssl.org/), which [seems to be a bug](https://github.com/nodejs/docker-node/issues/1919).
-(It was included in the "bullseye" image, the codename for Debian 11.)
-On Linux, [Prisma needs OpenSSL](https://www.prisma.io/docs/reference/system-requirements#linux-runtime-dependencies), so we install it here via Ubuntu's package manager APT.
-Python and its dependencies are there ready to be uncommented if you need them. See the [Troubleshooting](#python) section for more information.
-
-[It's recommended](https://docs.docker.com/develop/develop-images/instructions/#apt-get) to combine `apt-get update` and `apt-get install -y` in the same `RUN` statement for cache busting.
-After installing, we clean up the apt cache to keep the layer lean. (Running `apt-get clean` isn't required—[official Debian images do it automatically](https://github.com/moby/moby/blob/03e2923e42446dbb830c654d0eec323a0b4ef02a/contrib/mkimage/debootstrap#L82-L105).)
-
-```Dockerfile
-USER node
-```
-
-This and subsequent `chown` options in `COPY` instructions are for security.
-[Services that can run without privileges should](https://docs.docker.com/develop/develop-images/instructions/#user).
-The Node.js image includes a user, `node`, created with an explicit `uid` and `gid` (`1000`).
-We reuse it.
-
-```Dockerfile
-WORKDIR /home/node/app
-
-COPY --chown=node:node .yarnrc.yml .
-COPY --chown=node:node package.json .
-COPY --chown=node:node api/package.json api/
-COPY --chown=node:node web/package.json web/
-COPY --chown=node:node yarn.lock .
-```
-
-Here we copy the minimum set of files that the `yarn install` step needs.
-The order isn't completely arbitrary—it tries to maximize [Docker's layer caching](https://docs.docker.com/build/cache/).
-We expect `yarn.lock` to change more than the `package.json`s and the `package.json`s to change more than `.yarnrc.yml`.
-That said, it's hard to argue that these files couldn't be arranged differently, or that the `COPY` instructions couldn't be combined.
-The important thing is that they're all here, before the `yarn install` step:
-
-```Dockerfile
-RUN mkdir -p /home/node/.yarn/berry/index
-RUN mkdir -p /home/node/.cache
-
-RUN --mount=type=cache,target=/home/node/.yarn/berry/cache,uid=1000 \
- --mount=type=cache,target=/home/node/.cache,uid=1000 \
- CI=1 yarn install
-```
-
-This step installs all your project's dependencies—production and dev.
-Since we use multi-stage builds, your production images won't pay for the dev dependencies installed in this step.
-The build stages need the dev dependencies.
-
-The `mkdir` steps are a workaround for a permission error. We're working on removing them, but for now if you remove them the install step will probably fail.
-
-This step is a bit more involved than the others.
-It uses a [cache mount](https://docs.docker.com/build/cache/#use-your-package-manager-wisely).
-Yarn operates in three steps: resolution, fetch, and link.
-If you're not careful, the cache for the fetch step basically doubles the number of `node_modules` installed on disk.
-We could disable it all together, but by using a cache mount, we can still get the benefits without paying twice.
-We set it to the default directory here, but you can change its location in `.yarnrc.yml`.
-If you've done so you'll have to change it here too.
-
-One more thing to note: without setting `CI=1`, depending on the deploy provider, yarn may think it's in a TTY, making the logs difficult to read. With this set, yarn adapts accordingly.
-Enabling CI enables [immutable installs](https://v3.yarnpkg.com/configuration/yarnrc#enableImmutableInstalls) and [inline builds](https://v3.yarnpkg.com/configuration/yarnrc#enableInlineBuilds), both of which are highly recommended.
-
-```Dockerfile
-COPY --chown=node:node redwood.toml .
-COPY --chown=node:node graphql.config.js .
-COPY --chown=node:node .env.defaults .env.defaults
-```
-
-We'll need these config files for the build and production stages.
-The `redwood.toml` file is Redwood's de-facto config file.
-Both the build and serve stages read it to enable and configure functionality.
-
-:::warning `.env.defaults` is ok to include but `.env` is not
-
-If you add a secret to the Dockerfile, it can be excavated.
-While it's technically true that multi stage builds add a sort of security layer, it's not a best practice.
-Leave them out and look to your deploy provider for further configuration.
-
-:::
-
-### The `api_build` stage
-
-The `api_build` stage builds the api side:
-
-```Dockerfile
-FROM base as api_build
-
-# If your api side build relies on build-time environment variables,
-# specify them here as ARGs.
-#
-# ARG MY_BUILD_TIME_ENV_VAR
-
-COPY --chown=node:node api api
-RUN yarn rw build api
-```
-
-After the work we did in the base stage, building the api side amounts to copying in the api directory and running `yarn rw build api`.
-
-### The `api_serve` stage
-
-The `api_serve` stage serves your GraphQL api and functions:
-
-```Dockerfile
-FROM node:20-bookworm-slim as api_serve
-
-RUN corepack enable
-
-RUN apt-get update && apt-get install -y \
- openssl \
- # python3 make gcc \
- && rm -rf /var/lib/apt/lists/*
-```
-
-We don't start from the `base` stage, but begin anew with the `node:20-bookworm-slim` image.
-Since this is a production stage, it's important for it to be as small as possible.
-Docker's [multi-stage builds](https://docs.docker.com/build/building/multi-stage/) enables this.
-
-```Dockerfile
-USER node
-WORKDIR /home/node/app
-
-COPY --chown=node:node .yarnrc.yml .yarnrc.yml
-COPY --chown=node:node package.json .
-COPY --chown=node:node api/package.json api/
-COPY --chown=node:node yarn.lock yarn.lock
-```
-
-Like other `COPY` instructions, ordering these files with care enables layering caching.
-
-```Dockerfile
-RUN mkdir -p /home/node/.yarn/berry/index
-RUN mkdir -p /home/node/.cache
-
-RUN --mount=type=cache,target=/home/node/.yarn/berry/cache,uid=1000 \
- --mount=type=cache,target=/home/node/.cache,uid=1000 \
- CI=1 yarn workspaces focus api --production
-```
-
-This is a critical step for image size.
-We don't use the regular `yarn install` command.
-Using the [official workspaces plugin](https://github.com/yarnpkg/berry/tree/master/packages/plugin-workspace-tools)—which is included by default in yarn v4—we "focus" on the api workspace, only installing its production dependencies.
-
-The cache mount will be populated at this point from the install in the `base` stage, so the fetch step should fly by.
-
-```Dockerfile
-COPY --chown=node:node redwood.toml .
-COPY --chown=node:node graphql.config.js .
-COPY --chown=node:node .env.defaults .env.defaults
-
-COPY --chown=node:node --from=api_build /home/node/app/api/dist /home/node/app/api/dist
-COPY --chown=node:node --from=api_build /home/node/app/api/db /home/node/app/api/db
-COPY --chown=node:node --from=api_build /home/node/app/node_modules/.prisma /home/node/app/node_modules/.prisma
-```
-
-Here's where we really take advantage of multi-stage builds by copying from the `api_build` stage.
-At this point all the building has been done. Now we can just grab the artifacts without having to lug around the dev dependencies.
-
-There's one more thing that was built: the prisma client in `node_modules/.prisma`.
-We need to grab it too.
-
-```Dockerfile
-ENV NODE_ENV=production
-
-CMD [ "node_modules/.bin/rw-server", "api" ]
-```
-
-Lastly, the default command is to start the api server using the bin from the `@redwoodjs/api-server` package.
-You can override this command if you have more specific needs.
-
-Note that the Redwood CLI isn't available anymore. (It's a dev dependency.)
-To access the server bin, we have to find its path in `node_modules`.
-Though this is somewhat discouraged in modern yarn, since we're using the `node-modules` node linker, it's in `node_modules/.bin`.
-
-### The `web_build` stage
-
-This `web_build` builds the web side:
-
-```Dockerfile
-FROM base as web_build
-
-COPY --chown=node:node web web
-RUN yarn rw build web --no-prerender
-```
-
-After the work we did in the base stage, building the web side amounts to copying in the web directory and running `yarn rw build web`.
-
-This stage is a bit of a simplification.
-It foregoes Redwood's prerendering (SSG) capability.
-Prerendering is a little trickier; see [the `web_prerender_build` stage](#the-web_prerender_build-stage).
-
-If you've included environment variables in your `redwood.toml`'s `web.includeEnvironmentVariables` field, you'll want to specify them as ARGs here.
-The setup command should've inlined them for you.
-
-### The `web_prerender_build` stage
-
-The `web_prerender_build` stage builds the web side with prerender.
-
-```Dockerfile
-FROM api_build as web_build_with_prerender
-
-COPY --chown=node:node web web
-RUN yarn rw build web
-```
-
-Building the web side with prerendering poses a challenge.
-Prerender needs the api side around to get data for your Cells and route hooks.
-The key line here is the first one—this stage uses the `api_build` stage as its base image.
-
-### The `web_serve` stage
-
-```Dockerfile
-FROM node:20-bookworm-slim as web_serve
-
-RUN corepack enable
-
-USER node
-WORKDIR /home/node/app
-
-COPY --chown=node:node .yarnrc.yml .
-COPY --chown=node:node package.json .
-COPY --chown=node:node web/package.json web/
-COPY --chown=node:node yarn.lock .
-
-RUN mkdir -p /home/node/.yarn/berry/index
-RUN mkdir -p /home/node/.cache
-
-RUN --mount=type=cache,target=/home/node/.yarn/berry/cache,uid=1000 \
- --mount=type=cache,target=/home/node/.cache,uid=1000 \
- CI=1 yarn workspaces focus web --production
-
-COPY --chown=node:node redwood.toml .
-COPY --chown=node:node graphql.config.js .
-COPY --chown=node:node .env.defaults .env.defaults
-
-COPY --chown=node:node --from=web_build /home/node/app/web/dist /home/node/app/web/dist
-
-ENV NODE_ENV=production \
- API_PROXY_TARGET=http://api:8911
-
-CMD "node_modules/.bin/rw-web-server" "--api-proxy-target" "$API_PROXY_TARGET"
-```
-
-Most of this stage is similar to the `api_serve` stage, except that we're copying from the `web_build` stage instead of the `api_build`.
-(If you're prerendering, you'll want to change the `--from=web_build` to `--from=web_prerender_build`.)
-
-The binary we're using here to serve the web side is `rw-web-server` which comes from the `@redwoodjs/web-server` package.
-While this web server will be much more fully featured in the future, right now it's mostly just to get you going.
-Ideally you want to put a web server like Nginx or Caddy in front of it.
-
-Lastly, note that we use the shell form of `CMD` here for its variable expansion.
-
-### The `console` stage
-
-The `console` stage is an optional stage for debugging:
-
-```Dockerfile
-FROM base as console
-
-# To add more packages:
-#
-# ```
-# USER root
-#
-# RUN apt-get update && apt-get install -y \
-# curl
-#
-# USER node
-# ```
-
-COPY --chown=node:node api api
-COPY --chown=node:node web web
-COPY --chown=node:node scripts scripts
-```
-
-The console stage completes the base stage by copying in the rest of your Redwood app.
-But then it pretty much leaves you to your own devices.
-The intended way to use it is to create an ephemeral container by starting a shell like `/bin/bash` in the image built by targeting this stage:
-
-```bash
-# Build the console image:
-docker build . -t console --target console
-# Start an ephemeral container from it:
-docker run --rm -it console /bin/bash
-```
-
-As the comment says, feel free to add more packages.
-We intentionally kept them to a minimum in the base stage, but you shouldn't worry about the size of the image here.
-
-## Troubleshooting
-
-### Python
-
-We tried to make the Dockerfile as lean as possible.
-In some cases, that means we excluded a dependency your project needs.
-And by far the most common is Python.
-
-During a stage's `yarn install` step (`RUN ... yarn install`), if you see an error like the following:
-
-```
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python Python is not set from command line or npm configuration
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python Python is not set from environment variable PYTHON
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python checking if "python3" can be used
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python - executable path is ""
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python - "" could not be run
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python checking if "python" can be used
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python - executable path is ""
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python - "" could not be run
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python **********************************************************
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python You need to install the latest version of Python.
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python Node-gyp should be able to find and use Python. If not,
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python you can try one of the following options:
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python - Use the switch --python="/path/to/pythonexecutable"
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python (accepted by both node-gyp and npm)
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python - Set the environment variable PYTHON
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python - Set the npm configuration variable python:
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python npm config set python "/path/to/pythonexecutable"
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python For more information consult the documentation at:
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python https://github.com/nodejs/node-gyp#installation
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python **********************************************************
-➤ YN0000: │ bufferutil@npm:4.0.8 STDERR gyp ERR! find Python
-```
-
-It's because your project depends on Python and the image doesn't provide it.
-
-It's easy to fix: just add `python3` and its dependencies (usually `make` and `gcc`):
-
-```diff
- FROM node:20-bookworm-slim as base
-
- RUN apt-get update && apt-get install -y \
- openssl \
-+ python3 make gcc \
- && rm -rf /var/lib/apt/lists/*
-```
-
-Not sure why your project depends on Python? `yarn why` is your friend.
-From the error message, we know `bufferutil` couldn't build.
-But why do we have `bufferutil`?
-
-```
-yarn why bufferutil
-└─ websocket@npm:1.0.34
- └─ bufferutil@npm:4.0.8 (via npm:^4.0.1)
-```
-
-`websocket` needs `bufferutil`. But why do we have `websocket`?
-Keep pulling the thread till you get to a top-level dependency:
-
-```
-yarn why websocket
-└─ @supabase/realtime-js@npm:2.8.4
- └─ websocket@npm:1.0.34 (via npm:^1.0.34)
-
-yarn why @supabase/realtime-js
-└─ @supabase/supabase-js@npm:2.38.4
- └─ @supabase/realtime-js@npm:2.8.4 (via npm:^2.8.4)
-
-yarn why @supabase/supabase-js
-├─ api@workspace:api
-│ └─ @supabase/supabase-js@npm:2.38.4 (via npm:^2.21.0)
-│
-└─ web@workspace:web
- └─ @supabase/supabase-js@npm:2.38.4 (via npm:^2.21.0)
-```
-
-In this case, it looks like it's ultimately because of our auth provider, `@supabase/supabase-js`.
-
-## Using the Server File
-
-Redwood v7 introduced a new entry point to Redwood's api server: the server file at `api/src/server.ts`.
-The server file was made with Docker in mind. It allows you to
-
-1. have control over how the api server starts,
-2. customize the server as much as you want, and
-3. minimize the number of dependencies needed to start the api server process (all you need is Node.js!)
-
-Get started by running the setup command:
-
-```
-yarn rw setup server-file
-```
-
-This should give you a new file at `api/src/server.ts`:
-
-```typescript title="api/src/server.ts"
-import { createServer } from '@redwoodjs/api-server'
-
-import { logger } from 'src/lib/logger'
-
-async function main() {
- const server = await createServer({
- logger,
- })
-
- await server.start()
-}
-
-main()
-```
-
-Without the server file, to start the api side, you'd use binaries provided by `@redwoodjs/api-server` such as `yarn rw-server api` (you may also see this as `./node_modules/.bin/rw-server api`).
-
-With the server file, there's no indirection. Just use `node`:
-
-```
-yarn node api/dist/server.js
-```
-
-:::info You have to build first
-
-You can't run the server file directly with Node.js; it has to be built first:
-
-```
-yarn rw build api
-```
-
-The api serve stage in the Dockerfile pulls from the api build stage, so things are already in the right order there. Similarly, for `yarn rw dev`, the dev server will build and reload the server file for you.
-
-:::
-
-That means you can swap the `CMD` instruction in the api server stage:
-
-```diff
- ENV NODE_ENV=production
-
-- CMD [ "node_modules/.bin/rw-server", "api" ]
-+ CMD [ "yarn", "node", "api/dist/server.js" ]
-```
-
-### Configuring the server
-
-There's two ways you can configure the server.
-
-First, you can configure how the underlying Fastify server is instantiated via the`fastifyServerOptions` passed to the `createServer` function:
-
-```ts title="api/src/server.ts"
-const server = await createServer({
- logger,
- // highlight-start
- fastifyServerOptions: {
- // ...
- }
- // highlight-end
-})
-```
-
-For the complete list of options, see [Fastify's documentation](https://fastify.dev/docs/latest/Reference/Server/#factory).
-
-Second, you can register Fastify plugins on the server instance:
-
-```ts title="api/src/server.ts"
-const server = await createServer({
- logger,
-})
-
-// highlight-next-line
-server.register(myFastifyPlugin)
-```
-
-#### Example: Compressing Payloads and Rate Limiting
-
-Let's say that we want to compress payloads and add rate limiting.
-We want to compress payloads only if they're larger than 1KB, preferring deflate to gzip,
-and we want to limit IP addresses to 100 requests in a five minute window.
-We can leverage two Fastify ecosystem plugins, [@fastify/compress](https://github.com/fastify/fastify-compress) and [@fastify/rate-limit](https://github.com/fastify/fastify-rate-limit) respectively.
-
-First, you'll need to install these packages:
-
-```
-yarn workspace api add @fastify/compress @fastify/rate-limit
-```
-
-Then register them with the appropriate config:
-
-```ts title="api/src/server.ts"
-const server = await createServer({
- logger,
-})
-
-await server.register(import('@fastify/compress'), {
- global: true,
- threshold: 1024,
- encodings: ['deflate', 'gzip'],
-})
-
-await server.register(import('@fastify/rate-limit'), {
- max: 100,
- timeWindow: '5 minutes',
-})
-```
-
-#### Example: File Uploads
-
-If you try to POST file content to the api server such as images or PDFs, you may see the following error from Fastify:
-
-```json
-{
- "statusCode": 400,
- "code": "FST_ERR_CTP_INVALID_CONTENT_LENGTH",
- "error": "Bad Request",
- "message": "Request body size did not match Content-Length"
-}
-```
-
-This's because Fastify [only supports `application/json` and `text/plain` content types natively](https://www.fastify.io/docs/latest/Reference/ContentTypeParser/).
-While Redwood configures the api server to also accept `application/x-www-form-urlencoded` and `multipart/form-data`, if you want to support other content or MIME types (likes images or PDFs), you'll need to configure them here in the server file.
-
-You can use Fastify's `addContentTypeParser` function to allow uploads of the content types your application needs.
-For example, to support image file uploads you'd tell Fastify to allow `/^image\/.*/` content types:
-
-```ts title="api/src/server.ts"
-const server = await createServer({
- logger,
-})
-
-server.addContentTypeParser(/^image\/.*/, (req, payload, done) => {
- payload.on('end', () => {
- done()
- })
-})
-```
-
-The regular expression (`/^image\/.*/`) above allows all image content or MIME types because [they start with "image"](https://developer.mozilla.org/en-US/docs/Web/Media/Formats/Image_types).
-
-Now, when you POST those content types to a function served by the api server, you can access the file content on `event.body`.
-
-### The `start` method
-
-Since there's a few different ways to configure the host and port the server listens at, the server instance returned by `createServer` has a special `start` method:
-
-```ts title="api/src/server.ts"
-await server.start()
-```
-
-`start` is a thin wrapper around [`listen`](https://fastify.dev/docs/latest/Reference/Server/#listen).
-It takes the same arguments as `listen`, except for host and port. It computes those in the following way, in order of precedence:
-
-1. `--apiHost` or `--apiPort` flags:
-
- ```
- yarn node api/dist/server.js --apiHost 0.0.0.0 --apiPort 8913
- ```
-
-2. `REDWOOD_API_HOST` or `REDWOOD_API_PORT` env vars:
-
- ```
- export REDWOOD_API_HOST='0.0.0.0'
- export REDWOOD_API_PORT='8913'
- yarn node api/dist/server.js
- ```
-
-3. `[api].host` and `[api].port` in `redwood.toml`:
-
- ```toml title="redwood.toml"
- [api]
- host = '0.0.0.0'
- port = 8913
- ```
-
-If you'd rather not have `createServer` parsing `process.argv`, you can disable it via `parseArgv`:
-
-```ts title="api/src/server.ts"
-await createServer({
- parseArgv: false,
-})
-```
-
-And if you'd rather it do none of this, just change `start` to `listen` and specify the host and port inline:
-
-```ts title="api/src/server.ts"
-await server.listen({
- host: '0.0.0.0',
- port: 8913,
-})
-```
-
-If you don't specify a host, `createServer` uses `NODE_ENV` to set it. If `NODE_ENV` is production, it defaults to `'0.0.0.0'` and `'::'` otherwise.
-The Dockerfile sets `NODE_ENV` to production so that things work out of the box.
diff --git a/docs/versioned_docs/version-7.0/graphql.md b/docs/versioned_docs/version-7.0/graphql.md
deleted file mode 100644
index 35d5c3fb2382..000000000000
--- a/docs/versioned_docs/version-7.0/graphql.md
+++ /dev/null
@@ -1,2513 +0,0 @@
----
-description: GraphQL is a fundamental part of Redwood
----
-
-# GraphQL
-
-GraphQL is a fundamental part of Redwood. Having said that, you can get going without knowing anything about it, and can actually get quite far without ever having to read [the docs](https://graphql.org/learn/). But to master Redwood, you'll need to have more than just a vague notion of what GraphQL is. You'll have to really grok it.
-
-
-## GraphQL 101
-
-GraphQL is a query language that enhances the exchange of data between clients (in Redwood's case, a React app) and servers (a Redwood API).
-
-Unlike a REST API, a GraphQL Client performs operations that allow gathering a rich dataset in a single request.
-There's three types of GraphQL operations, but here we'll only focus on two: Queries (to read data) and Mutations (to create, update, or delete data).
-
-The following GraphQL query:
-
-```graphql
-query GetProject {
- project(name: "GraphQL") {
- id
- title
- description
- owner {
- id
- username
- }
- tags {
- id
- name
- }
- }
-}
-```
-
-returns the following JSON response:
-
-```json
-{
- "data": {
- "project": {
- "id": 1,
- "title": "My Project",
- "description": "Lorem ipsum...",
- "owner": {
- "id": 11,
- "username": "Redwood",
- },
- "tags": [
- { "id": 22, "name": "graphql" }
- ]
- }
- },
- "errors": null
-}
-```
-
-Notice that the response's structure mirrors the query's. In this way, GraphQL makes fetching data descriptive and predictable.
-
-Again, unlike a REST API, a GraphQL API is built on a schema that specifies exactly which queries and mutations can be performed.
-For the `GetProject` query above, here's the schema backing it:
-
-```graphql
-type Project {
- id: ID!
- title: String
- description: String
- owner: User!
- tags: [Tag]
-}
-
-# ... User and Tag type definitions
-
-type Query {
- project(name: String!): Project
-}
-```
-
-:::info
-
-More information on GraphQL types can be found in the [official GraphQL documentation](https://graphql.org/learn/schema/).
-
-:::
-
-Finally, the GraphQL schema is associated with a resolvers map that helps resolve each requested field. For example, here's what the resolver for the owner field on the Project type may look like:
-
-```ts
-export const Project = {
- owner: (args, { root, context, info }) => {
- return db.project.findUnique({ where: { id: root.id } }).user()
- },
- // ...
-}
-```
-
-:::info
-
-You can read more about resolvers in the dedicated [Understanding Default Resolvers](#understanding-default-resolvers) section below.
-
-:::
-
-To summarize, when a GraphQL query reaches a GraphQL API, here's what happens:
-
-```
-+--------------------+ +--------------------+
-| | 1.send operation | |
-| | | GraphQL Server |
-| GraphQL Client +----------------->| | |
-| | | | 2.resolve |
-| | | | data |
-+--------------------+ | v |
- ^ | +----------------+ |
- | | | | |
- | | | Resolvers | |
- | | | | |
- | | +--------+-------+ |
- | 3. respond JSON with data | | |
- +-----------------------------+ <--------+ |
- | |
- +--------------------+
-```
-
-In contrast to most GraphQL implementations, Redwood provides a "deconstructed" way of creating a GraphQL API:
-
-- You define your SDLs (schema) in `*.sdl.js` files, which define what queries and mutations are available, and what fields can be returned
-- For each query or mutation, you write a service function with the same name. This is the resolver
-- Redwood then takes all your SDLs and Services (resolvers), combines them into a GraphQL server, and expose it as an endpoint
-
-## RedwoodJS and GraphQL
-
-Besides taking care of the annoying stuff for you (namely, mapping your resolvers, which gets annoying fast if you do it yourself!), there's not many gotchas with GraphQL in Redwood.
-The only Redwood-specific thing you should really be aware of is [resolver args](#redwoods-resolver-args).
-
-Since there's two parts to GraphQL in Redwood, the client and the server, we've divided this doc up that way.
-
-On the `web` side, Redwood uses [Apollo Client](https://www.apollographql.com/docs/react/) by default though you can swap it out for something else if you want.
-
-
-The `api` side offers a GraphQL server built on [GraphQL Yoga](https://www.graphql-yoga.com) and the [Envelop plugin system](https://www.envelop.dev/docs) from [The Guild](https://the-guild.dev).
-###
-
-Redwood's api side is "serverless first", meaning it's architected as functions which can be deployed on either serverless or traditional infrastructure, and Redwood's GraphQL endpoint is effectively "just another function" (with a whole lot more going on under the hood, but that part is handled for you, out of the box).
-One of the tenets of the Redwood philosophy is "Redwood believes that, as much as possible, you should be able to operate in a serverless mindset and deploy to a generic computational grid.”
-
-### GraphQL Yoga and the Generic Computation Grid
-
-To be able to deploy to a “generic computation grid” means that, as a developer, you should be able to deploy using the provider or technology of your choosing. You should be able to deploy to Netlify, Vercel, Fly, Render, AWS Serverless, or elsewhere with ease and no vendor or platform lock in. You should be in control of the framework, what the response looks like, and how your clients consume it.
-
-The same should be true of your GraphQL Server. [GraphQL Yoga](https://www.graphql-yoga.com) from [The Guild](https://the-guild.dev) makes that possible.
-
-> The fully-featured GraphQL Server with focus on easy setup, performance and great developer experience.
-
-RedwoodJS leverages Yoga's Envelop plugins to implement custom internal plugins to help with [authentication](#authentication), [logging](#logging), [directive handling](#directives), and more.
-
-### Security Best Practices
-
-
-RedwoodJS implements GraphQL Armor from [Escape Technologies](https://escape.tech) to make your endpoint more secure by default by implementing common GraphQL [security best practices](#security).
-
-GraphQL Armor, developed by Escape in partnership with The Guild, is a middleware for JS servers that adds a security layer to the RedwoodJS GraphQL endpoint.
-
-### Trusted Documents
-
-In addition, RedwoodJS can be setup to enforce [persisted operations](https://the-guild.dev/graphql/yoga-server/docs/features/persisted-operations) -- alternatively called [Trusted Documents](https://benjie.dev/graphql/trusted-documents).
-
-See [Configure Trusted Documents](graphql/trusted-documents#configure-trusted-documents) for more information and usage instructions.
-
-
-### Conclusion
-
-All this gets us closer to Redwood's goal of being able to deploy to a "generic computation grid". And that’s exciting!
-
-## Client-side
-
-### RedwoodApolloProvider
-
-By default, Redwood Apps come ready-to-query with the `RedwoodApolloProvider`. As you can tell from the name, this Provider wraps [ApolloProvider](https://www.apollographql.com/docs/react/api/react/hooks/#the-apolloprovider-component). Omitting a few things, this is what you'll normally see in Redwood Apps:
-
-```jsx title="web/src/App.js"
-import { RedwoodApolloProvider } from '@redwoodjs/web/apollo'
-
-// ...
-
-const App = () => (
-
-
-
-)
-
-// ...
-```
-
-You can use Apollo's `useQuery` and `useMutation` hooks by importing them from `@redwoodjs/web`, though if you're using `useQuery`, we recommend that you use a [Cell](cells.md):
-
-```jsx title="web/src/components/MutateButton.js"
-import { useMutation } from '@redwoodjs/web'
-
-const MUTATION = gql`
- # your mutation...
-`
-
-const MutateButton = () => {
- const [mutate] = useMutation(MUTATION)
-
- return (
-
- )
-}
-```
-
-Note that you're free to use any of Apollo's other hooks, you'll just have to import them from `@apollo/client` instead. In particular, these two hooks might come in handy:
-
-| Hook | Description |
-| :------------------------------------------------------------------------------------------- | :------------------------------------------------------------------- |
-| [useLazyQuery](https://www.apollographql.com/docs/react/api/react/hooks/#uselazyquery) | Execute queries in response to events other than component rendering |
-| [useApolloClient](https://www.apollographql.com/docs/react/api/react/hooks/#useapolloclient) | Access your instance of `ApolloClient` |
-
-### Customizing the Apollo Client and Cache
-
-By default, `RedwoodApolloProvider` configures an `ApolloClient` instance with 1) a default instance of `InMemoryCache` to cache responses from the GraphQL API and 2) an `authMiddleware` to sign API requests for use with [Redwood's built-in auth](authentication.md). Beyond the `cache` and `link` params, which are used to set up that functionality, you can specify additional params to be passed to `ApolloClient` using the `graphQLClientConfig` prop. The full list of available configuration options for the client are [documented here on Apollo's site](https://www.apollographql.com/docs/react/api/core/ApolloClient/#options).
-
-Depending on your use case, you may want to configure `InMemoryCache`. For example, you may need to specify a type policy to change the key by which a model is cached or to enable pagination on a query. [This article from Apollo](https://www.apollographql.com/docs/react/caching/cache-configuration/) explains in further detail why and how you might want to do this.
-
-To configure the cache when it's created, use the `cacheConfig` property on `graphQLClientConfig`. Any value you pass is passed directly to `InMemoryCache` when it's created.
-
-For example, if you have a query named `search` that supports [Apollo's offset pagination](https://www.apollographql.com/docs/react/pagination/core-api/), you could enable it by specifying:
-
-```jsx
-
-```
-
-
-
-### Swapping out the RedwoodApolloProvider
-
-As long as you're willing to do a bit of configuring yourself, you can swap out `RedwoodApolloProvider` with your GraphQL Client of choice. You'll just have to get to know a bit of the make up of the [RedwoodApolloProvider](https://github.com/redwoodjs/redwood/blob/main/packages/web/src/apollo/index.tsx#L71-L84); it's actually composed of a few more Providers and hooks:
-
-- `FetchConfigProvider`
-- `useFetchConfig`
-- `GraphQLHooksProvider`
-
-For an example of configuring your own GraphQL Client, see the [redwoodjs-react-query-provider](https://www.npmjs.com/package/redwoodjs-react-query-provider). If you were thinking about using [react-query](https://react-query.tanstack.com/), you can also just go ahead and install it!
-
-Note that if you don't import `RedwoodApolloProvider`, it won't be included in your bundle, dropping your bundle size quite a lot!
-
-## Server-side
-
-### Understanding Default Resolvers
-
-According to the spec, for every field in your sdl, there has to be a resolver in your Services. But you'll usually see fewer resolvers in your Services than you technically should. And that's because if you don't define a resolver, GraphQL Yoga server will.
-
-The key question the Yoga server asks is: "Does the parent argument (in Redwood apps, the `parent` argument is named `root`—see [Redwood's Resolver Args](#redwoods-resolver-args)) have a property with this resolver's exact name?" Most of the time, especially with Prisma Client's ergonomic returns, the answer is yes.
-
-Let's walk through an example. Say our sdl looks like this:
-
-```jsx title="api/src/graphql/user.sdl.js"
-export const schema = gql`
- type User {
- id: Int!
- email: String!
- name: String
- }
-
- type Query {
- users: [User!]!
- }
-`
-```
-
-So we have a User model in our `schema.prisma` that looks like this:
-
-```jsx
-model User {
- id Int @id @default(autoincrement())
- email String @unique
- name String?
-}
-```
-
-If you create your Services for this model using Redwood's generator (`yarn rw g service user`), your Services will look like this:
-
-```jsx title="api/src/services/user/user.js"
-import { db } from 'src/lib/db'
-
-export const users = () => {
- return db.user.findMany()
-}
-```
-
-Which begs the question: where are the resolvers for the User fields—`id`, `email`, and `name`?
-All we have is the resolver for the Query field, `users`.
-
-As we just mentioned, GraphQL Yoga defines them for you. And since the `root` argument for `id`, `email`, and `name` has a property with each resolvers' exact name (i.e. `root.id`, `root.email`, `root.name`), it'll return the property's value (instead of returning `undefined`, which is what Yoga would do if that weren't the case).
-
-But, if you wanted to be explicit about it, this is what it would look like:
-
-```jsx title="api/src/services/user/user.js"
-import { db } from 'src/lib/db'
-
-export const users = () => {
- return db.user.findMany()
-}
-
-export const Users = {
- id: (_args, { root }) => root.id,
- email: (_args, { root }) => root.email,
- name: (_args, { root }) => root.name,
-}
-```
-
-The terminological way of saying this is, to create a resolver for a field on a type, in the Service, export an object with the same name as the type that has a property with the same name as the field.
-
-Sometimes you want to do this since you can do things like add completely custom fields this way:
-
-```jsx {5}
-export const Users = {
- id: (_args, { root }) => root.id,
- email: (_args, { root }) => root.email,
- name: (_args, { root }) => root.name,
- age: (_args, { root }) => new Date().getFullYear() - root.birthDate.getFullYear()
-}
-```
-
-
-
-### Redwood's Resolver Args
-
-[According to the spec](https://graphql.org/learn/execution/#root-fields-resolvers), resolvers take four arguments: `args`, `obj`, `context`, and `info`. In Redwood, resolvers do take these four arguments, but what they're named and how they're passed to resolvers is slightly different:
-
-- `args` is passed as the first argument
-- `obj` is named `root` (all the rest keep their names)
-- `root`, `context`, and `info` are wrapped into an object, `gqlArgs`; this object is passed as the second argument
-
-Here's an example to make things clear:
-
-```js
-export const Post = {
- user: (args, gqlArgs) => db.post.findUnique({ where: { id: gqlArgs?.root.id } }).user(),
-}
-```
-
-Of the four, you'll see `args` and `root` being used a lot.
-
-| Argument | Description |
-| :-------- | :------------------------------------------------------------------------------------------- |
-| `args` | The arguments provided to the field in the GraphQL query |
-| `root` | The previous return in the resolver chain |
-| `context` | Holds important contextual information, like the currently logged in user |
-| `info` | Holds field-specific information relevant to the current query as well as the schema details |
-
-> **There's so many terms!**
->
-> Half the battle here is really just coming to terms. To keep your head from spinning, keep in mind that everybody tends to rename `obj` to something else: Redwood calls it `root`, GraphQL Yoga calls it `parent`. `obj` isn't exactly the most descriptive name in the world.
-
-### Context
-
-In Redwood, the `context` object that's passed to resolvers is actually available to all your Services, whether or not they're serving as resolvers. Just import it from `@redwoodjs/graphql-server`:
-
-```jsx
-import { context } from '@redwoodjs/graphql-server'
-```
-
-#### How to Modify the Context
-
-Because the context is read-only in your services, if you need to modify it, then you need to do so in the `createGraphQLHandler`.
-
-To populate or enrich the context on a per-request basis with additional attributes, set the `context` attribute `createGraphQLHandler` to a custom ContextFunction that modifies the context.
-
-For example, if we want to populate a new, custom `ipAddress` attribute on the context with the information from the request's event, declare the `setIpAddress` ContextFunction as seen here:
-
-```jsx title="api/src/functions/graphql.js"
-// ...
-
-const ipAddress = ({ event }) => {
- return event?.headers?.['client-ip'] || event?.requestContext?.identity?.sourceIp || 'localhost'
-}
-
-const setIpAddress = async ({ event, context }) => {
- context.ipAddress = ipAddress({ event })
-}
-
-export const handler = createGraphQLHandler({
- getCurrentUser,
- loggerConfig: {
- logger,
- options: { operationName: true, tracing: true },
- },
- schema: makeMergedSchema({
- schemas,
- services,
- }),
- context: setIpAddress,
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-> **Note:** If you use the preview GraphQL Yoga/Envelop `graphql-server` package and a custom ContextFunction to modify the context in the createGraphQL handler, the function is provided **_only the context_** and **_not the event_**. However, the `event` information is available as an attribute of the context as `context.event`. Therefore, in the above example, one would fetch the ip address from the event this way: `ipAddress({ event: context.event })`.
-
-### The Root Schema
-
-Did you know that you can query `redwood`? Try it in the GraphQL Playground (you can find the GraphQL Playground at http://localhost:8911/graphql when your dev server is running—`yarn rw dev api`):
-
-```graphql
-query {
- redwood {
- version
- currentUser
- }
-}
-```
-
-How is this possible? Via Redwood's [root schema](https://github.com/redwoodjs/redwood/blob/main/packages/graphql-server/src/rootSchema.ts). The root schema is where things like currentUser are defined:
-
-```graphql
- scalar BigInt
- scalar Date
- scalar Time
- scalar DateTime
- scalar JSON
- scalar JSONObject
-
- type Redwood {
- version: String
- currentUser: JSON
- prismaVersion: String
- }
-
- type Query {
- redwood: Redwood
- }
-```
-
-Now that you've seen the sdl, be sure to check out [the resolvers](https://github.com/redwoodjs/redwood/blob/main/packages/graphql-server/src/rootSchema.ts):
-
-```ts
-export const resolvers: Resolvers = {
- BigInt: BigIntResolver,
- Date: DateResolver,
- Time: TimeResolver,
- DateTime: DateTimeResolver,
- JSON: JSONResolver,
- JSONObject: JSONObjectResolver,
- Query: {
- redwood: () => ({
- version: redwoodVersion,
- prismaVersion: prismaVersion,
- currentUser: (_args: any, context: GlobalContext) => {
- return context?.currentUser
- },
- }),
- },
-}
-```
-
-
-
-## CORS Configuration
-
-CORS stands for [Cross Origin Resource Sharing](https://en.wikipedia.org/wiki/Cross-origin_resource_sharing); in a nutshell, by default, browsers aren't allowed to access resources outside their own domain.
-
-Let's say you're hosting each of your Redwood app's sides on different domains: the web side on `www.example.com` and the api side (and thus, the GraphQL Server) on `api.example.com`.
-When the browser tries to fetch data from the `/graphql` function, you'll see an error that says the request was blocked due to CORS. Wording may vary, but it'll be similar to:
-
-> ⛔️ Access to fetch ... has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource.
-
-To fix this, you need to "configure CORS" by adding:
-
-```
-'Access-Control-Allow-Origin': 'https://example.com'
-'Access-Control-Allow-Credentials': true
-```
-
-to the GraphQL response headers which you can do this by setting the `cors` option in `api/src/functions/graphql.{js|t}s`:
-
-```tsx
-export const handler = createGraphQLHandler({
- loggerConfig: { logger, options: {} },
- directives,
- sdls,
- services,
- cors: {
- // 👈 setup your CORS configuration options
- origin: '*',
- credentials: true,
- },
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-For more in-depth discussion and configuration of CORS when it comes to using a cookie-based auth system (like [dbAuth](authentication.md#self-hosted-auth-installation-and-setup)), see the [CORS documentation](cors.md).
-
-## Health Checks
-
-You can use health checks to determine if a server is available and ready to start serving traffic.
-For example, services like [Pingdom](https://www.pingdom.com) use health checks to determine server uptime and will notify you if it becomes unavailable.
-
-Redwood's GraphQL server provides a health check endpoint at `/graphql/health` as part of its GraphQL handler.
-If the server is healthy and can accept requests, the response will contain the following headers:
-
-```
-content-type: application/json
-server: GraphQL Yoga
-x-yoga-id: yoga
-```
-
-and will return a `HTTP/1.1 200 OK` status with the body:
-
-```json
-{
- "message": "alive"
-}
-```
-
-Note the `x-yoga-id` header. The header's value defaults to `yoga` when `healthCheckId` isn't set in `createGraphQLHandler`. But you can customize it when configuring your GraphQL handler:
-
-```ts title="api/src/functions/graphql.ts"
-// ...
-
-export const handler = createGraphQLHandler({
- // This will be the value of the `x-yoga-id` header
- // highlight-next-line
- healthCheckId: 'my-redwood-graphql-server',
- getCurrentUser,
- loggerConfig: { logger, options: {} },
- directives,
- sdls,
- services,
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-If the health check fails, then the GraphQL server is unavailable and you should investigate what could be causing the downtime.
-
-#### Perform a Health Check
-
-To perform a health check, make a HTTP GET request to the `/graphql/health` endpoint.
-
-For local development,
-with the proxy using `curl` from the command line:
-
-```bash
-curl "http://localhost:8910/.redwood/functions/graphql/health" -i
-```
-
-or by directly invoking the graphql function:
-
-```bash
-curl "http://localhost:8911/graphql/health" -i
-```
-
-you should get the response:
-
-```json
-{
- "message": "alive"
-}
-```
-
-For production, make a request wherever your `/graphql` function exists.
-
-> These examples use `curl` but you can perform a health check via any HTTP GET request.
-
-#### Perform a Readiness Check
-
-A readiness check confirms that your GraphQL server can accept requests and serve **your server's** traffic.
-
-It forwards a request to the health check with a header that must match your `healthCheckId` in order to succeed.
-If the `healthCheckId` doesn't match or the request fails, then your GraphQL server isn't "ready".
-
-To perform a readiness check, make a HTTP GET request to the `/graphql/readiness` endpoint with the appropriate `healthCheckId` header.
-For local development, you can make a request to the proxy:
-
-```bash
-curl "http://localhost:8910/.redwood/functions/graphql/readiness" \
- -H 'x-yoga-id: yoga' \
- -i
-```
-
-or directly invoke the graphql function:
-
-```bash
-curl "http://localhost:8911/graphql/readiness" \
- -H 'x-yoga-id: yoga' \
- -i
-```
-
-Either way, you should get a `200 OK` HTTP status if ready, or a `503 Service Unavailable` if not.
-
-For production, make a request wherever your `/graphql` function exists.
-
-> These examples use `curl` but you can perform a readiness check via any HTTP GET request with the proper headers.
-
-## Verifying GraphQL Schema
-
-In order to keep your GraphQL endpoint and services secure, you must specify one of `@requireAuth`, `@skipAuth` or a custom directive on **every** query and mutation defined in your SDL.
-
-Redwood will verify that your schema complies with these runs when:
-
-- building (or building just the api)
-- launching the dev server.
-
-If any fail this check, you will see:
-
-- each query of mutation listed in the command's error log
-- a fatal error `⚠️ GraphQL server crashed` if launching the server
-
-### Build-time Verification
-
-When building via the `yarn rw build` command and the SDL fails verification, you will see output that lists each query or mutation missing the directive:
-
-```bash
- ✔ Generating Prisma Client...
- ✖ Verifying graphql schema...
- → - deletePost Mutation
- Building API...
- Cleaning Web...
- Building Web...
- Prerendering Web...
-
-You must specify one of @requireAuth, @skipAuth or a custom directive for
-- contacts Query
-- posts Query
-- post Query
-- createContact Mutation
-- createPost Mutation
-- updatePost Mutation
-- deletePost Mutation
-```
-
-### Dev Server Verification
-
-When launching the dev server via the `yarn rw dev` command, you will see output that lists each query or mutation missing the directive:
-
-```bash
-
-gen | Generating TypeScript definitions and GraphQL schemas...
-gen | 37 files generated
-api | Building... Took 444 ms
-api | Starting API Server... Took 2 ms
-api | Listening on http://localhost:8911/
-api | Importing Server Functions...
-web | ...
-api | FATAL [2021-09-24 18:41:49.700 +0000]:
-api | ⚠️ GraphQL server crashed
-api |
-api | Error: You must specify one of @requireAuth, @skipAuth or a custom directive for
-api | - contacts Query
-api | - posts Query
-api | - post Query
-api | - createContact Mutation
-api | - createPost Mutation
-api | - updatePost Mutation
-api | - deletePost Mutation
-```
-
-To fix these errors, simple declare with `@requireAuth` to enforce authentication or `@skipAuth` to keep the operation public on each as appropriate for your app's permissions needs.
-
-## Custom Scalars
-
-GraphQL scalar types give data meaning and validate that their values makes sense. Out of the box, GraphQL comes with `Int`, `Float`, `String`, `Boolean` and `ID`. While those can cover a wide variety of use cases, you may need more specific scalar types to better describe and validate your application's data.
-
-For example, if there's a `Person` type in your schema that has a field like `ageInYears`, if it's actually supposed to represent a person's age, technically it should only be a positive integer—never a negative one.
-Something like the [`PositiveInt` scalar](https://www.graphql-scalars.dev/docs/scalars/positive-int) provides that meaning and validation.
-
-### Scalars vs Service vs Directives
-
-How are custom scalars different from Service Validations or Validator Directives?
-
-[Service validations](services.md#service-validations) run when resolving the service. Because they run at the start of your Service function and throw if conditions aren't met, they're great for validating whenever you use a Service—anywhere, anytime.
-For example, they'll validate via GraphQL, Serverless Functions, webhooks, etc. Custom scalars, however, only validate via GraphQL and not anywhere else.
-
-Service validations also perform more fine-grained checks than scalars which are more geared toward validating that data is of a specific **type**.
-
-[Validator Directives](#directives) control user **access** to data and also whether or not a user is authorized to perform certain queries and/or mutations.
-
-### How To Add a Custom Scalar
-
-Let's say that you have a `Product` type that has three fields: a name, a description, and the type of currency.
-The built-in `String` scalar should suffice for the first two, but for the third, you'd be better off with a more-specific `String` scalar that only accepts [ISO 4217](https://en.wikipedia.org/wiki/ISO_4217) currency codes, like `USD`, `EUR`, `CAD`, etc.
-Luckily there's already a [`Currency` scalar type](https://github.com/Urigo/graphql-scalars/blob/master/src/scalars/Currency.ts) that does exactly that!
-All you have to do is add it to your GraphQL schema.
-
-To add a custom scalar to your GraphQL schema:
-
-1. Add the scalar definition to one of your sdl files, such as `api/src/graphql/scalars.sdl.ts`
-
-> Note that you may have to create this file. Moreover, it's just a convention—custom scalar type definitions can be in any of your sdl files.
-
-```jsx title="api/src/graphql/scalars.sdl.ts"
-export const schema = gql`
- scalar Currency
-`
-```
-
-
-
-2. Import the scalar's definition and resolver and pass them to your GraphQLHandler via the `schemaOptions` property:
-
-```tsx {10-13} title="api/src/functions/graphql.ts"
-import { CurrencyDefinition, CurrencyResolver } from 'graphql-scalars'
-
-// ...
-
-export const handler = createGraphQLHandler({
- loggerConfig: { logger, options: {} },
- directives,
- sdls,
- services,
- schemaOptions: {
- typeDefs: [CurrencyDefinition],
- resolvers: { Currency: CurrencyResolver },
- },
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-
-
-3. Use the scalar in your types
-
-```tsx {6,18,24}
-export const schema = gql`
- type Product {
- id: Int!
- name: String!
- description: String!
- currency_iso_4217: Currency! // validate on query
- createdAt: DateTime!
- }
-
- type Query {
- products: [Product!]! @requireAuth
- product(id: Int!): Product @requireAuth
- }
-
- input CreateProductInput {
- name: String!
- description: String!
- currency_iso_4217: Currency! // validate on mutation
- }
-
- input UpdateProductInput {
- name: String
- description: String
- currency_iso_4217: Currency // validate on mutation
- }
-
- type Mutation {
- createProduct(input: CreateProductInput!): Product! @requireAuth
- updateProduct(id: Int!, input: UpdateProductInput!): Product! @requireAuth
- deleteProduct(id: Int!): Product! @requireAuth
- }
-`
-```
-
-## Directives
-
-Directives supercharge your GraphQL services. They add configuration to fields, types or operations that act like "middleware" that lets you run reusable code during GraphQL execution to perform tasks like [authentication](#authentication), formatting, and more.
-
-You'll recognize a directive by its preceded by the `@` character, e.g. `@myDirective`, and by being declared alongside a field:
-
-```tsx
-type Bar {
- name: String! @myDirective
-}
-```
-
-or a Query or Mutation:
-
-```tsx
-type Query {
- bars: [Bar!]! @myDirective
-}
-
-type Mutation {
- createBar(input: CreateBarInput!): Bar! @myDirective
-}
-```
-
-See the [Directives](directives) section for complete information on RedwoodJS Directives.
-
-## Fragments
-
-See [fragments](graphql/fragments.md)
-
-## Unions
-
-Unions are abstract GraphQL types that enable a schema field to return one of multiple object types.
-
-`union FavoriteTree = Redwood | Ginkgo | Oak`
-
-A field can have a union as its return type.
-
-```tsx
-type Query {
- searchTrees: [FavoriteTree] // This list can include Redwood, Gingko or Oak objects
-}
-```
-
-All of a union's included types must be object types and do not need to share any fields.
-
-To query a union, you can take advantage on [inline fragments](https://graphql.org/learn/queries/#inline-fragments) to include subfields of multiple possible types.
-
-```tsx
-query GetFavoriteTrees {
- __typename // typename is helpful when querying a field that returns one of multiple types
- searchTrees {
- ... on Redwood {
- name
- height
- }
- ... on Ginkgo {
- name
- medicalUse
- }
- ... on Oak {
- name
- acornType
- }
- }
-}
-```
-
-Redwood will automatically detect your union types in your `sdl` files and resolve *which* of your union's types is being returned. If the returned object does not match any of the valid types, the associated operation will produce a GraphQL error.
-
-:::note
-
-In order to use Union types web-side with your Apollo GraphQL client, you will need to [generate possible types from fragments and union types](#generate-possible-types).
-
-:::
-
-### useCache
-
-Apollo Client stores the results of your GraphQL queries in a local, normalized, in-memory cache. This enables the client to respond almost immediately to queries for already-cached data, without even sending a network request.
-
-useCache is a custom hook that returns the cache object and some useful methods to interact with the cache:
-
-* [evict](#evict)
-* [extract](#extract)
-* [identify](#identify)
-* [modify](#modify)
-* [resetStore](#resetStore)
-* [clearStore](#clearStore)
-
-```ts
-import { useCache } from '@redwoodjs/web/apollo'
-```
-
-#### cache
-
-Returns the normalized, in-memory cache.
-
-```ts
-import { useCache } from '@redwoodjs/web/apollo'
-
-const { cache } = useCache()
-```
-
-#### evict
-
-Either removes a normalized object from the cache or removes a specific field from a normalized object in the cache.
-
-```ts
-import { useCache } from '@redwoodjs/web/apollo'
-
-
-const Fruit = ({ id }: { id: FragmentIdentifier }) => {
- const { evict } = useCache()
- const { data: fruit, complete } = useRegisteredFragment(id)
-
- evict(fruit)
-}
-```
-
-#### extract
-
-Returns a serialized representation of the cache's current contents
-
-```ts
-import { useCache } from '@redwoodjs/web/apollo'
-
-const Fruit = ({ id }: { id: FragmentIdentifier }) => {
- const { extract } = useCache()
-
- // Logs the cache's current contents
- console.log(extract())
-
-```
-
-#### identify
-
-```ts
-import { useCache } from '@redwoodjs/web/apollo'
-
-const Fruit = ({ id }: { id: FragmentIdentifier }) => {
- const { identify } = useCache()
- const { data: fruit, complete } = useRegisteredFragment(id)
-
- // Returns "Fruit:ownpc6co8a1w5bhfmavecko9"
- console.log(identify(fruit))
-}
-```
-
-#### modify
-
-Modifies one or more field values of a cached object. Must provide a modifier function for each field to modify. A modifier function takes a cached field's current value and returns the value that should replace it.
-
-Returns true if the cache was modified successfully and false otherwise.
-
-```ts
-import { useCache } from '@redwoodjs/web/apollo'
-
-const Fruit = ({ id }: { id: FragmentIdentifier }) => {
- const { modify } = useCache()
- const { data: fruit, complete } = useRegisteredFragment(id)
-
- // Modify the name of a given fruit entity to be uppercase
-
-
-
- // ...
-}
-```
-
-#### clearStore
-
-To reset the cache without refetching active queries, use the clearStore method.
-
-
-```ts
-import { useCache } from '@redwoodjs/web/apollo'
-
-const Fruit = ({ id }: { id: FragmentIdentifier }) => {
- const { clearStore } = useCache()
-
- clearStore()
-}
-```
-
-#### resetStore
-
-Reset the cache entirely, such as when a user logs out.
-
-```ts
-import { useCache } from '@redwoodjs/web/apollo'
-
-const Fruit = ({ id }: { id: FragmentIdentifier }) => {
- const { resetStore } = useCache()
-
- resetStore()
-}
-```
-
-## GraphQL Handler Setup
-
-Redwood's `GraphQLHandlerOptions` allows you to configure your GraphQL handler schema, context, authentication, security and more.
-
-```ts
-export interface GraphQLHandlerOptions {
- /**
- * @description The identifier used in the GraphQL health check response.
- * It verifies readiness when sent as a header in the readiness check request.
- *
- * By default, the identifier is `yoga` as seen in the HTTP response header `x-yoga-id: yoga`
- */
- healthCheckId?: string
-
- /**
- * @description Customize GraphQL Logger
- *
- * Collect resolver timings, and exposes trace data for
- * an individual request under extensions as part of the GraphQL response.
- */
- loggerConfig: LoggerConfig
-
- /**
- * @description Modify the resolver and global context.
- */
- context?: Context | ContextFunction
-
- /**
- * @description An async function that maps the auth token retrieved from the
- * request headers to an object.
- * Is it executed when the `auth-provider` contains one of the supported
- * providers.
- */
- getCurrentUser?: GetCurrentUser
-
- /**
- * @description A callback when an unhandled exception occurs. Use this to disconnect your prisma instance.
- */
- onException?: () => void
-
- /**
- * @description Services passed from the glob import:
- * import services from 'src/services\/**\/*.{js,ts}'
- */
- services: ServicesGlobImports
-
- /**
- * @description SDLs (schema definitions) passed from the glob import:
- * import sdls from 'src/graphql\/**\/*.{js,ts}'
- */
- sdls: SdlGlobImports
-
- /**
- * @description Directives passed from the glob import:
- * import directives from 'src/directives/**\/*.{js,ts}'
- */
- directives?: DirectiveGlobImports
-
- /**
- * @description A list of options passed to [makeExecutableSchema]
- * (https://www.graphql-tools.com/docs/generate-schema/#makeexecutableschemaoptions).
- */
- schemaOptions?: Partial
-
- /**
- * @description CORS configuration
- */
- cors?: CorsConfig
-
- /**
- * @description Customize GraphQL Armor plugin configuration
- *
- * @see https://escape-technologies.github.io/graphql-armor/docs/configuration/examples
- */
- armorConfig?: ArmorConfig
-
- /**
- * @description Customize the default error message used to mask errors.
- *
- * By default, the masked error message is "Something went wrong"
- *
- * @see https://github.com/dotansimha/envelop/blob/main/packages/core/docs/use-masked-errors.md
- */
- defaultError?: string
-
- /**
- * @description Only allows the specified operation types (e.g. subscription, query or mutation).
- *
- * By default, only allow query and mutation (ie, do not allow subscriptions).
- *
- * An array of GraphQL's OperationTypeNode enums:
- * - OperationTypeNode.SUBSCRIPTION
- * - OperationTypeNode.QUERY
- * - OperationTypeNode.MUTATION
- *
- * @see https://github.com/dotansimha/envelop/tree/main/packages/plugins/filter-operation-type
- */
- allowedOperations?: AllowedOperations
-
- /**
- * @description Custom Envelop plugins
- */
- extraPlugins?: Plugin[]
-
- /**
- * @description Auth-provider specific token decoder
- */
- authDecoder?: Decoder
-
- /**
- * @description Customize the GraphiQL Endpoint that appears in the location bar of the GraphQL Playground
- *
- * Defaults to '/graphql' as this value must match the name of the `graphql` function on the api-side.
- */
- graphiQLEndpoint?: string
- /**
- * @description Function that returns custom headers (as string) for GraphiQL.
- *
- * Headers must set auth-provider, Authorization and (if using dbAuth) the encrypted cookie.
- */
- generateGraphiQLHeader?: GenerateGraphiQLHeader
-}
-```
-
-### Directive Setup
-
-Redwood makes it easy to code, organize, and map your directives into the GraphQL schema.
-
-You simply add them to the `directives` directory and the `createGraphQLHandler` will do all the work.
-
-```tsx title="api/src/functions/graphql.ts"
-import { createGraphQLHandler } from '@redwoodjs/graphql-server'
-
-import directives from 'src/directives/**/*.{js,ts}' // 👈 directives live here
-import sdls from 'src/graphql/**/*.sdl.{js,ts}'
-import services from 'src/services/**/*.{js,ts}'
-
-import { db } from 'src/lib/db'
-import { logger } from 'src/lib/logger'
-
-export const handler = createGraphQLHandler({
- loggerConfig: { logger, options: {} },
- armorConfig, // 👈 custom GraphQL Security configuration
- directives, // 👈 directives are added to the schema here
- sdls,
- services,
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-> Note: Check-out the [in-depth look at Redwood Directives](directives) that explains how to generate directives so you may use them to validate access and transform the response.
-
-
-### Logging Setup
-
-For a details on setting up GraphQL Logging, see [Logging](#logging).
-### Security Setup
-
-For a details on setting up GraphQL Security, see [Security](#security).
-## Logging
-
-Logging is essential in production apps to be alerted about critical errors and to be able to respond effectively to support issues. In staging and development environments, logging helps you debug queries, resolvers and cell requests.
-
-We want to make logging simple when using RedwoodJS and therefore have configured the api-side GraphQL handler to log common information about your queries and mutations. Log statements also be optionally enriched with [operation names](https://graphql.org/learn/queries/#operation-name), user agents, request ids, and performance timings to give you more visibility into your GraphQL api.
-
-By configuring the GraphQL handler to use your api side [RedwoodJS logger](logger), any errors and other log statements about the [GraphQL execution](https://graphql.org/learn/execution/) will be logged to the [destination](logger#destination-aka-where-to-log) you've set up: to standard output, file, or transport stream.
-
-You configure the logger using the `loggerConfig` that accepts a [`logger`](logger) and a set of [GraphQL Logger Options](#graphql-logger-options).
-
-### Configure the GraphQL Logger
-
-A typical GraphQLHandler `graphql.ts` is as follows:
-
-```jsx title="api/src/functions/graphql.ts"
-// ...
-
-import { logger } from 'src/lib/logger'
-
-// ...
-export const handler = createGraphQLHandler({
- loggerConfig: { logger, options: {} },
- // ...
-})
-```
-
-#### Log Common Information
-
-The `loggerConfig` takes several options that logs meaningful information along the graphQL execution lifecycle.
-
-| Option | Description |
-| :------------ | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
-| data | Include response data sent to client. |
-| operationName | Include operation name. The operation name is a meaningful and explicit name for your operation. It is only required in multi-operation documents, but its use is encouraged because it is very helpful for debugging and server-side logging. When something goes wrong (you see errors either in your network logs, or in the logs of your GraphQL server) it is easier to identify a query in your codebase by name instead of trying to decipher the contents. Think of this just like a function name in your favorite programming language. See https://graphql.org/learn/queries/#operation-name |
-| requestId | Include the event's requestId, or if none, generate a uuid as an identifier. |
-| query | Include the query. This is the query or mutation (with fields) made in the request. |
-| tracing | Include the tracing and timing information. This will log various performance timings within the GraphQL event lifecycle (parsing, validating, executing, etc). |
-| userAgent | Include the browser (or client's) user agent. This can be helpful to know what type of client made the request to resolve issues when encountering errors or unexpected behavior. |
-
-Therefore, if you wish to log the GraphQL `query` made, the `data` returned, and the `operationName` used, you would
-
-```jsx title="api/src/functions/graphql.ts"
-export const handler = createGraphQLHandler({
- loggerConfig: {
- logger,
- options: { data: true, operationName: true, query: true },
- },
- // ...
-})
-```
-
-#### Exclude Operations
-
-You can exclude GraphQL operations by name with `excludeOperations`.
-This is useful when you want to filter out certain operations from the log output, for example, `IntrospectionQuery` from GraphQL playground:
-
-```jsx {5} title="api/src/functions/graphql.ts"
-export const handler = createGraphQLHandler({
- loggerConfig: {
- logger,
- options: { excludeOperations: ['IntrospectionQuery'] },
- },
- directives,
- sdls,
- services,
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-> **Relevant anatomy of an operation**
->
-> In the example below, `"FilteredQuery"` is the operation's name.
-> That's what you'd pass to `excludeOperations` if you wanted it filtered out.
->
-> ```js
-> export const filteredQuery = `
-> query FilteredQuery {
-> me {
-> id
-> name
-> }
-> }
-> ```
-
-### Benefits of Logging
-
-Benefits of logging common GraphQL request information include debugging, profiling, and resolving issue reports.
-
-#### Operation Name Identifies Cells
-
-The [operation name](https://graphql.org/learn/queries/#operation-name) is a meaningful and explicit name for your operation. It is only required in multi-operation documents, but its use is encouraged because it is very helpful for debugging and server-side logging.
-
-Because your cell typically has a unique operation name, logging this can help you identify which cell made a request.
-
-```jsx title="api/src/functions/graphql.ts"
-// ...
-export const handler = createGraphQLHandler({
- loggerConfig: { logger, options: { operationName: true } },
-// ...
-```
-
-#### RequestId for Support Issue Resolution
-
-Often times, your deployment provider will provide a request identifier to help reconcile and track down problems at an infrastructure level. For example, AWS API Gateway and AWS Lambda (used by Netlify, for example) provides `requestId` on the `event`.
-
-You can include the request identifier setting the `requestId` logger option to `true`.
-
-```jsx title="api/src/functions/graphql.ts"
-// ...
-export const handler = createGraphQLHandler({
- loggerConfig: { logger, options: { requestId: true } },
-// ...
-```
-
-And then, when working to resolve a support issue with your deployment provider, you can supply this request id to help them track down and investigate the problem more easily.
-
-#### No Need to Log within Services
-
-By configuring your GraphQL logger to include `data` and `query` information about each request you can keep your service implementation clean, concise and free of repeated logger statements in every resolver -- and still log the useful debugging information.
-
-```jsx title="api/src/functions/graphql.ts"
-// ...
-export const handler = createGraphQLHandler({
- loggerConfig: { logger, options: { data: true, operationName: true, query: true } },
-// ...
-
-// api/src/services/posts.js
-//...
-export const post = async ({ id }) => {
- return await db.post.findUnique({
- where: { id },
- })
-}
-//...
-```
-
-The GraphQL handler will then take care of logging your query and data -- as long as your logger is setup to log at the `info` [level](logger#log-level) and above.
-
-> You can also disable the statements in production by just logging at the `warn` [level](logger#log-level) or above
-
-This means that you can keep your services free of logger statements, but still see what's happening!
-
-```bash
-api | POST /graphql 200 7.754 ms - 1772
-api | DEBUG [2021-09-29 16:04:09.313 +0000] (graphql-server): GraphQL execution started: BlogPostQuery
-api | operationName: "BlogPostQuery"
-api | query: {
-api | "id": 3
-api | }
-api | DEBUG [2021-09-29 16:04:09.321 +0000] (graphql-server): GraphQL execution completed: BlogPostQuery
-api | data: {
-api | "post": {
-api | "id": 3,
-api | "body": "Meh waistcoat succulents umami asymmetrical, hoodie post-ironic paleo chillwave tote bag. Trust fund kitsch waistcoat vape, cray offal gochujang food truck cloud bread enamel pin forage. Roof party chambray ugh occupy fam stumptown. Dreamcatcher tousled snackwave, typewriter lyft unicorn pabst portland blue bottle locavore squid PBR&B tattooed.",
-api | "createdAt": "2021-09-24T16:51:06.198Z",
-api | "__typename": "Post"
-api | }
-api | }
-api | operationName: "BlogPostQuery"
-api | query: {
-api | "id": 3
-api | }
-api | POST /graphql 200 9.386 ms - 441
-```
-
-#### Send to Third-party Transports
-
-Stream to third-party log and application monitoring services vital to production logging in serverless environments like [logFlare](https://logflare.app/), [Datadog](https://www.datadoghq.com/) or [LogDNA](https://www.logdna.com/)
-
-#### Supports Log Redaction
-
-Everyone has heard of reports that Company X logged emails, or passwords to files or systems that may not have been secured. While RedwoodJS logging won't necessarily prevent that, it does provide you with the mechanism to ensure that won't happen.
-
-To redact sensitive information, you can supply paths to keys that hold sensitive data using the RedwoodJS logger [redact option](logger#redaction).
-
-Because this logger is used with the GraphQL handler, it will respect any redaction paths setup.
-
-For example, you have chosen to log `data` return by each request, then you may want to redact sensitive information, like email addresses from your logs.
-
-Here is an example of an application `/api/src/lib/logger.ts` configured to redact email addresses. Take note of the path `data.users[*].email` as this says, in the `data` attribute, redact the `email` from every `user`:
-
-```jsx title="/api/src/lib/logger.ts"
-import { createLogger, redactionsList } from '@redwoodjs/api/logger'
-
-export const logger = createLogger({
- options: {
- redact: [...redactionsList, 'email', 'data.users[*].email'],
- },
-})
-```
-
-#### Timing Traces and Metrics
-
-Often you want to measure and report how long your queries take to execute and respond. You may already be measuring these durations at the database level, but you can also measure the time it takes for your the GraphQL server to parse, validate, and execute the request.
-
-You may turn on logging these metrics via the `tracing` GraphQL configuration option.
-
-```jsx title="api/src/functions/graphql.ts"
-// ...
-export const handler = createGraphQLHandler({
- loggerConfig: { logger, options: { tracing: true } },
-// ...
-```
-
-Let's say we wanted to get some benchmark numbers for the "find post by id" resolver
-
-```jsx
-return await db.post.findUnique({
- where: { id },
-})
-```
-
-We see that this request took about 500 msecs (note: duration is reported in nanoseconds).
-
-For more details about the information logged and its format, see [Apollo Tracing](https://github.com/apollographql/apollo-tracing).
-
-```bash
-pi | INFO [2021-07-09 14:25:52.452 +0000] (graphql-server): GraphQL willSendResponse
-api | tracing: {
-api | "version": 1,
-api | "startTime": "2021-07-09T14:25:51.931Z",
-api | "endTime": "2021-07-09T14:25:52.452Z",
-api | "duration": 521131526,
-api | "execution": {
-api | "resolvers": [
-api | {
-api | "path": [
-api | "post"
-api | ],
-api | "parentType": "Query",
-api | "fieldName": "post",
-api | "returnType": "Post!",
-api | "startOffset": 1787428,
-api | "duration": 519121497
-api | },
-api | {
-api | "path": [
-api | "post",
-api | "id"
-api | ],
-api | "parentType": "Post",
-api | "fieldName": "id",
-api | "returnType": "Int!",
-api | "startOffset": 520982888,
-api | "duration": 25140
-api | },
-... more paths follow ...
-api | ]
-api | }
-api | }
-```
-
-By logging the operation name and extracting the duration for each query, you can easily collect and benchmark query performance.
-
-## Security
-
-Parsing a GraphQL operation document is a very expensive and compute intensive operation that blocks the JavaScript event loop. If an attacker sends a very complex operation document with slight variations over and over again he can easily degrade the performance of the GraphQL server.
-
-RedwoodJS will by default reject a variety malicious operation documents; that is, it'll prevent attackers from making malicious queries or mutations.
-
-RedwoodJS is configured out-of-the-box with GraphQL security best practices:
-
-* Schema Directive-based Authentication including RBAC validation
-* Production Deploys disable Introspection and GraphQL Playground automatically
-* Reject Malicious Operation Documents (Max Aliases, Max Cost, Max Depth, Max Directives, Max Tokens)
-* Prevent Information Leaks (Block Field Suggestions, Mask Errors)
-
-And with the Yoga Envelop Plugin ecosystem available to you, there are options for:
-
-* CSRF Protection
-* Rate Limiting
-* and more.
-
-### Authentication
-
-By default, your GraphQL endpoint is open to the world.
-
-That means anyone can request any query and invoke any Mutation.
-Whatever types and fields are defined in your SDL is data that anyone can access.
-
-Redwood [encourages being secure by default](directives) by defaulting all queries and mutations to have the `@requireAuth` directive when generating SDL or a service.
-
-When your app builds and your server starts up, Redwood checks that **all** queries and mutations have `@requireAuth`, `@skipAuth` or a custom directive applied.
-
-If not, then your build will fail:
-
-```bash
- ✖ Verifying graphql schema...
- Building API...
- Cleaning Web...
- Building Web...
- Prerendering Web...
-You must specify one of @requireAuth, @skipAuth or a custom directive for
-- contacts Query
-- posts Query
-- post Query
-- updatePost Mutation
-- deletePost Mutation
-```
-
-or your server won't startup and you should see that "Schema validation failed":
-
-```bash
-gen | Generating TypeScript definitions and GraphQL schemas...
-gen | 47 files generated
-api | Building... Took 593 ms
-api | [GQL Server Error] - Schema validation failed
-api | ----------------------------------------
-api | You must specify one of @requireAuth, @skipAuth or a custom directive for
-api | - posts Query
-api | - createPost Mutation
-api | - updatePost Mutation
-api | - deletePost Mutation
-```
-
-To correct, just add the appropriate directive to your queries and mutations.
-
-If not, then your build will fail and your server won't startup.
-
-#### @requireAuth
-
-To enforce authentication, simply add the `@requireAuth` directive in your GraphQL schema for any query or field you want protected.
-
-It's your responsibility to implement the `requireAuth()` function in your app's `api/src/lib/auth.{js|ts}` to check if the user is properly authenticated and/or has the expected role membership.
-
-The `@requireAuth` directive will call the `requireAuth()` function to determine if the user is authenticated or not.
-
-Here we enforce that a user must be logged in to `create`. `update` or `delete` a `Post`.
-
-```ts
-type Post {
- id: Int!
- title: String!
- body: String!
- authorId: Int!
- author: User!
- createdAt: DateTime!
-}
-
-input CreatePostInput {
- title: String!
- body: String!
- authorId: Int!
-}
-
-input UpdatePostInput {
- title: String
- body: String
- authorId: Int
-}
-
-type Mutation {
- createPost(input: CreatePostInput!): Post! @requireAuth
- updatePost(id: Int!, input: UpdatePostInput!): Post! @requireAuth
- deletePost(id: Int!): Post! @requireAuth
-}
-```
-
-It's your responsibility to implement the `requireAuth()` function in your app's `api/src/lib/auth.{js|ts}` to check if the user is properly authenticated and/or has the expected role membership.
-
-The `@requireAuth` directive will call the requireAuth() function to determine if the user is authenticated or not.
-
-```ts title="api/src/lib/auth.ts"
-// ...
-
-export const isAuthenticated = (): boolean => {
- return true // 👈 replace with the appropriate check
-}
-
-// ...
-
-export const requireAuth = ({ roles }: { roles: AllowedRoles }) => {
- if (isAuthenticated()) {
- throw new AuthenticationError("You don't have permission to do that.")
- }
-
- if (!hasRole({ roles })) {
- throw new ForbiddenError("You don't have access to do that.")
- }
-}
-```
-
-> **Note**: The `auth.ts` file here is the stub for a new RedwoodJS app. Once you have setup auth with your provider, this will enforce a proper authentication check.
-
-##### Field-level Auth
-
-You can apply the `@requireAuth` to any field as well (not just queries or mutations):
-
-```ts
-type Post {
- id: Int!
- title: String!
- body: String! @requireAuth
- authorId: Int!
- author: User!
- createdAt: DateTime!
-}
-```
-
-##### Role-based Access Control
-
-The `@requireAuth` directive lets you define roles that are permitted to perform the operation:
-
-```ts
-type Mutation {
- createPost(input: CreatePostInput!): Post! @requireAuth(roles: ['AUTHOR', 'EDITOR'])
- updatePost(id: Int!, input: UpdatePostInput!): Post! @requireAuth(roles: ['EDITOR']
- deletePost(id: Int!): Post! @requireAuth(roles: ['ADMIN']
-}
-```
-
-#### @skipAuth
-
-If, however, you want your query or mutation to be public, then simply use `@skipAuth`.
-
-In the example, fetching all posts or a single post is allowed for all users, authenticated or not.
-
-```ts
-type Post {
- id: Int!
- title: String!
- body: String!
- authorId: Int!
- author: User!
- createdAt: DateTime!
-}
-
-type Query {
- posts: [Post!]! @skipAuth
- post(id: Int!): Post @skipAuth
-}
-```
-
-### Introspection and Playground Disabled in Production
-
-Because it is often useful to ask a GraphQL schema for information about what queries it supports, GraphQL allows us to do so using the [introspection](https://graphql.org/learn/introspection/) system.
-
-The [GraphQL Playground](https://www.graphql-yoga.com/docs/features/graphiql) is a way for you to interact with your schema and try out queries and mutations. It can show you the schema by inspecting it. You can find the GraphQL Playground at [http://localhost:8911/graphql](http://localhost:8911/graphql) when your dev server is running.
-
-> Because both introspection and the playground share possibly sensitive information about your data model, your data, your queries and mutations, best practices for deploying a GraphQL Server call to disable these in production, RedwoodJS **, by default, only enables introspection and the playground when running in development**. That is when `process.env.NODE_ENV === 'development'`.
-
-However, there may be cases where you want to enable introspection as well as the GraphQL PLaygrouns. You can enable introspection by setting the `allowIntrospection` option to `true` and enable GraphiQL by setting `allowGraphiQL` to `true`.
-
-Here is an example of `createGraphQLHandler` function with the `allowIntrospection` and `allowGraphiQL` options set to `true`:
-```ts {8}
-export const handler = createGraphQLHandler({
- authDecoder,
- getCurrentUser,
- loggerConfig: { logger, options: {} },
- directives,
- sdls,
- services,
- allowIntrospection: true, // 👈 enable introspection in all environments
- allowGraphiQL: true, // 👈 enable GraphiQL Playground in all environments
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-:::warning
-
-Enabling introspection in production may pose a security risk, as it allows users to access information about your schema, queries, and mutations. Use this option with caution and make sure to secure your GraphQL API properly.
-
-The may be cases where one wants to allow introspection, but not GraphiQL.
-
-Or, you may want to enable GraphiQL, but not allow introspection; for example, to try out known queries, but not to share the entire set of possible operations and types.
-
-:::
-
-
-### GraphQL Armor Configuration
-
-[GraphQL Armor](https://escape.tech/graphql-armor/) is a middleware that adds a security layer the RedwoodJS GraphQL endpoint configured with sensible defaults.
-
-You don't have to configure anything to enforce protection against alias, cost, depth, directive, tokens abuse in GraphQL operations as well as to block field suggestions or revealing error messages that might leak sensitive information.
-
-But, if you need to enable, disable to modify the default settings, GraphQL Armor is fully configurable in a per-plugin fashion.
-
-Simply define and provide a custom GraphQL Security configuration to your `createGraphQLHandler`:
-
-```ts
-export const handler = createGraphQLHandler({
- authDecoder,
- getCurrentUser,
- loggerConfig: { logger, options: {} },
- directives,
- sdls,
- services,
- armorConfig, // 👈 custom GraphQL Security configuration
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-For example, the default max query depth limit is 6. To change that setting to 2 levels, simply provide the configuration to your handler:
-
-```ts
-export const handler = createGraphQLHandler({
- authDecoder,
- getCurrentUser,
- loggerConfig: { logger, options: {} },
- directives,
- sdls,
- services,
- armorConfig: { maxDepth: { n: 2 } },
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-#### Max Aliases
-
-This protection is enabled by default.
-
-Limit the number of aliases in a document. Defaults to 15.
-
-##### Example
-
-Aliases allow you to rename the data that is returned in a query’s results. They manipulate the structure of the query result that is fetched from your service, displaying it according to your web component's needs.
-
-This contrived example uses 11 alias to rename a Post's id and title to various permutations of post, article, and blog to return a different shape in the query result as `articles`:
-
-```ts
- {
- articles: posts {
- id
- articleId: id
- postId: id
- articlePostId: id
- postArticleId: id
- blogId: id
- title
- articleTitle: title
- postTitle: title
- articlePostTitle: title
- postArticleTitle: title
- blogTitle: title
- }
-}
-```
-
-##### Configuration and Defaults
-
-Limit the number of aliases in a document. Defaults to 15.
-
-You can change the default value via the `maxAliases` setting when creating your GraphQL handler.
-
-```ts
-{
- maxAliases: {
- enabled: true,
- n: 15,
- }
-}
-```
-#### Cost Limit
-
-This protection is enabled by default.
-
-It analyzes incoming GraphQL queries and applies a cost analysis algorithm to prevent resource overload by blocking too expensive requests (DoS attack attempts).
-
-The cost computation is quite simple (and naive) at the moment but there are plans to make it evolve toward a extensive plugin with many features.
-
-Defaults to a overall maxCost limit of 5000.
-
-##### Overview
-
-Cost is a factor of the kind of field and depth. Total Cost is a cumulative sum of each field based on its type and its depth in the query.
-
-Scalar fields -- those that return values like strings or numbers -- are worth one value; whereas are objects are worth another.
-
-How deep they are nested in the query is a multiplier factor such that:
-
-```
-COST = FIELD_KIND_COST * (DEPTH * DEPTH_COST_FACTOR)
-TOTAL_COST = SUM(COST)
-```
-
-If the `TOTAL_COST` exceeds the `maxCost`, an error stops GraphQL execution and rejects the request.
-
-You have control over the field kind and depth costs settings, but the defaults are:
-
-```
-objectCost: 2, // cost of retrieving an object
-scalarCost: 1, // cost of retrieving a scalar
-depthCostFactor: 1.5, // multiplicative cost of depth
-```
-
-##### Example
-
-In this small example, we have one object field `me` that contains two, nested scalar fields `id` and `me`. There is an operation `profile` (which is neither a scalar nor object and thus ignored as part of the cost calculation).
-
-```ts
-{
- profile {
- me {
- id
- user
- }
- }
-}
-```
-The cost breakdown for cost is:
-
-* two scalars `id` and `user` worth 1 each
-* they are at level 1 depth with a depth factor of 1.5
-* 2 \* ( 1 \* 1.5 ) = 2 \* 1.5 = 3
-* their parent object is `me` worth 2
-
-Therefore the total cost is 2 + 3 = 5.
-
-:::note
-The operation definition `query` of `profile` is ignored in the calculation. This is the case even if you name your query `MY_PROFILE` like:
-
-```
-{
- profile MY_PROFILE {
- me {
- id
- user
- }
- }
-}
-```
-:::
-
-##### Configuration and Defaults
-
-Defaults to a overall maxCost limit of 5000.
-
-You can change the default value via the `costLimit` setting when creating your GraphQL handler.
-
-
-```ts
-{
- costLimit: {
- enabled: true,
- maxCost: 5000, // maximum cost of a request before it is rejected
- objectCost: 2, // cost of retrieving an object
- scalarCost: 1, // cost of retrieving a scalar
- depthCostFactor: 1.5, // multiplicative cost of depth
- }
-}
-```
-
-#### Max Depth Limit
-
-This protection is enabled by default.
-
-Limit the depth of a document. Defaults to 6 levels.
-
-Attackers often submit expensive, nested queries to abuse query depth that could overload your database or expend costly resources.
-
-Typically, these types of unbounded, complex and expensive GraphQL queries are usually huge deeply nested and take advantage of an understanding of your schema (hence why schema introspection is disabled by default in production) and the data model relationships to create "cyclical" queries.
-
-##### Example
-
-An example of a cyclical query here takes advantage of knowing that an author has posts and each post has an author ... that has posts ... that has an another that ... etc.
-
-This cyclical query has a depth of 8.
-
-```jsx
-// cyclical query example
-// depth: 8+
-query cyclical {
- author(id: 'jules-verne') {
- posts {
- author {
- posts {
- author {
- posts {
- author {
- ... {
- ... # more deep nesting!
- }
- }
- }
- }
- }
- }
- }
- }
-}
-```
-##### Configuration and Defaults
-
-Defaults to 6 levels.
-
-You can change the default value via the `maxDepth` setting when creating your GraphQL handler.
-
-```ts
-{
- maxDepth: {
- enabled: true,
- n: 6,
- }
-}
-```
-
-#### Max Directives
-
-This protections is enabled by default.
-
-Limit the number of directives in a document. Defaults to 50.
-
-##### Example
-
-The following example demonstrates that by using the `@include` and `@skip` GraphQL query directives one can design a large request that requires computation, but in fact returns the expected response ...
-
-```ts
-{
- posts {
- id @include(if:true)
- id @include(if:false)
- id @include(if:false)
- id @skip(if:true)
- id @skip(if:true)
- id @skip(if:true))
- title @include(if:true)
- title @include(if:false)
- title @include(if:false)
- title @skip(if:true)
- title @skip(if:true)
- title @skip(if:true)
- }
-}
-```
-
-... of formatted Posts with just a single id and title.
-
-```ts
-{
- "data": {
- "posts": [
- {
- "id": 1,
- "title": "A little more about RedwoodJS"
- },
- {
- "id": 2,
- "title": "What is GraphQL?"
- },
- {
- "id": 3,
- "title": "Welcome to the RedwoodJS Community!"
- },
- {
- "id": 4,
- "title": "10 ways to secure your GraphQL endpoint"
- }
- ]
- }
-}
-```
-
-By limiting the maximum number of directives in the document, malicious queries can be rejected.
-
-##### Configuration and Defaults
-
-You can change the default value via the `maxDirectives` setting when creating your GraphQL handler.
-
-```ts
-{
- maxDirectives: {
- enabled: true,
- n: 50,
- }
-}
-```
-#### Max Tokens
-
-This protection is enabled by default.
-
-Limit the number of GraphQL tokens in a document.
-
- In computer science, lexical analysis, lexing or tokenization is the process of converting a sequence of characters into a sequence of lexical tokens.
-
- E.g. given the following GraphQL operation.
-
-```ts
- graphql {
- me {
- id
- user
- }
- }
-```
-
- The tokens are `query`, `{`, `me`, `{`, `id`, `user`, `}` and `}`. Having a total count of 8 tokens.
-
-##### Example
-
-Given the query with 8 tokens:
-
-```ts
- graphql {
- me {
- id
- user
- }
- }
-```
-
-And a custom configuration to all a maximum of two tokens:
-
-```
-const armorConfig = {
- maxTokens: { n: 2 },
-}
-```
-
-An error is raised:
-
-```
-'Syntax Error: Token limit of 2 exceeded, found 3.'
-```
-
-:::note
-
-When reporting the number of found tokens, then number found is not the total tokens, but the value when found that exceeded the limit.
-
-Therefore found would be n + 1.
-:::
-
-##### Configuration and Defaults
-
-Defaults to 1000.
-
-You can change the default value via the `maxTokens` setting when creating your GraphQL handler.
-
-```ts
-{
- maxTokens: {
- enabled: true,
- n: 1000,
- }
-}
-```
-#### Block Field Suggestions
-
-This plugin is enabled by default.
-
-It will prevent suggesting fields in case of an erroneous request. Suggestions can lead to the leak of your schema even with disabled introspection, which can be very detrimental in case of a private API.
-
-Example of such a suggestion:
-
-`Cannot query field "sta" on type "Media". Did you mean "stats", "staff", or "status"?`
-
-##### Example
-##### Configuration and Defaults
-
-Enabled by default.
-
-You can change the default value via the `blockFieldSuggestions` setting when creating your GraphQL handler.
-
-```ts
-{
- blockFieldSuggestion: {
- enabled: true,
- }
-}
-```
-Enabling will hide the field suggestion:
-
-`Cannot query field "sta" on type "Media". [Suggestion hidden]?`
-
-Orm if you want a custom mask:
-
-```ts
-{
-
- blockFieldSuggestion: {
- mask: ''
- },
-}
-```
-
-``Cannot query field "sta" on type "Media". [REDACTED]?`
-
-
-### Error Masking
-
-In many GraphQL servers, when an error is thrown, the details of that error are leaked to the outside world. The error and its message are then returned in the response and a client might reveal those errors in logs or even render the message to the user. You could potentially leak sensitive or other information about your app you don't want to share—such as database connection failures or even the presence of certain fields.
-
-Redwood is here to help!
-
-Redwood prevents leaking sensitive error-stack information out-of-the-box for unexpected errors.
-If an error that isn't one of [Redwood's GraphQL Errors](#redwood-errors) or isn't based on a GraphQLError is thrown:
-
-- The original error and its message will be logged using the defined GraphQL logger, so you'll know what went wrong
-- A default message "Something went wrong" will replace the error message in the response (Note: you can customize this message)
-
-#### Customizing the Error Message
-
-But what if you still want to share an error message with client?
-Simply use one of [Redwood's GraphQL Errors](#redwood-errors) and your custom message will be shared with your users.
-
-#### Customizing the Default Error Message
-
-You can customize the default "Something went wrong" message used when the error is masked via the `defaultError` setting on the `createGraphQLHandler`:
-
-```tsx
-export const handler = createGraphQLHandler({
- loggerConfig: { logger, options: {} },
- directives,
- sdls,
- services,
- defaultError: 'Sorry about that', // 👈 Customize the error message
- onException: () => {
- // Disconnect from your database with an unhandled exception.
- db.$disconnect()
- },
-})
-```
-
-#### Redwood Errors
-
-Redwood Errors are inspired from [Apollo Server Error codes](https://www.apollographql.com/docs/apollo-server/data/errors/#error-codes) for common use cases:
-
-To use a Redwood Error, import each from `@redwoodjs/graphql-server`.
-
-- `SyntaxError` - An unspecified error occurred
-- `ValidationError` - Invalid input to a service
-- `AuthenticationError` - Failed to authenticate
-- `ForbiddenError` - Unauthorized to access
-- `UserInputError` - Missing input to a service
-
-If you use one of the errors, then the message provided will not be masked and will be shared in the GraphQL response:
-
-```tsx
-import { UserInputError } from '@redwoodjs/graphql-server'
-// ...
-throw new UserInputError('An email is required.')
-```
-
-then the message provided will not be masked and it will be shred in the GraphQL response.
-
-##### Custom Errors and Uses
-
-Need you own custom error and message?
-
-Maybe you're integrating with a third-party api and want to handle errors from that service and also want control of how that error is shared with your user client-side.
-
-Simply extend from `RedwoodError` and you're all set!
-
-```tsx
-export class MyCustomError extends RedwoodError {
- constructor(message: string, extensions?: Record) {
- super(message, extensions)
- }
-}
-```
-
-For example, in your service, you can create and use it to handle the error and return a friendly message:
-
-```tsx
-export class WeatherError extends RedwoodError {
- constructor(message: string, extensions?: Record) {
- super(message, extensions)
- }
-}
-
-export const getWeather = async ({ input }: WeatherInput) {
- try {
- const weather = weatherClient.get(input.zipCode)
- } catch(error) {
- // rate limit issue
- if (error.statusCode = 429) {
- throw new WeatherError('Unable to get the latest weather updates at the moment. Please try again shortly.')
- }
-
- // other error
- throw new WeatherError(`We could not get the weather for ${input.zipCode}.`)
- }
-}
-```
-
-#### CSRF Prevention
-
-If you have CORS enabled, almost all requests coming from the browser will have a preflight request - however, some requests are deemed "simple" and don't make a preflight. One example of such a request is a good ol' GET request without any headers, this request can be marked as "simple" and have preflight CORS checks skipped therefore skipping the CORS check.
-
-This attack can be mitigated by saying: "all GET requests must have a custom header set". This would force all clients to manipulate the headers of GET requests, marking them as "_not-_simple" and therefore always executing a preflight request.
-
-You can achieve this by using the [`@graphql-yoga/plugin-csrf-prevention` GraphQL Yoga plugin](https://the-guild.dev/graphql/yoga-server/docs/features/csrf-prevention).
-
-## Self-Documenting GraphQL API
-
-RedwoodJS helps you document your GraphQL API by generating commented SDL used for GraphiQL and the GraphQL Playground explorer -- as well as can be turned into API docs using tools like [Docusaurus](#use-in-docusaurus).
-
-If you specify the SDL generator with its `--docs` option, any comments (which the [GraphQL spec](https://spec.graphql.org/October2021/#sec-Descriptions) calls "descriptions") will be incorporated into your RedwoodJS app's `graphql.schema` file when generating types.
-
-If you comment your Prisma schema models, its fields, or enums, the SDL generator will use those comments as the documentation.
-
-If there is no Prisma comment, then the SDL generator will default a comment that you can then edit.
-
-:::note
-If you re-generate the SDL, any custom comments will be overwritten.
-However, if you make those edits in your Prisma schema, then those will be used.
-:::
-
-### Prisma Schema Comments
-
-Your Prisma schema is documented with triple slash comments (`///`) that precedes:
-
-* Model names
-* Enum names
-* each Model field name
-
-```
-/// A blog post.
-model Post {
- /// The unique identifier of a post.
- id Int @id @default(autoincrement())
- /// The title of a post.
- title String
- /// The content of a post.
- body String
- /// When the post was created.
- createdAt DateTime @default(now())
-}
-
-/// A list of allowed colors.
-enum Color {
- RED
- GREEN
- BLUE
-}
-```
-
-### SDL Comments
-
-When used with `--docs` option, [SDL generator](cli-commands#generate-sdl) adds comments for:
-
-* Directives
-* Queries
-* Mutations
-* Input Types
-
-:::note
-By default, the `--docs` option to the SDL generator is false and comments are not created.
-:::
-
-Comments [enclosed in `"""` or `"`]([GraphQL spec](https://spec.graphql.org/October2021/#sec-Descriptions) in your sdl files will be included in the generated GraphQL schema at the root of your project (.redwood/schema.graphql).
-
-```
-"""
-Use to check whether or not a user is authenticated and is associated
-with an optional set of roles.
-"""
-directive @requireAuth(roles: [String]) on FIELD_DEFINITION
-
-"""Use to skip authentication checks and allow public access."""
-directive @skipAuth on FIELD_DEFINITION
-
-"""
-Autogenerated input type of InputPost.
-"""
-input CreatePostInput {
- "The content of a post."
- body: String!
-
- "The title of a post."
- title: String!
-}
-
-"""
-Autogenerated input type of UpdatePost.
-"""
-input UpdatePostInput {
- "The content of a post."
- body: String
-
- "The title of a post."
- title: String
-}
-
-"""
-A blog post.
-"""
-type Post {
- "The content of a post."
- body: String!
-
- "Description for createdAt."
- createdAt: DateTime!
-
- "The unique identifier of a post."
- id: Int!
-
- "The title of a post."
- title: String!
-}
-
-"""
-About mutations
-"""
-type Mutation {
- "Creates a new Post."
- createPost(input: CreatePostInput!): Post!
-
- "Deletes an existing Post."
- deletePost(id: Int!): Post!
-
- "Updates an existing Post."
- updatePost(id: Int!, input: UpdatePostInput!): Post!
-}
-
-"""
-About queries
-"""
-type Query {
- "Fetch a Post by id."
- post(id: Int!): Post
-
- "Fetch Posts."
- posts: [Post!]!
-}
-```
-
-#### Root Schema
-
-Documentation is also generated for the Redwood Root Schema that defines details about Redwood such as the current user and version information.
-```
-type Query {
- "Fetches the Redwood root schema."
- redwood: Redwood
-}
-
-"""
-The Redwood Root Schema
-
-Defines details about Redwood such as the current user and version information.
-"""
-type Redwood {
- "The current user."
- currentUser: JSON
-
- "The version of Prisma."
- prismaVersion: String
-
- "The version of Redwood."
- version: String
-}
-
-scalar BigInt
-scalar Date
-scalar DateTime
-scalar JSON
-scalar JSONObject
-scalar Time
-
-```
-
-### Preview in GraphiQL
-
-The [GraphQL Playground aka GraphiQL](https://www.graphql-yoga.com/docs/features/graphiql) is a way for you to interact with your schema and try out queries and mutations. It can show you the schema by inspecting it. You can find the GraphQL Playground at [http://localhost:8911/graphql](http://localhost:8911/graphql) when your dev server is running.
-
-The documentation generated is present when exploring the schema.
-
-#### Queries
-
-
-
-#### Mutations
-
-
-
-#### Model Types
-
-
-
-#### Input Types
-
-
-
-### Use in Docusaurus
-
-If your project uses [Docusaurus](https://docusaurus.io), the generated commented SDL can be used to publish documentation using the [graphql-markdown](https://graphql-markdown.github.io) plugin.
-
-#### Basic Setup
-
-The following is some basic setup information, but please consult [Docusaurus](https://docusaurus.io) and the [graphql-markdown](https://graphql-markdown.github.io) for latest instructions.
-
-1. Install Docusaurus (if you have not done so already)
-
-```terminal
-npx create-docusaurus@latest docs classic
-```
-
-
-Add `docs` to your `workspaces` in the project's `package.json`:
-
-```
- "workspaces": {
- "packages": [
- "docs",
- "api",
- "web",
- "packages/*"
- ]
- },
-```
-
-2. Ensure a `docs` directory exists at the root of your project
-
-```terminal
-mkdir docs // if needed
-```
-
-3. Install the GraphQL Generators Plugin
-
-```terminal
-yarn workspace docs add @edno/docusaurus2-graphql-doc-generator graphql
-```
-
-4. Ensure a Directory for your GraphQL APi generated documentation resides in with the Docusaurus directory `/docs` structure
-
-```terminal
-// Change into the "docs" workspace
-
-cd docs
-
-// you should have the "docs" directory and within that a "graphql-api" directory
-mkdir docs/graphql-api // if needed
-```
-
-5. Update `docs/docusaurus.config.js` and configure the plugin and navbar
-
-```
-// docs/docusaurus.config.js
-// ...
- plugins: [
- [
- '@edno/docusaurus2-graphql-doc-generator',
- {
- schema: '../.redwood/schema.graphql',
- rootPath: './docs',
- baseURL: 'graphql-api',
- linkRoot: '../..',
- },
- ],
- ],
-// ...
-themeConfig:
- /** @type {import('@docusaurus/preset-classic').ThemeConfig} */
- ({
- navbar: {
- title: 'My Site',
- logo: {
- alt: 'My Site Logo',
- src: 'img/logo.svg',
- },
- items: [
- {
- to: '/docs/graphql-api', // adjust the location depending on your baseURL (see configuration)
- label: 'GraphQL API', // change the label with yours
- position: 'right',
- },
-//...
-```
-6. Update `docs/sidebars.js` to include the generated `graphql-api/sidebar-schema.js`
-
-```
-// docs/sidebars.js
-/**
- * Creating a sidebar enables you to:
- * - create an ordered group of docs
- * - render a sidebar for each doc of that group
- * - provide next/previous navigation
- *
- * The sidebars can be generated from the filesystem, or explicitly defined here.
- *
- * Create as many sidebars as you want.
- */
-
-// @ts-check
-
-/** @type {import('@docusaurus/plugin-content-docs').SidebarsConfig} */
-const sidebars = {
- // By default, Docusaurus generates a sidebar from the docs folder structure
- tutorialSidebar: [
- {
- type: 'autogenerated',
- dirName: '.',
- },
- ],
- ...require('./docs/graphql-api/sidebar-schema.js'),
-}
-
-module.exports = sidebars
-```
-
-7. Generate the docs
-
-`yarn docusaurus graphql-to-doc`
-
-:::tip
-You can overwrite the generated docs and bypass the plugin's diffMethod use `--force`.
-
-``yarn docusaurus graphql-to-doc --force`
-:::
-
-8. Start Docusaurus
-
-```
-yarn start
-```
-
-##### Example Screens
-
-##### Schema Documentation
-![graphql-doc-example-main](/img/graphql-api-docs/schema-doc.png)
-
-##### Type Example
-![graphql-doc-example-type](/img/graphql-api-docs/contact-type.png)
-
-##### Query Example
-![graphql-doc-example-query](/img/graphql-api-docs/contact-query.png)
-
-##### Mutation Example
-![graphql-doc-example-mutation](/img/graphql-api-docs/schema-mutation.png)
-
-##### Directive Example
-![graphql-doc-example-directive](/img/graphql-api-docs/schema-directive.png)
-
-##### Scalar Example
-![graphql-doc-example-scalar](/img/graphql-api-docs/schema-scalar.png)
-
-## FAQ
-
-### Why Doesn't Redwood Use Something Like Nexus?
-
-This might be one of our most frequently asked questions of all time. Here's [Tom's response in the forum](https://community.redwoodjs.com/t/anyone-playing-around-with-nexus-js/360/5):
-
-> We started with Nexus, but ended up pulling it out because we felt like it was too much of an abstraction over the SDL. It’s so nice being able to just read the raw SDL to see what the GraphQL API is.
-
-
-
-
-## Further Reading
-
-Eager to learn more about GraphQL? Check out some of the resources below:
-- [GraphQL.wtf](https://graphql.wtf) covers most aspects of GraphQL and publishes one short video a week
-- The official GraphQL Yoga (the GraphQL server powering Redwood) [tutorial](https://www.graphql-yoga.com/tutorial/basic/00-introduction) is the best place to get your hands on GraphQL basics
-- And of course, [the official GraphQL docs](https://graphql.org/learn/) are great place to do a deep dive into exactly how GraphQL works
diff --git a/docs/versioned_docs/version-7.0/graphql/fragments.md b/docs/versioned_docs/version-7.0/graphql/fragments.md
deleted file mode 100644
index 71f693423fa4..000000000000
--- a/docs/versioned_docs/version-7.0/graphql/fragments.md
+++ /dev/null
@@ -1,310 +0,0 @@
-# Fragments
-
-[GraphQL fragments](https://graphql.org/learn/queries/#fragments) are reusable units of GraphQL queries that allow developers to define a set of fields that can be included in multiple queries. Fragments help improve code organization, reduce duplication, and make GraphQL queries more maintainable. They are particularly useful when you want to request the same set of fields on different parts of your data model or when you want to share query structures across multiple components or pages in your application.
-
-## What are Fragments?
-
-Here are some key points about GraphQL fragments:
-
-1. **Reusability**: Fragments allow you to define a set of fields once and reuse them in multiple queries. This reduces redundancy and makes your code more DRY (Don't Repeat Yourself).
-
-2. **Readability**: Fragments make queries more readable by separating the query structure from the actual query usage. This can lead to cleaner and more maintainable code.
-
-3. **Maintainability**: When you need to make changes to the requested fields, you only need to update the fragment definition in one place, and all queries using that fragment will automatically reflect the changes.
-
-## Basic Usage
-
-Here's a basic example of how you might use GraphQL fragments in developer documentation:
-
-Let's say you have a GraphQL schema representing books, and you want to create a fragment for retrieving basic book information like title, author, and publication year.
-
-
-```graphql
-# Define a GraphQL fragment for book information
-fragment BookInfo on Book {
- id
- title
- author
- publicationYear
-}
-
-# Example query using the BookInfo fragment
-query GetBookDetails($bookId: ID!) {
- book(id: $bookId) {
- ...BookInfo
- description
- # Include other fields specific to this query
- }
-}
-```
-
-In this example:
-
-- We've defined a fragment called `BookInfo` that specifies the fields we want for book information.
-- In the `GetBookDetails` query, we use the `...BookInfo` spread syntax to include the fields defined in the fragment.
-- We also include additional fields specific to this query, such as `description`.
-
-By using the `BookInfo` fragment, you can maintain a consistent set of fields for book information across different parts of your application without duplicating the field selection in every query. This improves code maintainability and reduces the chance of errors.
-
-In developer documentation, you can explain the purpose of the fragment, provide examples like the one above, and encourage developers to use fragments to organize and reuse their GraphQL queries effectively.
-
-## Using Fragments in RedwoodJS
-
-RedwoodJS makes it easy to use fragments, especially with VS Code and Apollo GraphQL Client.
-
-First, RedwoodJS instructs the VS Code GraphQL Plugin where to look for fragments by configuring the `documents` attribute of your project's `graphql.config.js`:
-
-```js
-// graphql.config.js
-
-const { getPaths } = require('@redwoodjs/internal')
-
-module.exports = {
- schema: getPaths().generated.schema,
- documents: './web/src/**/!(*.d).{ts,tsx,js,jsx}', // 👈 Tells VS Code plugin where to find fragments
-}
-```
-
-Second, RedwoodJS automatically creates the [fragmentRegistry](https://www.apollographql.com/docs/react/data/fragments/#registering-named-fragments-using-createfragmentregistry) needed for Apollo to know about the fragments in your project without needing to interpolate their declarations.
-
-Redwood exports ways to interact with fragments in the `@redwoodjs/web/apollo` package.
-
-```
-import { fragmentRegistry, registerFragment } from '@redwoodjs/web/apollo'
-```
-
-With `fragmentRegistry`, you can interact with the registry directly.
-
-With `registerFragment`, you can register a fragment with the registry and get back:
-
- ```ts
- { fragment, typename, getCacheKey, useRegisteredFragment }
- ```
-
-which can then be used to work with the registered fragment.
-
-### Setup
-
-`yarn rw setup graphql fragments`
-
-See more in [cli commands - setup graphql fragments](../cli-commands.md#setup-graphql-fragments).
-
-### registerFragment
-
-To register a fragment, you can simply register it with `registerFragment`.
-
-```ts
-import { registerFragment } from '@redwoodjs/web/apollo'
-
-registerFragment(
- gql`
- fragment BookInfo on Book {
- id
- title
- author
- publicationYear
- }
- `
-)
-```
-
-This makes the `BookInfo` available to use in your query:
-
-
-```ts
-import type { GetBookDetails } from 'types/graphql'
-
-import { useQuery } from '@redwoodjs/web'
-
-import BookInfo from 'src/components/BookInfo'
-
-const GET_BOOK_DETAILS = gql`
- query GetBookDetails($bookId: ID!) {
- book(id: $bookId) {
- ...BookInfo
- description
- # Include other fields specific to this query
- }
- }
-
-...
-
-const { data, loading} = useQuery(GET_BOOK_DETAILS)
-
-```
-
-
-You can then access the book info from `data` and render:
-
-```ts
-{!loading && (
-
-
Title: {data.title}
-
by {data.author} ({data.publicationYear})<>
-
-)}
-```
-
-### fragment
-
-Access the original fragment you registered.
-
-```ts
-import { fragment } from '@redwoodjs/web/apollo'
-```
-
-### typename
-
-Access typename of fragment you registered.
-
-```ts
-import { typename } from '@redwoodjs/web/apollo'
-```
-
-For example, with
-
-```graphql
-# Define a GraphQL fragment for book information
-fragment BookInfo on Book {
- id
- title
- author
- publicationYear
-}
-```
-
-the `typename` is `Book`.
-
-
-## useCache!!!
-
-## getCacheKey
-
-A helper function to create the cache key for the data associated with the fragment in Apollo cache.
-
-```ts
-import { getCacheKey } from '@redwoodjs/web/apollo'
-```
-
-For example, with
-
-```graphql
-# Define a GraphQL fragment for book information
-fragment BookInfo on Book {
- id
- title
- author
- publicationYear
-}
-```
-
-the `getCacheKey` is a function where `getCacheKey(42)` would return `Book:42`.
-
-### useRegisteredFragment
-
-```ts
-import { registerFragment } from '@redwoodjs/web/apollo'
-
-const { useRegisteredFragment } = registerFragment(
- // ...
-)
-```
-
-A helper function relies on Apollo's [`useFragment` hook](https://www.apollographql.com/docs/react/data/fragments/#usefragment) in Apollo cache.
-
-The useFragment hook represents a lightweight live binding into the Apollo Client Cache. It enables Apollo Client to broadcast specific fragment results to individual components. This hook returns an always-up-to-date view of whatever data the cache currently contains for a given fragment. useFragment never triggers network requests of its own.
-
-
-This means that once the Apollo Client Cache has loaded the data needed for the fragment, one can simply render the data for the fragment component with its id reference.
-
-Also, anywhere the fragment component is rendered will be updated with teh latest data if any of `useQuery` with uses the fragment received new data.
-
-```ts
-import type { Book } from 'types/graphql'
-
-import { registerFragment } from '@redwoodjs/web/apollo'
-
-const { useRegisteredFragment } = registerFragment(
- gql`
- fragment BookInfo on Book {
- id
- title
- author
- publicationYear
- }
- `
-)
-
-const Book = ({ id }: { id: string }) => {
- const { data, complete } = useRegisteredFragment(id)
-
- return (
- complete && (
-
-
Title: {data.title}
-
by {data.author} ({data.publicationYear})<>
-
- )
- )
-}
-
-export default Book
-```
-
-:::note
-In order to use [fragments](#fragments) with [unions](#unions) and interfaces in Apollo Client, you need to tell the client how to discriminate between the different types that implement or belong to a supertype.
-
-Please see how to [generate possible types from fragments and union types](#generate-possible-types).
-:::
-
-
-## Possible Types for Unions
-
-In order to use [fragments](#fragments) with [unions](#unions) and interfaces in Apollo Client, you need to tell the client how to discriminate between the different types that implement or belong to a supertype.
-
-You pass a possibleTypes option to the InMemoryCache constructor to specify these relationships in your schema.
-
-This object maps the name of an interface or union type (the supertype) to the types that implement or belong to it (the subtypes).
-
-For example:
-
-```ts
-/// web/src/App.tsx
-
-
-```
-
-To make this easier to maintain, RedwoodJS GraphQL CodeGen automatically generates `possibleTypes` so you can simply assign it to the `graphQLClientConfig`:
-
-
-```ts
-// web/src/App.tsx
-
-import possibleTypes from 'src/graphql/possibleTypes'
-
-// ...
-
-const graphQLClientConfig = {
- cacheConfig: {
- ...possibleTypes,
- },
-}
-
-
-```
-
-To generate the `src/graphql/possibleTypes` file, enable fragments in `redwood.toml`:
-
-```toml title=redwood.toml
-[graphql]
- fragments = true
-```
diff --git a/docs/versioned_docs/version-7.0/how-to/oauth.md b/docs/versioned_docs/version-7.0/how-to/oauth.md
deleted file mode 100644
index 4b65257ef5cf..000000000000
--- a/docs/versioned_docs/version-7.0/how-to/oauth.md
+++ /dev/null
@@ -1,835 +0,0 @@
-# OAuth
-
-If you're using an auth provider like [Auth0](/docs/auth/auth0), OAuth login to third party services (GitHub, Google, Facebook) is usually just a setting you can toggle on in your provider's dashboard. But if you're using [dbAuth](/docs/auth/dbauth) you'll only have username/password login to start. But, adding one or more OAuth clients isn't hard. This recipe will walk you through it from scratch, adding OAuth login via GitHub.
-
-## Prerequisites
-
-This article assumes you have an app set up and are using dbAuth. We're going to make use of the dbAuth system to validate that you're who you say you are. If you just want to try this code out in a sandbox app, you can create a test blog app from scratch by checking out the [Redwood codebase](https://github.com/redwoodjs/redwood) itself and then running a couple of commands:
-
-```bash
-yarn install
-yarn build
-
-# typescript
-yarn run build:test-project ~/oauth-app
-
-# javascript
-yarn run build:test-project ~/oauth-app --javascript
-```
-
-That will create a simple blog application at `~/oauth-app`. You'll get a login and signup page, which we're going to enhance to include a **Login with GitHub** button.
-
-Speaking of GitHub, you'll also need a GitHub account so you can create an [OAuth app](https://docs.github.com/en/apps/oauth-apps/building-oauth-apps/creating-an-oauth-app).
-
-We also assume you're familiar with the basics of OAuth and the terminology surrounding it.
-
-## Login Flow
-
-Here's the logic flow we're going to implement:
-
-1. User comes to the login page and clicks a **Login with GitHub** button/link.
-2. The link directs the browser to GitHub's OAuth process at github.com.
-3. The user logs in with their GitHub credentials and approves our app.
-4. The browser is redirected back to our app, to a new function `/api/src/functions/oauth/oauth.js`.
-5. The function fetches the OAuth **access_token** with a call to GitHub, using the **code** that was included with the redirect from GitHub in the previous step.
-6. When the **access_token** is received, the function then requests the user data from GitHub via another fetch to GitHub's API.
-7. The function then checks our database for a user identified by GitHub's `id`. If no user is found, the `User` record is created using the info from the fetch in the previous step.
-8. The user data from our own database is used to create the same cookie that dbAuth creates on a successful login.
-9. The browser is redirected back to our site, and the user is now logged in.
-
-## GitHub OAuth App Setup
-
-In order to allow OAuth login with GitHub, we need to create an OAuth App. The instructions below are for creating one on your personal GitHub account, but if your app lives in a separate organization then you can perform the same steps under the org instead.
-
-First go to your [Settings](https://github.com/settings/profile) and then the [Developer settings](https://github.com/settings/apps) at the bottom left. Finally, click the [OAuth Apps](https://github.com/settings/developers) nav item at left:
-
-![OAuth app settings screenshot](https://user-images.githubusercontent.com/300/245297416-34821cb6-ace0-4a6a-9bf6-4e434d3cefc5.png)
-
-Click [**New OAuth App**](https://github.com/settings/applications/new) and fill it in something like this:
-
-![New OAuth app settings](https://user-images.githubusercontent.com/300/245298106-b35a6abe-6e8c-4ab1-8ab5-7b7e1dcc0a39.png)
-
-The important part is the **Authorization callback URL** which is where GitHub will redirect you back once authenticated (step 4 of the login flow above). This callback URL assumes you're using the default function location of `/.redwood/functions`. If you've changed that in your app be sure to change it here as well.
-
-Click **Register application** and then on the screen that follows, click the **Generate a new client secret** button:
-
-![New client secret button](https://user-images.githubusercontent.com/300/245298639-6e08a201-b0db-4df6-975f-592544bdced7.png)
-
-You may be asked to use your 2FA code to verify that you're who you say you are, but eventually you should see your new **Client secret**. Copy that, and the **Client ID** above it:
-
-![Client secret](https://user-images.githubusercontent.com/300/245298897-129b5d00-3bed-4d7e-a40e-f4c9cda8a21f.png)
-
-Add those to your app's `.env` file (or wherever you're managing your secrets). Note that it's best to have a different OAuth app on GitHub for each environment you deploy to. Consider this one the **dev** app, and you'll create a separate one with a different client ID and secret when you're ready to deploy to production:
-
-```bash title="/.env"
-GITHUB_OAUTH_CLIENT_ID=41a08ae238b5aee4121d
-GITHUB_OAUTH_CLIENT_SECRET=92e8662e9c562aca8356d45562911542d89450e1
-```
-
-We also need to denote what data we want permission to read from GitHub once someone authorizes our app. We'll want the user's public info, and probably their email address. That's only two scopes, and we can add those as another ENV var:
-
-```bash title="/.env"
-GITHUB_OAUTH_CLIENT_ID=41a08ae238b5aee4121d
-GITHUB_OAUTH_CLIENT_SECRET=92e8662e9c562aca8356d45562911542d89450e1
-# highlight-next-line
-GITHUB_OAUTH_SCOPES="read:user user:email"
-```
-
-If you wanted access to more GitHub data, you can specify [additional scopes](https://docs.github.com/en/apps/oauth-apps/building-oauth-apps/scopes-for-oauth-apps) here and they'll be listed to the user when they go to authorize your app. You can also change this list in the future, but you'll need to log the user out and the next time they click **Login with GitHub** they'll be asked to authorize your app again, with a new list of requested scopes.
-
-One more ENV var, this is the same callback URL we told GitHub about. This is used in the link in the **Login with GitHub** button and gives GitHub another chance to verify that you're who you say you are: you're proving that you know where you're supposed to redirect back to:
-
-```bash title="/.env"
-GITHUB_OAUTH_CLIENT_ID=41a08ae238b5aee4121d
-GITHUB_OAUTH_CLIENT_SECRET=92e8662e9c562aca8356d45562911542d89450e1
-GITHUB_OAUTH_SCOPES="read:user user:email"
-# highlight-next-line
-GITHUB_OAUTH_REDIRECT_URI="http://localhost:8910/.redwood/functions/oauth/callback"
-```
-
-## The Login Button
-
-This part is pretty easy, we're just going to add a link/button to go directly to GitHub to begin the OAuth process:
-
-```jsx title="/web/src/pages/LoginPage/LoginPage.jsx"
-
- Login with GitHub
-
-```
-
-:::info
-This example uses Tailwind to style the link to match the rest of the default dbAuth login page
-:::
-
-You can put this same link on your signup page as well, since using the OAuth flow will be dual-purpose: it will log the user in if a local user already exists, or it will create the user and then log them in.
-
-We're using several of our new ENV vars here, and need to tell Redwood to make them available to the web side during the build process. Add them to the `includeEnvironmentVariables` key in `redwood.toml`:
-
-```toml title="/redwood.toml"
-[web]
- title = "Redwood App"
- port = "${WEB_DEV_PORT:8910}"
- apiUrl = "/.redwood/functions"
- # highlight-next-line
- includeEnvironmentVariables = ["GITHUB_OAUTH_CLIENT_ID", "GITHUB_OAUTH_REDIRECT_URI", "GITHUB_OAUTH_SCOPES"]
-[api]
- port = "${API_DEV_PORT:8911}"
-[browser]
- open = true
-[notifications]
- versionUpdates = ["latest"]
-```
-
-Restart your dev server to pick up the new TOML settings, and your link should appear:
-
-![Login button](https://user-images.githubusercontent.com/300/245899085-0b946a14-cd7c-402a-9d86-b6527fd89c7f.png)
-
-Go ahead and click it, and you should be taken to GitHub to authorize your GitHub login to work with your app. You'll see the scopes we requested listed under the **Personal User Data** heading:
-
-![GitHub Oauth Access Page](https://user-images.githubusercontent.com/300/245899872-8ddd7e69-dbfa-4544-ab6f-78fd4ff02da8.png)
-
-:::warning
-
-If you get an error here that says "The redirect_uri MUST match the registered callback URL for this application" verify that the redirect URL you entered on GitHub and the one you put into the `GITHUB_OAUTH_REDIRECT_URL` ENV var are identical!
-
-:::
-
-Click **authorize** and you should end up seeing some JSON, and an error:
-
-![/oauth function not found](https://user-images.githubusercontent.com/300/245900327-b21a178e-5539-4c6d-a5d6-9bb736100940.png)
-
-That's coming from our app because we haven't created the `oauth` function that GitHub redirects to. But you'll see a `code` in the URL, which means GitHub is happy with our flow so far. Now we need to trade that `code` for an `access_token`. We'll do that in our `/oauth` function.
-
-:::info
-This nicely formatted JSON comes from the [JSON Viewer](https://chrome.google.com/webstore/detail/json-viewer/gbmdgpbipfallnflgajpaliibnhdgobh) Chrome extension.
-:::
-
-## The `/oauth` Function
-
-We can have Redwood generate a shell of our new function for us:
-
-```bash
-yarn rw g function oauth
-```
-
-That will create the function at `/api/src/functions/oauth/oauth.js`. If we retry the **Login with GitHub** button now, we'll see the output of that function instead of the error:
-
-![Oauth function responding](https://user-images.githubusercontent.com/300/245903068-760596fa-4139-4d11-b3b3-a90edfbbf496.png)
-
-Now let's start filling out this function with the code we need to get the `access_token`.
-
-### Fetching the `access_token`
-
-We told GitHub to redirect to `/oauth/callback` which *appears* like it would be a subdirectory, or child route of our `oauth` function, but in reality everything after `/oauth` just gets shoved into an `event.path` variable that we'll need to inspect to make sure it has the proper parts (like `/callback`). We can do that in the `hander()`:
-
-```js title="/api/src/functions/oauth/oauth.js"
-export const handler = async (event, _context) => {
- switch (event.path) {
- case '/oauth/callback':
- return await callback(event)
- default:
- // Whatever this is, it's not correct, so return "Not Found"
- return {
- statusCode: 404,
- }
- }
-}
-
-const callback = async (event) => {
- return { body: 'ok' }
-}
-```
-
-The `callback()` function is where we'll actually define the rest of our flow. We can verify this is working by trying a couple of different URLs in the browser and see that `/oauth/callback` returns a 200 and "ok" in the body of the page, but anything else returns a 404.
-
-Now we need to make a request to GitHub to trade the `code` for an `access_token`. This is handled by a `fetch`:
-
-```js title="/api/src/functions/oauth/oauth.js"
-const callback = async (event) => {
- // highlight-start
- const { code } = event.queryStringParameters
-
- const response = await fetch(`https://github.com/login/oauth/access_token`, {
- method: 'POST',
- headers: {
- Accept: 'application/json',
- 'Content-Type': 'application/json',
- },
- body: JSON.stringify({
- client_id: process.env.GITHUB_OAUTH_CLIENT_ID,
- client_secret: process.env.GITHUB_OAUTH_CLIENT_SECRET,
- redirect_uri: process.env.GITHUB_OAUTH_REDIRECT_URI,
- code,
- }),
- })
-
- const { access_token, scope, error } = JSON.parse(await response.text())
-
- if (error) {
- return { statuscode: 400, body: error }
- }
-
- return {
- body: JSON.stringify({ access_token, scope, error })
- }
- // highlight-end
-}
-```
-
-First we get the `code` out of the query string variables, then make a POST `fetch()` to GitHub, setting the required JSON body to include several of the ENV vars we've set, as well as the `code` we got from the GitHub redirect. Then we parse the response JSON and just return it in the browser to make sure it worked. If something went wrong (`error` is not `undefined`) then we'll output the error message in the body of the page.
-
-Let's try it: go back to the login page, click the **Login with GitHub** button and see what happens:
-
-![GitHub OAuth access_token granted](https://user-images.githubusercontent.com/300/245906529-d08f9d6e-4947-4d14-9377-def3645d9c68.png)
-
-You can also verify that the error response works by, for example, removing the `code` key from the `fetch()`, and see GitHub complain:
-
-![GitHub OAuth error](https://user-images.githubusercontent.com/300/245906827-703a4a21-b279-428c-be1c-b73c559a72b3.png)
-
-Great, GitHub has authorized us and now we can get details about the actual user from GitHub.
-
-### Retrieving GitHub User Details
-
-We need some unique identifier to tie a user in GitHub to a user in our database. The `access_token` we retrieved allows us to make requests to GitHub's API and return data for the user that the `access_token` is attached to, up to the limits of the `scopes` we requested. GitHub has a unique user `id` that we can use to tie the two together. Let's request that data and dump it to the browser so we can see that the request works.
-
-To keep things straight in our heads, let's call our local user `user` and the GitHub user the `providerUser` (since GitHub is "providing" the OAuth credentials).
-
-Let's make the API call to GitHub's user info endpoint and dump the result to the browser:
-
-```js title="/api/src/functions/oauth/oauth.js"
-const callback = async (event) => {
- const { code } = event.queryStringParameters
-
- const response = await fetch(`https://github.com/login/oauth/access_token`, {
- method: 'POST',
- headers: {
- Accept: 'application/json',
- 'Content-Type': 'application/json',
- },
- body: JSON.stringify({
- client_id: process.env.GITHUB_OAUTH_CLIENT_ID,
- client_secret: process.env.GITHUB_OAUTH_CLIENT_SECRET,
- redirect_uri: process.env.GITHUB_OAUTH_REDIRECT_URI,
- code,
- }),
- })
-
- const { access_token, scope, error } = JSON.parse(await response.text())
-
- if (error) {
- return { statuscode: 400, body: error }
- }
-
- // highlight-start
- try {
- const providerUser = await getProviderUser(access_token)
- return {
- body: JSON.stringify(providerUser)
- }
- } catch (e) {
- return { statuscode: 500, body: e.message }
- }
- // highlight-end
-}
-
-// highlight-start
-const getProviderUser = async (token) => {
- const response = await fetch('https://api.github.com/user', {
- headers: { Authorization: `Bearer ${token}` },
- })
- const body = JSON.parse(await response.text())
-
- return body
-}
-// highlight-end
-```
-
-If all went well you should get a ton of juicy data:
-
-![GitHub user output](https://user-images.githubusercontent.com/300/245909925-c984eeb4-f172-46f6-8102-297b72e26bbd.png)
-
-If something went wrong with the fetch you should get a 500 and the error message output in the body. Try setting the `token` in the `Authorization` header to something like `foobar` to verify:
-
-![GitHub API error](https://user-images.githubusercontent.com/300/245910198-2975e90e-9af1-49b1-a41a-81b9269ff71d.png)
-
-Great, we've got the user data, now what do we do with it?
-
-### Database Updates
-
-We've got a bunch of user data that we can use to create a `User` in our own database. But we'll want to look up that same user in the future when they log back in. We have a couple of ways we can go about doing this:
-
-1. Keep our `User` model as-is and create the user in our local database. When the user logs in again, look them by their email address stored in GitHub. **Cons:** If the user changes their email in GitHub we won't be able to find them the next time they log in, and we would create a new user.
-2. Keep the `User` model as-is but create the user with the same `id` as the one we get from GitHub. **Cons:** If we keep username/password login, we would need to create new users with an `id` that won't ever clash with the ones from GitHub.
-2. Add a column to `User` like `githubId` that stores the GitHub `id` so that we can find the user again the next time they come to login. **Cons:** If we add more providers in the future we'll need to keep adding new `*Id` columns for each.
-3. Create a new one-to-many relationship model that stores the GitHub `id` as a single row, tied to the `userId` of the `User` table, and a new row for each ID of any future providers. **Cons:** More complex data structure.
-
-Option #4 will be the most flexible going forward if we ever decide to add more OAuth providers. And if my experience is any indication, everyone always wants more login providers.
-
-So let's create a new `Identity` table that stores the name of the provider and the ID in that system. Logically it will look like this:
-
-```
-┌───────────┐ ┌────────────┐
-│ User │ │ Identity │
-├───────────┤ ├────────────┤
-│ id │•──┐ │ id │
-│ name │ └──<│ userId │
-│ email │ │ provider │
-│ ... │ │ uid │
-└───────────┘ │ ... │
- └────────────┘
-```
-
-For now `provider` will always be `github` and the `uid` will be the GitHub's unique ID. `uid` should be a `String`, because although GitHub's ID's are integers, not every OAuth provider is guaranteed to use ints.
-
-#### Prisma Schema Updates
-
-Here's the `Identity` model definition:
-
-```prisma title="/api/db/schema.prisma"
-model Identity {
- id Int @id @default(autoincrement())
- provider String
- uid String
- userId Int
- user User @relation(fields: [userId], references: [id])
- accessToken String?
- scope String?
- lastLoginAt DateTime @default(now())
- createdAt DateTime @default(now())
- updatedAt DateTime @updatedAt
-
- @@unique([provider, uid])
- @@index(userId)
-}
-```
-
-We're also storing the `accessToken` and `scope` that we got back from the last time we retrived them from GitHub, as well as a timestamp for the last time the user logged in. Storing the `scope` is useful because if you ever change them, you may want to notify users that have the previous scope definition to re-login so the new scopes can be authorized.
-
-:::caution
-
-There's no GraphQL SDL tied to the Identity table, so it is not accessible via our API. But, if you ever did create an SDL and service, be sure that `accessToken` is not in the list of fields exposed publicly!
-
-:::
-
-We'll need to add an `identities` relation to the `User` model, and make the previously required `hashedPassword` and `salt` fields optional (since users may want to *only* authenticate via GitHub, they'll never get to enter a password):
-
-```prisma title="/api/db/schema.prisma"
-model User {
- id Int @id @default(autoincrement())
- email String @unique
- // highlight-start
- hashedPassword String?
- salt String?
- identities Identity[]
- // highlight-end
- ...
-}
-```
-
-Save these as a migration and apply them to the database:
-
-```bash
-yarn rw prisma migrate dev
-```
-
-Give it a name like "create identity". That's it for the database. Let's return to the `/oauth` function and start working with our new `Identity` model.
-
-### Creating Users and Identities
-
-On a successful GitHub OAuth login we'll want to first check and see if a user already exists with the provider info. If so, we can go ahead and log them in. If not, we'll need to create it first, then log them in.
-
-Let's add some code that returns the user if found, otherwise it creates the user *and* returns it, so that the rest of our code doesn't have to care.
-
-:::info
-Be sure to import `db` at the top of the file if you haven't already!
-:::
-
-```js title="/api/src/functions/oauth/oauth.js"
-// highlight-next-line
-import { db } from 'src/lib/db'
-
-const callback = async (event) => {
- const { code } = event.queryStringParameters
-
- const response = await fetch(`https://github.com/login/oauth/access_token`, {
- method: 'POST',
- headers: {
- Accept: 'application/json',
- 'Content-Type': 'application/json',
- },
- body: JSON.stringify({
- client_id: process.env.GITHUB_OAUTH_CLIENT_ID,
- client_secret: process.env.GITHUB_OAUTH_CLIENT_SECRET,
- redirect_uri: process.env.GITHUB_OAUTH_REDIRECT_URI,
- code,
- }),
- })
-
- const { access_token, scope, error } = JSON.parse(await response.text())
-
- if (error) {
- return { statuscode: 400, body: error }
- }
-
- try {
- const providerUser = await getProviderUser(access_token)
- // highlight-start
- const user = await getUser({ providerUser, accessToken: access_token, scope })
- return {
- body: JSON.stringify(user)
- }
- // highlight-end
- } catch (e) {
- return { statuscode: 500, body: e.message }
- }
-}
-
-// highlight-start
-const getUser = async ({ providerUser, accessToken, scope }) => {
- const { user, identity } = await findOrCreateUser(providerUser)
-
- await db.identity.update({
- where: { id: identity.id },
- data: { accessToken, scope, lastLoginAt: new Date() },
- })
-
- return user
-}
-// highlight-end
-
-// highlight-start
-const findOrCreateUser = async (providerUser) => {
- const identity = await db.identity.findFirst({
- where: { provider: 'github', uid: providerUser.id.toString() }
- })
-
- if (identity) {
- // identity exists, return the user
- const user = await db.user.findUnique({ where: { id: identity.userId }})
- return { user, identity }
- }
-
- // identity not found, need to create it and the user
- return await db.$transaction(async (tx) => {
- const user = await tx.user.create({
- data: {
- email: providerUser.email,
- fullName: providerUser.name,
- },
- }
-
- const identity = await tx.identity.create({
- data: {
- userId: user.id,
- provider: 'github',
- uid: providerUser.id.toString()
- }
- })
-
- return { user, identity }
- })
-}
-// highlight-end
-```
-
-Let's break that down.
-
-```js
-const providerUser = await getProviderUser(access_token)
-// highlight-next-line
-const user = await getUser({ providerUser, accessToken: access_token, scope })
-return {
- body: JSON.stringify(user)
-}
-```
-
-After getting the `providerUser` we're going to find our local `user`, and then dump the user to the browser to verify.
-
-```js
-const getUser = async ({ providerUser, accessToken, scope }) => {
- const { user, identity } = await getOrCreateUser(providerUser)
-
- await db.identity.update({
- where: { id: identity.id },
- data: { accessToken, scope, lastLoginAt: new Date() },
- })
-
- return user
-}
-```
-
-The `getUser()` function is going to return the user, whether it had to be created or not. Either way, the attached identity is updated with the current value for the `access_token` (note the case change, try not to get confused!), as well as the `scope` and `lastLoginAt` timestamp. `findOrCreateUser()` is going to do the heavy lifting:
-
-```js
-const findOrCreateUser = async (providerUser) => {
- const identity = await db.identity.findFirst({
- where: { provider: 'github', uid: providerUser.id.toString() }
- })
-
- if (identity) {
- const user = await db.user.findUnique({ where: { id: identity.userId }})
- return { user, identity }
- }
-
- // ...
-}
-```
-
-If the user already exists, great! Return it, and the attached `identity` so that we can update the details. If the user doesn't exist already:
-
-```js
-const findOrCreateUser = async (providerUser) => {
- // ...
-
- return await db.$transaction(async (tx) => {
- const user = await tx.user.create({
- data: {
- email: providerUser.email,
- fullName: providerUser.name,
- },
- }
-
- const identity = await tx.identity.create({
- data: {
- userId: user.id,
- provider: 'github',
- uid: providerUser.id.toString()
- }
- })
-
- return { user, identity }
- })
-}
-```
-
-We create the `user` and the `identity` records inside a transaction so that if something goes wrong, both records fail to create. The error would bubble up to the try/catch inside `callback()`. (The Redwood test project has a required `fullName` field that we fill with the `name` attribute from GitHub.)
-
-:::info
-Don't forget the `toString()` calls whenever we read or write the `providerUser.id` since we made the `uid` of type `String`.
-:::
-
-If everything worked then on clicking **Login with GitHub** we should now see a dump of the actual user from our local database:
-
-![User details](https://user-images.githubusercontent.com/300/245922971-caaeb3ed-9231-4edf-aac5-9ea76b488824.png)
-
-You can take a look in the database and verify that the User and Identity were created. Start up the [Prisma Studio](https://www.prisma.io/studio) (which is already included with Redwood):
-
-```bash
-yarn rw prisma studio
-```
-
-![Inspecting the Identity record](https://user-images.githubusercontent.com/300/245923393-d61233cc-52d2-4568-858e-9059dfe31bfc.png)
-
-Great! But, if you go back to your homepage, you'll find that you're not actually logged in. That's because we're not setting the cookie that dbAuth expects to see to consider you logged in. Let's do that, and then our login will be complete!
-
-### Setting the Login Cookie
-
-In order to let dbAuth do the work of actually considering us logged in (and handling stuff like reauthentication and logout) we'll just set the same cookie that the username/password login system would have if the user logged in with a username and password.
-
-Setting a cookie in the browser is a matter of returning a `Set-Cookie` header in the response from the server. We've been responding with a dump of the user object, but now we'll do a real return, including the cookie and a `Location` header to redirect us back to the site.
-
-Don't forget the new `CryptoJS` import at the top!
-
-```js title="/api/src/functions/oauth/oauth.js"
-// highlight-next-line
-import CryptoJS from 'crypto-js'
-
-const callback = async (event) => {
- const { code } = event.queryStringParameters
-
- const response = await fetch(`https://github.com/login/oauth/access_token`, {
- method: 'POST',
- headers: {
- Accept: 'application/json',
- 'Content-Type': 'application/json',
- },
- body: JSON.stringify({
- client_id: process.env.GITHUB_OAUTH_CLIENT_ID,
- client_secret: process.env.GITHUB_OAUTH_CLIENT_SECRET,
- redirect_uri: process.env.GITHUB_OAUTH_REDIRECT_URI,
- code,
- }),
- })
-
- const { access_token, scope, error } = JSON.parse(await response.text())
-
- if (error) {
- return { statuscode: 400, body: error }
- }
-
- try {
- const providerUser = await getProviderUser(access_token)
- const user = await getUser({
- providerUser,
- accessToken: access_token,
- scope,
- })
- // highlight-start
- const cookie = secureCookie(user)
-
- return {
- statusCode: 302,
- headers: {
- 'Set-Cookie': cookie,
- Location: '/',
- },
- }
- // highlight-end
- } catch (e) {
- return { statuscode: 500, body: e.message }
- }
-}
-
-// highlight-start
-const secureCookie = (user) => {
- const expires = new Date()
- expires.setFullYear(expires.getFullYear() + 1)
-
- const cookieAttrs = [
- `Expires=${expires.toUTCString()}`,
- 'HttpOnly=true',
- 'Path=/',
- 'SameSite=Strict',
- `Secure=${process.env.NODE_ENV !== 'development'}`,
- ]
- const data = JSON.stringify({ id: user.id })
-
- const encrypted = CryptoJS.AES.encrypt(
- data,
- process.env.SESSION_SECRET
- ).toString()
-
- return [`session=${encrypted}`, ...cookieAttrs].join('; ')
-}
-// highlight-end
-```
-
-`secureCookie()` takes care of creating the cookie that matches the one set by dbAuth. The attributes that we're setting are actually a copy of the ones set in the `authHandler` in `/api/src/functions/auth.js` and you could remove some duplication between the two by exporting the `cookie` object from `auth.js` and then importing it and using it here. We've set the cookie to expire in a year because, let's admit it, no one likes having to log back in again.
-
-At the end of `callback()` we set the `Set-Cookie` and `Location` headers to send the browser back to the homepage of our app.
-
-Try it out, and as long as you have an indication on your site that a user is logged in, you should see it! In the case of the test project, you'll see "Log Out" at the right side of the top nav instead of "Log In". Try logging out and then back again to test the whole flow from scratch.
-
-## The Complete `/oauth` Function
-
-Here's the `oauth` function in its entirety:
-
-```jsx title="/api/src/functions/oauth/oauth.js"
-import CryptoJS from 'crypto-js'
-
-import { db } from 'src/lib/db'
-
-export const handler = async (event, _context) => {
- switch (event.path) {
- case '/oauth/callback':
- return await callback(event)
- default:
- // Whatever this is, it's not correct, so return "Not Found"
- return {
- statusCode: 404,
- }
- }
-}
-
-const callback = async (event) => {
- const { code } = event.queryStringParameters
-
- const response = await fetch(`https://github.com/login/oauth/access_token`, {
- method: 'POST',
- headers: {
- Accept: 'application/json',
- 'Content-Type': 'application/json',
- },
- body: JSON.stringify({
- client_id: process.env.GITHUB_OAUTH_CLIENT_ID,
- client_secret: process.env.GITHUB_OAUTH_CLIENT_SECRET,
- redirect_uri: process.env.GITHUB_OAUTH_REDIRECT_URI,
- code,
- }),
- })
-
- const { access_token, scope, error } = JSON.parse(await response.text())
-
- if (error) {
- return { statuscode: 400, body: error }
- }
-
- try {
- const providerUser = await getProviderUser(access_token)
- const user = await getUser({
- providerUser,
- accessToken: access_token,
- scope,
- })
- const cookie = secureCookie(user)
-
- return {
- statusCode: 302,
- headers: {
- 'Set-Cookie': cookie,
- Location: '/',
- },
- }
- } catch (e) {
- return { statuscode: 500, body: e.message }
- }
-}
-
-const secureCookie = (user) => {
- const expires = new Date()
- expires.setFullYear(expires.getFullYear() + 1)
-
- const cookieAttrs = [
- `Expires=${expires.toUTCString()}`,
- 'HttpOnly=true',
- 'Path=/',
- 'SameSite=Strict',
- `Secure=${process.env.NODE_ENV !== 'development'}`,
- ]
- const data = JSON.stringify({ id: user.id })
-
- const encrypted = CryptoJS.AES.encrypt(
- data,
- process.env.SESSION_SECRET
- ).toString()
-
- return [`session=${encrypted}`, ...cookieAttrs].join('; ')
-}
-
-const getProviderUser = async (token) => {
- const response = await fetch('https://api.github.com/user', {
- headers: { Authorization: `Bearer ${token}` },
- })
- const body = JSON.parse(await response.text())
-
- return body
-}
-
-const getUser = async ({ providerUser, accessToken, scope }) => {
- const { user, identity } = await findOrCreateUser(providerUser)
-
- await db.identity.update({
- where: { id: identity.id },
- data: { accessToken, scope, lastLoginAt: new Date() },
- })
-
- return user
-}
-
-const findOrCreateUser = async (providerUser) => {
- const identity = await db.identity.findFirst({
- where: { provider: 'github', uid: providerUser.id.toString() },
- })
-
- if (identity) {
- // identity exists, return the user
- const user = await db.user.findUnique({ where: { id: identity.userId } })
- return { user, identity }
- }
-
- // identity not found, need to create it and the user
- return await db.$transaction(async (tx) => {
- const user = await tx.user.create({
- data: {
- email: providerUser.email,
- fullName: providerUser.name,
- },
- })
-
- const identity = await tx.identity.create({
- data: {
- userId: user.id,
- provider: 'github',
- uid: providerUser.id.toString(),
- },
- })
-
- return { user, identity }
- })
-}
-```
-
-## Enhancements
-
-This is barebones implementation of a single OAuth provider. What can we do to make it better?
-
-### More Providers
-
-We hardcoded "github" as the provider in a couple of places, as well as hardcoding GitHub's API endpoint for fetching user data. That obviously limits this implementation to only support GitHub.
-
-A more flexible version could include the provider as part of the callback URL, and then our code can see that and choose which provider to set and how to get user details. Maybe the OAuth redirect is `/oauth/github/callback` and `/oauth/twitter/callback`. Then parse that out and delegate to a different function altogether, or implement each provider's specific info into separate files and `import` them into the `/oauth` function, invoking each as needed.
-
-### Changing User Details
-
-Right now we just copy the user details from GitHub right into our new User object. Maybe we want to give the user a chance to update those details first, or add additional information before saving to the database. One solution could be to create the `Identity` record, but redirect to your real Signup page with the info from GitHub (and the `accessToken`) and prefill the signup fields, giving the user a chance to change or enhance them, adding the `accessToken` to a hidden field. Then when the user submits that form, if the `accessToken` is part of the form, get the user details from GitHub again (so we can get their GitHub `id`) and then create the `Identity` and `User` record as before.
-
-### Better Error Handling
-
-Right now if an error occurs in the OAuth flow, the browser just stays on the `/oauth/callback` function and sees a plain text error message. A better experience would be to redirect the user back to the login page, with the error message in a query string variable, something like `http://localhost:8910/login?error=Application+not+authorized` Then in the LoginPage, add a `useParams()` to pull out the query variables, and show a toast message if an error is present:
-
-```jsx
-import { useParams } from '@redwoodjs/router'
-import { toast, Toaster } from '@redwoodjs/web/toast'
-
-const LoginPage = () => {
- const params = useParams()
-
- useEffect(() => {
- if (params.error) {
- toast.error(error)
- }
- }, [params]
-
- return (
- <>
-
- // ...
- >
- )
-}
-```
diff --git a/docs/versioned_docs/version-7.0/how-to/supabase-auth.md b/docs/versioned_docs/version-7.0/how-to/supabase-auth.md
deleted file mode 100644
index f5425b735daa..000000000000
--- a/docs/versioned_docs/version-7.0/how-to/supabase-auth.md
+++ /dev/null
@@ -1,685 +0,0 @@
-# Supabase Auth
-
-Let's call this how to a port of the [Redwood GoTrue Auth how to](gotrue-auth.md) to [Supabase](https://supabase.io/).
-I won't get original style points because I copy-pasted (and updated, for good measure) the original.
-Why? Because Supabase auth is based on [Netlify GoTrue](https://github.com/netlify/gotrue), an API service for handling user registration and authentication. The Supabase folks build on solid open-source foundations.
-
-Once I connected these dots, the Redwood GoTrue Auth how to became a handy resource as I climbed the auth learning curve (and I started from sea level). Hopefully this Supabase-specific edition will help you climb your own too.
-
-## Time to Cook
-
-In this recipe, we'll:
-
-- Configure a Redwood app with Supabase auth
-- Create a Sign Up form, a Sign In form, and a Sign Out button
-- Add auth links that display the correct buttons based on our auth state
-
-But first, some housekeeping...
-
-## Prerequisites
-
-Before getting started, there are a few steps you should complete:
-
-- [Create a Redwood app](../tutorial/chapter1/installation.md)
-- [Create a Supabase account](https://www.supabase.io/)
-- [Go through the Supabase React Quick Start](https://supabase.io/docs/guides/with-react)
-- [Go through the Supabase Redwood Quick Start](https://supabase.io/docs/guides/with-redwoodjs)
-- Fire up a dev server: `yarn redwood dev`
-
-### About the Supabase Quick Starts
-
-Why the React Quick Start before the Redwood? I found it helpful to first interact directly with the [Supabase Client](https://github.com/supabase/supabase-js). Eventually, you'll use the [Redwood Auth wrapper](../authentication.md#supabase), which provides a level of abstraction and a clean, consistent style. But I needed a couple hours of direct client experimentation to gain comfort in the Redwood one.
-
-So, just this once, I hereby give you permission to fire-up Create React App as you follow-along the Supabase React Quick Start. I worked through it first. Then I worked through the Supabase Redwood Quick start, observing the slight differences. This helped me understand the details that the Redwood wrapper abstracts for us.
-
-> **Auth Alphabet Soup**
->
-> If you're like me—and I'm pretty sure I'm just human—you may find yourself spinning in jumbled auth jargon. Hang in there, you'll get your auth ducks lined up eventually.
->
-> I'm proud to tell you that I now know that the Redwood Supabase auth client wraps the Supabase GoTrueJS client, which is a fork of Netlify’s GoTrueJS client (which is different from Netlify Identity). And dbAuth is a totally separate auth option. Plus, I'll keep it simple and not use RBAC at the moment.
->
-> Ahhh! It took me a few weeks to figure this out.
-
-## Back to Redwood
-
-Armed with some knowledge and insight from going through the Supabase Quick Starts, let's head back to the Redwood app created as part of the prerequisites.
-
-Start by installing the required packages and generating boilerplate for Redwood Auth, all with this simple [CLI command](../cli-commands.md#setup-auth):
-
-```bash
-yarn redwood setup auth supabase
-```
-
-By specifying `supabase` as the provider, Redwood automatically added the necessary Supabase config to our app. Let's open up `web/src/App.[js/tsx]` and inspect. You should see:
-
-```jsx {1-2,12,17} title="web/src/App.[js/tsx]"
-import { AuthProvider } from '@redwoodjs/auth'
-import { createClient } from '@supabase/supabase-js'
-
-import { FatalErrorBoundary, RedwoodProvider } from '@redwoodjs/web'
-import { RedwoodApolloProvider } from '@redwoodjs/web/apollo'
-
-import FatalErrorPage from 'src/pages/FatalErrorPage'
-import Routes from 'src/Routes'
-
-import './index.css'
-
-const supabaseClient = createClient(process.env.SUPABASE_URL, process.env.SUPABASE_KEY)
-
-const App = () => (
-
-
-
-
-
-
-
-
-
-)
-
-export default App
-```
-
-Now it's time to add the Supabase URL, public API KEY, and JWT SECRET (`SUPABASE_URL`, `SUPABASE_KEY`, and `SUPABASE_JWT_SECRET`) to your `.env` file.
-You can find these items in your Supabase management console, under **Settings > API**:
-
-![Supabase console screen shot](https://user-images.githubusercontent.com/43206213/146407575-71ad2c94-8fa6-48d2-a403-d249f75569ea.png)
-
-Here's a `.env` example:
-
-```bash
-# .env (in your root project directory)
-
-SUPABASE_URL=https://replacewithyoursupabaseurl.supabase.co
-SUPABASE_KEY=eyJhb_replace_VCJ9.eyJy_with_your_wfQ.0Abb_anon_key_teLJs
-SUPABASE_JWT_SECRET=eyJh_replace_CJ9.eyJy_with_your_NTQwOTB9.MGNZN_JWT_secret_JgErqxj4
-```
-
-That's (almost) all for configuration.
-
-## Sign Up
-
-Sign Up feels like an appropriate place to start building our interface.
-Our first iteration won't include features like email confirmation or password recovery.
-To forgo email confirmation, turn off "Enable email confirmations" on your Supabase management console, found under `Authentication > Settings`:
-
-![Supabase email confirmation toggle](https://user-images.githubusercontent.com/43206213/147164458-1b6723ef-d7dd-4c7c-b228-73ca4ba7b1ff.png)
-
-_Now_ we're done with configuration. Back to our app...
-
-## The Sign Up Page
-
-Let's generate a Sign Up page:
-
-```bash
-yarn redwood generate page signup
-```
-
-This adds a Sign Up [route](../router.md) to our routes file and creates a `SignupPage` component.
-
-In the just-generated `SignupPage` component (`web/src/pages/SignupPage/SignupPage.[js/tsx]`), let's import some [Redwood Form components](../forms.md) and make a very basic form:
-
-```jsx title="web/src/pages/SignupPage/SignupPage.[js/tsx]"
-import { Form, TextField, PasswordField, Submit } from '@redwoodjs/forms'
-
-const SignupPage = () => {
- return (
- <>
-
Sign Up
-
- >
- )
-}
-
-export default SignupPage
-```
-
-Did I mention it was basic? If you want to add some polish, you might find both the [Redwood Form docs](../forms.md) and the [tutorial section on forms](../tutorial/chapter3/forms.md) quite useful. For our purposes, let's just focus on the functionality.
-
-Now that we have a form interface, we're going to want to do something when the user submits it. Let's add an `onSubmit` function to our component and pass it as a prop to our Form component:
-
-```jsx {4-6,11} title="web/src/pages/SignupPage/SignupPage.[js/tsx]"
-// ...
-
-const SignupPage = () => {
- const onSubmit = (data) => {
- // do something here
- }
-
- return (
- <>
-
Sign Up
-
- >
- )
-}
-
-//...
-```
-
-The _something_ we need to do is—surprise!—sign up. To do this, we'll need a way to communicate with `` and the Supabase GoTrue-JS client we passed to it. Look no further than the [`useAuth` hook](../authentication.md#api), which lets us subscribe to our auth state and its properties. In our case, we'll be glad to now have access to `client` and, thusly, our Supabase GoTrue-JS instance and [all of its functions](https://github.com/supabase/supabase-js).
-
-Let's import `useAuth` and destructure `client` from it in our component:
-
-```jsx {2,5} title="web/src/pages/SignupPage/SignupPage.js"
-import { Form, TextField, PasswordField, Submit } from '@redwoodjs/forms'
-import { useAuth } from '@redwoodjs/auth'
-
-const SignupPage = () => {
- const { client } = useAuth()
- const onSubmit = (data) => {
- // do something here
- }
-
- return (
- <>
-
Sign Up
-
- >
- )
-}
-
-export default SignupPage
-```
-
-And now we'll attempt to create a new user in the `onSubmit` function with [`client.auth.signUp()`](https://supabase.io/docs/reference/javascript/auth-signup) by passing the `email` and `password` values that we captured from our form:
-
-```jsx {8-16} title="web/src/pages/SignupPage/SignupPage.[js/tsx]"
-import { Form, TextField, PasswordField, Submit } from '@redwoodjs/forms'
-import { useAuth } from '@redwoodjs/auth'
-
-const SignupPage = () => {
- const { client } = useAuth()
-
- const onSubmit = async (data) => {
- try {
- const response = await client.auth.signUp({
- email: data.email,
- password: data.password
- })
- console.log('response: ', response)
- } catch(error) {
- console.log('error: ', error)
- }
- }
-
- return (
- <>
-
Sign Up
-
- >
- )
-}
-export default SignupPage
-```
-
-Presently, our sign up works as is, but simply console-logging the response from `client.auth.signup()` is hardly useful behavior.
-
-Let's display errors to the user if there are any. To do this, we'll set up `React.useState()` to manage our error state and conditionally render the error message. We'll also want to reset the error state at the beginning of every submission with `setError(null)`:
-
-```jsx {6,9,16,18,26} title="web/src/pages/SignupPage/SignupPage.js"
-import { Form, TextField, PasswordField, Submit } from '@redwoodjs/forms'
-import { useAuth } from '@redwoodjs/auth'
-
-const SignupPage = () => {
- const { client } = useAuth()
- const [error, setError] = React.useState(null)
-
- const onSubmit = async (data) => {
- setError(null)
- try {
- const response = await client.auth.signUp({
- email: data.email,
- password: data.password
- })
- console.log('response: ', response)
- response?.error?.message && setError(response.error.message)
- } catch(error) {
- setError(error.message)
- }
- }
-
- return (
- <>
-
Sign Up
-
- >
- )
-}
-export default SignupPage
-```
-
-> Errors may be returned in two fashions:
->
-> 1. upon promise fulfillment, within the `error` property of the object returned by the promise
->
-> 2. upon promise rejection, within an error returned by the promise (you can handle this via the `catch` block)
-
-Now we can handle a successful submission. If we sign up without email confirmation, then successful sign up also _signs in_ the user. Once they've signed in, we'll want to redirect them back to our app.
-
-First, if you haven't already, [generate](../cli-commands.md#generate-page) a homepage:
-
-```bash
-yarn redwood generate page home /
-```
-
-Let's import `routes` and `navigate` from [Redwood Router](../router.md#navigate) and use them to redirect to the home page upon successful sign up:
-
-```jsx {3,16} title="web/src/pages/SignupPage/SignupPage.js"
-import { Form, TextField, PasswordField, Submit } from '@redwoodjs/forms'
-import { useAuth } from '@redwoodjs/auth'
-import { routes, navigate } from '@redwoodjs/router'
-
-const SignupPage = () => {
- const { client } = useAuth()
- const [error, setError] = React.useState(null)
-
- const onSubmit = async (data) => {
- setError(null)
- try {
- const response = await client.auth.signUp({
- email: data.email,
- password: data.password
- })
- response?.error?.message ? setError(response.error.message) : navigate(routes.home())
- } catch(error) {
- setError(error.message)
- }
- }
-
- return (
- <>
-
Sign Up
-
- >
- )
-}
-export default SignupPage
-```
-
-Hoorah! We've just added a sign up page and created a sign up form. We created a function to sign up users and we redirect users to the home page upon successful submission. Let's move on to Sign In.
-
-## Sign In
-
-Let's get right to it. Start by [generating](../cli-commands.md#generate-page) a sign in page:
-
-```bash
-yarn redwood generate page signin
-```
-
-Next we'll add a basic form with `email` and `password` fields, some error reporting, and a hollow `onSubmit` function:
-
-```jsx title="web/src/pages/SigninPage/SigninPage.[js/tsx]"
-import { Form, TextField, PasswordField, Submit } from '@redwoodjs/forms'
-
-const SigninPage = () => {
- const [error, setError] = React.useState(null)
-
- const onSubmit = (data) => {
- // do sign in here
- }
-
- return (
- <>
-
Sign In
-
- >
- )
-}
-
-export default SigninPage
-```
-
-Then we'll need to import `useAuth` from `@redwoodjs/auth` and destructure `logIn` so that we can use it in our `onSubmit` function:
-
-```jsx {2,5} title="web/src/pages/SigninPage/SigninPage.js"
-import { Form, TextField, PasswordField, Submit } from '@redwoodjs/forms'
-import { useAuth } from '@redwoodjs/auth'
-
-const SigninPage = () => {
- const { logIn } = useAuth()
- const [error, setError] = React.useState(null)
-
- const onSubmit = (data) => {
- setError(null)
- // do sign in here
- }
-
- return (
- <>
-
Sign In
-
- >
- )
-}
-
-export default SigninPage
-```
-
-Now we'll add `logIn` to our `onSubmit` function. This time we'll be passing an object to our function as we're using Redwood Auth's `logIn` function directly (as opposed to `client`). This object takes an email and password.
-
-```jsx {10-15} title="web/src/pages/SigninPage/SigninPage.js"
-import { Form, TextField, PasswordField, Submit } from '@redwoodjs/forms'
-import { useAuth } from '@redwoodjs/auth'
-
-const SigninPage = () => {
- const { logIn } = useAuth()
- const [error, setError] = React.useState(null)
-
- const onSubmit = async (data) => {
- setError(null)
- try {
- const response = await logIn({ email: data.email, password: data.password })
- // do something
- } catch(error) {
- setError(error.message)
- }
- }
-
- return (
- <>
-
Sign In
-
- >
- )
-}
-
-export default SigninPage
-```
-
-Let's redirect our user back to the home page upon a successful login.
-
-In our `SigninPage`, import `navigate` and `routes` from [`@redwoodjs/router`](../router.md) and add them after awaiting `logIn`:
-
-```jsx {10-16} title="web/src/pages/SigninPage/SigninPage.js"
-import { Form, TextField, PasswordField, Submit } from '@redwoodjs/forms'
-import { useAuth } from '@redwoodjs/auth'
-import { navigate, routes } from '@redwoodjs/router'
-
-const SigninPage = () => {
- const { logIn } = useAuth()
- const [error, setError] = React.useState(null)
-
- const onSubmit = async (data) => {
- setError(null)
- try {
- const response = await logIn({ email: data.email, password: data.password })
- response?.error?.message ? setError(response.error.message) : navigate(routes.home())
- } catch(error) {
- setError(error.message)
- }
- }
-
- return (
- <>
-
Sign In
-
- >
- )
-}
-
-export default SigninPage
-```
-
-Well done! We've created a sign in page and form that successfully handles sign in.
-
-> The remainder of the how to is the same as the [Netlify GoTrue Auth](gotrue-auth.md) version. This highlights one of the fun benefits of the Redwood Auth wrappers: code specific to a certain auth implementation scheme can live in a few specific spots, as we walked through above. Then, general Redwood Auth functions can be used elsewhere in the app.
-
-## Sign Out
-
-Sign Out is by far the easiest to implement. All we need to do is call `useAuth`'s `logOut` method.
-
-Let's start by [generating a component](../cli-commands.md#generate-component) to house our Sign Out Button:
-
-```bash
-yarn redwood generate component signoutBtn
-```
-
-In the `web/src/components/SignoutBtn/SignoutBtn.js` file we just generated, let's render a button and add a click handler:
-
-```jsx title="web/src/components/SignoutBtn/SignoutBtn.[js/tsx]"
-const SignoutBtn = () => {
- const onClick = () => {
- // do sign out here.
- }
- return
-}
-
-export default SignoutBtn
-```
-
-Now let's import `useAuth` from `@redwoodjs/auth`. We'll destructure its `logOut` method and invoke it in `onClick`:
-
-```jsx {1,4,7} title="web/src/components/SignoutBtn/SignoutBtn.[js/tsx]"
-import { useAuth } from '@redwoodjs/auth'
-
-const SignoutBtn = () => {
- const { logOut } = useAuth()
-
- const onClick = () => {
- logOut()
- }
- return
-}
-
-export default SignoutBtn
-```
-
-This works as is, but because the user may be in a restricted part of your app when they sign out, we should make sure to navigate them away from this page:
-
-```jsx {2,8-9} title="web/src/components/SignoutBtn/SignoutBtn.[js/tsx]"
-import { useAuth } from '@redwoodjs/auth'
-import { navigate, routes } from '@redwoodjs/router'
-
-const SignoutBtn = () => {
- const { logOut } = useAuth()
-
- const onClick = async () => {
- await logOut()
- navigate(routes.home())
- }
-
- return
-}
-
-export default SignoutBtn
-```
-
-And that's it for Sign Out! Err, of course, we're not rendering it anywhere in our app yet. In the next section, well add some navigation that conditionally renders the appropriate sign up, sign in, and sign out buttons based on our authentication state.
-
-## Auth Links
-
-In this section we'll implement some auth-related navigation that conditionally renders the correct links and buttons based on the user's authentication state:
-
-- when the user's logged out, we should see **Sign Up** and **Sign In**
-- when the user's logged in, we should see **Log Out**
-
-Let's start by [generating a navigation component](../cli-commands.md#generate-component):
-
-```bash
-yarn redwood generate component navigation
-```
-
-This creates `web/src/components/Navigation/Navigation.js`. In that file, let's import [the `Link` component and the `routes` object](../router.md#link-and-named-route-functions) from `@redwoodjs/router`.
-We'll also import [`useAuth`](../authentication.md#api) since we'll need to subscribe to the auth state for our component to decide what to render:
-
-```jsx title="web/src/components/Navigation/Navigation.js"
-import { Link, routes } from '@redwoodjs/router'
-import { useAuth } from '@redwoodjs/auth'
-
-const Navigation = () => {
- return
-}
-
-export default Navigation
-```
-
-Let's destructure `isAuthenticated` from the `useAuth` hook and use it in some conditionals:
-
-```jsx {5,8-12} title="web/src/components/Navigation/Navigation.js"
-import { Link, routes } from '@redwoodjs/router'
-import { useAuth } from '@redwoodjs/auth'
-
-const Navigation = () => {
- const { isAuthenticated } = useAuth()
- return (
-
- )
-}
-
-export default Navigation
-```
-
-Because Redwood Auth uses [React's Context API](https://reactjs.org/docs/context.html) to manage and broadcast the auth state, we can be confident that `isAuthenticated` will always be up-to-date, even if it changes from within another component in the tree (so long as it's a child of ``). In our case, when `isAuthenticated` changes, React will auto-magically take care of rendering the appropriate components.
-
-Now let's import our sign out button and add it, as well as sign in and sign up links, to the appropriate blocks in the conditional:
-
-```jsx {3,9-16} title="web/src/components/Navigation/Navigation.[js/tsx]"
-import { Link, routes } from '@redwoodjs/router'
-import { useAuth } from '@redwoodjs/auth'
-import SignoutBtn from 'src/components/SignoutBtn/SignoutBtn'
-
-const Navigation = () => {
- const { isAuthenticated } = useAuth()
- return (
-
- )
-}
-
-export default Navigation
-```
-
-We have a working navigation component, but we still need to render it somewhere. Let's [generate a layout](../cli-commands.md#generate-layout) called GlobalLayout:
-
-```bash
-yarn redwood generate layout global
-```
-
-Then import and render the navigation component in the newly-generated `web/src/layouts/GlobalLayout/GlobalLayout`:
-
-```jsx title="web/src/layouts/GlobalLayout/GlobalLayout.js"
-import Navigation from 'src/components/Navigation/Navigation'
-
-const GlobalLayout = ({ children }) => {
- return (
- <>
-
-
-
- {children}
- >
- )
-}
-
-export default GlobalLayout
-```
-
-Finally, we'll wrap each of our generated pages in this `GlobalLayout` component. To do this efficiently, we'll update the routes defined in our `web\src\Routes.[js/tsx]` file with the [`Set` component](../router.md#sets-of-routes):
-
-```jsx title="web/src/Routes.[js/tsx]"
-import { Router, Route, Set } from '@redwoodjs/router'
-import GlobalLayout from 'src/layouts/GlobalLayout/GlobalLayout'
-
-const Routes = () => {
- return (
-
-
-
-
-
-
-
-
- )
-}
-
-export default Routes
-```
-
-Now we have navigation that renders the correct links and buttons based on our auth state. When the user signs in, they'll see a **Sign Out** button. When the user signs out, they'll see **Sign Up** and **Sign In** links.
-
-## Wrapping Up
-
-We've configured Supabase GoTrue Auth with Redwood Auth, created a Sign Up page, a Sign In page, and a Sign Out button, and added auth links to our layout. Nicely done!
-
-As you continue refining your app, the following resources may come in handy:
-
-- [Redwood Supabase Auth Installation & Setup](../authentication.md#supabase)
-- [Redwood Auth Playground](https://redwood-playground-auth.netlify.app/supabase)
-- [Redwood Supabase Auth Client Implementation](https://github.com/redwoodjs/redwood/blob/main/packages/auth/src/authClients/supabase.ts)
-- [Supabase GoTrue client implementation](https://github.com/supabase/gotrue-js/blob/d7b334a4283027c65814aa81715ffead262f0bfa/src/GoTrueClient.ts)
-
-Finally, keep the following features in mind (future how to's could go deep into any of these):
-
-- Authentication state changes can be observed via an event listener. The [Supabase Auth playground](https://github.com/redwoodjs/playground-auth/blob/main/web/src/lib/code-samples/supabase.md) shows an example.
-- Authentication options include...
- - Passwordless (enter email and get a magic confirmation link)
- - Third party (via GitHub, Google, etc)
- - Phone one-time password
- - Sign in with refresh token (JWTs are a critical part of the auth implementation)
-
-Thanks for tuning in!
-
-> If you spot an error or have trouble completing any part of this recipe, please feel free to open an issue on [Github](https://github.com/redwoodjs/redwoodjs.com) or create a topic on our [community forum](https://community.redwoodjs.com/).
diff --git a/docs/versioned_docs/version-7.0/introduction.md b/docs/versioned_docs/version-7.0/introduction.md
deleted file mode 100644
index 7a89099ca8d1..000000000000
--- a/docs/versioned_docs/version-7.0/introduction.md
+++ /dev/null
@@ -1,60 +0,0 @@
----
-description: Redwood is the full-stack web framework designed to help you grow from side project to startup
----
-
-# Introduction
-
-Redwood is the full-stack web framework designed to help you grow from side project to startup.
-Redwood features an end-to-end development workflow that weaves together the best parts of [React](https://reactjs.org/), [GraphQL](https://graphql.org/), [Prisma](https://www.prisma.io/), [TypeScript](https://www.typescriptlang.org/), [Jest](https://jestjs.io/), and [Storybook](https://storybook.js.org/).
-For full inspiration and vision, see Redwood's [README](https://github.com/redwoodjs/redwood/blob/main/README.md).
-
-Development on Redwood happens in the [redwoodjs/redwood repo on GitHub](https://github.com/redwoodjs/redwood).
-The docs are [there too](https://github.com/redwoodjs/redwood/tree/main/docs).
-While Redwood's [leadership and maintainers](https://github.com/redwoodjs/redwood#core-team-leadership)
-handle most of the high-priority items and the day-to-day, Redwood wouldn't be
-where it is without [all its contributors](https://github.com/redwoodjs/redwood#all-contributors)!
-Feel free to reach out to us on the [forums](https://community.redwoodjs.com) or on [Discord](https://discord.gg/redwoodjs), and follow us on [Twitter](https://twitter.com/redwoodjs) for updates.
-
-## Getting the Most out of Redwood
-
-To get the most out of Redwood, do two things:
-
-- [Start the tutorial](tutorial/foreword.md)
-- [Join the community](https://redwoodjs.com/community)
-
-The tutorial is the best way to start your Redwood adventure.
-It's readable, feature-ful, and fun.
-You'll go all the way from `git clone` to Netlify deploy!
-And by the end, you should feel comfortable enough to start that side project.
-
-After you've read the tutorial and started your side project, come say hi and tell us all about it by joining the community.
-Redwood wouldn't be where it is without the people who use and contribute to it.
-We warmly welcome you!
-
-## How these Docs are Organized
-
-As you can probably tell from the sidebar, Redwood's docs are organized into three sections:
-
-- [Tutorial](tutorial/foreword.md)
-- [Reference](index)
-- [How To](how-to/index)
-
-The order isn't arbitrary.
-This is more or less the learning journey we have in mind for you.
-
-While we expect you to read the tutorial from top to bottom (maybe even more than once?), we of course don't expect you to read the Reference and How To sections that way.
-The content in those sections is there on an as-needed basis.
-You need to know about the Router? Check out the [Router](router.md) reference.
-You need to upload files? Check out the [File Uploads](how-to/file-uploads.md) how to.
-
-That said, there are some references you should consider reading at some point in your Redwood learning journey.
-Especially if you want to become an advanced user.
-For example, [Services](services.md) are fundamental to Redwood.
-It's worth getting to know them inside and out.
-And if you're not writing [tests](testing.md) and [stories](storybook.md), you're not using Redwood to its full potential.
-
-> **We realize that the content doesn't always match the organization**
->
-> For example, half the [Testing](testing.md) reference reads like a tutorial, and half the [Logger](logger.md) reference read like a how to.
-> Till now, we've focused on coverage, making sure we had content on all of Redwood's feature somewhere at least.
-> We'll shift our focus to organization and pay more attention to how we can curate the experience.
diff --git a/docs/versioned_docs/version-7.0/local-postgres-setup.md b/docs/versioned_docs/version-7.0/local-postgres-setup.md
deleted file mode 100644
index 5facde1d7929..000000000000
--- a/docs/versioned_docs/version-7.0/local-postgres-setup.md
+++ /dev/null
@@ -1,166 +0,0 @@
----
-description: Setup a Postgres database to develop locally
----
-
-# Local Postgres Setup
-
-RedwoodJS uses a SQLite database by default. While SQLite makes local development easy, you're
-likely going to want to run the same database you use in production locally at some point. And since the odds of that database being Postgres are high, here's how to set up Postgres.
-
-## Install Postgres
-### Mac
-If you're on a Mac, we recommend using Homebrew:
-
-```bash
-brew install postgresql@14
-```
-
-> **Install Postgres? I've messed up my Postgres installation so many times, I wish I could just uninstall everything and start over!**
->
-> We've been there before. For those of you on a Mac, [this video](https://www.youtube.com/watch?v=1aybOgni7lI) is a great resource on how to wipe the various Postgres installs off your machine so you can get back to a blank slate.
-> Obviously, warning! This resource will teach you how to wipe the various Postgres installs off your machine. Please only do it if you know you can!
-
-### Windows and Other Platforms
-If you're using another platform, see Prisma's [Data Guide](https://www.prisma.io/docs/guides/database-workflows/setting-up-a-database/postgresql) for detailed instructions on how to get up and running.
-
-## Creating a database
-
-If everything went well, then Postgres should be running and you should have a few commands at your disposal (namely, `psql`, `createdb`, and `dropdb`).
-
-Check that Postgres is running with `brew services` (the `$(whoami)` bit in the code block below is just where your username should appear):
-
-```bash
-$ brew services
-Name Status User Plist
-postgresql started $(whoami) /Users/$(whoami)/Library/LaunchAgents/homebrew.mxcl.postgresql.plist
-```
-
-If it's not started, start it with:
-
-```bash
-brew services start postgresql
-```
-
-Great. Now let's try running the PostgresQL interactive terminal, `psql`:
-
-```bash
-$ psql
-```
-
-You'll probably get an error like:
-
-```bash
-psql: error: FATAL: database $(whoami) does not exist
-```
-
-This is because `psql` tries to log you into a database of the same name as your user. But if you just installed Postgres, odds are that database doesn't exist.
-
-Luckily it's super easy to create one using another of the commands you got, `createdb`:
-
-```bash
-$ createdb $(whoami)
-```
-
-Now try:
-
-```
-$ psql
-psql (13.1)
-Type "help" for help.
-
-$(whoami)=#
-```
-
-If it worked, you should see a prompt like the one above—your username followed by `=#`. You're in the PostgreSQL interactive terminal! While we won't get into `psql`, here's a few the commands you should know:
-
-- `\q` — quit (super important!)
-- `\l` — list databases
-- `\?` — get a list of commands
-
-If you'd rather not follow any of the advice here and create another Postgres user instead of a Postgres database, follow [this](https://www.digitalocean.com/community/tutorials/how-to-install-and-use-postgresql-on-ubuntu-18-04#step-3-%E2%80%94-creating-a-new-role).
-
-## Update the Prisma Schema
-
-Tell Prisma to use a Postgres database instead of SQLite by updating the `provider` attribute in your
-`schema.prisma` file:
-
-```graphql title="api/db/schema.prisma"
-datasource db {
- provider = "postgresql"
- url = env("DATABASE_URL")
-}
-```
-> Note: If you run into a "PrismaClientInitializationError" then you may need to regenerate the prisma client using: `yarn rw prisma generate`
-
-## Connect to Postgres
-
-Add a `DATABASE_URL` to your `.env` file with the URL of the database you'd like to use locally. The
-following example uses `redwoodblog_dev` for the database. It also has `postgres` setup as a
-superuser for ease of use.
-```env
-DATABASE_URL="postgresql://postgres@localhost:5432/redwoodblog_dev?connection_limit=1"
-```
-
-Note the `connection_limit` parameter. This is [recommended by Prisma](https://www.prisma.io/docs/reference/tools-and-interfaces/prisma-client/deployment#recommended-connection-limit) when working with
-relational databases in a Serverless context. You should also append this parameter to your production
-`DATABASE_URL` when configuring your deployments.
-
-### Local Test DB
-You should also set up a test database similarly by adding `TEST_DATABASE_URL` to your `.env` file.
-```env
-TEST_DATABASE_URL="postgresql://postgres@localhost:5432/redwoodblog_test?connection_limit=1"
-```
-
-> Note: local postgres server will need manual start/stop -- this is not handled automatically by RW CLI in a manner similar to sqlite
-
-### Base URL and path
-
-Here is an example of the structure of the base URL and the path using placeholder values in uppercase letters:
-```bash
-postgresql://USER:PASSWORD@HOST:PORT/DATABASE
-```
-The following components make up the base URL of your database, they are always required:
-
-| Name | Placeholder | Description |
-| ------ | ------ | ------|
-| Host | `HOST`| IP address/domain of your database server, e.g. `localhost` |
-| Port | `PORT` | Port on which your database server is running, e.g. `5432` |
-| User | `USER` | Name of your database user, e.g. `postgres` |
-| Password | `PASSWORD` | password of your database user |
-| Database | `DATABASE` | Name of the database you want to use, e.g. `redwoodblog_dev` |
-
-## Migrations
-Migrations are snapshots of your DB structure, which, when applied, manage the structure of both your local development DB and your production DB.
-
-To create and apply a migration to the Postgres database specified in your `.env`, run the _migrate_ command. (Did this return an error? If so, see "Migrate from SQLite..." below.):
-```bash
-yarn redwood prisma migrate dev
-```
-
-### Migrate from SQLite to Postgres
-If you've already created migrations using SQLite, e.g. you have a migrations directory at `api/db/migrations`, follow this two-step process.
-
-#### 1. Remove existing migrations
-**For Linux and Mac OS**
-From your project root directory, run either command corresponding to your OS.
-```bash
-rm -rf api/db/migrations
-```
-
-**For Windows OS**
-```bash
-rmdir /s api\db\migrations
-```
-
-> Note: depending on your project configuration, your migrations may instead be located in `api/prisma/migrations`
-
-#### 2. Create a new migration
-Run this command to create and apply a new migration to your local Postgres DB:
-```bash
-yarn redwood prisma migrate dev
-```
-
-## DB Management Tools
-Here are our recommendations in case you need a tool to manage your databases:
-- [TablePlus](https://tableplus.com/) (Mac, Windows)
-- [Beekeeper Studio](https://www.beekeeperstudio.io/) (Linux, Mac, Windows - Open Source)
diff --git a/docs/versioned_docs/version-7.0/project-configuration-dev-test-build.mdx b/docs/versioned_docs/version-7.0/project-configuration-dev-test-build.mdx
deleted file mode 100644
index 37df966dafeb..000000000000
--- a/docs/versioned_docs/version-7.0/project-configuration-dev-test-build.mdx
+++ /dev/null
@@ -1,238 +0,0 @@
----
-title: Project Configuration
-description: Advanced project configuration
----
-
-import ReactPlayer from 'react-player'
-
-# Project Configuration: Dev, Test, Build
-
-## Babel
-
-Out of the box Redwood configures [Babel](https://babeljs.io/) so that you can write modern JavaScript and TypeScript without needing to worry about transpilation at all.
-GraphQL tags, JSX, SVG imports—all of it's handled for you.
-
-For those well-versed in Babel config, you can find Redwood's in [@redwoodjs/internal](https://github.com/redwoodjs/redwood/tree/main/packages/internal/src/build/babel).
-
-### Configuring Babel
-
-For most projects, you won't need to configure Babel at all, but if you need to you can configure each side (web, api) individually using side-specific `babel.config.js` files.
-
-> **Heads up**
->
-> `.babelrc{.js}` files are ignored.
-> You have to put your custom config in the appropriate side's `babel.config.js`: `web/babel.config.js` for web and `api/babel.config.js` for api.
-
-Let's go over an example.
-
-#### Example: Adding Emotion
-
-Let's say we want to add the styling library [emotion](https://emotion.sh), which requires adding a Babel plugin.
-
-1. Create a `babel.config.js` file in `web`:
-```shell
-touch web/babel.config.js
-```
-
-
-2. Add the `@emotion/babel-plugin` as a dependency:
-```shell
-yarn workspace web add --dev @emotion/babel-plugin
-```
-
-
-3. Add the plugin to `web/babel.config.js`:
-```jsx title="web/babel.config.js"
-module.exports = {
- plugins: ["@emotion"] // 👈 add the emotion plugin
-}
-
-// ℹ️ Notice how we don't need the `extends` property
-```
-
-That's it!
-Now your custom web-side Babel config will be merged with Redwood's.
-
-## Jest
-
-Redwood uses [Jest](https://jestjs.io/) for testing.
-Let's take a peek at how it's all configured.
-
-At the root of your project is `jest.config.js`.
-It should look like this:
-
-```jsx title="jest.config.js"
-module.exports = {
- rootDir: '.',
- projects: ['/{*,!(node_modules)/**/}/jest.config.js'],
-}
-```
-
-This just tells Jest that the actual config files sit in each side, allowing Jest to pick up the individual settings for each.
-`rootDir` also makes sure that if you're running Jest with the `--collectCoverage` flag, it'll produce the report in the root directory.
-
-#### Web Jest Config
-
-The web side's configuration sits in `./web/jest.config.js`
-
-```jsx
-const config = {
- rootDir: '../',
- preset: '@redwoodjs/testing/config/jest/web',
- // ☝️ load the built-in Redwood Jest configuration
-}
-
-module.exports = config
-```
-
-> You can always see Redwood's latest configuration templates in the [create-redwood-app package](https://github.com/redwoodjs/redwood/blob/main/packages/create-redwood-app/templates/ts/web/jest.config.js).
-
-The preset includes all the setup required to test everything that's going on in web: rendering React components and transforming JSX, automatically mocking Cells, transpiling with Babel, mocking the Router and the GraphQL client—the list goes on!
-You can find all the details in the [source](https://github.com/redwoodjs/redwood/blob/main/packages/testing/config/jest/web/jest-preset.js).
-
-#### Api Side Config
-
-The api side is configured similarly, with the configuration sitting in `./api/jest.config.js`.
-But the api preset is slightly different in that:
-
-- it's configured to run tests serially (because Scenarios seed your test database)
-- it has setup code to make sure your database is 1) seeded before running tests 2) reset between Scenarios
-
-You can find all the details in the [source](https://github.com/redwoodjs/redwood/blob/main/packages/testing/config/jest/api/jest-preset.js).
-
-## GraphQL Codegen
-
-You can customize the types that Redwood generates from your project too! This is documented in a bit more detail in the [Generated Types](typescript/generated-types#customising-codegen-config) doc.
-
-## Debug configurations
-
-### Dev Server
-The `yarn rw dev` command is configured by default to open a browser and a debugger on the port `18911` and your redwood app ships with several default configurations to debug with VSCode.
-
-#### Customizing the configuration
-**a) Using the redwood.toml**
-
-Add/change the `debugPort` or `open` under your api settings
-
-```toml title="redwood.toml"
-[web]
- # .
-[api]
- # .
- // highlight-next-line
- debugPort = 18911 # change me!
-[browser]
- // highlight-next-line
- open = true # change me!
-```
-
-**b) Pass a flag to `rw dev` command**
-
-You can also pass a flag when you launch your dev servers, for example:
-
-```bash
-yarn rw dev --debugPort 75028
-```
-The flag passed in the CLI will always take precedence over your setting in the `redwood.toml`
-
-Just remember to also change the port you are attaching to in your `./vscode/launch.json`
-
-### API and Web Debuggers
-Simply run your dev server, then attach the debugger from the "run and debug" panel. Quick demo below:
-
-
-
-### Compound Debugger
-The compound configuration is a combination of the dev, api and web configurations.
-It allows you to start all debugging configurations at once, facilitating simultaneous debugging of server and client-side code.
-
-
-
-> **ℹ️ Tip: Can't see the debug configurations?** In VSCode
->
-> You can grab the latest launch.json from the Redwood template [here](https://github.com/redwoodjs/redwood/blob/main/packages/create-redwood-app/templates/ts/.vscode/launch.json). Copy the contents into your project's `.vscode/launch.json`
-
-## Ignoring the `.yarn` folder
-
-The `.yarn` folder contains the most recent Yarn executable that Redwood supports
-which is the [recommended way](https://github.com/yarnpkg/yarn/issues/7741)
-to ensure things run smoothly for everyone. From VSCode's perspective, this of course
-is just another folder containing code, so it will
-
-1. include its contents in project-wide, full-text searches
-2. display it in the file browser
-3. watch its contents for changes
-
-… which, depending on your personal preference, is something you may not need or want.
-
-Fortunately, all these aspects are configurable via VSCode's `settings.json`. You have the
-choice of making these changes to your local Redwood project's configuration
-found in `.vscode/settings.json` or globally (so they apply to other projects as
-well). For global changes, hit F1 or Ctrl+Shift+P
-(that's ⌘+Shift+P if you're on Mac)
-and search for "Preferences: Open User Settings (JSON)".
-
-Note that the local workspace configuration always overrules your user settings.
-The VSCode website [provides an extensive explanation](https://code.visualstudio.com/docs/getstarted/settings#_settings-precedence)
-on how its config inheritance works. It also has a complete reference of
-[all available settings and their defaults](https://code.visualstudio.com/docs/getstarted/settings#_default-settings).
-
-### Excluding a folder from search results only
-
-Adding the following would exclude any `.yarn` folder encountered anywhere in
-the project (that's what the
-`**` [glob pattern](https://code.visualstudio.com/docs/editor/codebasics#_advanced-search-options)
-does) from search results:
-
-```json
- "search.exclude": {
- "**/.yarn": true
- }
-```
-
-### Excluding a folder from the file browser and searching
-
-```json
- "files.exclude": {
- "**/.yarn": true
- }
-```
-
-This setting also excludes all matching folders and files from search results,
-so there's no point in adding a `search.exclude` setting separately.
-
-Don't worry: this setting won't influence change detection in your "Source Control"
-tab—that would be managed via `.gitignore`.
-
-### Excluding a folder from watching
-
-```json
- "files.watcherExclude": {
- "**/.yarn": true
- }
-```
-
-This setting works independently of the ones above and so it needs to be added
-separately. It's important to note that files or folders matched by this
-setting will no longer immediately appear (or disappear):
-- from existing search results (but as soon as you search again or change the search term, they'll be discovered)
-- in your "Source Control" tab, unless you hit the "Refresh" button
-
-Admittedly, the `.yarn` folder won't change that often, so this may not be
-the best example. But we thought we'd share this technique with you
-so that you'd know how to apply it to any folders that you know change very often,
-and how to tell VSCode not to bother wasting any CPU cycles on them.
-
-## Trailing whitespace
-
-If you're using VS Code, or another editor that supports
-[EditorConfig](https://editorconfig.org), trailing whitespace will be trimmed
-in source files, but preserved in html, markdown and mjml files when saving.
-
-This behavior is controlled by `.vscode/settings` or `.editorconfig` depending
-on your editor.
-
-In JavaScript and TypeScript files trailing whitespace has no significance,
-but for html, markdown and mjml it does. That's why the behavior is different
-for those files. If you don't like the default behavior Redwood has configured
-for you, you're free to change the settings in those two files.
diff --git a/docs/versioned_docs/version-7.0/router.md b/docs/versioned_docs/version-7.0/router.md
deleted file mode 100644
index 860327c1fcda..000000000000
--- a/docs/versioned_docs/version-7.0/router.md
+++ /dev/null
@@ -1,870 +0,0 @@
----
-description: About the built-in router for Redwood apps
----
-
-# Router
-
-This is the built-in router for Redwood apps. It takes inspiration from Ruby on Rails, React Router, and Reach Router, but is very opinionated in its own way.
-
-The router is designed to list all routes in a single file, with limited nesting. We prefer this design, as it makes it very easy to track which routes map to which pages.
-
-## Router and Route
-
-The first thing you need is a `Router`. It will contain all of your routes. The router will attempt to match the current URL to each route in turn, and only render those with a matching `path`. The only exception to this is the `notfound` route, which can be placed anywhere in the list and only matches when no other routes do.
-
-:::note The `notfound` route can't be nested in a `Set`
-
-If you want to wrap your custom notfound page in a `Layout`, then you should add the `Layout` to the page instead. See [customizing the NotFoundPage](#customizing-the-notfoundpage).
-
-:::
-
-Each route is specified with a `Route`. Our first route will tell the router what to render when no other route matches:
-
-```jsx title="Routes.js"
-import { Router, Route } from '@redwoodjs/router'
-
-const Routes = () => (
-
-
-
-)
-
-export default Routes
-```
-
-The router expects a single `Route` with a `notfound` prop. When no other route is found to match, the component in the `page` prop will be rendered.
-
-To create a route to a normal Page, you'll pass three props: `path`, `page`, and `name`:
-
-```jsx title="Routes.js"
-
-```
-
-The `path` prop specifies the URL path to match, starting with the beginning slash. The `page` prop specifies the Page component to render when the path is matched. The `name` prop is used to specify the name of the _named route function_.
-
-## Private Routes
-
-Some pages should only be visible to authenticated users. We support this using the `PrivateSet` component. Read more [further down](#privateset).
-
-## Sets of Routes
-
-You can group Routes into sets using the `Set` component. `Set` allows you to wrap a set of Routes in another component or array of components—usually a Context, a Layout, or both:
-
-```jsx title="Routes.js"
-import { Router, Route, Set } from '@redwoodjs/router'
-import BlogContext from 'src/contexts/BlogContext'
-import BlogLayout from 'src/layouts/BlogLayout'
-
-const Routes = () => {
- return (
-
-
-
-
-
-
-
-
- )
-}
-
-export default Routes
-```
-
-The `wrap` prop accepts a single component or an array of components. Components are rendered in the same order they're passed, so in the example above, Set expands to:
-
-```jsx
-
-
-
- // ...
-
-
-```
-
-Conceptually, this fits with how we think about Context and Layouts as things that wrap Pages and contain content that’s outside the scope of the Pages themselves. Crucially, since they're higher in the tree, `BlogContext` and `BlogLayout` won't rerender across Pages in the same Set.
-
-There's a lot of flexibility here. You can even nest `Sets` to great effect:
-
-```jsx title="Routes.js"
-import { Router, Route, Set } from '@redwoodjs/router'
-import BlogContext from 'src/contexts/BlogContext'
-import BlogLayout from 'src/layouts/BlogLayout'
-import BlogNavLayout from 'src/layouts/BlogNavLayout'
-
-const Routes = () => {
- return (
-
-
-
-
-
-
-
-
-
-
- )
-}
-```
-
-### Forwarding props
-
-All props you give to `` (except for `wrap`) will be passed to the wrapper components.
-
-So this...
-
-```jsx
-
-
-
-```
-
-becomes...
-
-```jsx
-
-
-
-```
-
-### `PrivateSet`
-
-A `PrivateSet` makes all Routes inside that Set require authentication. When a user isn't authenticated and attempts to visit one of the Routes in the `PrivateSet`, they'll be redirected to the Route passed as the `PrivateSet`'s `unauthenticated` prop. The originally-requested Route's path is added to the query string as a `redirectTo` param. This lets you send the user to the page they originally requested once they're logged-in.
-
-Here's an example of how you'd use a `PrivateSet`:
-
-```jsx title="Routes.js"
-
-
-
-
-
-
-```
-
-For more fine-grained control, you can specify `roles` (which takes a string for a single role or an array of roles), and the router will check to see that the current user is authorized before giving them access to the Route. If they're not, they will be redirected to the page specified in the `unauthenticated` prop, such as a "forbidden" page. Read more about Role-based Access Control in Redwood [here](how-to/role-based-access-control.md).
-
-To protect private routes for access by a single role:
-
-```jsx title="Routes.js"
-
-
-
-
-
-
-
-```
-
-To protect private routes for access by multiple roles:
-
-```jsx title="Routes.js"
-
-
-
-
-
-
-
-```
-
-Redwood uses the `useAuth` hook under the hood to determine if the user is authenticated. Read more about authentication in Redwood [here](tutorial/chapter4/authentication.md).
-
-## Link and named route functions
-
-When it comes to routing, matching URLs to Pages is only half the equation. The other half is generating links to your pages. The router makes this really simple without having to hardcode URL paths. In a Page component, you can do this (only relevant bits are shown in code samples from now on):
-
-```jsx title="SomePage.js"
-import { Link, routes } from '@redwoodjs/router'
-
-// Given the route in the last section, this produces:
-const SomePage = () =>
-```
-
-You use a `Link` to generate a link to one of your routes and can access URL generators for any of your routes from the `routes` object. We call the functions on the `routes` object _named route functions_ and they are named after whatever you specify in the `name` prop of the `Route`.
-
-Named route functions simply return a string, so you can still pass in hardcoded strings to the `to` prop of the `Link` component, but using the proper named route function is easier and safer. Plus, if you ever decide to change the `path` of a route, you don't need to change any of the `Link`s to it (as long as you keep the `name` the same)!
-
-## Active links
-
-`NavLink` is a special version of `Link` that will add an `activeClassName` to the rendered element when it matches **exactly** the current URL.
-
-```jsx title="MainMenu.js"
-import { NavLink, routes } from '@redwoodjs/router'
-
-// Will render respectively when on the page
-const MainMenu = () =>
-