Skip to content

Commit

Permalink
docs: copy github images to built dirhtml
Browse files Browse the repository at this point in the history
Signed-off-by: Joan Fontanals Martinez <[email protected]>
  • Loading branch information
JoanFM committed Aug 24, 2023
1 parent e2d39f5 commit a970924
Show file tree
Hide file tree
Showing 5 changed files with 14 additions and 8 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/label-pr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ jobs:
git fetch origin
export NUM_RELEASES=2 # only 2 last tags to save build time
bash makedoc.sh local-only
netlify deploy --dir=_build/dirhtml --alias=${{ env.BRANCH_NAME }} --message="Deploying docs to ${{ env.BRANCH_NAME }} branch"
netlify deploy --dir=./docs/_build/dirhtml --alias=${{ env.BRANCH_NAME }} --message="Deploying docs to ${{ env.BRANCH_NAME }} branch"
env:
NETLIFY_AUTH_TOKEN: ${{ secrets.NETLIFY_AUTH_TOKEN }}
NETLIFY_SITE_ID: ${{ secrets.NETLIFY_SITE_ID }}
Expand Down
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@

Jina lets you build multimodal [**AI services**](#build-ai-models) and [**pipelines**](#build-a-pipeline) that communicate via gRPC, HTTP and WebSockets, then scale them up and deploy to production. You can focus on your logic and algorithms, without worrying about the infrastructure complexity.

![](https://github.com/jina-ai/jina/.github/images/build-deploy.png)
![](./.github/images/build-deploy.png)

Jina provides a smooth Pythonic experience for serving ML models transitioning from local deployment to advanced orchestration frameworks like Docker-Compose, Kubernetes, or Jina AI Cloud. Jina makes advanced solution engineering and cloud-native technologies accessible to every developer.

Expand Down Expand Up @@ -85,7 +85,7 @@ Jina has three fundamental layers:

Let's build a fast, reliable and scalable gRPC-based AI service. In Jina we call this an **[Executor](https://docs.jina.ai/concepts/serving/executor/)**. Our simple Executor will wrap the [StableLM](https://huggingface.co/stabilityai/stablelm-base-alpha-3b) LLM from Stability AI. We'll then use a **Deployment** to serve it.

![](https://github.com/jina-ai/jina/.github/images/deployment-diagram.png)
![](./.github/images/deployment-diagram.png)

> **Note**
> A Deployment serves just one Executor. To combine multiple Executors into a pipeline and serve that, use a [Flow](#build-a-pipeline).
Expand Down Expand Up @@ -265,7 +265,7 @@ class TextToImage(Executor):
</table>


![](https://github.com/jina-ai/jina/.github/images/flow-diagram.png)
![](./.github/images/flow-diagram.png)

Build the Flow with either Python or YAML:

Expand Down Expand Up @@ -340,7 +340,7 @@ response = client.post(on='/', inputs=[prompt], return_type=DocList[ImageDoc])
response[0].display()
```

![](https://github.com/jina-ai/jina/.github/images/mona-lisa.png)
![](./.github/images/mona-lisa.png)

<!-- end build-pipelines -->

Expand All @@ -352,7 +352,7 @@ Increase your application's throughput with scalability features out of the box,

Let's scale a Stable Diffusion Executor deployment with replicas and dynamic batching:

![](https://github.com/jina-ai/jina/.github/images/scaled-deployment.png)
![](./.github/images/scaled-deployment.png)

* Create two replicas, with [a GPU assigned for each](https://docs.jina.ai/concepts/orchestration/scale-out/#replicate-on-multiple-gpus).
* Enable dynamic batching to process incoming parallel requests together with the same model inference.
Expand Down
2 changes: 1 addition & 1 deletion docs/jina-ai-cloud/login.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Login & Token Management

To use Jina AI Cloud, you need to log in, either via a GitHub or Google account. This section describes how to log in Jina AI Cloud and manage the personal access token. You can do it via webpage, via CLI or via Python API.
To use Jina AI Cloud, you need to log in, either via a GitHub or Google account. This section describes how to log in Jina AI Cloud and manage the personal access token. You can do it via webpage, CLI or Python API.

## via Webpage

Expand Down
6 changes: 6 additions & 0 deletions docs/makedoc.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,12 @@ if [[ $1 == "local-only" ]]; then
ghcr.io/jina-ai/protoc-gen-doc --doc_opt=markdown,docs.md

make dirhtml
mkdir -p _build/dirhtml/.github
cp -r ../.github/images _build/dirhtml/.github
ls -l
ls -l _build/dirhtml
ls -l _build/dirhtml/.github
ls -l _build/dirhtml/.github/images
else
export NUM_RELEASES=${NUM_RELEASES:-10}
export DEFAULT_BRANCH='master'
Expand Down
2 changes: 1 addition & 1 deletion docs/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ sphinx-autodoc-typehints==1.18.3
sphinx_copybutton
sphinx-notfound-page==0.7.1
gitpython==3.1.30
sphinx-sitemap
sphinx-sitemap==2.5.0
sphinxext-opengraph
furo
myst-parser
Expand Down

0 comments on commit a970924

Please sign in to comment.