Skip to content

Commit

Permalink
Quick doc fixes (#117)
Browse files Browse the repository at this point in the history
  • Loading branch information
chrish42 authored Jan 16, 2025
1 parent 4496a40 commit 8e30926
Show file tree
Hide file tree
Showing 4 changed files with 6 additions and 6 deletions.
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -178,7 +178,7 @@ The following applies to all files unless otherwise noted:

END OF TERMS AND CONDITIONS

Copyright 2024 ServiceNow, Inc.
Copyright 2024-2025 ServiceNow, Inc.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
Expand Down
2 changes: 1 addition & 1 deletion docs/license.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ title: License
Fast-LLM is licenced under the Apache 2.0 license:

```text
Copyright 2024 ServiceNow, Inc.
Copyright 2024-2025 ServiceNow, Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
Expand Down
6 changes: 3 additions & 3 deletions docs/quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ Now, select the compute environment that matches your setup or preferred workflo
Install Python 3.12 (or later) if it's not already available on your system. For a Python virtual environment, run:

```bash
python3.10 -m venv ./fast-llm-tutorial/venv
python3.12 -m venv ./fast-llm-tutorial/venv
source ./fast-llm-tutorial/venv/bin/activate
pip install --upgrade pip
```
Expand Down Expand Up @@ -202,11 +202,11 @@ Choose based on your goals for this tutorial.

=== "Big"

For the big configuration, we'll use a Llama model with 8B parameters. We'll grab the model from the Huggingface Hub and save it to our inputs folder.
For the big configuration, we'll use a Llama model with 8B parameters. We'll grab the model from the HuggingFace Hub and save it to our inputs folder.

!!! note "Access Required"

Meta gates access to their Llama models. You need to request access to the model from Meta before you can download it at https://huggingface.co/meta-llama/Llama-3.1-8B. You'll need to authenticate with your Hugging Face account to download the model:
Meta gates access to their Llama models. You need to request access to the model from Meta before you can download it at https://huggingface.co/meta-llama/Llama-3.1-8B. You'll need to authenticate with your HuggingFace account to download the model:

```bash
pip install huggingface_hub
Expand Down
2 changes: 1 addition & 1 deletion mkdocs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ repo_url: https://github.com/ServiceNow/Fast-LLM
edit_uri: edit/main/docs/

# Copyright
copyright: Copyright 2024 ServiceNow, Inc.
copyright: Copyright 2024-2025 ServiceNow, Inc.

# Configuration
theme:
Expand Down

0 comments on commit 8e30926

Please sign in to comment.