You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Setting up a local development environment that uses on-device hardware for machine learning can be difficult to navigate, given the wide range of projects, matching hardware to software, and understanding chains of dependencies for various workflows.
Several team members within Mozilla Innovation are in the process of (or have built) multi-GPU hardware setups for doing small model training and ML application on local hardware (instead of within cloud environments). On-device ML can be especially compelling for exploring privacy and security-sensitive personal computing use cases, such as referencing documents on the local file system or browser history augmentation.
Describe the solution you'd like to see
A section that explores the considerations for doing on-device machine learning could include tested projects and hardware combinations, a rough idea of hardware specifications for different types of work (e.g. inference, RAG, fine-tuning), an overview of how to work with CUDA, recommended operating system / development environment best practices in configuration a system to ensure dependencies for individual projects are kept separate, and multi-user environments.
The text was updated successfully, but these errors were encountered:
Please describe your issue
Setting up a local development environment that uses on-device hardware for machine learning can be difficult to navigate, given the wide range of projects, matching hardware to software, and understanding chains of dependencies for various workflows.
Several team members within Mozilla Innovation are in the process of (or have built) multi-GPU hardware setups for doing small model training and ML application on local hardware (instead of within cloud environments). On-device ML can be especially compelling for exploring privacy and security-sensitive personal computing use cases, such as referencing documents on the local file system or browser history augmentation.
Describe the solution you'd like to see
A section that explores the considerations for doing on-device machine learning could include tested projects and hardware combinations, a rough idea of hardware specifications for different types of work (e.g. inference, RAG, fine-tuning), an overview of how to work with CUDA, recommended operating system / development environment best practices in configuration a system to ensure dependencies for individual projects are kept separate, and multi-user environments.
The text was updated successfully, but these errors were encountered: