From 206fbb879c3e3ae3418fd2e9e033883fdf42cd88 Mon Sep 17 00:00:00 2001 From: Jamie Packer Date: Tue, 12 Mar 2024 12:06:54 +0100 Subject: [PATCH] Remove Run on Gradient links --- README.md | 13 ++----------- 1 file changed, 2 insertions(+), 11 deletions(-) diff --git a/README.md b/README.md index c6a2816..1ec0a90 100644 --- a/README.md +++ b/README.md @@ -32,20 +32,11 @@ The package is under active development, to broaden its scope and applicability. To generate datasets based on the paper __Repurposing Density Functional Theory to Suit Deep Learning__ [Link](https://icml.cc/virtual/2023/workshop/21476#wse-detail-28485) [PDF](https://syns-ml.github.io/2023/assets/papers/17.pdf) presented at the [Syns & ML Workshop, ICML 2023](https://syns-ml.github.io/2023/), the entry point is the notebook [DFT Dataset Generation](./notebooks/DFT-dataset-generation.ipynb), and the file [density_functional_theory.py](./density_functional_theory.py). -To run the notebook on Graphcore IPU hardware on Paperspace: - -[![Run on Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/YX0jlK) - ### For DFT teaching and learning: nanoDFT We also provide a lightweight implementation of the SCF algorithm, optimized for readability and hackability, in the [nanoDFT demo](notebooks/nanoDFT-demo.ipynb) notebook and in [nanodft](pyscf_ipu/nanoDFT/README.md) folder. -To run the notebook on Graphcore IPU hardware on Paperspace: - -[![Run on Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/ipobmC) - - Additional notebooks in [notebooks](notebooks) demonstrate other aspects of the computation. @@ -58,7 +49,7 @@ We recommend upgrading `pip` to the latest stable release to prepare your enviro pip install -U pip ``` -This project is currently under active development. +This project is currently under active development. For CPU simulations, we recommend installing `pyscf-ipu` from latest `main` branch as: ```bash pip install pyscf-ipu[cpu]@git+https://github.com/graphcore-research/pyscf-ipu @@ -97,7 +88,7 @@ You can then start generating (locally on CPU) a dataset using the following com python density_functional_theory.py -generate -save -fname dataset_name -level 0 -plevel 0 -gdb 9 -backend cpu -float32 ``` -You can speed up the generation by using IPUs. Please try the [DFT dataset generation notebook](https://ipu.dev/YX0jlK) [![Run on Gradient](https://assets.paperspace.io/img/gradient-badge.svg)](https://ipu.dev/YX0jlK) +You can speed up the generation by using IPUs. Please try the [DFT dataset generation notebook](https://ipu.dev/YX0jlK) ## Training SchNet on [QM1B](qm1b/README.md)