Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doc: Update the developer guide for the v3 #3376

Merged
merged 15 commits into from
Mar 2, 2024
199 changes: 199 additions & 0 deletions doc/development/create-a-model-pt.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,199 @@
# Create a model in PyTorch

If you'd like to create a new model that isn't covered by the existing DeePMD-kit library, but reuse DeePMD-kit's other efficient modules such as data processing, trainner, etc, you may want to read this section.

To incorporate your custom model you'll need to:
1. Register and implement new components (e.g. descriptor) in a Python file.
2. Register new arguments for user inputs.
3. Package new codes into a Python package.
4. Test new models.

## Design a new component

With DeePMD-kit v3, we have expanded support to include two additional backends alongside TensorFlow: the PyTorch backend and the DPModel backend. The PyTorch backend adopts a highly modularized design to provide flexibility and extensibility. It ensures a consistent experience for both training and inference, aligning with the TensorFlow backend.

The DPModel backend is implemented in pure NumPy, serving as a reference backend to ensure consistency in tests. Its design pattern closely parallels that of the PyTorch backend.

### New descriptors

When creating a new descriptor, it is essential to inherit from both the {py:class}`deepmd.pt.model.descriptor.base_descriptor.BaseDescriptor` class and the {py:class}`torch.nn.Module` class. Abstract methods, including {py:class}`deepmd.pt.model.descriptor.base_descriptor.BaseDescriptor.forward`, must be implemented, while others remain optional. It is crucial to adhere to the original method arguments without any modifications. Once the implementation is complete, the next step involves registering the component with a designated key:

```py
from deepmd.pt.model.descriptor.base_descriptor import (
BaseDescriptor,
)


@BaseDescriptor.register("some_descrpt")
class SomeDescript(BaseDescriptor, torch.nn.Module):
def __init__(self, arg1: bool, arg2: float) -> None:
pass

def get_rcut(self) -> float:
pass

def get_nnei(self) -> int:
pass

def get_ntypes(self) -> int:
pass

def get_dim_out(self) -> int:
pass

def get_dim_emb(self) -> int:
pass

def mixed_types(self) -> bool:
pass

def forward(
self,
coord_ext: torch.Tensor,
atype_ext: torch.Tensor,
nlist: torch.Tensor,
mapping: Optional[torch.Tensor] = None,
):
pass

def serialize(self) -> dict:
pass

def deserialize(cls, data: dict) -> "SomeDescript":
pass

def update_sel(cls, global_jdata: dict, local_jdata: dict):
pass
```

The serialize and deserialize methods are important for cross-backend model conversion.

### New fitting nets

In many instances, there is no requirement to create a new fitting net. For fitting user-defined scalar properties, the {py:class}`deepmd.pt.model.task.ener.InvarFitting` class can be utilized. However, if there is a need for a new fitting net, one should inherit from both the {py:class}`deepmd.pt.model.task.base_fitting.BaseFitting` class and the {py:class}`torch.nn.Module` class. Alternatively, for a more straightforward approach, inheritance from the {py:class}`deepmd.pt.model.task.fitting.GeneralFitting` class is also an option.


```py
from deepmd.pt.model.task.fitting import (
GeneralFitting,
)
from deepmd.dpmodel import (
FittingOutputDef,
fitting_check_output,
)


@GeneralFitting.register("some_fitting")
@fitting_check_output
class SomeFittingNet(GeneralFitting):
def __init__(self, arg1: bool, arg2: float) -> None:
pass

def forward(
self,
descriptor: torch.Tensor,
atype: torch.Tensor,
gr: Optional[torch.Tensor] = None,
g2: Optional[torch.Tensor] = None,
h2: Optional[torch.Tensor] = None,
fparam: Optional[torch.Tensor] = None,
aparam: Optional[torch.Tensor] = None,
):
pass

def output_def(self) -> FittingOutputDef:
pass
```
### New models
The PyTorch backend's model architecture is meticulously structured with multiple layers of abstraction, ensuring a high degree of flexibility. Typically, the process commences with an atomic model responsible for atom-wise property calculations. This atomic model inherits from both the {py:class}`deepmd.pt.model.atomic_model.base_atomic_model.BaseAtomicModel` class and the {py:class}`torch.nn.Module` class.

Subsequently, the `AtomicModel` is encapsulated using the `make_model(AtomicModel)` function, which leverages the `deepmd.pt.model.model.make_model.make_model` function. The purpose of the `make_model` wrapper is to facilitate the translation between atomic property predictions and the extended property predictions and differentiation , e.g. the reduction of atomic energy contribution and the autodiff for calculating the forces and virial. The developers usually need to implement an `AtomicModel` not a `Model`.
anyangml marked this conversation as resolved.
Show resolved Hide resolved

Finally, the entire model is enveloped within a `DPModel`, necessitating inheritance from the {py:class}`deepmd.pt.model.model.model.BaseModel` class and the inclusion of the aforementioned `make_model(AtomicModel)`. In most cases, there is no need to reconstruct a `DPModel`; one can directly utilize the {py:class}`deepmd.pt.model.model.dp_model.DPModel` class by providing the corresponding fitting net. For models without a fitting net, like the `PairTableModel`, a new `DPModel` needs to be designed. Users seamlessly interact with a wrapper built on top of the `DPModel` to handle key translations of the returned dictionary.


```py
from deepmd.pt.model.atomic_model.base_atomic_model import (
BaseAtomicModel,
)
from deepmd.pt.model.model.make_model import (
make_model,
)
from deepmd.pt.model.model.model import (
BaseModel,
)


class SomeAtomicModel(BaseAtomicModel, torch.nn.Module):
def __init__(self, arg1: bool, arg2: float) -> None:
pass

def forward_atomic(self):
pass


@BaseModel.register("some_model")
class SomeDPModel(make_model(SomeAtomicModel), BaseModel):
pass


class SomeModel(SomeDPModel):
pass
```

## Register new arguments

To let someone uses your new component in their input file, you need to create a new method that returns some `Argument` of your new component, and then register new arguments. For example, the code below

```py
from typing import List

from dargs import Argument
from deepmd.utils.argcheck import descrpt_args_plugin


@descrpt_args_plugin.register("some_descrpt")
def descrpt_some_args() -> List[Argument]:
return [
Argument("arg1", bool, optional=False, doc="balabala"),
Argument("arg2", float, optional=True, default=6.0, doc="haha"),
]
```

allows one to use your new descriptor as below:

```json
"descriptor" :{
"type": "some_descrpt",
"arg1": true,
"arg2": 6.0
}
```

The arguments here should be consistent with the class arguments of your new component.

## Package new codes

You may use `setuptools` to package new codes into a new Python package. It's crucial to add your new component to `entry_points['deepmd']` in `setup.py`:

```py
entry_points = (
{
"deepmd": [
"some_descrpt=deepmd_some_descrtpt:SomeDescript",
],
},
)
```
anyangml marked this conversation as resolved.
Show resolved Hide resolved
anyangml marked this conversation as resolved.
Show resolved Hide resolved

where `deepmd_some_descrtpt` is the module of your codes. It is equivalent to `from deepmd_some_descrtpt import SomeDescript`.

If you place `SomeDescript` and `descrpt_some_args` into different modules, you are also expected to add `descrpt_some_args` to `entry_points`.

After you install your new package, you can now use `dp --pt train` to run your new model.

## Unit tests

When transferring features from another backend to the PyTorch backend, it is essential to include a regression test in `/source/tests/consistent` to validate the consistency of the PyTorch backend with other backends. Presently, the regression tests cover self-consistency and cross-backend consistency between TensorFlow, PyTorch, and DPModel (Numpy) through the serialization/deserialization technique.

During the development of new components within the PyTorch backend, it is necessary to provide a DPModel (Numpy) implementation and incorporate corresponding regression tests. For PyTorch components, developers are also required to include a unit test using `torch.jit`.
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Create a model
# Create a model in TensorFlow

If you'd like to create a new model that isn't covered by the existing DeePMD-kit library, but reuse DeePMD-kit's other efficient modules such as data processing, trainner, etc, you may want to read this section.

Expand All @@ -10,7 +10,7 @@ To incorporate your custom model you'll need to:

## Design a new component

When creating a new component, take descriptor as the example, you should inherit {py:class}`deepmd.tf.descriptor.descriptor.Descriptor` class and override several methods. Abstract methods such as {py:class}`deepmd.tf.descriptor.descriptor.Descriptor.build` must be implemented and others are not. You should keep arguments of these methods unchanged.
When creating a new component, take descriptor as the example, one should inherit from the {py:class}`deepmd.tf.descriptor.descriptor.Descriptor` class and override several methods. Abstract methods such as {py:class}`deepmd.tf.descriptor.descriptor.Descriptor.build` must be implemented and others are not. You should keep arguments of these methods unchanged.

After implementation, you need to register the component with a key:
```py
Expand All @@ -31,7 +31,7 @@ To let someone uses your new component in their input file, you need to create a
from typing import List

from dargs import Argument
from deepmd.tf.utils.argcheck import descrpt_args_plugin
from deepmd.utils.argcheck import descrpt_args_plugin


@descrpt_args_plugin.register("some_descrpt")
Expand Down
3 changes: 2 additions & 1 deletion doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,8 @@ DeePMD-kit is a package written in Python/C++, designed to minimize the effort r
:caption: Developer Guide

development/cmake
development/create-a-model
development/create-a-model-tf
development/create-a-model-pt
development/type-embedding
development/coding-conventions
development/cicd
Expand Down