Skip to content

Commit

Permalink
Fix spellchecker action and errors
Browse files Browse the repository at this point in the history
  • Loading branch information
szmazurek committed Nov 21, 2024
1 parent 60eda6c commit f057ab8
Show file tree
Hide file tree
Showing 4 changed files with 15 additions and 6 deletions.
11 changes: 10 additions & 1 deletion .spelling/.spelling/expect.txt
Original file line number Diff line number Diff line change
Expand Up @@ -723,4 +723,13 @@ ystore
Zisserman
zsuokb
zwezggl
zzokqk
zzokqk
thirdparty
adopy
Shohei
crcrpar
lrs
autograd
cudagraph
kwonly
torchscript
4 changes: 2 additions & 2 deletions GANDLF/losses/loss_interface.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ def forward(self, prediction: torch.Tensor, target: torch.Tensor) -> torch.Tenso

class AbstractSegmentationLoss(AbstractLossFunction):
"""
Base class for loss funcions that are used for segmentation tasks.
Base class for loss functions that are used for segmentation tasks.
"""

def __init__(self, params: dict):
Expand All @@ -43,7 +43,7 @@ def _compute_single_class_loss(

def _optional_loss_operations(self, loss: torch.Tensor) -> torch.Tensor:
"""
Perform addtional operations on the loss value. Defaults to identity operation.
Perform additional operations on the loss value. Defaults to identity operation.
If needed, child classes can override this method. Useful in cases where
for example, the loss value needs to log-transformed or clipped.
"""
Expand Down
2 changes: 1 addition & 1 deletion GANDLF/optimizers/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
- Add the relevant code under the `GANDLF.optimizers.thirdparty` submodule.
- Add a wrapper which takes in GaNDLF's `parameter` dictionary as input and creates a `torch.optim.Optimizer` object as output.
- Add the wrapper to the `GANDLF.optimizers.thirdparty.__init__.py` so that it can be called from `GANDLF.optimizers.__init__.py`.
- See `GANDLF.optimizers.thirdparty.adopy.py` as an example.
- See `GANDLF.optimizers.thirdparty.adopt.py` as an example.
- If a new dependency needs to be used, update GaNDLF's [`setup.py`](https://github.com/mlcommons/GaNDLF/blob/master/setup.py) with the new requirement.
- Define a new submodule under `GANDLF.optimizers` as `GANDLF.optimizers.wrap_${package_name}.py`.
- Ensure that the new algorithm is wrapped in a function which returns an object with the PyTorch optimizer type. Use any of the optimizers in `GANDLF.optimizers.wrap_torch.py` as an example.
Expand Down
4 changes: 2 additions & 2 deletions docs/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,9 +53,9 @@ Please see https://mlcommons.github.io/GaNDLF/usage/#federating-your-model-evalu

Please read the [migration guide](https://mlcommons.github.io/GaNDLF/migration_guide) to understand the changes that have been made to GaNDLF. If you have any questions, please feel free to [post a support request](https://github.com/mlcommons/GaNDLF/issues/new?assignees=&labels=&template=--questions-help-support.md&title=).

### I am getting an error realted to version mismatch (greater or smaller) between the configuration and GaNDLF version. What should I do?
### I am getting an error related to version mismatch (greater or smaller) between the configuration and GaNDLF version. What should I do?

This is a safety feature to ensure a tight integartion between the configuration used to define a model and the code version used to perform the training. Ensure that you have all requirements satisfied, and then check the ``version`` key in the configration, and ensure it appropriately matches the output of ``gandlf run --version``.
This is a safety feature to ensure a tight integration between the configuration used to define a model and the code version used to perform the training. Ensure that you have all requirements satisfied, and then check the ``version`` key in the configuration, and ensure it appropriately matches the output of ``gandlf run --version``.

### What if I have another question?

Expand Down

0 comments on commit f057ab8

Please sign in to comment.