-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[doc] Add autograd docstring #91
Conversation
Reviewer's Guide by SourceryThis pull request enhances the documentation of the Sequence diagram for autograd function calls with physical unitssequenceDiagram
participant User
participant AutogradFunction
participant JAX
participant UnitHandler
User->>AutogradFunction: Call with unit-aware input
AutogradFunction->>UnitHandler: Strip units
AutogradFunction->>JAX: Compute derivative
JAX-->>AutogradFunction: Return raw result
AutogradFunction->>UnitHandler: Apply appropriate units
AutogradFunction-->>User: Return unit-aware result
Class diagram for autograd module functionsclassDiagram
class autograd {
+hessian(fun, argnums)
+jacrev(fun, argnums)
+jacfwd(fun, argnums)
+vector_grad(fun, argnums, return_value)
}
note for autograd "All functions are unit-aware"
class hessian {
+__call__(x: Quantity) Quantity
}
class jacrev {
+__call__(x: Quantity) Quantity
}
class jacfwd {
+__call__(x: Quantity) Quantity
}
class vector_grad {
+__call__(x: Quantity) Quantity
}
autograd ..> hessian
autograd ..> jacrev
autograd ..> jacfwd
autograd ..> vector_grad
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @Routhleck - I've reviewed your changes - here's some feedback:
Overall Comments:
- Consider standardizing the example function names across docstrings (e.g.,
simple_function1
vssimple_function
) to maintain consistency
Here's what I looked at during the review
- 🟢 General issues: all looks good
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟢 Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
This pull request adds example usage to several functions in the
brainunit/autograd
module to improve documentation and clarify how to use these functions with physical unit-aware computations. The most important changes include the addition of examples for thehessian
,jacrev
,jacfwd
, andvector_grad
functions.Documentation improvements:
brainunit/autograd/_hessian.py
: Added example usage for thehessian
function to demonstrate how to compute the Hessian of a function with physical units.brainunit/autograd/_jacobian.py
: Added example usage for thejacrev
function to show how to compute the Jacobian of a function with physical units.brainunit/autograd/_jacobian.py
: Added example usage for thejacfwd
function to illustrate how to compute the forward-mode Jacobian of a function with physical units.brainunit/autograd/_vector_grad.py
: Added example usage for thevector_grad
function to explain how to compute the gradient of a vector with respect to the input and handle physical units.Summary by Sourcery
Documentation:
hessian
,jacrev
,jacfwd
, andvector_grad
functions with physical unit-aware computations.