-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
nllLoss lib files and test files #49
base: main
Are you sure you want to change the base?
Conversation
x[1,0] = -0.5; x[1,1] = -1.5; x[1,2] = -2.5; | ||
|
||
var target = Tensor.arange(2); | ||
ref y = target.forceRank(1); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing .array
.
|
||
var weight = Tensor.ones(3); | ||
|
||
var a = nllLoss(input, target, weight); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be dynamicTensor.nllLoss(...)
?
lib/DynamicTensor.chpl
Outdated
input: dynamicTensor(2,?eltType), | ||
target: dynamicTensor(1,eltType), | ||
weight: dynamicTensor(1, eltType), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dynamicTensor
does not have a rank argument.
lib/StaticTensor.chpl
Outdated
proc type staticTensor.nllLoss( | ||
input: staticTensor(2,?eltType), | ||
target: staticTensor(1,eltType), | ||
weight: staticTensor(1, eltType) = ones(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
staticTensor.ones(eltType)
or make a function overload that doesn't have the weight
argument and one that does have the weight
argument, but with no default parameter.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi Aven, please see my comments. You can also use --print-compiler-outputs
with the testing script to see where you went wrong on your local machine.
I implemented the changes. A bunch of other issues came up too, and now I'm at the point where it's returning this.🌱 loss/nll
|
@@ -653,6 +653,44 @@ proc dynamicTensor.flatten(): dynamicTensor(eltType) { | |||
return new dynamicTensor(eltType); | |||
} | |||
|
|||
proc type dynamicTensor.nllLoss( | |||
input: dynamicTensor(?eltType), | |||
target: dynamicTensor(int), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be target: dynamicTensor(eltType)
for param rankIn in 2..2 { | ||
if input.checkRank(rankIn) { | ||
for param rank in 1..1 { | ||
if target.checkRank(rank) && weight.checkRank(rank) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be if target.checkRank(rankIn) && weight.checkRank(rank) {...
for param rankIn in 2..2 { | ||
if input.checkRank(rankIn) { | ||
for param rank in 1..1 { | ||
if target.checkRank(rank) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be if target.checkRank(rankIn) {...
Debugged nllLoss in NDArray, Autograd, StaticTensor, and Dynamic Tensor. Started writing tests in test files. Currently getting this error. Might be something wrong with the test file, but I'm not sure.
🌱 loss/nll
Failed to compile loss/nll/nll.chpl
/nfs/stak/users/sadighia/capstone/ChAI/lib/NDArray.chpl:3: warning: an implicit module named 'NDArray' is being introduced to contain file-scope code
/nfs/stak/users/sadighia/capstone/ChAI/lib/StaticTensor.chpl:61: In method 'this':
/nfs/stak/users/sadighia/capstone/ChAI/lib/StaticTensor.chpl:62: error: unresolved call 'staticTensor(1,real(32)).slice(int(64))'
/nfs/stak/users/sadighia/capstone/ChAI/lib/StaticTensor.chpl:346: note: this candidate did not match: staticTensor.slice(dom: domain(?))
/nfs/stak/users/sadighia/capstone/ChAI/lib/StaticTensor.chpl:62: note: because actual argument #1 with type 'int(64)'
/nfs/stak/users/sadighia/capstone/ChAI/lib/StaticTensor.chpl:346: note: is passed to formal 'dom: domain(?)'
/nfs/stak/users/sadighia/capstone/ChAI/lib/StaticTensor.chpl:62: note: other candidates are:
/nfs/stak/users/sadighia/capstone/ChAI/lib/StaticTensor.chpl:351: note: staticTensor.slice(rngs: range ...rank)
/nfs/stak/users/sadighia/capstone/ChAI/lib/Utilities.chpl:344: note: _tuple.slice(param low: int, param high: int)
note: and 6 other candidates, use --print-all-candidates to see them
/nfs/stak/users/sadighia/capstone/ChAI/test/correspondence/loss/nll/nll.chpl:10: called as (staticTensor(1,real(32))).this(args(0): int(64))
⛔ loss/nll
Failed Chapel compilations tests: ['nll']
Failed Python tests: []
Tests to fix: []