Skip to content

Commit

Permalink
tf: add fparam/aparam support for finetune (#3313)
Browse files Browse the repository at this point in the history
Fix #3256.

Signed-off-by: Jinzhe Zeng <[email protected]>
  • Loading branch information
njzjz authored Feb 21, 2024
1 parent e1c0564 commit d629616
Showing 1 changed file with 16 additions and 1 deletion.
17 changes: 16 additions & 1 deletion deepmd/tf/fit/ener.py
Original file line number Diff line number Diff line change
Expand Up @@ -856,7 +856,22 @@ def change_energy_bias(
box = test_data["box"][:numb_test]
else:
box = None
ret = dp.eval(coord, box, atype, mixed_type=mixed_type)
if dp.get_dim_fparam() > 0:
fparam = test_data["fparam"][:numb_test]
else:
fparam = None
if dp.get_dim_aparam() > 0:
aparam = test_data["aparam"][:numb_test]
else:
aparam = None
ret = dp.eval(
coord,
box,
atype,
mixed_type=mixed_type,
fparam=fparam,
aparam=aparam,
)
energy_predict.append(ret[0].reshape([numb_test, 1]))
type_numbs = np.concatenate(type_numbs)
energy_ground_truth = np.concatenate(energy_ground_truth)
Expand Down

0 comments on commit d629616

Please sign in to comment.