Skip to content

Commit

Permalink
cherry-pick from (PaddlePaddle#56066): fix codestyle
Browse files Browse the repository at this point in the history
  • Loading branch information
ForFishes authored and wentaoyu committed Oct 26, 2023
1 parent 8e90e92 commit c57972a
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 10 deletions.
2 changes: 0 additions & 2 deletions paddle/phi/api/yaml/ops.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -925,7 +925,6 @@
kernel :
func : flash_attn
data_type : q
intermediate : softmax_lse, seed_offset
backward : flash_attn_grad

- op : flash_attn_unpadded
Expand All @@ -950,7 +949,6 @@
kernel :
func : flash_attn_v1
data_type : q
intermediate : softmax_lse, seed_offset
backward : flash_attn_v1_grad

- op : flash_attn_v1_unpadded
Expand Down
10 changes: 2 additions & 8 deletions python/paddle/nn/functional/flash_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -201,10 +201,7 @@ def flash_attention(
if sdp_func_name == "flash_attn":
if in_dynamic_mode():
if g_use_flash_attn_v1:
(
result_attention,
result_softmax,
) = _C_ops.flash_attn_v1(
(result_attention, result_softmax, _, _) = _C_ops.flash_attn_v1(
query,
key,
value,
Expand All @@ -214,10 +211,7 @@ def flash_attention(
not training,
)
else:
(
result_attention,
result_softmax,
) = _C_ops.flash_attn(
(result_attention, result_softmax, _, _) = _C_ops.flash_attn(
query,
key,
value,
Expand Down

0 comments on commit c57972a

Please sign in to comment.