Skip to content

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #10004

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #10004

Triggered via pull request November 18, 2024 17:30
@yaoyu-33yaoyu-33
labeled #11289
Status Success
Total duration 33s
Artifacts

code-formatting.yml

on: pull_request_target
reformat_with_isort_and_black
25s
reformat_with_isort_and_black
Matrix: check_pylint
Fit to window
Zoom out
Zoom in