Skip to content

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #9923

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #9923

Triggered via pull request November 15, 2024 19:57
@yaoyu-33yaoyu-33
labeled #11289
Status Success
Total duration 28s
Artifacts

code-formatting.yml

on: pull_request_target
reformat_with_isort_and_black
20s
reformat_with_isort_and_black
Matrix: check_pylint
Fit to window
Zoom out
Zoom in