Add attention_bias
argument in transformer block and transformer layer modules, addressing change in MCore
#6366
The logs for this run have expired and are no longer available.
Loading