Skip to content

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #6366

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore

Add attention_bias argument in transformer block and transformer layer modules, addressing change in MCore #6366

The logs for this run have expired and are no longer available.