You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Many thanks for your effort to develop such a great library. I want to add support for the ChatGLM (not the 3rd generation) model in mlc-llm, however, it seems to me that currently attention mask is not supported as documented in the following file:
# 3rdparty/tvm/python/tvm/relax/frontend/nn/modules.py:924assertattention_maskisNone, "Attention mask not yet supported."
How can I resolve this issue, sincerely wish to hear from you.
The text was updated successfully, but these errors were encountered:
❓ General Questions
Many thanks for your effort to develop such a great library. I want to add support for the ChatGLM (not the 3rd generation) model in mlc-llm, however, it seems to me that currently attention mask is not supported as documented in the following file:
How can I resolve this issue, sincerely wish to hear from you.
The text was updated successfully, but these errors were encountered: