Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

xtuner转化模型权重文件是否支持多卡 #715

Closed
AlittlePIE opened this issue May 24, 2024 · 8 comments
Closed

xtuner转化模型权重文件是否支持多卡 #715

AlittlePIE opened this issue May 24, 2024 · 8 comments

Comments

@AlittlePIE
Copy link

4张3090
微调qwen32b之后进行权重转换,是否有选择多卡的选项呢,默认是单卡,会出现oom,以及之后merge的操作,是否有多卡的选项,14b进行merge单卡也会oom(这一步可以使用原生transformers库搞定)

@LZHgrla
Copy link
Collaborator

LZHgrla commented May 24, 2024

@AlittlePIE
pth_to_hf是在CPU上完成的,应该不涉及GPU的OOM,可以再检查一下是什么问题导致的OOM。
merge的话,有一个参数可以控制,以打开transformers的device_map='auto',--device auto

parser.add_argument(
'--device',
default='cuda',
choices=('cuda', 'cpu', 'auto'),
help='Indicate the device')

@AlittlePIE
Copy link
Author

微信图片_20240524190555
这个看上去还是会加载权重的,然后就oom了

@LZHgrla
Copy link
Collaborator

LZHgrla commented May 24, 2024

@AlittlePIE
Copy link
Author

微信图片_20240524192157
还是会这样

@AlittlePIE
Copy link
Author

cfg = Config.fromfile(args.config)
请问在这一句代码中,args.config是个代码路径,这一句代码是否会执行该代码文件呢,我在该代码文件里写了个print(1111),在运行pth_to_hf的时候也输出了1111

@LZHgrla
Copy link
Collaborator

LZHgrla commented May 24, 2024

cfg = Config.fromfile(args.config)

请问在这一句代码中,args.config是个代码路径,这一句代码是否会执行该代码文件呢,我在该代码文件里写了个print(1111),在运行pth_to_hf的时候也输出了1111

会执行

@LZHgrla
Copy link
Collaborator

LZHgrla commented May 24, 2024

微信图片_20240524192157

还是会这样

试试在config的model中的llm中增加一个device_map='auto'参数,看会不会起效。

@AlittlePIE
Copy link
Author

可以了,感谢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants