Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

没有进行finutne,训练了1000个epoch,验证:pre:0.05 recall:0.02 hmean:0.03 AP:0?可能是什么问题呢 #98

Open
neverstoplearn opened this issue May 16, 2022 · 6 comments

Comments

@neverstoplearn
Copy link

No description provided.

@jiangxiluning
Copy link
Owner

jiangxiluning commented May 16, 2022 via email

@neverstoplearn
Copy link
Author

1000 epoch , e2e 不finetune 训练可能有点难。 neverstoplearn @.> 于2022年5月16日周一 14:41写道:

— Reply to this email directly, view it on GitHub <#98>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AANWXLSGJBMPM77HAG64Q4DVKHUYHANCNFSM5WANQXCQ . You are receiving this because you are subscribed to this thread.Message ID: @.
>
预训练模型训练需要时间更久 尴尬

@neverstoplearn
Copy link
Author

1000 epoch , e2e 不finetune 训练可能有点难。 neverstoplearn @.> 于2022年5月16日周一 14:41写道:

— Reply to this email directly, view it on GitHub <#98>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AANWXLSGJBMPM77HAG64Q4DVKHUYHANCNFSM5WANQXCQ . You are receiving this because you are subscribed to this thread.Message ID: @.
>

不fintune的话大概多少epoch可以有不错的结果啊,我看你在readme里面不finetune 1000轮效果还可以。

@jiangxiluning
Copy link
Owner

我最近 训的一个 1000 不finetune 就很拉垮。
Calculated!{"precision": 0.10656316160903317, "recall": 0.07270101107366393, "hmean": 0.0864338866628506, "AP": 0}

@neverstoplearn
Copy link
Author

我最近 训的一个 1000 不finetune 就很拉垮。 Calculated!{"precision": 0.10656316160903317, "recall": 0.07270101107366393, "hmean": 0.0864338866628506, "AP": 0}

结果差不多 啥时候更新一波在SynthText800k预训练模型啊?另外想请教一下,这类端到端的算法是不是不适用于中文OCR?

@laofeiwei
Copy link

我跑的tf复现的FOTS,效果也不尽人意,检测分支训练效果一直不好。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants