Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minor UI changes for the release #691

Closed
7 tasks done
JulienVig opened this issue Jul 1, 2024 · 2 comments · Fixed by #692
Closed
7 tasks done

Minor UI changes for the release #691

JulienVig opened this issue Jul 1, 2024 · 2 comments · Fixed by #692
Assignees
Labels
web client Related to the browser environment
Milestone

Comments

@JulienVig
Copy link
Collaborator

JulienVig commented Jul 1, 2024

  • On the training page, add hover explanations for "round", "batch" and "epoch" and their relationships with each other.
  • Display a banner saying that Disco is currently in a demo state
  • Add a link to a feedback form
  • Add user feedback when downloading a model
  • Add LLM training ETA -> display the total number of epochs
  • Clarify the use of "epochs" for the language modeling task (which aren't really epochs)
  • Display LLM inference message that currently only the CLI inference is available
@JulienVig JulienVig added the web client Related to the browser environment label Jul 1, 2024
@JulienVig JulienVig self-assigned this Jul 1, 2024
@JulienVig JulienVig changed the title Release minor UI changes Minor UI changes for the release Jul 1, 2024
This was referenced Jul 1, 2024
@JulienVig JulienVig linked a pull request Jul 2, 2024 that will close this issue
@JulienVig JulienVig removed a link to a pull request Jul 2, 2024
@tharvik
Copy link
Collaborator

tharvik commented Jul 2, 2024

taken from comment on prerelease PR

The epoch count is reset whenever the round is incremented, is it expected? When I train wikitext I get the following sequence: round 0 - epoch 0 - batch 0 round 0 - epoch 0 - batch 1 [...] round 0 - epoch 0 - batch 5 round 0 - epoch 1 - batch 0 [...] round 0 - epoch 1 - batch 5 round 0 - epoch 2 - batch 0 round 1 - epoch 0 - batch 0 round 1 - epoch 0 - batch 1

Shouldn't the epoch continue incrementing rather than being reset? (Clearly there needs to be more explanations in the UI regarding rounds epochs and batches #691).

I was thinking of a ~descending relation, translated in words as "each round, run X epoches. each epoch, run Y batches.". so every round/epoch restart its lower ~layer.

@JulienVig
Copy link
Collaborator Author

Hm as a user I would like to know how many epochs the model has been training so far (without having to do a multiplication and remember how rounds translate into epochs).
Is it even useful to show the round number actually? Could we rather only show (like with an animation or something) when communication is happening but not necessarily show the round number?

@martinjaggi martinjaggi added this to the v3.0.0 milestone Jul 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
web client Related to the browser environment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants