Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimize webui for a great number of tokens #9

Closed
kelsos opened this issue Oct 9, 2018 · 1 comment
Closed

Optimize webui for a great number of tokens #9

kelsos opened this issue Oct 9, 2018 · 1 comment

Comments

@kelsos
Copy link
Contributor

kelsos commented Oct 9, 2018

Currently the webui is not build to handle a great number of tokens.

The first time the user opens the webui for each token 4 requests to web3 take place (decimals/name/symbol/balance).

Then as long as the user is in page the mechanism keeps polling for tokens every 5 seconds. The polling mechanisms also includes a call to web3 for each token to check for balance.

Possible steps to resolve this

  • Update to web3.js 1.0.0 as we talked with @andrevmatos
  • Find if it is possible to batch all these calls to web3 and instead of doing x (number of tokens) http requests use a single one for all the information.
  • Find an alternative approach to update balances, instead of checking every 5 seconds
@LefterisJP LefterisJP transferred this issue from raiden-network/raiden Nov 30, 2018
@kelsos
Copy link
Contributor Author

kelsos commented Nov 30, 2018

The batching mechanism should not should not use the BatchRequest from web3.js since the class doesn't provide any promise based mechanism but instead works with callbacks. The initial approach using that mechanism also requires some monkey-patching to work and it is unnecessarily complicated.

The new mechanism should:

  • Automatically split batched requests to smaller batches (RPC calls fail for more than 800 batched requests)
  • Support for optional values with default return values. (In optional token properties the whole batch would fail due to an exception)
  • Return a Promise that returns an array of all the results.
  • Figure out when the batch promise should fail completely.

@kelsos kelsos closed this as completed in #3 Dec 18, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant