You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
4367460 removes the heuristics from NBS. However, for some use cases, the heuristics can be beneficial, both for Firefox-like and Chrome-like browsers.
Going back to 401, it leaks that the server exists but once known, there is not much to be learn in respect to the proxy discovery attack. That is the reason we decided to omit 401. However, blocking a server with excessive 401 errors makes sense because it might be a distributed password cracking attack. Suppose a malicious server distributes to a connected client passwords and lets the clients do the dirty work of connecting to the victim server. This way, server logs will show the browser proxy IP address and not the real attacker. If we pick weight 0.01 for 401 that means 1000 attempts. Does it make sense to reintroduce 401 and similar errors with a smallish weight? If so, does it make sense to have something like that for Firefox as well?
We should go through potential attacks that the heuristics can block and try to mitigate them.
Implement custom DNS cache using resolved data available in onResponseStartedListener().
Deals with the original problem of #62. This merge closes#75.
This removes the heuristics. But we should bring them back for other
types of attacks, see #76 for more details.
4367460 removes the heuristics from NBS. However, for some use cases, the heuristics can be beneficial, both for Firefox-like and Chrome-like browsers.
We should go through the errors at
jsrestrictor/chrome/http_shield_chrome.js
Line 41 in 0ff5acc
So for example 404 adds 1 to the
hostStatistics
atjsrestrictor/chrome/http_shield_chrome.js
Line 171 in 0ff5acc
Going back to 401, it leaks that the server exists but once known, there is not much to be learn in respect to the proxy discovery attack. That is the reason we decided to omit 401. However, blocking a server with excessive 401 errors makes sense because it might be a distributed password cracking attack. Suppose a malicious server distributes to a connected client passwords and lets the clients do the dirty work of connecting to the victim server. This way, server logs will show the browser proxy IP address and not the real attacker. If we pick weight 0.01 for 401 that means 1000 attempts. Does it make sense to reintroduce 401 and similar errors with a smallish weight? If so, does it make sense to have something like that for Firefox as well?
We should go through potential attacks that the heuristics can block and try to mitigate them.
The issue was originally a part of #62.
The text was updated successfully, but these errors were encountered: