Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Report duration of failed retrievals #44

Open
1 of 3 tasks
Tracked by #54
bajtos opened this issue Nov 29, 2023 · 2 comments
Open
1 of 3 tasks
Tracked by #54

Report duration of failed retrievals #44

bajtos opened this issue Nov 29, 2023 · 2 comments
Assignees

Comments

@bajtos
Copy link
Member

bajtos commented Nov 29, 2023

At the moment, most retrievals (>99.99%) fail. We are not measuring the duration of failed retrievals, and therefore we don't know how many tasks can an honest checker node complete every round.

Let's start collecting that data.

  • Modify spark checker to report duration for failed retrievals too.

    Note: we should be collecting this data, but apparently, some measurements come with an invalid end_at value. See Report retrieval network errors #43 (comment)

    Let's ensure end_at is always set correctly.

  • Modify spark-evaluate to produce two retrieval stats - duration of successful requests and duration of all requests.

  • Figure out how to handle measurements with end_at set to Date(0) - they are clearly invalid, but why are we receiving them? Are they produced by fraudulent nodes?

@bajtos
Copy link
Member Author

bajtos commented Nov 29, 2023

Oh! We don’t have any catch block, that’s why we are not setting end_at. And I am guessing that PG or node-pg converts null to Date(0).

spark/lib/spark.js

Lines 97 to 99 in a609849

} finally {
clearTimeout(timeout)
}

See also #43 (comment)

@bajtos
Copy link
Member Author

bajtos commented Nov 30, 2023

Figure out how to handle measurements with end_at set to Date(0) - they are clearly invalid, but why are we receiving them? Are they produced by fraudulent nodes?

This will be fixed by filecoin-station/spark-api#160

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: 🗃 backlog
Development

No branches or pull requests

1 participant