Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DataCap Application] <zinc15 > <2024-06-21T05:04:25.948Z> #39

Open
martapiekarska opened this issue Jun 21, 2024 · 37 comments
Open

[DataCap Application] <zinc15 > <2024-06-21T05:04:25.948Z> #39

martapiekarska opened this issue Jun 21, 2024 · 37 comments
Assignees

Comments

@martapiekarska
Copy link
Contributor

martapiekarska commented Jun 21, 2024

Version

2024-06-21T05:04:25.948Z

DataCap Applicant

@dos2un1x

Data Owner Name

zinc15

Data Owner Country/Region

Life Science / Healthcare

Website

zinc15.docking.org

Social Media Handle

https://registry.opendata.aws/zinc15/

Social Media Type

Slack

What is your role related to the dataset

Other

Total amount of DataCap being requested

5PiB

Expected size of single dataset (one copy)

989TiB

Number of replicas to store

4

Weekly allocation of DataCap requested

1PiB

On-chain address for first allocation

f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Identifier

Share a brief history of your project and organization

Welcome to ZINC15, a research tool for ligand discovery, chemical biology and pharmacology. We don't believe documentation should be necessary. Our goal is to make ZINC so blindingly obvious to use that it requires none.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

3D models for molecular docking screens.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

If you are a data preparer. What is your location (Country/Region)

Singapore

If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?

If you are not preparing the data, who will prepare the data? (Provide name and business)

Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.

Please share a sample of the data

sudo aws s3 ls --no-sign-request s3://zinc3d/ --recursive --human-readable --summarize | grep Total
Total Objects: 5840977
Total Size: 989.7 TiB

Confirm that this is a public dataset that can be retrieved by anyone on the Network

Confirm

If you chose not to confirm, what was the reason

What is the expected retrieval frequency for this data

Yearly

For how long do you plan to keep this dataset stored on Filecoin

1 to 1.5 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, Africa, North America, South America, Europe, Australia (continent), Antarctica

How will you be distributing your data to storage providers

Cloud storage (i.e. S3)

How did you find your storage providers

Slack, Partners, Others

If you answered "Others" in the previous question, what is the tool or platform you used

Please list the provider IDs and location of the storage providers you will be working with.

1.f02837226 UK
2.f02864300 US
3.f02894286 KR
4.f02870401 ID

How do you plan to make deals to your storage providers

Boost client

If you answered "Others/custom tool" in the previous question, enter the details here

Can you confirm that you will follow the Fil+ guideline

Yes

Copy link
Contributor

datacap-bot bot commented Jun 21, 2024

Application is waiting for allocator review

@kevzak
Copy link
Contributor

kevzak commented Jun 21, 2024

Hello @dos2un1x

Can you list SP Entity names for each SP.
1.f02837226 UK
2.f02864300 US
3.f02894286 KR
4.f02870401 ID
These SPs are not prepared to provide retrievals. Please confirm or choose other SPs

Copy link
Contributor

datacap-bot bot commented Jun 21, 2024

@kevzak
Copy link
Contributor

kevzak commented Jun 21, 2024

Also we're asking you to complete a gitcoin KYC check above @dos2un1x

The other KYC option is

  • go to filplus.storage
  • Log in with your github account (top right corner)
  • Click on your Profile Icon (Next to a bell icon)
  • Choose "Confirm Identity"
  • Scroll to about middle of the page and follow an external link to togggle third party check.
  • You will need a mobile phone, and an ID.

Copy link
Contributor

datacap-bot bot commented Jun 25, 2024

KYC completed for client address f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy with Optimism address 0xA2011DC10c4eB5db5600ccC91AB40da093f69897 and passport score 30.

@dos2un1x
Copy link

@kevzak Dear Notary, our KYC has been completed. I need your signature. Thank you!

@kevzak
Copy link
Contributor

kevzak commented Jun 27, 2024

Hi @dos2un1x - KYC is confirmed.

Please reply to this request before we can begin: #39 (comment)

You need to confirm SPs that meet retrieval requirements upfront. Thanks.

@dos2un1x
Copy link

Hi @dos2un1x - KYC is confirmed.

Please reply to this request before we can begin: #39 (comment)

You need to confirm SPs that meet retrieval requirements upfront. Thanks.

I have discussed this issue with SPs, and they support Graphsync/HTTP retrieval mode.
We will also add new SPs later to ensure data distribution.

@kevzak
Copy link
Contributor

kevzak commented Jul 1, 2024

Please provide SP entity information @dos2un1x

#39 (comment)

@dos2un1x
Copy link

dos2un1x commented Jul 4, 2024

1.f02837226 UK Jerry [email protected] kinghash
2.f02864300 US miaozi [email protected] chainup
3.f02894286 KR Lee [email protected] HS88
4.f02870401 ID akcd4040 [email protected] bitwind

Copy link
Contributor

datacap-bot bot commented Jul 7, 2024

Datacap Request Trigger

Total DataCap requested

5PiB

Expected weekly DataCap usage rate

DataCap Amount - First Tranche

50TiB

Client address

f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy

Copy link
Contributor

datacap-bot bot commented Jul 7, 2024

DataCap Allocation requested

Multisig Notary address

Client address

f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy

DataCap allocation requested

50TiB

Id

5c52eb6e-c375-4972-94de-f369ec4de17a

Copy link
Contributor

datacap-bot bot commented Jul 7, 2024

Application is ready to sign

Copy link
Contributor

datacap-bot bot commented Jul 7, 2024

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzacedwrmmlacqd33224ykbcdggz7sum7puvfkci7y5ke4mjqkixwju44

Address

f1deu3dtpvqphtovt2rzwppla2ikkzx334lhtpfpy

Datacap Allocated

50TiB

Signer Address

f1v24knjbqv5p6qrmfjj5xmlaoddzqnon2oxkzkyq

Id

5c52eb6e-c375-4972-94de-f369ec4de17a

You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedwrmmlacqd33224ykbcdggz7sum7puvfkci7y5ke4mjqkixwju44

Copy link
Contributor

datacap-bot bot commented Jul 7, 2024

Application is Granted

@kevzak
Copy link
Contributor

kevzak commented Aug 6, 2024

checker:manualTrigger

Copy link
Contributor

datacap-bot bot commented Aug 6, 2024

Issue information change request has been approved.

Copy link
Contributor

datacap-bot bot commented Aug 6, 2024

DataCap and CID Checker Report Summary1

Storage Provider Distribution

✔️ Storage provider distribution looks healthy.

Deal Data Replication

✔️ Data replication looks healthy.

Deal Data Shared with other Clients2

✔️ No CID sharing has been observed.

Full report

Click here to view the CID Checker report.

Footnotes

  1. To manually trigger this report, add a comment with text checker:manualTrigger

  2. To manually trigger this report with deals from other related addresses, add a comment with text checker:manualTrigger <other_address_1> <other_address_2> ...

@kevzak
Copy link
Contributor

kevzak commented Aug 6, 2024

@dos2un1x - we've implemented some new data preparation questions for open datasets. Please take some time to answer these for us. Thanks.

1: How the data is being transformed into deals for Filecoin
What is the transformation process from the files available for download and what will be stored on filecoin?
How when we sample your deals will we be able to confirm that it has come from the dataset?
Given a 32GB payload, what steps can an independent entity take to confirm it comes from the relevant upstream dataset?

2: How the data is made available
When a deal is sampled for verification, how will we be able to confirm that it is part of this dataset? (how is is chunked into car files?)
We want to see how a client could be able to make use of this dataset, can you share the documentation?
This could be a client script for how to iterate through / process over the data
This could be a web site allowing browsing / identification of specific pieces of data from the dataset as stored
This could be identification of clients making use of the data

@datacap-bot datacap-bot bot mentioned this issue Aug 7, 2024
Copy link
Contributor

datacap-bot bot commented Aug 7, 2024

Issue has been modified. Changes below:

(OLD vs NEW)

Weekly Allocation: 1PiB vs
State: ChangesRequested vs Granted

Copy link
Contributor

datacap-bot bot commented Aug 9, 2024

Issue information change request has been approved.

Copy link
Contributor

datacap-bot bot commented Aug 9, 2024

Client used 75% of the allocated DataCap. Consider allocating next tranche.

@dos2un1x
Copy link

  1. You can use the aws s3 sync to download https://registry.opendata.aws/zinc15/ corresponding data set.
  2. We archive the downloaded data set and generate the car file with the boost tool according to reasonable rules.

@willscott
Copy link
Contributor

The boost tool does not generate a car file - it makes a deal with a car file. We are asking you to explain these "reasonable rules"

@dos2un1x
Copy link

Just because the generate-car parameter has been removed from the new version of boostx doesn't mean I can't use another version, right?

@willscott
Copy link
Contributor

"the boost tool according to reasonable rules" does not provide enough detail for us to understand the transformation you are proposing to perform. boostx is a different tool that you are now specifying somewhat more specifically, but you still have not provided enough information for anyone else to be able to reconstruct the original data set from the deals you make to filecoin.

@dos2un1x
Copy link

I think you're taking the term boost tool too seriously!

@willscott
Copy link
Contributor

indeed - I'm asking for you to actually provide the details that allow for technical replication for how the open data set will be stored to filecoin. This was always the intention of public data sets being stored to filecoin, and is now being enforced - at least by this datacap pathway.

@dos2un1x
Copy link

Why provide technical details? Why not in other issues?

@dos2un1x
Copy link

Do I have to share my code with you?

@martapiekarska
Copy link
Contributor Author

martapiekarska commented Sep 18, 2024

Since June/July this year we are trying to raise the standards on data prep for our clients. All of our clients are now asked to share details of how they prepare data so that we can spot check it. It is important that anytone in the community is able to make use of the data set and in order to do that, they will need to reconstruct it.

@dos2un1x
Copy link

I think you can provide a standard for healthy community development.
There won't be so many disputes when everyone meets the same criteria.

@martapiekarska
Copy link
Contributor Author

hi @dos2un1x please review our Policies on https://github.com/fidlabs/Open-Data-Pathway/wiki and let me know if you have further questions.

When you are ready to share the dataprep plan or connect us with the team doing it for you and ensure that your data is available for download as a open data set, please let us know

@dampud
Copy link

dampud commented Feb 5, 2025

@dos2un1x Please let us know if you intend to continue the application.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants