Skip to content
This repository has been archived by the owner on Jul 18, 2024. It is now read-only.

[DataCap Application] <Venus Team > - <DealAccelerator 4> #1726

Closed
1 of 2 tasks
Joss-Hua opened this issue Mar 5, 2023 · 107 comments
Closed
1 of 2 tasks

[DataCap Application] <Venus Team > - <DealAccelerator 4> #1726

Joss-Hua opened this issue Mar 5, 2023 · 107 comments

Comments

@Joss-Hua
Copy link

Joss-Hua commented Mar 5, 2023

Data Owner Name

Venus team

Data Owner Country/Region

Hong Kong

Data Owner Industry

Life Science / Healthcare

Website

https://venushub.io

Social Media

https://linktr.ee/filecoinvenus

Total amount of DataCap being requested

5PiB

Weekly allocation of DataCap requested

200TiB

On-chain address for first allocation

f1fpup5kk6ibs2u37vyhmcfwpzkr4axs2mtc534ga

Custom multisig

  • Use Custom Multisig

Identifier

No response

Share a brief history of your project and organization

The team: Venus team

Venus team leads the development and practice of Filecoin Venus, established in 2020, Shanghai, China. Now we have decades of members from all over the world focusing on code dev and community.

Through the core dev and ecological activities and programs, we hope (and already) to have more storage service providers, users, and enthusiasts join Filecoin or provide more contributions after joining Filecoin.


The project: Venus Deal Accelerator (https://venushub.io/accelerator/)

Venus is committed to offer a fully functional deal-making experience for both storage clients and storage providers on the scale. As the Filecoin network grows and the community strives towards a more storage deal weighted growth than committed capacity growth, the Venus community takes on the challenge to help shape this vision with the Venus Deal Accelerator program.

The goal of the Venus Deal Accelerator program is to distribute as much storage deals as it can to the broader storage provider community with focuses on seamlessly bridging the sealing experience that storage providers are already familiar with to the Filecoin deal taking experience. Venus Deal Accelerator program will be responsible for applying large datacap with approved open datasets from fil-plus program and distribute storage deals to its participants running Venus storage systems.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

The data stored in Filecoin is from publicly available datasets in various biology, life sciences.

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

How do you plan to prepare the dataset

graphsplit

If you answered "other/custom tool" in the previous question, enter the details here

No response

Please share a sample of the data

https://registry.opendata.aws/vitaldb/
https://registry.opendata.aws/physionet/
s3://physionet-pds/
https://registry.opendata.aws/gatk-sv-data/
s3://gatk-sv-data-us-east-2/
https://registry.opendata.aws/hecatomb/
s3://hecatombdatabases/
https://registry.opendata.aws/nasa-psi/
s3://nasa-psi/

Confirm that this is a public dataset that can be retrieved by anyone on the Network

  • I confirm

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Monthly

For how long do you plan to keep this dataset stored on Filecoin

1.5 to 2 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, North America, Europe

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), HTTP or FTP server, Shipping hard drives, Others

How do you plan to choose storage providers

Slack, Filmine, Big data exchange, Others

If you answered "Others" in the previous question, what is the tool or platform you plan to use

We define VenusHub as a platform for community projects. The Deal Accelerator mentioned here is one of them. The Deal Accelerator is aimed at the storage providers of real data storage. We complete the SP screening through they' application and screening rules, and they come from the Filecoin community, so they have no interest relationship.

If you already have a list of storage providers to work with, fill out their names and provider IDs below

This list is a part of it. We are still expanding the available real data storage providers.
https://github.com/data-preservation-programs/filplus-checker-assets/tree/main/filecoin-project/filecoin-plus-large-datasets/issues/345
https://github.com/data-preservation-programs/filplus-checker-assets/tree/main/filecoin-project/filecoin-plus-large-datasets/issues/1444
We will focus on more new SPs (miners) and hope that more miners will real data provide storage for the client.

How do you plan to make deals to your storage providers

Others/custom tool

If you answered "Others/custom tool" in the previous question, enter the details here

venus-market (Droplet). The support of venus-market(Droplet) for real data storage is very mature.

Can you confirm that you will follow the Fil+ guideline

Yes

@large-datacap-requests
Copy link

Thanks for your request!

Heads up, you’re requesting more than the typical weekly onboarding rate of DataCap!

@large-datacap-requests
Copy link

Thanks for your request!
Everything looks good. 👌

A Governance Team member will review the information provided and contact you back pretty soon.

@Sunnyiscoming
Copy link
Collaborator

Datacap Request Trigger

Total DataCap requested

5PiB

Expected weekly DataCap usage rate

200TiB

Client address

f1fpup5kk6ibs2u37vyhmcfwpzkr4axs2mtc534ga

@large-datacap-requests
Copy link

DataCap Allocation requested

Multisig Notary address

f02049625

Client address

f1fpup5kk6ibs2u37vyhmcfwpzkr4axs2mtc534ga

DataCap allocation requested

100TiB

Id

7cbf2ceb-9171-417f-8b14-b68a6dad7eed

@newwebgroup
Copy link

Tim Venus memiliki banyak pengalaman Fil+ sebelumnya dan sejarah kinerja yang baik dan, karena ini adalah putaran pertama, Bersedia mendukung mereka di babak ini.

Copy link

Request Proposed

Your Datacap Allocation Request has been proposed by the Notary

Message sent to Filecoin Network

bafy2bzacebwlfywgzvtrcbvaynu5g2dji76udvz6e3ujkcwzu44ybtba54jd6

Address

f1fpup5kk6ibs2u37vyhmcfwpzkr4axs2mtc534ga

Datacap Allocated

100.00TiB

Signer Address

f1e77zuityhvvw6u2t6tb5qlnsegy2s67qs4lbbbq

Id

7cbf2ceb-9171-417f-8b14-b68a6dad7eed

You can check the status of the message here: https://filfox.info/en/message/bafy2bzacebwlfywgzvtrcbvaynu5g2dji76udvz6e3ujkcwzu44ybtba54jd6

@sxxfuture-official
Copy link

The Venus team is a trustworthy team. After checking the disclosed information, it is a public data set, and the volume of the data meets the requirements. I will support this round.

Copy link

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzacebmklplckkc2pcmmh4rlmn5qrwtxveukq46sakhe7dx6zitm5fbh6

Address

f1fpup5kk6ibs2u37vyhmcfwpzkr4axs2mtc534ga

Datacap Allocated

100.00TiB

Signer Address

f1foiomqlmoshpuxm6aie4xysffqezkjnokgwcecq

Id

7cbf2ceb-9171-417f-8b14-b68a6dad7eed

You can check the status of the message here: https://filfox.info/en/message/bafy2bzacebmklplckkc2pcmmh4rlmn5qrwtxveukq46sakhe7dx6zitm5fbh6

@data-programs data-programs added the kyc verified User has passed KYC check label Sep 4, 2023
@Joss-Hua
Copy link
Author

Joss-Hua commented Sep 4, 2023

@Joss-Hua Please note this.I can support this round. ⚠️ 1 storage providers sealed too much duplicate data - f07830: 22.86%

got it

@kernelogic
Copy link

OP has explained to me the duplicate data is caused by deal status tracking and scheduling. Seeing this is a very large series, it can sometimes happen. I'll allow it.

Copy link

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzaceawull3rbmyqyozvx6bn3jlcuhcq63pno2wbvpofyiheu7fmlhv6w

Address

f1fpup5kk6ibs2u37vyhmcfwpzkr4axs2mtc534ga

Datacap Allocated

422.55TiB

Signer Address

f1yjhnsoga2ccnepb7t3p3ov5fzom3syhsuinxexa

Id

e4eefb4d-1039-4339-9aa1-68ab8e26cf48

You can check the status of the message here: https://filfox.info/en/message/bafy2bzaceawull3rbmyqyozvx6bn3jlcuhcq63pno2wbvpofyiheu7fmlhv6w

@Joss-Hua
Copy link
Author

Joss-Hua commented Sep 5, 2023

OP has explained to me the duplicate data is caused by deal status tracking and scheduling. Seeing this is a very large series, it can sometimes happen. I'll allow it.

Thanks for understanding

@large-datacap-requests
Copy link

DataCap Allocation requested

Request number 9

Multisig Notary address

f02049625

Client address

f1fpup5kk6ibs2u37vyhmcfwpzkr4axs2mtc534ga

DataCap allocation requested

10.23GiB

Id

4e317cb5-cc98-4ef6-b2f3-5c397eb5d6b9

@large-datacap-requests
Copy link

Stats & Info for DataCap Allocation

Multisig Notary address

f02049625

Client address

f1fpup5kk6ibs2u37vyhmcfwpzkr4axs2mtc534ga

Rule to calculate the allocation request amount

400% of weekly dc amount requested

DataCap allocation requested

10.23GiB

Total DataCap granted for client so far

3.8430698623415174e+96YiB

Datacap to be granted to reach the total amount requested by the client (5PiB)

3.8430698623415174e+96YiB

Stats

Number of deals Number of storage providers Previous DC Allocated Top provider Remaining DC
160845 33 422.55TiB 7.6 157.44TiB

@github-actions
Copy link

This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.

--
Commented by Stale Bot.

@github-actions
Copy link

This application has not seen any responses in the last 14 days, so for now it is being closed. Please feel free to contact the Fil+ Gov team to re-open the application if it is still being processed. Thank you!

--
Commented by Stale Bot.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests