-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: support writing multiple dataframes/objects to the same pin #311
Comments
Thank you for the report! I know something like this is available on the R side using Does that match what you would expect in this scenario, at least partially? |
I think that would work! Thanks @isabelizimm |
I second the request for multi-file pins via My workaround for the single-file limitation in Python It's also worth noting that if you write a multi-file pin from R and attempt to download it from Python with |
I'm going to break this out into a separate issue for tracking purposes. Thank you for the feedback on use cases and rough edges! |
I spoke to some pins users at posit::conf who are interested in the ability to read/write multiple dataframes to the same pin. The primary use-case for this is when using
board_connect
. The ACL controls imposed by Connect mean that if a user wants to store >1 related dataframes on Connect then they must use multiple pins. This is cumbersome because they must also maintain ACL's for multiple content items. My recommendation for now is to use groups in Connect and update group membership but YMMV depending on the configured Auth provider in Connect.I'm not that familiar with how pins stores data but my guess is that some of this is already possible when using the
json
storage type for Python or thejson
/rds
types for R but the user would need combine their dataframes first.It would be nice if pins supported APIs for writing multiple dataframes/objects to the same pin. I'm envisioning something like this:
This would store 2 separate parquet files (or CSVs) under the hood, one for each dataframe.
The text was updated successfully, but these errors were encountered: