-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
upload images to local storage #182
Conversation
@dantownsend When you have time can you check if this example of S3 adapter for media upload works. It should work. # in example.py
import aiobotocore
@dataclass
class S3MediaHandler:
aws_access_key_id: str
aws_secret_access_key: str
region: str
bucket: str
key: str
async def upload(self, request, file):
session = aiobotocore.get_session()
async with session.create_client(
"s3",
region_name=self.region,
aws_secret_access_key=self.aws_secret_access_key,
aws_access_key_id=self.aws_access_key_id,
) as client:
await client.put_object(
ACL="public-read", Bucket=self.bucket, Key=self.key, Body=file
)
return {
"image": (
f"https://{self.bucket}.s3.{self.region}.amazonaws.com/{self.key}"
),
}
media_handler: t.Any = S3MediaHandler(...) |
We need to be careful with this: await client.put_object(
ACL="public-read", Bucket=self.bucket, Key=self.key, Body=file
)
return {
"image": (
f"https://{self.bucket}.s3.{self.region}.amazonaws.com/{self.key}"
),
} The URL is hardcoded to AWS, but it should be possible to use boto with any S3 compatible storage. I would just return the key, rather than the full URL. We then need an endpoint which gets the URL for the image - for S3 it needs to create a signed URL. If we just store the image in a public bucket, it won't be secure. Something like:
So in Piccolo Admin, when we load a column which has media storage enabled, we get the key for the image, and then fetch a signed URL on demand for that image. You get a signed URL doing something like this (I lifted this code from one of my projects): response = s3_client.generate_presigned_url(
ClientMethod="get_object",
Params={"Bucket": BUCKET_NAME, "Key": image_name},
ExpiresIn=3600,
) I hope that makes sense. |
@dantownsend I agree with you and it should be changed, but this is just an example of a media handler that the user should have written themselves (as we said before, user can write their own adapter to work with whatever storage provider they want). I'm mostly interested in whether that handler works with the |
I see. Yeah, it looks like it should work. Have you ever tried this? http://docs.getmoto.org/en/latest/docs/server_mode.html#run-using-docker It's a Docker container you can run which acts as a fake S3. |
@dantownsend Sorry for the slow response but I was unable to test the |
@sinisaos No worries - thanks! |
@dantownsend I changed the adapter to this one. import boto3
from botocore.exceptions import ClientError
@dataclass
class S3MediaHandler:
aws_access_key_id: str
aws_secret_access_key: str
bucket: str
key: str
region: str
def upload(self, request, file):
s3_client = boto3.resource(
"s3",
aws_access_key_id=self.aws_access_key_id,
aws_secret_access_key=self.aws_secret_access_key,
region_name=self.region,
)
s3_client.create_bucket(Bucket=self.bucket)
# Upload the file
s3_client = boto3.client("s3")
try:
response = s3_client.upload_fileobj(
file.file,
self.bucket,
self.key,
)
except ClientError as e:
print(e) # or logging
response = s3_client.generate_presigned_url(
ClientMethod="get_object",
Params={"Bucket": self.bucket, "Key": file.filename},
ExpiresIn=3600,
)
return {"image": response}
media_handler: t.Any = S3MediaHandler(...) I hope this works, but you'll have to try it yourself. To be honest, I've been struggling with the moto library (and I don't know if I've used it well at all) and I've used the |
@sinisaos I really struggled using moto at first. It took me ages to work it out. I hoped that using the Docker container gave a simpler experience, and you could just point your code at it, and it would act like a local version of S3. Don't worry too much about S3 for now - we can always add it later. Just local storage will be super useful. |
@dantownsend Ok. Upload to local storage (code is in this PR) is ready and everything works well. If you have time, can you try it? |
Codecov Report
@@ Coverage Diff @@
## master #182 +/- ##
==========================================
+ Coverage 96.01% 96.21% +0.19%
==========================================
Files 5 5
Lines 251 264 +13
==========================================
+ Hits 241 254 +13
Misses 10 10
Help us with your feedback. Take ten seconds to tell us how you rate us. |
Usage | ||
----- | ||
|
||
Piccolo admin uses the Piccolo ORM `Array <https://piccolo-orm.readthedocs.io/en/latest/piccolo/schema/column_types.html#array>`_ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not 100% sure about it having to be an Array
column. I can imagine situations where someone wants an array, but I can also imagine situations where someone wants to use a Varchar
, as they only want a single image to be uploaded.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The input type file is set to upload only one file, so the user can easily upload only one image. For uploading multiple files Array
column is ideal (we already have a ArrayWidget
for adding and removing fields which is very useful), so this solves the uploading of multiple or single file in one solution.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, in some situations though you want to make sure the user can only upload one file.
Imagine you're building a user profile table, and you have a field called 'profile_image'. If there are multiple images stored, it will be confusing - which is the genuine one?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made 2 videos to see what I mean.
Single file upload
single_file.mp4
Multiple files upload
multiple_file.mp4
But of course you can do it however you want, but I think this is a simple and efficient solution because we actually upload only one file each time.
piccolo_admin/example.py
Outdated
@@ -180,6 +189,22 @@ async def booking_endpoint(request, data): | |||
return "Booking complete" | |||
|
|||
|
|||
@dataclass | |||
class LocalMediaHandler: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would make this a standard Python class, and have it inherit from a base class, called something like MediaHandler
.
class MediaHandler(abc.ABC):
@abc.abstract_method
def upload(self, request, file):
pass
Then it's more obvious what a user needs to implement if they create their own handler.
This looks cool. I've got quite a few ideas about this - do you want me to make a branch and have a play around? |
Of course. That would be great |
Changes were merged in via this PR: #189 |
Related to #181 and #109. For now, the upload is only in a local static folder.