Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

upload images to local storage #182

Closed
wants to merge 9 commits into from

Conversation

sinisaos
Copy link
Member

Related to #181 and #109. For now, the upload is only in a local static folder.

@sinisaos
Copy link
Member Author

@dantownsend When you have time can you check if this example of S3 adapter for media upload works. It should work.

# in example.py
import aiobotocore

@dataclass
class S3MediaHandler:
    aws_access_key_id: str
    aws_secret_access_key: str
    region: str
    bucket: str
    key: str

    async def upload(self, request, file):
        session = aiobotocore.get_session()
        async with session.create_client(
            "s3",
            region_name=self.region,
            aws_secret_access_key=self.aws_secret_access_key,
            aws_access_key_id=self.aws_access_key_id,
        ) as client:
            await client.put_object(
                ACL="public-read", Bucket=self.bucket, Key=self.key, Body=file
            )
            return {
                "image": (
                    f"https://{self.bucket}.s3.{self.region}.amazonaws.com/{self.key}"
                ),
            }

media_handler: t.Any = S3MediaHandler(...)

@dantownsend
Copy link
Member

We need to be careful with this:

            await client.put_object(
                ACL="public-read", Bucket=self.bucket, Key=self.key, Body=file
            )
            return {
                "image": (
                    f"https://{self.bucket}.s3.{self.region}.amazonaws.com/{self.key}"
                ),
            }

The URL is hardcoded to AWS, but it should be possible to use boto with any S3 compatible storage. I would just return the key, rather than the full URL.

We then need an endpoint which gets the URL for the image - for S3 it needs to create a signed URL. If we just store the image in a public bucket, it won't be secure. Something like:

GET /api/media/some-file-key

{
    'url': 'https://some-signed-url.com/foo/'
}

So in Piccolo Admin, when we load a column which has media storage enabled, we get the key for the image, and then fetch a signed URL on demand for that image.

You get a signed URL doing something like this (I lifted this code from one of my projects):

response = s3_client.generate_presigned_url(
            ClientMethod="get_object",
            Params={"Bucket": BUCKET_NAME, "Key": image_name},
            ExpiresIn=3600,
        )

I hope that makes sense.

@sinisaos
Copy link
Member Author

@dantownsend I agree with you and it should be changed, but this is just an example of a media handler that the user should have written themselves (as we said before, user can write their own adapter to work with whatever storage provider they want). I'm mostly interested in whether that handler works with the store_files() method from endpoints.py.

@dantownsend
Copy link
Member

I see. Yeah, it looks like it should work.

Have you ever tried this?

http://docs.getmoto.org/en/latest/docs/server_mode.html#run-using-docker
https://hub.docker.com/r/motoserver/moto/tags

It's a Docker container you can run which acts as a fake S3.

@sinisaos
Copy link
Member Author

@dantownsend Sorry for the slow response but I was unable to test the /api/media/ endpoint. Thanks for the suggestions. I'll try tomorrow.

@dantownsend
Copy link
Member

@sinisaos No worries - thanks!

@sinisaos
Copy link
Member Author

@dantownsend I changed the adapter to this one.

import boto3
from botocore.exceptions import ClientError

@dataclass
class S3MediaHandler:
    aws_access_key_id: str
    aws_secret_access_key: str
    bucket: str
    key: str
    region: str

    def upload(self, request, file):
        s3_client = boto3.resource(
            "s3",
            aws_access_key_id=self.aws_access_key_id,
            aws_secret_access_key=self.aws_secret_access_key,
            region_name=self.region,
        )
        s3_client.create_bucket(Bucket=self.bucket)
        # Upload the file
        s3_client = boto3.client("s3")
        try:
            response = s3_client.upload_fileobj(
                file.file,
                self.bucket,
                self.key,
            )
        except ClientError as e:
            print(e)  # or logging

        response = s3_client.generate_presigned_url(
            ClientMethod="get_object",
            Params={"Bucket": self.bucket, "Key": file.filename},
            ExpiresIn=3600,
        )

        return {"image": response}

media_handler: t.Any = S3MediaHandler(...)

I hope this works, but you'll have to try it yourself. To be honest, I've been struggling with the moto library (and I don't know if I've used it well at all) and I've used the @mock_s3 decorator to test the /api/media/ endpoint with the moto server, but I get NoSuchKey The specified key does not exist. 1234 (1234 is the image name) when I open the response url in the browser.

@dantownsend
Copy link
Member

@sinisaos I really struggled using moto at first. It took me ages to work it out.

I hoped that using the Docker container gave a simpler experience, and you could just point your code at it, and it would act like a local version of S3.

Don't worry too much about S3 for now - we can always add it later. Just local storage will be super useful.

@sinisaos
Copy link
Member Author

@dantownsend Ok. Upload to local storage (code is in this PR) is ready and everything works well. If you have time, can you try it?

@codecov-commenter
Copy link

codecov-commenter commented Jul 22, 2022

Codecov Report

Merging #182 (1b77f8c) into master (6797d11) will increase coverage by 0.19%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master     #182      +/-   ##
==========================================
+ Coverage   96.01%   96.21%   +0.19%     
==========================================
  Files           5        5              
  Lines         251      264      +13     
==========================================
+ Hits          241      254      +13     
  Misses         10       10              
Impacted Files Coverage Δ
piccolo_admin/endpoints.py 95.81% <100.00%> (+0.24%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us.

Usage
-----

Piccolo admin uses the Piccolo ORM `Array <https://piccolo-orm.readthedocs.io/en/latest/piccolo/schema/column_types.html#array>`_
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not 100% sure about it having to be an Array column. I can imagine situations where someone wants an array, but I can also imagine situations where someone wants to use a Varchar, as they only want a single image to be uploaded.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The input type file is set to upload only one file, so the user can easily upload only one image. For uploading multiple files Array column is ideal (we already have a ArrayWidget for adding and removing fields which is very useful), so this solves the uploading of multiple or single file in one solution.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, in some situations though you want to make sure the user can only upload one file.

Imagine you're building a user profile table, and you have a field called 'profile_image'. If there are multiple images stored, it will be confusing - which is the genuine one?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I made 2 videos to see what I mean.
Single file upload

single_file.mp4

Multiple files upload

multiple_file.mp4

But of course you can do it however you want, but I think this is a simple and efficient solution because we actually upload only one file each time.

@@ -180,6 +189,22 @@ async def booking_endpoint(request, data):
return "Booking complete"


@dataclass
class LocalMediaHandler:
Copy link
Member

@dantownsend dantownsend Jul 22, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would make this a standard Python class, and have it inherit from a base class, called something like MediaHandler.

class MediaHandler(abc.ABC):
    @abc.abstract_method
    def upload(self, request, file):
        pass

Then it's more obvious what a user needs to implement if they create their own handler.

@dantownsend
Copy link
Member

This looks cool. I've got quite a few ideas about this - do you want me to make a branch and have a play around?

@sinisaos
Copy link
Member Author

Of course. That would be great

@dantownsend dantownsend mentioned this pull request Jul 30, 2022
7 tasks
@dantownsend
Copy link
Member

Changes were merged in via this PR: #189

@dantownsend dantownsend closed this Aug 5, 2022
@sinisaos sinisaos deleted the image_upload branch August 5, 2022 21:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants