-
Notifications
You must be signed in to change notification settings - Fork 861
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When putting an S3Object connecting to S3, the contentEncoding for that object is always "aws-chunked" #5769
Comments
Hi @ngudbhav, Thank you for reporting the issue. I tried to reproduce this scenario but found the behavior to be consistent between AWS S3 and LocalStack. Both environments:
Could you please go through the reproduction steps from below and let me know for any deviation that may result in your reported behavior? pom.xml<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>V2_ContentEncoding_5769</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>11</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<aws.sdk.version>2.29.15</aws.sdk.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>bom</artifactId>
<version>2.29.15</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-bom</artifactId>
<version>2.19.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>s3</artifactId>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>s3-transfer-manager</artifactId>
</dependency>
<dependency>
<groupId>software.amazon.awssdk.crt</groupId>
<artifactId>aws-crt</artifactId>
<version>0.33.7</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-slf4j2-impl</artifactId>
</dependency>
</dependencies>
</project> 1. AWS S3 Behavior: Code snippetpublic static void main(String[] args) {
Log.initLoggingToFile(Log.LogLevel.Trace, "/Users/***/IdeaProjects/V2_ContentEncoding_5769/log.txt");
String bucket = "<<bucket_name>>";
String key = "testing-file.txt";
String contentType = "text/plain";
String content = "Hello Java SDK!";
byte[] bytes = content.getBytes(StandardCharsets.UTF_8);
S3AsyncClient s3Client = buildS3Client();
S3TransferManager s3TransferManager = S3TransferManager.builder()
.s3Client(s3Client)
.build();
PutObjectRequest putObjectRequest = PutObjectRequest.builder()
.bucket(bucket)
.key(key)
.contentEncoding(GZIP_ENCODING)
.contentType(contentType)
.contentLength((long) bytes.length)
.build();
UploadRequest uploadRequest = UploadRequest.builder()
.putObjectRequest(putObjectRequest)
.requestBody(AsyncRequestBody.fromBytes(bytes))
.build();
try {
s3TransferManager.upload(uploadRequest).completionFuture().join();
System.out.println("Upload completed successfully");
} catch (Exception e) {
System.err.println("Upload failed: " + e.getMessage());
e.printStackTrace();
} finally {
s3TransferManager.close();
s3Client.close();
}
}
private static S3AsyncClient buildS3Client() {
S3CrtAsyncClientBuilder builder = S3AsyncClient.crtBuilder()
.credentialsProvider(DefaultCredentialsProvider.create())
.region(Region.of(REGION))
.minimumPartSizeInBytes((long) (8 * 1024 * 1024));
return builder.build();
}
} CRT debug log
2. LocalStack Behavior: Code snippetpublic class Main {
private static final String GZIP_ENCODING = "gzip";
private static final String REGION = "us-east-1";
public static void main(String[] args) {
Log.initLoggingToFile(Log.LogLevel.Trace, "/Users/bhoradc/IdeaProjects/V2_ContentEncoding_5769/log.txt");
String bucket = "<<bucket_name>>";
String key = "testing-file.txt";
String contentType = "text/plain";
String content = "Hello Java SDK!";
byte[] bytes = content.getBytes(StandardCharsets.UTF_8);
// S3AsyncClient s3Client = buildS3Client();
S3AsyncClient s3Client =localstackbuildS3Client();
S3TransferManager s3TransferManager = S3TransferManager.builder()
.s3Client(s3Client)
.build();
s3Client.createBucket(CreateBucketRequest.builder()
.bucket(bucket)
.build())
.join();
System.out.println("Bucket created successfully: " + bucket);
PutObjectRequest putObjectRequest = PutObjectRequest.builder()
.bucket(bucket)
.key(key)
.contentEncoding(GZIP_ENCODING)
.contentType(contentType)
.contentLength((long) bytes.length)
.build();
UploadRequest uploadRequest = UploadRequest.builder()
.putObjectRequest(putObjectRequest)
.requestBody(AsyncRequestBody.fromBytes(bytes))
.build();
try {
s3TransferManager.upload(uploadRequest).completionFuture().join();
System.out.println("Upload completed successfully");
} catch (Exception e) {
System.err.println("Upload failed: " + e.getMessage());
e.printStackTrace();
} finally {
s3TransferManager.close();
s3Client.close();
}
}
private static S3AsyncClient buildS3Client() {
S3CrtAsyncClientBuilder builder = S3AsyncClient.crtBuilder()
.credentialsProvider(DefaultCredentialsProvider.create())
.region(Region.of(REGION))
.minimumPartSizeInBytes((long) (8 * 1024 * 1024));
return builder.build();
}
private static S3AsyncClient localstackbuildS3Client() {
S3CrtAsyncClientBuilder builder = S3AsyncClient.crtBuilder()
.credentialsProvider(StaticCredentialsProvider.create(
AwsBasicCredentials.create("test", "test")))
.region(Region.of(REGION));
Optional<String> s3Endpoint = getLocalStackEndpoint();
s3Endpoint.ifPresent(s -> {
builder.endpointOverride(URI.create("http://localhost:4566"));
builder.forcePathStyle(true);
builder.minimumPartSizeInBytes((long) (8 * 1024 * 1024));
});
return builder.build();
}
private static Optional<String> getLocalStackEndpoint() {
return Optional.of("http://localhost:4566");
}
} CRT debug log
LocalStack version~ % docker inspect localstack/localstack:3.7.2 | grep -i version
"DockerVersion": "",
"PYTHON_VERSION=3.11.9",
"PYTHON_PIP_VERSION=24.0",
"PYTHON_SETUPTOOLS_VERSION=65.5.1",
"LOCALSTACK_BUILD_VERSION=3.7.2" Only notable difference I see in the networking setup, your environment is Regards, |
Hi @bhoradc I have tried various ways but downloading the file requires manual decompression. As you can see in the screenshot, even the wireshark displays an error that decompression failed. I have tried using a browser, Go AWS client but the automatic decompression is not working. However, if I explicitly write a code to decompress the GZIP file, I get the expected contents back. I am not sure if the dual headers is the cause of this behaviour. |
Describe the bug
s3.txt
This is a re-opened thread from #4746 (comment).
I have attached the packet details of the PUT Object call from SDK to S3 (Localstack).
SDK always send
Content-encoding
asaws-chunked
. This causes the result to fail to decompress. I have tried to explicitly set theContent-Length
to a sufficiently high number but in vain. This is only reproducible with localstack and not the real AWS.Regression Issue
Expected Behavior
Content-encoding should not always be aws-chunked.
Current Behavior
Content-encoding is not always be aws-chunked.
Reproduction Steps
I have used the below code to upload the file
The above snippet initialises the S3Client. I have used the
minimumPartSizeInBytes
in trial and error.This code actually facilitates the transfer!
Possible Solution
No response
Additional Information/Context
No response
AWS Java SDK version used
2.29.15
JDK version used
17.0.13
Operating System and version
Ubuntu 22.04.5 LTS, Linux 6.10.14-linuxkit, Inside Docker 27.4.0
The text was updated successfully, but these errors were encountered: