Skip to content
Advertisement

Error while uploading to S3 from Hadoop using s3a

I am running a java application on kubernetes which upload multiple files from local container to s3 bucket using s3a, but I am getting the below exception in logs and files are not getting uploaded to s3.

Partial files are getting uploaded to s3

JavaScript

Advertisement

Answer

Looks like this is a bug.

problems caused by AWS SDK classes from the localstack-utils-fat.jar overriding classes defined in the actual Lambda jar/zip itself.

Here’s the version you need with the fix. It sounds like there is a work around:

implemented a partial fix for this that moved the localstack-utils-fat.jar later on the classpath, but this fix only applied to lambdas being run using the docker executor.

Basically, it’s not your fault. It’s a code issue with dependencies overwriting each others function signatures. You need to use the latest localstack-utils-fat.jar and it should fix your issue.

User contributions licensed under: CC BY-SA
2 People found this is helpful
Advertisement