I am running a java application on kubernetes which upload multiple files from local container to s3 bucket using s3a, but I am getting the below exception in logs and files are not getting uploaded to s3.
Partial files are getting uploaded to s3
2021-11-30 12:28:44,529 614982 [secor-threadpool-1] ERROR c.pinterest.secor.uploader.Uploader - Error while uploading to S3 : java.lang.RuntimeException: org.apache.hadoop.fs.s3a.AWSClientIOException: copyFromLocalFile(/tmp/secor_data/message_logs/backup/7_22/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate, s3a://prod-dataplatform-hive/orc_secor_test/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate) on /tmp/secor_data/message_logs/backup/7_22/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate: com.amazonaws.AmazonClientException: Unable to complete transfer: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lorg/apache/commons/logging/Log;)V: Unable to complete transfer: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lorg/apache/commons/logging/Log;)V java.util.concurrent.ExecutionException: java.lang.RuntimeException: org.apache.hadoop.fs.s3a.AWSClientIOException: copyFromLocalFile(/tmp/secor_data/message_logs/backup/7_22/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate, s3a://prod-dataplatform-hive/orc_secor_test/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate) on /tmp/secor_data/message_logs/backup/7_22/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate: com.amazonaws.AmazonClientException: Unable to complete transfer: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lorg/apache/commons/logging/Log;)V: Unable to complete transfer: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lorg/apache/commons/logging/Log;)V at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:192) at com.pinterest.secor.uploader.FutureHandle.get(FutureHandle.java:34) at com.pinterest.secor.uploader.Uploader.uploadFiles(Uploader.java:117) at com.pinterest.secor.uploader.Uploader.checkTopicPartition(Uploader.java:244) at com.pinterest.secor.uploader.Uploader.applyPolicy(Uploader.java:284) at com.pinterest.secor.consumer.Consumer.checkUploadPolicy(Consumer.java:141) at com.pinterest.secor.consumer.Consumer.run(Consumer.java:133) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.RuntimeException: org.apache.hadoop.fs.s3a.AWSClientIOException: copyFromLocalFile(/tmp/secor_data/message_logs/backup/7_22/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate, s3a://prod-dataplatform-hive/orc_secor_test/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate) on /tmp/secor_data/message_logs/backup/7_22/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate: com.amazonaws.AmazonClientException: Unable to complete transfer: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lorg/apache/commons/logging/Log;)V: Unable to complete transfer: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lorg/apache/commons/logging/Log;)V at com.pinterest.secor.uploader.HadoopS3UploadManager$1.run(HadoopS3UploadManager.java:63) ... 5 more Caused by: org.apache.hadoop.fs.s3a.AWSClientIOException: copyFromLocalFile(/tmp/secor_data/message_logs/backup/7_22/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate, s3a://prod-dataplatform-hive/orc_secor_test/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate) on /tmp/secor_data/message_logs/backup/7_22/DAPI_UNICAST_BOOKING_ACTION/driverapi_transactional_ha/dapi_unicast_booking_action/dt=2021-11-29/hr=12/pk_1_1_00000000000000101414.deflate: com.amazonaws.AmazonClientException: Unable to complete transfer: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lorg/apache/commons/logging/Log;)V: Unable to complete transfer: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lorg/apache/commons/logging/Log;)V at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:144) at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:117) at org.apache.hadoop.fs.s3a.S3AFileSystem.copyFromLocalFile(S3AFileSystem.java:2039) at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:2320) at org.apache.hadoop.fs.FileSystem.moveFromLocalFile(FileSystem.java:2307) at com.pinterest.secor.util.FileUtil.moveToCloud(FileUtil.java:206) at com.pinterest.secor.uploader.HadoopS3UploadManager$1.run(HadoopS3UploadManager.java:61) ... 5 more Caused by: com.amazonaws.AmazonClientException: Unable to complete transfer: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lorg/apache/commons/logging/Log;)V at com.amazonaws.services.s3.transfer.internal.AbstractTransfer.unwrapExecutionException(AbstractTransfer.java:286) at com.amazonaws.services.s3.transfer.internal.AbstractTransfer.rethrowExecutionException(AbstractTransfer.java:265) at com.amazonaws.services.s3.transfer.internal.UploadImpl.waitForUploadResult(UploadImpl.java:66) at org.apache.hadoop.fs.s3a.S3AFileSystem.innerCopyFromLocalFile(S3AFileSystem.java:2099) at org.apache.hadoop.fs.s3a.S3AFileSystem.copyFromLocalFile(S3AFileSystem.java:2037) ... 9 more Caused by: java.lang.NoSuchMethodError: com.amazonaws.util.IOUtils.release(Ljava/io/Closeable;Lorg/apache/commons/logging/Log;)V at com.amazonaws.services.s3.model.S3DataSource$Utils.cleanupDataSource(S3DataSource.java:48) at com.amazonaws.services.s3.AmazonS3Client.uploadObject(AmazonS3Client.java:1863) at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1817) at com.amazonaws.services.s3.transfer.internal.UploadCallable.uploadInOneChunk(UploadCallable.java:169) at com.amazonaws.services.s3.transfer.internal.UploadCallable.call(UploadCallable.java:149) at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:115) at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:45) ... 4 more
Advertisement
Answer
Looks like this is a bug.
problems caused by AWS SDK classes from the localstack-utils-fat.jar overriding classes defined in the actual Lambda jar/zip itself.
Here’s the version you need with the fix. It sounds like there is a work around:
implemented a partial fix for this that moved the localstack-utils-fat.jar later on the classpath, but this fix only applied to lambdas being run using the docker executor.
Basically, it’s not your fault. It’s a code issue with dependencies overwriting each others function signatures. You need to use the latest localstack-utils-fat.jar and it should fix your issue.