I’m trying to unzip a very large .gz file in java around 50MB and then transferring it to hadoop file system. After unzipping, the file size becomes 20 GB. It takes more than 5 min to do this job. Even after using Buffered I/O streams, it is taking very long to decompress and transfer the file. Does Hadoop is causing
Tag: hdfs
How create a JSONArray in java
I have 2 java functions: listeFilesHdfs return a list of files that stored in HDFS, for example: If you remark, the files that stored in HDFS there content is a JSON format, for example: I created the below function to call both function above (one return list of files and the second open a path): How can I modify my
create file with webHdfs
I would like to create a file to hdfs with webhdfs, I wrote the function below In the last print I don’t see my file… Any idea ? Answer