Skip to content
Advertisement

/bin/bash: /bin/java: No such file or directory error in Yarn apps in MacOS

I was trying to run a simple wordcount MapReduce Program using Java 1.7 SDK and Hadoop2.7.1 on Mac OS X EL Captain 10.11 and I am getting the following error message in my container log “stderr” /bin/bash: /bin/java: No such file or directory

Application Log-

5/11/27 02:52:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/11/27 02:52:33 INFO client.RMProxy: Connecting to ResourceManager at /192.168.200.96:8032
15/11/27 02:52:34 INFO input.FileInputFormat: Total input paths to process : 0
15/11/27 02:52:34 INFO mapreduce.JobSubmitter: number of splits:0
15/11/27 02:52:34 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1448608832342_0003
15/11/27 02:52:34 INFO impl.YarnClientImpl: Submitted application application_1448608832342_0003
15/11/27 02:52:34 INFO mapreduce.Job: The url to track the job: http://mymac.local:8088/proxy/application_1448608832342_0003/
15/11/27 02:52:34 INFO mapreduce.Job: Running job: job_1448608832342_0003
15/11/27 02:52:38 INFO mapreduce.Job: Job job_1448608832342_0003 running in uber mode : false
15/11/27 02:52:38 INFO mapreduce.Job:  map 0% reduce 0%
15/11/27 02:52:38 INFO mapreduce.Job: Job job_1448608832342_0003 failed with state FAILED due to: Application application_1448608832342_0003 failed 2 times due to AM Container for appattempt_1448608832342_0003_000002 exited with  exitCode: 127
For more detailed output, check application tracking page:http://mymac.local:8088/cluster/app/application_1448608832342_0003Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1448608832342_0003_02_000001
Exit code: 127
Stack trace: ExitCodeException exitCode=127:
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
    at org.apache.hadoop.util.Shell.run(Shell.java:456)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
    at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)


Container exited with a non-zero exit code 127
Failing this attempt. Failing the application.
15/11/27 02:52:38 INFO mapreduce.Job: Counters: 0

Command I am Running

hadoop jar wordcount.jar org.myorg.WordCount /user/gangadharkadam/input/ /user/gangadharkadam/output/

My ENV variable are-

export   JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home
export HADOOP_HOME=/usr/local/hadoop/hadoop-2.7.1
export HADOOP_MAPRED_HOME=/usr/local/hadoop/hadoop-2.7.1
export HADOOP_COMMON_HOME=/usr/local/hadoop/hadoop-2.7.1
export HADOOP_HDFS_HOME=/usr/local/hadoop/hadoop-2.7.1
export YARN_HOME=/usr/local/hadoop/hadoop-2.7.1
export HADOOP_CONF_DIR=/usr/local/hadoop/hadoop-2.7.1/etc/hadoop
export HADOOP_CLASSPATH=$JAVA_HOME/lib/tools.jar

export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:HADOOP_HOME/sbin:$M2_HOME/bin:$ANT_HOME/bin:$IVY_HOME/bin:$FB_HOME/bin:$MYSQL_HOME/bin:$MYSQL_HOME/lib:$SQOOP_HOME/bin

The problem seems to be because YARN is using different path for JAVA executable different then you have in your OS. the local logs for the failed task in “stderr” shows- /bin/bash: /bin/java: No such file or directory

I tried to create a soft link from $JAVA_HOM/bin/java to /bin/java but in El Captian OS X but its not allowing to create a softlink. The New OS X EL Captian has a rootless login and user can not create anything on certain restricted folders like /bin/. Any work around on this issue is highly appreciated.Thanks in advance.

Advertisement

Answer

This answer is applicable for Hadoop version 2.6.0 and earlier. Disabling SIP and creating a symbolic link does provide a workaround. A better solution is to fix the hadoop-config.sh so it picks up your JAVA_HOME correctly

In HADOOP_HOME/libexec/hadoop-config.sh look for the if condition below where it sets JAVA_HOME

# Attempt to set JAVA_HOME if it is not set

Remove extra parentheses in the export JAVA_HOME lines as below. Change this

if [ -x /usr/libexec/java_home ]; then
    export JAVA_HOME=($(/usr/libexec/java_home))
else
    export JAVA_HOME=(/Library/Java/Home)
fi

to

if [ -x /usr/libexec/java_home ]; then
    // note that the extra parentheses are removed
    export JAVA_HOME=$(/usr/libexec/java_home)
else
    export JAVA_HOME=/Library/Java/Home
fi

Restart yarn after you have made this change.

More detailed info can be found here https://issues.apache.org/jira/browse/HADOOP-8717 and seems that Hadoop 3.0.0-alpha1 is the first release with the fix.

User contributions licensed under: CC BY-SA
6 People found this is helpful
Advertisement