I’m trying to setup Hadoop3-alpha3 with a Single Node Cluster (Psuedo-distributed) and using the apache guide to do so. I’ve tried running the example MapReduce job but every time the connection is refused. After running sbin/start-all.sh
I’ve been seeing these exceptions in the ResourceManager log (and similarly in the NodeManager log):
xxxx-xx-xx xx:xx:xx,xxx INFO org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Error when creating PropertyDescriptor for public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)! Ignoring this property. xxxx-xx-xx xx:xx:xx,xxx DEBUG org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Exception is: java.beans.IntrospectionException: bad write method arg count: public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object) at java.desktop/java.beans.PropertyDescriptor.findPropertyType(PropertyDescriptor.java:696) at java.desktop/java.beans.PropertyDescriptor.setWriteMethod(PropertyDescriptor.java:356) at java.desktop/java.beans.PropertyDescriptor.<init>(PropertyDescriptor.java:142) at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.createFluentPropertyDescritor(FluentPropertyBeanIntrospector.java:178) at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.introspect(FluentPropertyBeanIntrospector.java:141) at org.apache.commons.beanutils.PropertyUtilsBean.fetchIntrospectionData(PropertyUtilsBean.java:2245) at org.apache.commons.beanutils.PropertyUtilsBean.getIntrospectionData(PropertyUtilsBean.java:2226) at org.apache.commons.beanutils.PropertyUtilsBean.getPropertyDescriptor(PropertyUtilsBean.java:954) at org.apache.commons.beanutils.PropertyUtilsBean.isWriteable(PropertyUtilsBean.java:1478) at org.apache.commons.configuration2.beanutils.BeanHelper.isPropertyWriteable(BeanHelper.java:521) at org.apache.commons.configuration2.beanutils.BeanHelper.initProperty(BeanHelper.java:357) at org.apache.commons.configuration2.beanutils.BeanHelper.initBeanProperties(BeanHelper.java:273) at org.apache.commons.configuration2.beanutils.BeanHelper.initBean(BeanHelper.java:192) at org.apache.commons.configuration2.beanutils.BeanHelper$BeanCreationContextImpl.initBean(BeanHelper.java:669) at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.initBeanInstance(DefaultBeanFactory.java:162) at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.createBean(DefaultBeanFactory.java:116) at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:459) at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:479) at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:492) at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResultInstance(BasicConfigurationBuilder.java:447) at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResult(BasicConfigurationBuilder.java:417) at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.getConfiguration(BasicConfigurationBuilder.java:285) at org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:119) at org.apache.hadoop.metrics2.impl.MetricsConfig.create(MetricsConfig.java:98) at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:478) at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:188) at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:163) at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:62) at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:58) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceInit(ResourceManager.java:678) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.createAndInitActiveServices(ResourceManager.java:1129) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceInit(ResourceManager.java:315) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1407)
And then later in the file:
xxxx-xx-xx xx:xx:xx,xxx FATAL org.apache.hadoop.yarn.server.resourcemanager.ResourceManager: Error starting ResourceManager java.lang.ExceptionInInitializerError at com.google.inject.internal.cglib.reflect.$FastClassEmitter.<init>(FastClassEmitter.java:67) at com.google.inject.internal.cglib.reflect.$FastClass$Generator.generateClass(FastClass.java:72) at com.google.inject.internal.cglib.core.$DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25) at com.google.inject.internal.cglib.core.$AbstractClassGenerator.create(AbstractClassGenerator.java:216) at com.google.inject.internal.cglib.reflect.$FastClass$Generator.create(FastClass.java:64) at com.google.inject.internal.BytecodeGen.newFastClass(BytecodeGen.java:204) at com.google.inject.internal.ProviderMethod$FastClassProviderMethod.<init>(ProviderMethod.java:256) at com.google.inject.internal.ProviderMethod.create(ProviderMethod.java:71) at com.google.inject.internal.ProviderMethodsModule.createProviderMethod(ProviderMethodsModule.java:275) at com.google.inject.internal.ProviderMethodsModule.getProviderMethods(ProviderMethodsModule.java:144) at com.google.inject.internal.ProviderMethodsModule.configure(ProviderMethodsModule.java:123) at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340) at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:349) at com.google.inject.AbstractModule.install(AbstractModule.java:122) at com.google.inject.servlet.ServletModule.configure(ServletModule.java:52) at com.google.inject.AbstractModule.configure(AbstractModule.java:62) at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340) at com.google.inject.spi.Elements.getElements(Elements.java:110) at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:138) at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:104) at com.google.inject.Guice.createInjector(Guice.java:96) at com.google.inject.Guice.createInjector(Guice.java:73) at com.google.inject.Guice.createInjector(Guice.java:62) at org.apache.hadoop.yarn.webapp.WebApps$Builder.build(WebApps.java:332) at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:377) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1116) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1218) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1408) Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make protected final java.lang.Class java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain) throws java.lang.ClassFormatError accessible: module java.base does not "opens java.lang" to unnamed module @173f73e7 at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:337) at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:281) at java.base/java.lang.reflect.Method.checkCanSetAccessible(Method.java:197) at java.base/java.lang.reflect.Method.setAccessible(Method.java:191) at com.google.inject.internal.cglib.core.$ReflectUtils$2.run(ReflectUtils.java:56) at java.base/java.security.AccessController.doPrivileged(Native Method) at com.google.inject.internal.cglib.core.$ReflectUtils.<clinit>(ReflectUtils.java:46) ... 29 more
For reference my core-site.xml:
<configuration> <property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property> </configuration>
hdfs-site.xml:
<configuration> <property> <name>dfs.replication</name> <value>1</value> </property> </configuration>
mapred-site.xml:
<configuration> <property> <name>mapreduce.framework.name</name> <value>yarn</value> </property> </configuration>
and yarn-site.xml:
<configuration> <property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property> <property> <name>yarn.nodemanager.env-whitelist</name> <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value> </property> </configuration>
I have no idea what is causing these exceptions, any help with them would be helpful.
Edit: Added hadoop-env.sh:
export JAVA_HOME=/usr/local/jdk-9 export HADOOP_HOME=/usr/local/hadoop export HADOOP_OS_TYPE=${HADOOP_OS_TYPE:-$(uname -s)} case ${HADOOP_OS_TYPE} in Darwin*) export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= " export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.kdc= " export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf= " ;; esac export HADOOP_ROOT_LOGGER=DEBUG,console export HADOOP_DAEMON_ROOT_LOGGER=DEBUG,RFA
Advertisement
Answer
At mentioned by @tk421 in the comments. Java 9 is not compatible with Hadoop 3 (and possibly any hadoop version) yet.
https://issues.apache.org/jira/browse/HADOOP-11123
I’ve changed to Java 8.181 and both are starting up now:
hadoop@hadoop:/usr/local/hadoop$ sbin/start-all.sh WARNING: Attempting to start all Apache Hadoop daemons as hadoop in 10 seconds. WARNING: This is not a recommended production deployment configuration. WARNING: Use CTRL-C to abort. Starting namenodes on [localhost] Starting datanodes Starting secondary namenodes [hadoop] Starting resourcemanager Starting nodemanagers hadoop@hadoop:/usr/local/hadoop$ jps 8756 SecondaryNameNode 8389 NameNode 9173 NodeManager 9030 ResourceManager 8535 DataNode 9515 Jps