env: HDP: 3.1.5(hadoop: 3.1.1, hive: 3.1.0), Flink: 1.12.2 Java code:
public static void main(String[] args) { EnvironmentSettings settings = EnvironmentSettings.newInstance().useBlinkPlanner().build(); TableEnvironment tblEnv=TableEnvironment.create(settings); String name = "myhive"; String defaultDatabase = "default"; String hiveConfDir = "/etc/hive/conf"; HiveCatalog hive = new HiveCatalog(name, defaultDatabase, hiveConfDir); tblEnv.registerCatalog("myhive", hive); tblEnv.useCatalog("myhive"); //tblEnv.getConfig().setSqlDialect(SqlDialect.HIVE); tblEnv.sqlQuery("SELECT * FROM users").execute().print(); }
Dependency:
<dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-table-api-java-bridge_2.12</artifactId> <version>${flink.version}</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-hive_2.12</artifactId> <version>${flink.version}</version> </dependency>
error 1:
org.apache.flink.util.FlinkException: JobMaster for job 35afe414e1dd861c86130ddd031312f2 failed. at org.apache.flink.runtime.dispatcher.Dispatcher.jobMasterFailed(Dispatcher.java:887) ~[flink-dist_2.12-1.12.2.jar:1.12.2] at org.apache.flink.runtime.dispatcher.Dispatcher.dispatcherJobFailed(Dispatcher.java:465) ~[flink-dist_2.12-1.12.2.jar:1.12.2] at org.apache.flink.runtime.dispatcher.Dispatcher.handleDispatcherJobResult(Dispatcher.java:444) ~[flink-dist_2.12-1.12.2.jar:1.12.2] ... Caused by: org.apache.flink.runtime.client.JobInitializationException: Could not instantiate JobManager. at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:494) ~[flink-dist_2.12-1.12.2.jar:1.12.2] at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604) ~[?:1.8.0_292] ... Caused by: org.apache.flink.runtime.JobException: Cannot instantiate the coordinator for operator Source: HiveSource-zjdev_xiangliang.users -> SinkConversionToTuple2 at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.<init>(ExecutionJobVertex.java:231) ~[flink-dist_2.12-1.12.2.jar:1.12.2] at org.apache.flink.runtime.executiongraph.ExecutionGraph.attachJobGraph(ExecutionGraph.java:866) ~[flink-dist_2.12-1.12.2.jar:1.12.2] ... Caused by: java.lang.NoClassDefFoundError: Lorg/apache/hadoop/mapred/JobConf; at java.lang.Class.getDeclaredFields0(Native Method) ~[?:1.8.0_292] at java.lang.Class.privateGetDeclaredFields(Class.java:2583) ~[?:1.8.0_292] at java.lang.Class.getDeclaredField(Class.java:2068) ~[?:1.8.0_292] at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1871) ~[?:1.8.0_292] ... Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.JobConf at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_292] at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_292] ...
try add dependency
<dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-mapreduce-client-core</artifactId> <version>${hadoop.version}</version> <scope>provided</scope> </dependency>
get another error
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.cli.Option.builder(Ljava/lang/String;)Lorg/apache/commons/cli/Option$Builder; at org.apache.flink.runtime.entrypoint.parser.CommandLineOptions.<clinit>(CommandLineOptions.java:27) at org.apache.flink.runtime.entrypoint.DynamicParametersConfigurationParserFactory.options(DynamicParametersConfigurationParserFactory.java:43) at org.apache.flink.runtime.entrypoint.DynamicParametersConfigurationParserFactory.getOptions(DynamicParametersConfigurationParserFactory.java:50) at org.apache.flink.runtime.entrypoint.parser.CommandLineParser.parse(CommandLineParser.java:42) at org.apache.flink.runtime.entrypoint.ClusterEntrypointUtils.parseParametersOrExit(ClusterEntrypointUtils.java:63) at org.apache.flink.yarn.entrypoint.YarnJobClusterEntrypoint.main(YarnJobClusterEntrypoint.java:89)
try to fix conflict about commons-cli:1.3.1 with 1.2: choose 1.3.1 then error 1; choose 1.2 then error 2; add dependency commons-cli 1.4, then error 1.
Advertisement
Answer
1、commons-cli choose 1.3.1 or 1.4 2、add $hadoop_home/../hadoop_mapreduce/* to yarn.application.classpath