一、问题描述
Spark任务,通过输入参数配置灵活配置任务运行时间,但是,在一套新代码重报错
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class
at scopt.OptionParser.<init>(options.scala:175)
at com.common.RichOptionParser.<init>(RichOptionParser.scala:6)
at com.task.utils.Parser$$anon$1.<init>(Parser.scala:16)
at com.task.utils.Parser$.getOpt(Parser.scala:16)
at com.task.test.AppInviteMergeSpark$.main(AppInviteMergeSpark.scala:95)
at com.task.test.AppInviteMergeSpark.main(AppInviteMergeSpark.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: scala.Product$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
.. 18 more
二、问题原因
因为采用scopt依赖去解析输入的参数,但是,实际scopt引入的是2.11依赖,而代码重使用的是scala 2.12.版本号导致报错这种奇怪的错误。替换其依赖修改为2.12即可
<dependency>
<groupId>com.github.scopt</groupId>
<artifactId>scopt_2.11</artifactId>
<version>3.4.0</version>
</dependency>
修改为
<dependency>
<groupId>com.github.scopt</groupId>
<artifactId>scopt_2.12</artifactId>
<version>3.5.0</version>
</dependency>
这种神奇的问题,大多都是因为版本号不对应。