终于解决了。
错误信息是
Exception in thread “main” java.lang.ExceptionInInitializerError
at org.apache.spark.SparkConf$.(SparkConf.scala:716)
at org.apache.spark.SparkConf$.(SparkConf.scala)
at org.apache.spark.SparkConf$$anonfun$getOption$1.apply (SparkConf.scala:389)
at org.apache.spark.SparkConf$$anonfun$getOption$1.apply (SparkConf.scala:389)
at scala.Option.orElse (Option.scala:289)
at org.apache.spark.SparkConf.getOption (SparkConf.scala:389)
at org.apache.spark.SparkConf.get (SparkConf.scala:251)
at org.apache.spark.deploy.SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendS3AndSparkHadoopConfigurations (SparkHadoopUtil.scala:463)
at org.apache.spark.deploy.SparkHadoopUtil$.newConfiguration (SparkHadoopUtil.scala:436)
at org.apache.spark.deploy.SparkSubmit$$anonfun$2.apply (SparkSubmit.scala:323)
at org.apache.spark.deploy.SparkSubmit$$anonfun$2.apply (SparkSubmit.scala:323)
at scala.Option.getOrElse (Option.scala:121)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment (SparkSubmit.scala:323)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain (SparkSubmit.scala:774)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1 (SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit (SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit (SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit (SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main (SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main (SparkSubmit.scala)
Caused by: java.net.UnknownHostException: linux-z1h9: linux-z1h9: Name or service not known
at java.net.InetAddress.getLocalHost (InetAddress.java:1506)
at org.apache.spark.util.Utils$.findLocalInetAddress (Utils.scala:946)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute (Utils.scala:939)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress (Utils.scala:939)
at org.apache.spark.util.Utils$$anonfun$localCanonicalHostName$1.apply (Utils.scala:996)
at org.apache.spark.util.Utils$$anonfun$localCanonicalHostName$1.apply (Utils.scala:996)
at scala.Option.getOrElse (Option.scala:121)
at org.apache.spark.util.Utils$.localCanonicalHostName (Utils.scala:996)
at org.apache.spark.internal.config.package$.(package.scala:302)
at org.apache.spark.internal.config.package$.(package.scala)
… 20 more
Caused by: java.net.UnknownHostException: linux-z1h9: Name or service not known
at java.net.Inet6AddressImpl.lookupAllHostAddr (Native Method)
at java.net.InetAddress$2.lookupAllHostAddr (InetAddress.java:929)
at java.net.InetAddress.getAddressesFromNameService (InetAddress.java:1324)
at java.net.InetAddress.getLocalHost (InetAddress.java:1501)
… 29 more
一开始以为是 java 版本过高,后来发现不是这个问题
OpenSUSE 15.1 自带的 openjdk 没有任何问题。
openjdk version “11.0.3” 2019-04-16
OpenJDK Runtime Environment (build 11.0.3+7-suse-lp151.2.1-x8664)
OpenJDK 64-Bit Server VM (build 11.0.3+7-suse-lp151.2.1-x8664, mixed mode)
多版本 java 的解决方法还是值得分享的,
OpenSUSE 可以自己选择 /usr/bin/java 跟哪个版本关联。
这个帖子很不错
https://my.oschina.net/mylingcc/blog/217578
还有这个帖子
还有
https://cn.opensuse.org/安装_Sun_Java
另外,有些帖子提到的、设置以下环境变量 :
export JAVA_HOME=/usr/local/spark_java/jdk1.8.0_261
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export PATH=$PATH:$JAVA_HOME/bin
经测试无用,全部删掉。无论是 /etc/profile 还是 ~/.bashrc
最后是这么解决的:
先用 hostname -I 或者 ip addr show 或者直接右单击 NetworkManager Applet 找到本机 IP 地址。
(不知道从哪个版本开始 OpenSUSE 不再有 ifconfig 了)
然后在 /etc/profile 里添加
export SPARK_LOCAL_IP=本地 IP 地址