1、环境
cdh5.12.3
spark2 2.3.0
2、需要本地地洞spark2-shell用于环境测试
错误一:Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
错误一
需要配置spark-env.sh 指定HADOOP_CONF_DIR
错误二:Error: Cluster deploy mode is not applicable to Spark shells.
Run with --help for usage help or --verbose for debug output
错误二
注意,spark2-shell运行在yarn上面,需要指定模式yarn-client,如果指定yarn-cluster,则会报错:
Error: Cluster deploy mode is not applicable to Spark shells.
因为spark-shell作为一个与用户交互的命令行,必须将Driver运行在本地,而不是yarn上。
其中的参数与提交Spark应用程序到yarn上用法一样。
正确启动方式:spark2-shell --master yarn --deploy-mode client
3、启动成功标识:
启动成功