spark集群进入 bin 下面目录./spark-shell 出现Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
spark集群启动的时候可以正常,进入 ./spark-shell 就会出现如下错误
配置文件:spark-env.sh
export JAVA_HOME=/usr/java/jdk1.7.0_51
export SCALA_HOME=/home/hadoop/scala-2.11.6
export SPARK_MASTER_IP=master24
export SPARK_MASTER_PORT=17077
export SPARK_MASTER_WEBUI_PORT=18080
export SPARK_WORKER_CORES=1
export SPARK_WORKER_MEMORY=30g
export SPARK_WORKER_WEBUI_PORT=18081
export SPARK_WORKER_INSTANCES=1INFO SparkEnv: Registering BlockManagerMaster
错误信息如下:
15/03/24 18:32:03 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20150324183203-6f8e 15/03/24 18:32:03 INFO MemoryStore: MemoryStore started with capacity 294.9 MB. 15/03/24 18:32:03 INFO ConnectionManager: Bound socket to port 35104 with id = ConnectionManagerId(server2,35104) 15/03/24 18:32:03 INFO BlockManagerMaster: Trying to register BlockManager 15/03/24 18:32:03 INFO BlockManagerInfo: Registering block manager server2:35104 with 294.9 MB RAM 15/03/24 18:32:03 INFO BlockManagerMaster: Registered BlockManager 15/03/24 18:32:03 INFO HttpServer: Starting HTTP Server 15/03/24 18:32:03 INFO HttpBroadcast: Broadcast server started at http://192.168.1.24:41483 15/03/24 18:32:03 INFO HttpFileServer: HTTP File server directory is /tmp/spark-524059df-53c2-4df8-a2a0-c76c878a3d94 15/03/24 18:32:03 INFO HttpServer: Starting HTTP Server 15/03/24 18:32:03 INFO SparkUI: Started SparkUI at http://server12:4040 15/03/24 18:32:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable java.lang.IllegalArgumentException: java.net.UnknownHostException: cluster1 at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:418) at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:231) at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:139) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:510) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:453) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2433) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.UnknownHostException: cluster1
... 61 more
Spark context available as sc.
scala>
是什么原因呢?
郑重声明:本站内容如果来自互联网及其他传播媒体,其版权均属原媒体及文章作者所有。转载目的在于传递更多信息及用于网络分享,并不代表本站赞同其观点和对其真实性负责,也不构成任何其他建议。