Hadoop程序运行中的Error(1)-Error: org.apache.hadoop.hdfs.BlockMissingException

15/03/18 09:59:21 INFO mapreduce.Job: Task Id : attempt_1426641074924_0002_m_000000_2, Status : FAILED
Error: org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-35642051-192.168.199.91-1419581604721:blk_1073743091_2267 file=/filein/file_128M.txt
at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:882)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:563)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:793)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
at java.io.DataInputStream.readFully(DataInputStream.java:195)
at java.io.DataInputStream.readFully(DataInputStream.java:169)
at com.mr.AESEn.DataRecordReader.nextKeyValue(DataRecordReader.java:94)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
 
以上是MapReduce程序在运行过程中产生的Error-Error: org.apache.hadoop.hdfs.BlockMissingException。
在网上查了下,可能的原因有以下两种:一是datanode节点有break的,另一种是datanode之间的通信有问题。秉着如上的原因,我开始排错:
 
[hadoop@cMaster hadoop-2.5.2]$ ssh cSlave00
Last login: Tue Mar 17 08:38:10 2015 from missie-pc.lan
[hadoop@cSlave00 ~]$ jps
3952 Jps
2910 NodeManager
 
[hadoop@cMaster hadoop-2.5.2]$ ssh cSlave01
Last login: Tue Mar 17 08:38:13 2015 from missie-pc.lan
[hadoop@cSlave01 ~]$ jps
3051 NodeManager
2714 DataNode
4562 Jps
 
[hadoop@cMaster hadoop-2.5.2]$ ssh cSlave02
Last login: Tue Mar 17 08:38:15 2015 from missie-pc.lan
[hadoop@cSlave02 ~]$ jps
4154 Jps
2921 NodeManager
 
可以发现,cSlave00与cSlave02节点的DataNode都crash掉了。
 
于是:
1.关闭yarn与dfs
[hadoop@cMaster hadoop-2.5.2]$ sbin/stop-yarn.sh 
[hadoop@cMaster hadoop-2.5.2]$ sbin/stop-dfs.sh
 
2.重新启动dfs与yarn
[hadoop@cMaster hadoop-2.5.2]$ sbin/start-dfs.sh
[hadoop@cMaster hadoop-2.5.2]$ sbin/start-yarn.sh
 
15/03/18 10:04:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [cMaster]
cMaster: starting namenode, logging to /home/hadoop/hadoop-2.5.2/logs/hadoop-hadoop-namenode-cMaster.out
cSlave00: starting datanode, logging to /home/hadoop/hadoop-2.5.2/logs/hadoop-hadoop-datanode-cSlave00.out
cSlave02: starting datanode, logging to /home/hadoop/hadoop-2.5.2/logs/hadoop-hadoop-datanode-cSlave02.out
cSlave01: starting datanode, logging to /home/hadoop/hadoop-2.5.2/logs/hadoop-hadoop-datanode-cSlave01.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/hadoop/hadoop-2.5.2/logs/hadoop-hadoop-secondarynamenode-cMaster.out
15/03/18 10:04:35 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
 
 
starting yarn daemons
starting resourcemanager, logging to /home/hadoop/hadoop-2.5.2/logs/yarn-hadoop-resourcemanager-cMaster.out
cSlave01: starting nodemanager, logging to /home/hadoop/hadoop-2.5.2/logs/yarn-hadoop-nodemanager-cSlave01.out
cSlave02: starting nodemanager, logging to /home/hadoop/hadoop-2.5.2/logs/yarn-hadoop-nodemanager-cSlave02.out
cSlave00: starting nodemanager, logging to /home/hadoop/hadoop-2.5.2/logs/yarn-hadoop-nodemanager-cSlave00.out
 
 
如此,便能解决之前出现的Error: org.apache.hadoop.hdfs.BlockMissingException问题了。
 
参考文章:http://blog.csdn.net/yonghutwo/article/details/39937185

郑重声明:本站内容如果来自互联网及其他传播媒体,其版权均属原媒体及文章作者所有。转载目的在于传递更多信息及用于网络分享,并不代表本站赞同其观点和对其真实性负责,也不构成任何其他建议。