There are 0 datanode(s) running and no node(s) are excluded in this operation.

时间:2022-03-30 15:21:20

向hadoop导入文件,报错

....

There are 0 datanode(s) running and no node(s) are excluded in this operation.

....

查看配置

$hadoop_home/hadoop/etc/hdfs-site.xml

<property>
<name>dfs.namenode.name.dir</name>
<value>file:/home/sparkuser/myspark/hadoop/hdfs/name</value>
</property>
<property>

解决

删除目录下的 hdfs目录下所有文件

1. 查看NameNode的9000端口,(core-site.xml文件中的fs.default.name节点配置)端口是否打开,因为所有的DataNode都要通过这个端口连接NameNode

2. 关闭防火墙,因为防火墙可能会阻止其他的电脑连接。使用以下命令关闭防火墙

**查看防火墙**

service iptables status

service iptable stop

chkconfig iptable  off

3.在host文件中注释掉

127.0.0.1 localhost
::1 localhost6 Master