问题1

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
解决方案

去除警告,vim hadoop/etc/hadoop/log4j.properties
添加 log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR

问题2

Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[DatanodeInfoWithStorage[192.168.1.125:50010,DS-c41f5f60-fb7e-4afd-814e-d4ee05623630,DISK], DatanodeInfoWithStorage[192.168.1.1
解决方案

修改 hdfs-site.xml

<property>
    <name>dfs.client.block.write.replace-datanode-on-failure.policy</name>
    <value>NEVER</value>
</property>
<property>
    <name>dfs.client.block.write.replace-datanode-on-failure.enable</name>
    <value>true</value>
</property>

问题3

下载文件错误1 :java 调用hadoop API 报错 java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.

解决方案

将对应的hadoop.gz包在windows中解压缩,然后配置环境变量 HADOOP_HOME E:\hadoop-3.2.0\hadoop-3.2.0 重启eclipse

问题4

下载文件错误2: java.io.FileNotFoundException: Could not locate Hadoop executable: E:\hadoop-3.2.0\hadoop-3.2.0\bin\winutils.exe
解决方案
需要在windows下解压一份windows的安装包 将winutils.exe 放到 bin目录下 再次运行 解决问题

运行mapreduce 程序的时候 报错
错误: 找不到或无法加载主类 org.apache.hadoop.mapreduce.v2.app.MRAppMaster

解决方案

在命令行输入 hadoop classpath 得到路径
添加 yarn-site.xml

<property>
<name>yarn.application.classpath</name>
<value>/usr/local/webserver/hadoop-3.2.0/etc/hadoop,/usr/local/webserver/hadoop-3.2.0/share/hadoop/common/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/common/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/hdfs,/usr/local/webserver/hadoop-3.2.0/share/hadoop/hdfs/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/hdfs/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/mapreduce/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/mapreduce/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/yarn,/usr/local/webserver/hadoop-3.2.0/share/hadoop/yarn/lib/*,/usr/local/webserver/hadoop-3.2.0/share/hadoop/yarn/*
</value>
</property>

问题5

org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete错误是 hadoop 处于安全状态不能删除

解决方案
hadoop dfsadmin -safemode leave
Last modification:September 17, 2019
如果觉得我的文章对你有用,请随意赞赏