mm111222 发表于 2016-12-4 09:50:19

hadoop 报错 org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException

  报错:
  org.apache.hadoop.hdfs.DFSClient:Failed to close file
  org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException)
  解决方法:
  修改linux打开文件最大限制

echo "fs.file-max = 65535" >> /etc/sysctl.conf
echo "* - nofile 65535" >> /etc/security/limits.conf
sysctl -p
  ulimit -n
  修改hadoop配置
vi hdfs-site.xml

<property>
<name>dfs.datanode.max.xcievers</name>
<value>8192</value>
</property>
 
 
 
页: [1]
查看完整版本: hadoop 报错 org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException