求指教hadoop的這個錯誤 org.apache.hadoop.ipc.RemoteException: java.io.IOException: File > /user/hadoop/testfiles/testfiles/file1.txt could only be replicated to > 0 nodes, instead of 1 困擾我好久了 任何辦法 包括權限檢查、順序啟動、hdfs格式化等都試過了 一直不行 反復配置還是不行 不知道大家有沒碰到過
1 回答

largeQ
TA貢獻2039條經驗 獲得超8個贊
從代碼看,可能是樓主沒有啟動datanode,或者所有datanode都和namenode斷連了。樓主可以去namenode的web頁面看看。
chooseTarget方法是說從datanode中選擇n個存儲樓主的文件。
/** * Choose target datanodes according to the replication policy. * * @throws IOException * if the number of targets < minimum replication. * @see BlockPlacementPolicy#chooseTarget(String, int, DatanodeDescriptor, * List, boolean, HashMap, long) */ public DatanodeDescriptor[] chooseTarget(final String src, final int numOfReplicas, final DatanodeDescriptor client, final HashMap<Node, Node> excludedNodes, final long blocksize) throws IOException { // choose targets for the new block to be allocated. final DatanodeDescriptor targets[] = blockplacement.chooseTarget(src, numOfReplicas, client, new ArrayList<DatanodeDescriptor>(), false, excludedNodes, blocksize); if (targets.length < minReplication) { throw new IOException("File " + src + " could only be replicated to " + targets.length + " nodes instead of minReplication (=" + minReplication + "). There are " + getDatanodeManager().getNetworkTopology().getNumOfLeaves() + " datanode(s) running and " + (excludedNodes == null? "no": excludedNodes.size()) + " node(s) are excluded in this operation."); } return targets; }
- 1 回答
- 0 關注
- 1013 瀏覽
添加回答
舉報
0/150
提交
取消