JAVA client connection to hdfs deployed on docker

  docker, question

Hdfs application has been successfully deployed on docker, and jps command also shows that namenode and datanode start normally. The browser displays the host IP: 50070 page
I use java to connect hdfs in eclipse with the following code:

FileSystem hdfs;
 Configuration conf = new Configuration();
 try {
 hdfs = FileSystem.get(new URI("hdfs://10.8.2.11:9999"),conf);
 Path des =new Path("E:/AB4/hdfs/worksapce");
 String srcPath = "/abcloud";
 Path src = new Path(srcPath);
 hdfs.copyFromLocalFile(false,true,des, src);

Running results:

org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /abcloud/test.txt could only be replicated to 0 nodes instead of minReplication (=1)  .  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
 at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1559)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3245)
 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:663)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:482)
 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:975)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2036)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2034)
 
 at org.apache.hadoop.ipc.Client.call(Client.java:1411)
 at org.apache.hadoop.ipc.Client.call(Client.java:1364)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
 at com.sun.proxy.$Proxy7.addBlock(Unknown Source)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
 at java.lang.reflect.Method.invoke(Unknown Source)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
 at com.sun.proxy.$Proxy7.addBlock(Unknown Source)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:368)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1449)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1270)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:526)

Have you ever successfully deployed hdfs on docker and shared your experience with anyone who can use java api

Add: After I connected with eclipse-hadoop plug-in, there was that file in the file system, but there was no content.
图片描述

Hello, I have the same problem. After uploading master command in docker, there is no problem. After mapping the 9000 port to the host 9000, accessing hdfs://localhost:9000 and uploading using java api, there is the same problem as you. The file was uploaded, but the size is 0.eclipse reported the same error as yours. Have you solved your problem? It’s been a long time since it was solved.