Hadoop cross-host docker cluster configuration, upload to hdfs error, single physical machine does not report error

  docker, question

First of all, let’s introduce my situation. I have a physical machine as master node, hereinafter referred to as master. In addition, it has two servers, hereinafter referred to as node1 and node2. Among them, docker nodes slave1-slave10 are configured on node1 and SLAVE11-SLAVE10 are configured on node2.

At this time, it has been ensured that master can connect to each table node by ssh. the firewalls iptables of master and node have been closed, the NAT table has been closed, FORWARD ACCEPT (if it is not set like this, it will not be forwarded), and then the doc does not have a firewall, so it cannot be closed.
 When connecting 10 nodes of any one of the servers, the distributed system can operate normally.  However, when I connect two servers at the same time, the following bug will appear when uploading files to hdfs:
18/07/31 15:42:20 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.123.1:8032
 18/07/31 15:42:21 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.123.1:8032
 18/07/31 15:42:22 INFO mapred.FileInputFormat: Total input paths to process : 1
 18/07/31 15:42:25 INFO hdfs.DFSClient: Exception in createBlockOutputStream
 java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.123.24:50010
 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1482)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554)
 18/07/31 15:42:25 INFO hdfs.DFSClient: Abandoning BP-557839422-192.168.123.1-1533022646989:blk_1073741831_1007
 18/07/31 15:42:25 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.123.24:50010,DS-7062ca95-5971-4c80-87f7-5ea1a2f9f448,DISK]
 18/07/31 15:42:30 INFO hdfs.DFSClient: Exception in createBlockOutputStream
 java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.123.19:50010
 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1482)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554)
 18/07/31 15:42:30 INFO hdfs.DFSClient: Abandoning BP-557839422-192.168.123.1-1533022646989:blk_1073741832_1008
 18/07/31 15:42:30 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.123.19:50010,DS-9f25c91c-4b25-4dc3-9581-581ba2d4d79c,DISK]
 18/07/31 15:42:41 INFO hdfs.DFSClient: Exception in createBlockOutputStream
 java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.123.22:50010
 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1482)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554)
 18/07/31 15:42:41 INFO hdfs.DFSClient: Abandoning BP-557839422-192.168.123.1-1533022646989:blk_1073741833_1009
 18/07/31 15:42:41 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.123.22:50010,DS-45f819cc-a3b5-44a9-8a98-75f9442d5dd4,DISK]
 18/07/31 15:42:45 INFO hdfs.DFSClient: Exception in createBlockOutputStream
 java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.123.17:50010
 at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1482)
 at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385)
 at org.apache.hadoop.hdfs.DFSOutp

After I checked online, most of the reasons were that my firewall was not turned off, but my master and node, whether iptables or ufw, were all turned off, and there was no firewall in the dock, even if I forced the dock to install the firewall and turned it off.
Thank you all ~

Hello, has your problem been solved? I have also encountered this problem. Can you tell me how to solve it? I must thank you very much, qq 1738127840