hadoop 遇到java.net.ConnectException: to 0.0.0.0:10020 failed on connection

时间:2021-11-27 05:45:01
 
 
这个问题一般是在hadoop2.x版本里会出现,Hadoop的datanode需要访问namenode的jobhistory server,如果没有修改,则默认为0.0.0.0:10020,则可以修改mapred-site.xml文件:

<property>
    <name>mapreduce.jobhistory.address</name>

<!-- 配置实际的Master主机名和端口-->
    <value>0.0.0.0:10020</value>
</property>

<property>
    <name>mapreduce.jobhistory.webapp.address</name>

<!-- 配置实际的Master主机名和端口-->
    <value>0.0.0.0:19888</value>
</property>

最后不要忘记了启动jobhistory

 
$HADOOP_HOME/sbin/mr-jobhistory-daemon.sh start historyserver
 
 
********************************************************************
a:C:\words.txt  文件内容:  
hello alamp s
hello qq
 
hello xx
 
hello aa
 
hello swk
 
hello zbj
 
hello blm
 
blm   xixi
zbj  hehe
swk   haha
b. java
 
@Test
     public void testUpload() throws IllegalArgumentException, IOException {
          FSDataOutputStream out = fs.create(new Path(
                   "hdfs://itcast01:9000/upload2"));
          FileInputStream in = new FileInputStream(new File("c:/words.txt"));
          IOUtils.copyBytes(in, out, 2048, true);
     }
hadoop 遇到java.net.ConnectException: to 0.0.0.0:10020 failed on connection
package cn.itcast.hadoop.mr;
 
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
 
 
public class WordCount {
 
    public static void main(String[] args) throws Exception {
        Configuration conf = new Configuration();
        conf.setInt("mapreduce.client.submit.file.replication", 20);
        Job job = Job.getInstance(conf);
 
        //notice
        job.setJarByClass(WordCount.class);
 
        //set mapper`s property
        job.setMapperClass(WCMapper.class);
        job.setMapOutputKeyClass(Text.class);
        job.setMapOutputValueClass(LongWritable.class);
        FileInputFormat.setInputPaths(job, new Path("/upload2/"));
 
        //set reducer`s property
        job.setReducerClass(WCReducer.class);
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(LongWritable.class);
        FileOutputFormat.setOutputPath(job, new Path("/usr/mapperReduce"));
 
        //submit
        job.waitForCompletion(true);
    }
 
}
c.打jar包
                            hadoop 遇到java.net.ConnectException: to 0.0.0.0:10020 failed on connection
 
                           hadoop 遇到java.net.ConnectException: to 0.0.0.0:10020 failed on connection
      
[root@itcast01 usr]# hadoop jar mr.jar
17/05/20 16:44:09 INFO client.RMProxy: Connecting to ResourceManager at itcast01/192.168.233.128:8032
17/05/20 16:44:10 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
17/05/20 16:44:11 INFO input.FileInputFormat: Total input paths to process : 1
17/05/20 16:44:11 INFO mapreduce.JobSubmitter: number of splits:1
17/05/20 16:44:11 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1495288787912_0006
17/05/20 16:44:12 INFO impl.YarnClientImpl: Submitted application application_1495288787912_0006
17/05/20 16:44:12 INFO mapreduce.Job: The url to track the job: http://itcast01:8088/proxy/application_1495288787912_0006/
17/05/20 16:44:12 INFO mapreduce.Job: Running job: job_1495288787912_0006
17/05/20 16:44:55 INFO mapreduce.Job: Job job_1495288787912_0006 running in uber mode : false
17/05/20 16:44:55 INFO mapreduce.Job:  map 0% reduce 0%
17/05/20 16:46:24 INFO mapreduce.Job:  map 100% reduce 0%
17/05/20 16:47:02 INFO mapreduce.Job:  map 100% reduce 100%
17/05/20 16:47:03 INFO mapreduce.Job: Job job_1495288787912_0006 completed successfully
17/05/20 16:47:06 INFO mapreduce.Job: Counters: 49
        File System Counters
                FILE: Number of bytes read=435
                FILE: Number of bytes written=186471
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=210
                HDFS: Number of bytes written=78
                HDFS: Number of read operations=6
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=2
        Job Counters
                Launched map tasks=1
                Launched reduce tasks=1
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=65092
                Total time spent by all reduces in occupied slots (ms)=32649
                Total time spent by all map tasks (ms)=65092
                Total time spent by all reduce tasks (ms)=32649
                Total vcore-seconds taken by all map tasks=65092
                Total vcore-seconds taken by all reduce tasks=32649
                Total megabyte-seconds taken by all map tasks=66654208
                Total megabyte-seconds taken by all reduce tasks=33432576
        Map-Reduce Framework
                Map input records=17
                Map output records=32
                Map output bytes=365
                Map output materialized bytes=435
                Input split bytes=93
                Combine input records=0
                Combine output records=0
                Reduce input groups=13
                Reduce shuffle bytes=435
                Reduce input records=32
                Reduce output records=13
                Spilled Records=64
                Shuffled Maps =1
                Failed Shuffles=0
                Merged Map outputs=1
                GC time elapsed (ms)=290
                CPU time spent (ms)=5530
                Physical memory (bytes) snapshot=284258304
                Virtual memory (bytes) snapshot=1685770240
                Total committed heap usage (bytes)=136515584
        Shuffle Errors
                BAD_ID=0
                CONNECTION=0
                IO_ERROR=0
                WRONG_LENGTH=0
                WRONG_MAP=0
                WRONG_REDUCE=0
        File Input Format Counters
                Bytes Read=117
        File Output Format Counters
                Bytes Written=78
 
[root@itcast01 usr]# hadoop fs -ls /
Found 9 items
-rw-r--r--   1 root supergroup  153512879 2017-05-17 05:34 /jdk
-rw-r--r--   1 root supergroup         62 2017-05-20 13:54 /score_in
drwx------   - root supergroup          0 2017-05-17 06:15 /tmp
-rw-r--r--   3 root supergroup         32 2017-05-17 14:47 /upload
-rw-r--r--   3 root supergroup        117 2017-05-20 16:35 /upload2
drwxr-xr-x   - root supergroup          0 2017-05-20 16:44 /usr
drwxr-xr-x   - root supergroup          0 2017-05-17 06:18 /wcout
-rw-r--r--   1 root supergroup         70 2017-05-17 06:12 /words
drwxr-xr-x   - root supergroup          0 2017-05-20 15:23 /xx
[root@itcast01 usr]# hadoop fs -cat /usr/upload2/part-r-00000
cat: `/usr/upload2/part-r-00000': No such file or directory
 
 
[root@itcast01 usr]# hadoop fs -ls /usr/
Found 4 items
drwxr-xr-x   - root supergroup          0 2017-05-17 14:54 /usr/local
drwxr-xr-x   - root supergroup          0 2017-05-20 16:47 /usr/mapperReduce
drwxr-xr-x   - root supergroup          0 2017-05-20 16:08 /usr/swk
drwxr-xr-x   - root supergroup          0 2017-05-17 11:25 /usr/test
[root@itcast01 usr]# hadoop fs -ls /usr/mapperReduce
Found 2 items
-rw-r--r--   1 root supergroup          0 2017-05-20 16:47 /usr/mapperReduce/_SUCCESS
-rw-r--r--   1 root supergroup         78 2017-05-20 16:46 /usr/mapperReduce/part-r-00000
[root@itcast01 usr]# hadoop fs -cat /usr/mapperReduce/part-r-00000
        11
aa      1
alamp   1
blm     2
haha    1
hehe    1
hello   7
qq      1
s       1
swk     2
xixi    1
xx      1
zbj     2
[root@itcast01 usr]#

hadoop 遇到java.net.ConnectException: to 0.0.0.0:10020 failed on connection的更多相关文章

  1. hadoop报错:java&period;io&period;IOException&lpar;java&period;net&period;ConnectException&colon; Call From xxx&sol;xxx to xxx&colon;10020 failed on connection exception&colon; java&period;net&period;ConnectException&colon; 拒绝连接

    任务一直报错 现象比较奇怪,部分任务可以正常跑,部分问题报错 报错信息如下: Ended Job = job_1527476268558_132947 with exception 'java.io. ...

  2. Hadoop格式化 From hu-hadoop1&sol;192&period;168&period;11&period;11 to hu-hadoop2&colon;8485 failed on connection exception&colon; java&period;net&period;

    192.168.11.12:8485: Call From hu-hadoop1/192.168.11.11 to hu-hadoop2:8485 failed on connection excep ...

  3. Hadoop&colon; failed on connection exception&colon; java&period;net&period;ConnectException&colon; Connection refuse

    ssh那些都已经搞了,跑一个书上的例子出现了Connection Refused异常,如下: 12/04/09 01:00:54 INFO ipc.Client: Retrying connect t ...

  4. Call to localhost&sol;127&period;0&period;0&period;1&colon;9000 failed on connection exception&colon;java&period;net&period;ConnectException的解决方案

    Call to localhost/127.0.0.1:9000 failed on connection exception:java.net.ConnectException的解决方案 作者:凯鲁 ...

  5. android异常&colon; java&period;net&period;ConnectException&colon; localhost&sol;127&period;0&period;0&period;1&colon;8080 - Connection refused

    android手机做下载文件时,报了如下异常: java.net.ConnectException: localhost/127.0.0.1:8080 - Connection refused 模拟器 ...

  6. &lbrack;转&rsqb;android访问网络:java&period;net&period;ConnectException&colon; localhost&sol;127&period;0&period;0&period;1&colon;8888 - Connection refused

    这对刚学会向tomcat模拟的本地服务器发送请求的同学非常重要! 转自:http://wing123.iteye.com/blog/1873763 描述:在做注册功能的时候,向本地服务器:127.0. ...

  7. 【android】java&period;net&period;ConnectException&colon; localhost&sol;127&period;0&period;0&period;1&colon;8080 - Connection refused

    调试中通过android simulator模拟器链接localhost或者127.0.0.1,因为我在电脑上面建立了apache,我的代码大概就是URL url = new URL(urlStrin ...

  8. Debian部署RMI异常:java&period;rmi&period;ConnectException&colon; Connection refused to host&colon; 127&period;0&period;1&period;1&semi;

    现象:在windows上部署RMI很顺利,但移到debian上部署后,客户端报异常: java.rmi.ConnectException: Connection refused to host: 12 ...

  9. 调用远程主机上的 RMI 服务时抛出 java&period;rmi&period;ConnectException&colon; Connection refused to host&colon; 127&period;0&period;0&period;1 异常原因及解决方案

    最近使用 jmx 遇到一个问题,client/server 同在一台机器上,jmx client能够成功连接 server,如果把 server 移植到另一台机器上192.168.134.128,抛出 ...

随机推荐

  1. &period;net变量判断

    <div class="AccountLevel" style="margin-top: 15px;">                <sp ...

  2. 分巧克力【来源:CSDN线上编程挑战赛】——递归,费波那奇数列,迭代

    /*====================================================================== 儿童节快到了,班长想要给班上的每个同学给一个巧克力, ...

  3. tachyon with spark

    spark1.2.0  tachyon0.5.0 jdk1.7 scala2.10.4 1.装好spark.tachyon.jdk.scala 2.修改spark-env.sh添加Tachyon客户端 ...

  4. Script Form

    Script Form 是SAP所提供的一款强大的报表设设计工具. 一.Script Form主要工具包括如下: 1)Form Painter:格式绘制器,用于格式的设定.TCoce:SE71. 2) ...

  5. db2 存储过程迁移方法

    大家在迁移数据库时,存储过程一般也要迁移过去,但一般有两个问题: 1. 非常多存储过程有先后关系(存储过程调用存储过程),假设存储过程数量少,还能手动操作.假设量大,那真是要疯了. 2. 存储过程过大 ...

  6. Android性能优化建议

    1.减少View树的高度(多层嵌套) 2.使用<include>重用layout 3.使用<ViewStub>实现View的延迟加载 作用范围:当这个布局在初始化加载时候,不需 ...

  7. 面试的妹纸问我:web缓存设置不是后台的事情吗?

    背景介绍 团队最近在招前端开发,早上收到一封简历,是个妹纸,从技能点来看还算符合要求,于是约了下午3点过来面试. 整个面试过程持续了大约40分钟,问的题目也比较常规,其中一道题就是"常见的性 ...

  8. 模板引擎,中间件,spring AOP原理

    1. 主流模板引擎有哪些 https://blog.csdn.net/wangmx1993328/article/details/81054474 2. 解释模板引擎是个什么东西 https://ww ...

  9. JavaScript中的内置对象-8--2&period;String-符串对象 方法&semi; 截取方法&semi; 综合应用&semi; 其他方法&semi;

    JavaScript内置对象-2String(字符串) 学习目标 1.掌握字符串对象 方法: charAt() charCodeAt() indexOf() lastIndextOf() charAt ...

  10. nginx访问日志&comma;错误日志参数说明

    说明: nginx日志主要有两种:访问日志.错误日志.其中访问日志记录客户端访问nginx的每一个请求,包含用户地域来源.跳转来源.使用终端.某个URL访问量等信息,访问日志格式可以自定义:错误日志则 ...