zycchen 发表于 2017-12-18 12:35:00

Linux巩固记录(5) hadoop 2.7.4下自己编译代码并运行MapReduce程序

[iyunv@master>[iyunv@master>  /home/classes

[iyunv@master>  .
  0 directories, 0 files

[iyunv@master>  /home/javaFile/
  └── WordCount.java
  0 directories, 1 file

[iyunv@master>  /home/jars/
  ├── commons-cli-1.4.jar
  ├── hadoop-common-2.7.4.jar
  └── hadoop-mapreduce-client-core-2.7.4.jar
  0 directories, 3 files

[iyunv@master>
[iyunv@master>  .
  ├── WordCount.class
  ├── WordCount$IntSumReducer.class
  └── WordCount$TokenizerMapper.class
  0 directories, 3 files

  added manifest
  adding: WordCount.class(in = 1907) (out= 1040)(deflated 45%)
  adding: WordCount$IntSumReducer.class(in = 1739) (out= 742)(deflated 57%)
  adding: WordCount$TokenizerMapper.class(in = 1736) (out= 753)(deflated 56%)

[iyunv@master>  .
  ├── wordc.jar
  ├── WordCount.class
  ├── WordCount$IntSumReducer.class
  └── WordCount$TokenizerMapper.class
  0 directories, 4 files

  17/09/02 02:11:45 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.0.80:8032
  17/09/02 02:11:47 INFO input.FileInputFormat: Total input paths to process : 1
  17/09/02 02:11:47 INFO mapreduce.JobSubmitter: number of splits:1
  17/09/02 02:11:47 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1504320356950_0010
  17/09/02 02:11:47 INFO impl.YarnClientImpl: Submitted application application_1504320356950_0010
  17/09/02 02:11:47 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1504320356950_0010/
  17/09/02 02:11:47 INFO mapreduce.Job: Running job: job_1504320356950_0010
  17/09/02 02:11:56 INFO mapreduce.Job: Job job_1504320356950_0010 running in uber mode : false
  17/09/02 02:11:56 INFO mapreduce.Job:map 0% reduce 0%
  17/09/02 02:12:02 INFO mapreduce.Job:map 100% reduce 0%
  17/09/02 02:12:09 INFO mapreduce.Job:map 100% reduce 100%
  17/09/02 02:12:09 INFO mapreduce.Job: Job job_1504320356950_0010 completed successfully
  17/09/02 02:12:10 INFO mapreduce.Job: Counters: 49
  File System Counters
  FILE: Number of bytes read=118
  FILE: Number of bytes written=241697
  FILE: Number of read operations=0
  FILE: Number of large read operations=0
  FILE: Number of write operations=0
  HDFS: Number of bytes read=174
  HDFS: Number of bytes written=76
  HDFS: Number of read operations=6
  HDFS: Number of large read operations=0
  HDFS: Number of write operations=2
  Job Counters
  Launched map tasks=1
  Launched reduce tasks=1
  Data-local map tasks=1
  Total time spent by all maps in occupied slots (ms)=3745
  Total time spent by all reduces in occupied slots (ms)=4081
  Total time spent by all map tasks (ms)=3745
  Total time spent by all reduce tasks (ms)=4081
  Total vcore-milliseconds taken by all map tasks=3745
  Total vcore-milliseconds taken by all reduce tasks=4081
  Total megabyte-milliseconds taken by all map tasks=3834880
  Total megabyte-milliseconds taken by all reduce tasks=4178944
  Map-Reduce Framework
  Map input records=6
  Map output records=12
  Map output bytes=118
  Map output materialized bytes=118
  Input split bytes=98
  Combine input records=12
  Combine output records=9
  Reduce input groups=9
  Reduce shuffle bytes=118
  Reduce input records=9
  Reduce output records=9
  Spilled Records=18
  Shuffled Maps =1
  Failed Shuffles=0
  Merged Map outputs=1
  GC time elapsed (ms)=155
  CPU time spent (ms)=1430
  Physical memory (bytes) snapshot=299466752
  Virtual memory (bytes) snapshot=4159479808
  Total committed heap usage (bytes)=141385728
  Shuffle Errors
  BAD_ID=0
  CONNECTION=0
  IO_ERROR=0
  WRONG_LENGTH=0
  WRONG_MAP=0
  WRONG_REDUCE=0
  File Input Format Counters
  Bytes Read=76
  File Output Format Counters
  Bytes Written=76

[iyunv@master>  Found 3 items
  -rw-r--r--   2 root supergroup         76 2017-09-02 00:57 /hdfs-input.txt
  drwxr-xr-x   - root supergroup          0 2017-09-02 02:12 /result-self-compile
  drwx------   - root supergroup          0 2017-09-02 02:11 /tmp

[iyunv@master>
[iyunv@master>
页: [1]
查看完整版本: Linux巩固记录(5) hadoop 2.7.4下自己编译代码并运行MapReduce程序