santaclaus 发表于 2018-10-29 09:33:49

ubuntu安装spark2.1 hadoop2.7.3集群

hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.3.jar wordcount /testin/str.txt testout  
17/02/24 11:20:59 INFO client.RMProxy: Connecting to ResourceManager at spark1/192.168.100.25:8032
  
17/02/24 11:21:01 INFO input.FileInputFormat: Total input paths to process : 1
  
17/02/24 11:21:01 INFO mapreduce.JobSubmitter: number of splits:1
  
17/02/24 11:21:02 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1487839487040_0002
  
17/02/24 11:21:06 INFO impl.YarnClientImpl: Submitted application application_1487839487040_0002
  
17/02/24 11:21:06 INFO mapreduce.Job: The url to track the job: http://spark1:8088/proxy/application_1487839487040_0002/
  
17/02/24 11:21:06 INFO mapreduce.Job: Running job: job_1487839487040_0002
  
17/02/24 11:21:28 INFO mapreduce.Job: Job job_1487839487040_0002 running in uber mode : false
  
17/02/24 11:21:28 INFO mapreduce.Job:map 0% reduce 0%
  
17/02/24 11:22:00 INFO mapreduce.Job:map 100% reduce 0%
  
17/02/24 11:22:15 INFO mapreduce.Job:map 100% reduce 100%
  
17/02/24 11:22:17 INFO mapreduce.Job: Job job_1487839487040_0002 completed successfully
  
17/02/24 11:22:17 INFO mapreduce.Job: Counters: 49
  
      File System Counters
  
                FILE: Number of bytes read=212115
  
                FILE: Number of bytes written=661449
  
                FILE: Number of read operations=0
  
                FILE: Number of large read operations=0
  
                FILE: Number of write operations=0
  
                HDFS: Number of bytes read=377966
  
                HDFS: Number of bytes written=154893
  
                HDFS: Number of read operations=6
  
                HDFS: Number of large read operations=0
  
                HDFS: Number of write operations=2
  
      Job Counters
  
                Launched map tasks=1
  
                Launched reduce tasks=1
  
                Data-local map tasks=1
  
                Total time spent by all maps in occupied slots (ms)=23275
  
                Total time spent by all reduces in occupied slots (ms)=11670
  
                Total time spent by all map tasks (ms)=23275
  
                Total time spent by all reduce tasks (ms)=11670
  
                Total vcore-milliseconds taken by all map tasks=23275
  
                Total vcore-milliseconds taken by all reduce tasks=11670
  
                Total megabyte-milliseconds taken by all map tasks=23833600
  
                Total megabyte-milliseconds taken by all reduce tasks=11950080
  
      Map-Reduce Framework
  
                Map input records=1635
  
                Map output records=63958
  
                Map output bytes=633105
  
                Map output materialized bytes=212115
  
                Input split bytes=98
  
                Combine input records=63958
  
                Combine output records=14478
  
                Reduce input groups=14478
  
                Reduce shuffle bytes=212115
  
                Reduce input records=14478
  
                Reduce output records=14478
  
                Spilled Records=28956
  
                Shuffled Maps =1
  
                Failed Shuffles=0
  
                Merged Map outputs=1
  
                GC time elapsed (ms)=429
  
                CPU time spent (ms)=10770
  
                Physical memory (bytes) snapshot=455565312
  
                Virtual memory (bytes) snapshot=1391718400
  
                Total committed heap usage (bytes)=277348352
  
      Shuffle Errors
  
                BAD_ID=0
  
                CONNECTION=0
  
                IO_ERROR=0
  
                WRONG_LENGTH=0
  
                WRONG_MAP=0
  
                WRONG_REDUCE=0
  
      File Input Format Counters
  
                Bytes Read=377868
  
      File Output Format Counters
  
                Bytes Written=154893


页: [1]
查看完整版本: ubuntu安装spark2.1 hadoop2.7.3集群