1、创建要统计的txt
cd /testvi wordcount.txthello wordhello kafkahello hellospark hadoopkafka spark
2、控制台操作
语言 方法
9763 9B4xBdnMg2
J41p9
直播怎么开2434 2012.12.13 11-44-35
[root@cmaster sbin]# spark-shellUsing Spark's default log4j profile: org/apache/spark/log4j-defaults.propertiesSetting default log level to "WARN".To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).19/07/22 00:09:28 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable19/07/22 00:10:06 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.019/07/22 00:10:06 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException19/07/22 00:10:09 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectExceptionSpark context Web UI available at http://192.168.43.116:4040Spark context available as 'sc' (master = local[*], app id = local-1563779372587).Spark session available as 'spark'.Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.0 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_141)Type in expressions to have them evaluated.Type :help for more information.scala> val file=sc.textFile("file:///test/wordcount.txt");file: org.apache.spark.rdd.RDD[String] = file:///test/wordcount.txt MapPartitionsRDD[1] at textFile at <console>:24scala> val words=file.flatMap(_.split(" "));words: org.apache.spark.rdd.RDD[String] = MapPartitionsRDD[2] at flatMap at <console>:26scala> val kv=words.map((_,1));kv: org.apache.spark.rdd.RDD[(String, Int)] = MapPartitionsRDD[3] at map at <console>:28scala> val res=kv.reduceByKey(_+_);res: org.apache.spark.rdd.RDD[(String, Int)] = ShuffledRDD[4] at reduceByKey at <console>:30scala> res.foreach(println _);(spark,2)(hadoop,1)(word,1)(hello,4)(kafka,2)scala> res.saveAsTextFile("file:///test");
3、一幅图区分flatMap和map
本站仅提供存储服务,所有内容均由用户发布,如发现有害或侵权内容,请
点击举报。