Hi ,
Please help me on below error .
I am trying to save my RDD wordcount result using below code
val rdd1 = newrdd.count()
rdd1.saveAsTextFile(“path”)
Error : saveAsTextFile is not member of Long
Please advise
Hi ,
Please help me on below error .
I am trying to save my RDD wordcount result using below code
val rdd1 = newrdd.count()
rdd1.saveAsTextFile(“path”)
Error : saveAsTextFile is not member of Long
Please advise
saveAsText is a method on the rdd.
count returns a long
you will need to wrap the long in an RDD in order to call that method
or just use something like java.io.Files.write
newRdd.count()
doesn’t return the word count. You may want to read more the documentation.
It pretty much depends on what you are trying to do. Even if count was to return a Dataframe/DataSet (Which it does not) the resulting file would contain just a number, is that want your’re trying to achieve?
I tried below but got error . Please advise
rdd created as below
val newrdd = spark.read.textFile("/home/aaaa/tutorial/abc.txt")
scala> val res = newrdd.collect()
res: Array[String] = Array(Hi Friends Welcome to learning platform, “”)
Now , Count the Words.The rdd is ready, and contains a set of words.
Count the total number of words in the rdd.
scala> val splitdata = newrdd.flatMap(line => line.split(" "));
splitdata: org.apache.spark.sql.Dataset[String] = [value: string]
scala> val splitdata = res.flatMap(line => line.split(" "));
splitdata: Array[String] = Array(Hi, Friends, Welcome, to, learning, platform, “”)
scala> val mapdata = splitdata1.map(word => (word,1));
mapdata: Array[(String, Int)] = Array((Hi,1), (Friends,1), (Welcome,1), (to,1),( (Play,1), (the,1), (digital,1), (learning,1), (platform,1), ("",1))
scala> val reducedata = mapdata.reduceByKey( + );
:31: error: value reduceByKey is not a member of Array[(String, Int)]
** val reducedata = mapdata.reduceByKey( + );**
Aim is to :
Save the Result RDD
The word count is ready.
**but getting error saveAsTextFile is not member of long **
saveAsTextFile is not member of Array[String]
Please advise