Cannot convert RDD of BigDecimal into dataframe

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Cannot convert RDD of BigDecimal into dataframe

Jose Rivera-Rubio
Hi guys,

I have a list of BigDecimal obtained through SparkSQL from some Parquet files.

list: List[BigDecimal] = List(1015.00, 580.00, 290.00, 1160.00)

When I try to convert them to dataframe to visualize them using the zeppelin context


val df   = sc.parallelize(list).toDF("list_of_numbers")


 I get the following error:

error: value toDF is not a member of org.apache.spark.rdd.RDD[BigDecimal]

If I map the values into double I get the same error.

Any ideas?

Thanks!
Reply | Threaded
Open this post in threaded view
|

Re: Cannot convert RDD of BigDecimal into dataframe

moon
Administrator
I think you can define a case class and map a list of BigDecimal to list of your case class.

If you parallelize this list, then toDF will work.

Thanks,
moon
On 2015년 10월 8일 (목) at 오후 4:40 Jose Rivera-Rubio <[hidden email]> wrote:
Hi guys,

I have a list of BigDecimal obtained through SparkSQL from some Parquet files.

list: List[BigDecimal] = List(1015.00, 580.00, 290.00, 1160.00)

When I try to convert them to dataframe to visualize them using the zeppelin context


val df   = sc.parallelize(list).toDF("list_of_numbers")


 I get the following error:

error: value toDF is not a member of org.apache.spark.rdd.RDD[BigDecimal]

If I map the values into double I get the same error.

Any ideas?

Thanks!