[问题] pandas dataframe 转 spark dataframe 出现null值

楼主: zeus83157 (zeus83157)   2018-07-14 19:53:10
from pyspark import SparkContext
from pyspark.sql import SQLContext
.
.
.
sc = SparkContext()
sqlContext = SQLContext(sc)
.
.
.
people1 = sqlContext.createDataFrame(people1)
.
.
.
其中people是pandas的dataframe
https://imgur.com/a/ebzMO7Z
图中左边是spark的dataframe,右边是pandas的dataframe
可以看到黄色圆圈处,明明pandas的有东西,转过去spark就不见了
有人遇过这种状况吗?

Links booklink

Contact Us: admin [ a t ] ucptt.com