对于Spark 2.2+:
import spark.implicits._val jsonStr = """{ "metadata": { "key": 84896, "value": 54 }}"""val df = spark.read.json(Seq(jsonStr).toDS)对于Spark 2.1.x:
val events = sc.parallelize("""{"action":"create","timestamp":"2016-01-07T00:01:17Z"}""" :: Nil) val df = sqlContext.read.json(events)提示:这是使用
sqlContext.read.json(jsonRDD:RDD[Stirng])重载。它也sqlContext.read.json(path: String)可以直接读取Json文件。
对于旧版本:
val jsonStr = """{ "metadata": { "key": 84896, "value": 54 }}"""val rdd = sc.parallelize(Seq(jsonStr))val df = sqlContext.read.json(rdd)


