sparkstream报错 Caused by: org.apache.spark.SparkException: This RDD lacks a SparkContext. It could happen in the following cases
报错代码 val result1 = words.transform(
(rdd, rddtime) => {
println("---------批次开始触发--------top-----")
println("执行了 5s batch rdd transform")
println(LocalDateTime.ofEpochSecond(rddtime.milliseconds / 1000, 0, ZoneOffset.ofHours(8)).format(DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss"))
)
println("--结束---")
rdd.foreach(str => {
val droppedWordsCounter = AccumulatorIfDataOut.getInstance(rdd.sparkContext) // 当前行,即获取广播变量,一定不要写到rdd.foreach{...}里面,否则报错. spark认为他算个action操作,rdd转换算子中不允许action.
if (str._1 == "yy") {
println("小窗口来了一个yy")
droppedWordsCounter.ad


