栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 前沿技术 > 大数据 > 大数据系统

推荐系统实现过程中遇到的日志NoClassDefFoundError:StaticLoggerBinder问题

推荐系统实现过程中遇到的日志NoClassDefFoundError:StaticLoggerBinder问题

提示:文章写完后,目录可以自动生成,如何生成可参考右边的帮助文档

文章目录
  • 问题
  • 分析
  • 解决


问题

使用spark做推荐系统时,出现日志文件没找到的报错,报错信息如下:

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder
	at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:222)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:127)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.SparkContext.initializeLogIfNecessary(SparkContext.scala:80)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:102)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:101)
	at org.apache.spark.SparkContext.initializeLogIfNecessary(SparkContext.scala:80)
	at org.apache.spark.internal.Logging.log(Logging.scala:49)
	at org.apache.spark.internal.Logging.log$(Logging.scala:47)
	at org.apache.spark.SparkContext.log(SparkContext.scala:80)
	at org.apache.spark.internal.Logging.logInfo(Logging.scala:57)
	at org.apache.spark.internal.Logging.logInfo$(Logging.scala:56)
	at org.apache.spark.SparkContext.logInfo(SparkContext.scala:80)
	at org.apache.spark.SparkContext.(SparkContext.scala:186)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555)
	at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
	at com.study.statistics.StatisticsRecommender$.main(StatisticsRecommender.scala:25)
	at com.study.statistics.StatisticsRecommender.main(StatisticsRecommender.scala)
Caused by: java.lang.ClassNotFoundException: org.slf4j.impl.StaticLoggerBinder
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 21 more

Process finished with exit code 1

分析

也是根据报错信息一步步点进源码看了,确实在日志包jar包下面找不到这个类
也查了资料,说包没导入,但是查看maven依赖后,日志包,绑定包都导入了,一度怀疑是log4j包升级后去掉了这个类。
但是突然想到我连接的是linux下的mysql,这个mysql是作为hive的元数据存储的,spark可能会通过hive去访问mysql?(这点只是我的猜测)

解决

尝试性的在pom依赖中导入hive依赖


	   org.apache.hive
	   hive-exec
	   3.1.2

再次运行程序

Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
2021-11-27 00:17:20 [driver-heartbeater] WARN  ProcfsMetricsGetter:69 - Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped

Process finished with exit code 0

编译也是成功通过了,证明导入hive依赖即可

转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/613411.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号