栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 前沿技术 > 大数据 > 大数据系统

spark-submit提交作业时,mysql数据库连接不上,ClassNotFound执行类找不到

spark-submit提交作业时,mysql数据库连接不上,ClassNotFound执行类找不到

scala+mysql,The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.

提交作业时,出现连接不上的问题

在本地IDEA生成jar包,放到虚拟机上执行提交任务时,出现以下错误。
下面展示一些 报错片段。

Exception in thread "main" java.sql.SQLNonTransientConnectionException: Could not create connection to database server. Attempted reconnect 3 times. Giving up.
        at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:110)
        at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
        at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:89)
        at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:63)
        at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:73)
        at com.mysql.cj.jdbc.ConnectionImpl.connectWithRetries(ConnectionImpl.java:906)
        at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:831)
        at com.mysql.cj.jdbc.ConnectionImpl.(ConnectionImpl.java:456)
        at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:246)
        at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:199)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$createConnectionFactory$1(JdbcUtils.scala:64)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:56)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:226)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:35)
        at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:339)
        at org.apache.spark.sql.DataframeReader.loadV1Source(DataframeReader.scala:279)
        at org.apache.spark.sql.DataframeReader.$anonfun$load$2(DataframeReader.scala:268)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.spark.sql.DataframeReader.load(DataframeReader.scala:268)
        at org.apache.spark.sql.DataframeReader.load(DataframeReader.scala:203)
        at org.apache.spark.sql.DataframeReader.jdbc(DataframeReader.scala:294)
        at org.apache.spark.sql.DataframeReader.jdbc(DataframeReader.scala:336)
        at summ.ReqGenerateProdSrc_7$.main(ReqGenerateProdSrc_7.scala:50)
        at summ.ReqGenerateProdSrc_7.main(ReqGenerateProdSrc_7.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: com.mysql.cj.exceptions.CJCommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:61)
        at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:105)
        at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:151)
        at com.mysql.cj.exceptions.ExceptionFactory.createCommunicationsException(ExceptionFactory.java:167)
        at com.mysql.cj.protocol.a.NativeSocketConnection.connect(NativeSocketConnection.java:91)
        at com.mysql.cj.NativeSession.connect(NativeSession.java:144)
        at com.mysql.cj.jdbc.ConnectionImpl.connectWithRetries(ConnectionImpl.java:850)
        ... 30 more
Caused by: java.net.ConnectException: 拒绝连接 (Connection refused)
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
        at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
        at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
        at java.net.Socket.connect(Socket.java:589)
        at com.mysql.cj.protocol.StandardSocketFactory.connect(StandardSocketFactory.java:155)
        at com.mysql.cj.protocol.a.NativeSocketConnection.connect(NativeSocketConnection.java:65)
        ... 32 more
解决方案

百度了问题,也有很多博主给出了解决方案,最终还是不太行,下面是我的解决方案,同时附上可能出现问题的地方。

    在IDEA中创建连接写的数据源连接代码,需要加上自动连接
    下面展示一些 代码片。
--autoReconnect=true&failOverReadOnly=false,这两个参数需要加上
val url = "jdbc:mysql://172.1.3.111:3306/alm_result_0?useUnicode=true&characterEncoding=UTF-8&autoReconnect=true&failOverReadonly=false&serverTimezone=UTC"
我是通过虚拟机与本地连接的方式,mysql在我本地,所以IP也需要改成我本地的IP。
    注释掉setMaster(“local[]")以及.master("local[]”)
val conf: SparkConf = new SparkConf()
//      .setMaster("local[*]")
     .setAppName("ReqGenerateProdSrc_7")
val sc = new SparkContext(conf)
   val sparkSession: SparkSession = SparkSession.builder()
//      .master("local[*]")     
    提交作业的时候,一直出现classnotfound的报错,验证了不是数据连接的问题,那么就要考虑是否是jar包的问题,在jar Artifacts的时候,把多余的依赖包去掉。只剩下自身,如果后续说依赖包不存在,再添加即可
附上spark-submit命令
bin/spark-submit 
--class summ.ReqGenerateProdSrc_7 
--master yarn 
--deploy-mode client 
/opt/module/jar/batch.jar 
--driver-memory 2g 
--executor-memory 2g 
--num-executors 2 
--executor-cores 24

今天你摸鱼了吗~欢迎留言

转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/753378.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号