栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 前沿技术 > 大数据 > 大数据系统

Spark 学习笔记——001【spark-mysql+spark-hive】

Spark 学习笔记——001【spark-mysql+spark-hive】

在本地提交 spark-submit spark-sql 窗口的进入
 ./spark-sql --master local[4]  --jars /home/hadoop/software/mysql-connector-java-5.1.27-bin.jar 
 --driver-class-path  /home/hadoop/software/mysql-connector-java-5.1.27-bin.jar
spark 提交任务
./bin/spark-submit 
  --class test001 
  --master local 
  /home/hadoop/jars/com.xx.bigdata-2.0.jar  
  /home/hadoop/data/84-0.txt /home/hadoop/data/result
  
spark 入门案例的
import org.apache.spark.{SparkConf, SparkContext}


object test001 {

  def main(args: Array[String]): Unit = {
    //导入隐饰操作,否则RDD无法调用toDF方法

    val outpu_path=args(1)
    val input_path=args(0)

//      args(0)
    val conf = new SparkConf().setAppName(this.getClass.getSimpleName).setMaster("local[4]")

    val sc = new SparkContext(conf)

    val data = sc.textFile(input_path)

    val result = data.flatMap(_.split(" ")).map(x=>(x,1)).reduceByKey(_+_).sortBy(x=>x._2,false)

    result.saveAsTextFile(outpu_path)

    sc.stop()
  }
}
spark 读写mysql
##maven  加载对应的依赖
        
            com.typesafe
            config
            1.3.3
        

        
            mysql
            mysql-connector-java
            5.1.47
        
读写mysql
import org.apache.spark.sql.{SaveMode, SparkSession}
import java.util.Properties

object session_source_jdbc {
  def main(args: Array[String]): Unit = {

    val session = SparkSession.builder().master("local[4]").appName("read_jdbc").getOrCreate()
    import session.implicits._

    val url="jdbc:mysql://192.168.2.123:3306/hadoop_hive?useUnicode=true&characterEncoding=UTF-8"
    val table="TBLS"
    val reader = session.read.format("jdbc").
      option("url", url).
      option("dbtable", table).
      option("driver", "com.mysql.jdbc.Driver").
      option("user", "root").
      option("password", "root")
    val frame = reader.load()
    
    frame.createOrReplaceTempView("temp1")

    val frame1 = session.sql("select TBL_ID,CREATE_TIME,OWNER  from temp1 where SD_ID<=8")

    frame1.show()
    val url_local=" jdbc:mysql://localhost:3306/mysql001?useUnicode=true&characterEncoding=UTF-8"
    val prop = new Properties()
    prop.setProperty("user", "root")
    prop.setProperty("password", "123456")
    print("mysql 链接成功")

//    frame1.write.mode(saveMode ="append").jdbc(url_local,"spark2myql",prop)

    print("-----")
    session.stop()

  }

  case class person(name:String,age:Int)
}
转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/663703.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号