类似的问题-
- 使用Spark Java的GeoSpark库
- 使用Java从ResultSet到Spark数据框
- 使用Spark / Java的GeoSpark
- 未定义函数:“ ST_GeomFromText”,使用Spark / Java
我认为,您没有完全按照GeoSparkSQL-Overview /#quick-
start进行操作-
按照快速入门,您需要将GeoSpark-core和GeoSparkSQL添加到项目POM.xml或build.sbt中
<!– Geo spark lib doc - https://datasystemslab.github.io/GeoSpark/api/sql/GeoSparkSQL-Overview/#quick-start-->
org.datasyslab
geospark-sql_2.31.3.1 com.vividsolutions
jts1.13 org.datasyslab
geospark-viz_2.31.3.1 org.datasyslab
geospark1.3.1 声明您的Spark会话
SparkSession sparkSession = SparkSession.builder()
.config(“spark.serializer”, KryoSerializer.class.getName())
.config(“spark.kryo.registrator”, GeoSparkKryoRegistrator.class.getName())
.master(“local[*]”)
.appName(“myGeoSparkSQLdemo”)
.getOrCreate();从注册所有功能
geospark-sql_2.3
的sparkSession
,这样它可以用来直接火花SQL// register all functions from geospark-sql_2.3 to sparkSession
GeoSparkSQLRegistrator.registerAll(sparkSession);
现在,这是工作示例-
SparkSession sparkSession = SparkSession.builder() .config("spark.serializer", KryoSerializer.class.getName()) .config("spark.kryo.registrator", GeoSparkKryoRegistrator.class.getName()) .master("local[*]") .appName("myGeoSparkSQLdemo") .getOrCreate(); // register all functions from geospark-sql_2.3 to sparkSession GeoSparkSQLRegistrator.registerAll(sparkSession); try { System.out.println(sparkSession.catalog().getFunction("ST_Geomfromtext")); // Function[name='ST_GeomFromText', className='org.apache.spark.sql.geosparksql.expressions.ST_GeomFromText$', isTemporary='true'] } catch (Exception e) { e.printStackTrace(); } // https://datasystemslab.github.io/GeoSpark/api/sql/GeoSparkSQL-Function/ Dataset<Row> dataframe = sparkSession.sql("select ST_GeomFromText('POINT(-7.07378166 33.826661)')"); dataframe.show(false); dataframe.printSchema(); // using longitude and latitude column from existing dataframe Dataset<Row> df = sparkSession.sql("select -7.07378166 as longitude, 33.826661 as latitude"); df.withColumn("ST_Geomfromtext ", expr("ST_GeomFromText(CONCAt('POINT(',longitude,' ',latitude,')'))")) .show(false); 


