Skip to content

bdgeise/geomesa-geospark

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

combine geospatial tools on spark

I need to combine geomesa and geospark on spark, apache/sedona#253.

to execute use:

make run

fails with

ClassCastException: org.apache.spark.sql.catalyst.expressions.UnsafeArrayData cannot be cast to org.apache.spark.sql.catalyst.InternalRow

when not using separate registrators. When doing so as suggested in apache/sedona#253

Catalog.expressions.foreach(f => FunctionRegistry.builtin.registerFunction("geospark_" + f.getClass.getSimpleName.dropRight(1), f))
Catalog.aggregateExpressions.foreach(f => sparkSession.udf.register("geospark_" + f.getClass.getSimpleName, f))

Exeption goes away. But geomesa is used. When renaming functions to geospark_ST_Point(x, y) they no longer seem to be defined.

I can't find them in:

FunctionRegistry.functionSet.foreach(println)

About

Integration of geomesa and geospark

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Scala 100.0%