Custom registrator provided for registering classes specific to spark ojai connector This registrator should be used when kryo serialization is enabled for the spark application.
Custom registrator provided for registering classes specific to spark ojai connector This registrator should be used when kryo serialization is enabled for the spark application.
sparkconf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer") .set("spark.kryo.registrator", "com.mapr.db.spark.OJAIKryoRegistrator")
field class provides the functionality to represent the query conditions.
field class provides the functionality to represent the query conditions.
name of the field in MapRDB Table.
An equality condition can be represented by field("a.c.d") === 10 Similarly a greater than condition can be represented by field("a.c.d") >= 10
MapRDBSpark is a static class which contains factory methods to create scala's ojai document and partitioner objects.
MapRDBSpark is a static class which contains factory methods to create scala's ojai document and partitioner objects.
Factory functions to help create scala's ojai documents val doc = MapRDBSpark.newDocument(jsonString) val doc = MapRDBSpark.newDocument(document: org.ojai.Document) Here are the ways to access elements in OJAIDocument val partitioner = MapRDBSpark.newPartitioner(tableName) It creates a partitioiner using the splits specified in tableName. val partitioner = MapRDBSpark.newPartitioner(Seq("AA","CC")) It creates a partitioner using the splits provided in the sequence. Here three splits will be created (null, "AA") , ("AA","CC") and ("CC", null) Note that this call assumes that user supplies the splits in sorted order.
Spark MapRDB connector specific functions to save either RDD[OJAIDocument] or RDD of anyobject
Spark MapRDB connector specific functions to save either RDD[OJAIDocument] or RDD of anyobject
rdd on which this function is called
docs.saveToMapRDB("tableName") It might throw a DecodingException if the RDD or anyObject is not possible to convert to a document.
Spark MapRDB connector specific functions to join external RDD with a MapRDB table.
Spark MapRDB connector specific functions to join external RDD with a MapRDB table.
rdd on which this function is called
docs.joinWithMapRDB("tableName")
Spark MapRDB connector specific functions to save either RDD[(String, OJAIDocument)] or RDD[(String, anyobject)]
Spark MapRDB connector specific functions to save either RDD[(String, OJAIDocument)] or RDD[(String, anyobject)]
rdd on which this function is called
docs.saveToMapRDB("tableName") It might throw a DecodingException if the RDD or anyObject is not possible to convert to a document.
Spark MapRDB connector specific functions to load json tables as RDD[OJAIDocument]
Spark MapRDB connector specific functions to load json tables as RDD[OJAIDocument]
sparkContext
val docs = sc.loadMapRDBTable("tableName")