package spark
- Alphabetic
- By Inheritance
- spark
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
class
OJAIKryoRegistrator extends KryoRegistrator
Custom registrator provided for registering classes specific to spark ojai connector This registrator should be used when kryo serialization is enabled for the spark application.
Custom registrator provided for registering classes specific to spark ojai connector This registrator should be used when kryo serialization is enabled for the spark application.
sparkconf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer") .set("spark.kryo.registrator", "com.mapr.db.spark.OJAIKryoRegistrator")
Example: - case class SparkContextFunctions(sc: SparkContext, bufferWrites: Boolean = true, hintUsingIndex: Option[String] = None, queryOptions: Map[String, String] = Map[String, String]()) extends Serializable with Product
-
case class
field(fieldPath: String) extends Product with Serializable
field class provides the functionality to represent the query conditions.
field class provides the functionality to represent the query conditions.
- fieldPath
name of the field in MapRDB Table.
An equality condition can be represented by field("a.c.d") === 10 Similarly a greater than condition can be represented by field("a.c.d") >= 10
Example: - case class sizeOf(field: field) extends Product with Serializable
Value Members
- implicit val ojaiByteBufferOrdering: Ordering[ByteBuffer]
- implicit val ojaiDBBinaryKeyOrdering: Ordering[DBBinaryValue]
- implicit val ojaiStringKeyOrdering: Ordering[String]
-
implicit
def
toDocumentRDDFunctions[D](rdd: RDD[D])(implicit arg0: OJAIValue[D]): OJAIDocumentRDDFunctions[D]
Spark MapRDB connector specific functions to save either RDD[OJAIDocument] or RDD of anyobject
Spark MapRDB connector specific functions to save either RDD[OJAIDocument] or RDD of anyobject
- rdd
rdd on which this function is called
docs.saveToMapRDB("tablePath") It might throw a DecodingException if the RDD or anyObject is not possible to convert to a document.
Example: -
implicit
def
toFilterRDDFunctions[K](rdd: RDD[K])(implicit arg0: OJAIKey[K], arg1: quotes[K]): FilterRDDFunctions[K]
Spark MapRDB connector specific functions to join external RDD with a MapRDB table.
Spark MapRDB connector specific functions to join external RDD with a MapRDB table.
- rdd
rdd on which this function is called
docs.joinWithMapRDB("tablePath")
Example: -
implicit
def
toPairedRDDFunctions[K, V](rdd: RDD[(K, V)])(implicit arg0: OJAIKey[K], arg1: OJAIValue[V]): PairedDocumentRDDFunctions[K, V]
Spark MapRDB connector specific functions to save either RDD[(String, OJAIDocument)] or RDD[(String, anyobject)]
Spark MapRDB connector specific functions to save either RDD[(String, OJAIDocument)] or RDD[(String, anyobject)]
- rdd
rdd on which this function is called
docs.saveToMapRDB("tablePath") It might throw a DecodingException if the RDD or anyObject is not possible to convert to a document.
Example: -
implicit
def
toSparkContextFunctions(sc: SparkContext): SparkContextFunctions
Spark MapRDB connector specific functions to load json tables as RDD[OJAIDocument]
Spark MapRDB connector specific functions to load json tables as RDD[OJAIDocument]
- sc
sparkContext
val docs = sc.loadMapRDBTable("tablePath")
Example: -
object
MapRDBSpark
MapRDBSpark is a static class which contains factory methods to create scala's ojai document and partitioner objects.
MapRDBSpark is a static class which contains factory methods to create scala's ojai document and partitioner objects.
Factory functions to help create scala's ojai documents val doc = MapRDBSpark.newDocument(jsonString) val doc = MapRDBSpark.newDocument(document: org.ojai.Document) Here are the ways to access elements in OJAIDocument val partitioner = MapRDBSpark.newPartitioner(tableName) It creates a partitioiner using the splits specified in tableName. val partitioner = MapRDBSpark.newPartitioner(Seq("AA","CC")) It creates a partitioner using the splits provided in the sequence. Here three splits will be created (null, "AA") , ("AA","CC") and ("CC", null) Note that this call assumes that user supplies the splits in sorted order.
Example: - object field extends Serializable