Packages

p

com.mapr.db

spark

package spark

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. spark
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class OJAIKryoRegistrator extends KryoRegistrator

    Custom registrator provided for registering classes specific to spark ojai connector This registrator should be used when kryo serialization is enabled for the spark application.

    Custom registrator provided for registering classes specific to spark ojai connector This registrator should be used when kryo serialization is enabled for the spark application.

    Example:
    1. sparkconf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer") .set("spark.kryo.registrator", "com.mapr.db.spark.OJAIKryoRegistrator")

  2. case class SparkContextFunctions(sc: SparkContext, bufferWrites: Boolean = true, hintUsingIndex: Option[String] = None, queryOptions: Map[String, String] = Map[String, String]()) extends Serializable with Product
  3. case class field(fieldPath: String) extends Product with Serializable

    field class provides the functionality to represent the query conditions.

    field class provides the functionality to represent the query conditions.

    fieldPath

    name of the field in MapRDB Table.

    Example:
    1. An equality condition can be represented by field("a.c.d") === 10 Similarly a greater than condition can be represented by field("a.c.d") >= 10

  4. case class sizeOf(field: field) extends Product with Serializable

Value Members

  1. implicit val ojaiByteBufferOrdering: Ordering[ByteBuffer]
  2. implicit val ojaiDBBinaryKeyOrdering: Ordering[DBBinaryValue]
  3. implicit val ojaiStringKeyOrdering: Ordering[String]
  4. implicit def toDocumentRDDFunctions[D](rdd: RDD[D])(implicit arg0: OJAIValue[D]): OJAIDocumentRDDFunctions[D]

    Spark MapRDB connector specific functions to save either RDD[OJAIDocument] or RDD of anyobject

    Spark MapRDB connector specific functions to save either RDD[OJAIDocument] or RDD of anyobject

    rdd

    rdd on which this function is called

    Example:
    1. docs.saveToMapRDB("tablePath") It might throw a DecodingException if the RDD or anyObject is not possible to convert to a document.

  5. implicit def toFilterRDDFunctions[K](rdd: RDD[K])(implicit arg0: OJAIKey[K], arg1: quotes[K]): FilterRDDFunctions[K]

    Spark MapRDB connector specific functions to join external RDD with a MapRDB table.

    Spark MapRDB connector specific functions to join external RDD with a MapRDB table.

    rdd

    rdd on which this function is called

    Example:
    1. docs.joinWithMapRDB("tablePath")

  6. implicit def toPairedRDDFunctions[K, V](rdd: RDD[(K, V)])(implicit arg0: OJAIKey[K], arg1: OJAIValue[V]): PairedDocumentRDDFunctions[K, V]

    Spark MapRDB connector specific functions to save either RDD[(String, OJAIDocument)] or RDD[(String, anyobject)]

    Spark MapRDB connector specific functions to save either RDD[(String, OJAIDocument)] or RDD[(String, anyobject)]

    rdd

    rdd on which this function is called

    Example:
    1. docs.saveToMapRDB("tablePath") It might throw a DecodingException if the RDD or anyObject is not possible to convert to a document.

  7. implicit def toSparkContextFunctions(sc: SparkContext): SparkContextFunctions

    Spark MapRDB connector specific functions to load json tables as RDD[OJAIDocument]

    Spark MapRDB connector specific functions to load json tables as RDD[OJAIDocument]

    sc

    sparkContext

    Example:
    1. val docs = sc.loadMapRDBTable("tablePath")

  8. object MapRDBSpark

    MapRDBSpark is a static class which contains factory methods to create scala's ojai document and partitioner objects.

    MapRDBSpark is a static class which contains factory methods to create scala's ojai document and partitioner objects.

    Example:
    1. Factory functions to help create scala's ojai documents val doc = MapRDBSpark.newDocument(jsonString) val doc = MapRDBSpark.newDocument(document: org.ojai.Document) Here are the ways to access elements in OJAIDocument val partitioner = MapRDBSpark.newPartitioner(tableName) It creates a partitioiner using the splits specified in tableName. val partitioner = MapRDBSpark.newPartitioner(Seq("AA","CC")) It creates a partitioner using the splits provided in the sequence. Here three splits will be created (null, "AA") , ("AA","CC") and ("CC", null) Note that this call assumes that user supplies the splits in sorted order.

  9. object field extends Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped