Package

com.mapr.db

spark

Permalink

package spark

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. spark
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class OJAIKryoRegistrator extends KryoRegistrator

    Permalink

    Custom registrator provided for registering classes specific to spark ojai connector This registrator should be used when kryo serialization is enabled for the spark application.

    Custom registrator provided for registering classes specific to spark ojai connector This registrator should be used when kryo serialization is enabled for the spark application.

    Example:
    1. sparkconf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer") .set("spark.kryo.registrator", "com.mapr.db.spark.OJAIKryoRegistrator")

  2. case class SparkContextFunctions(sc: SparkContext) extends Serializable with Product

    Permalink
  3. case class field(fieldPath: String) extends Product with Serializable

    Permalink

    field class provides the functionality to represent the query conditions.

    field class provides the functionality to represent the query conditions.

    fieldPath

    name of the field in MapRDB Table.

    Example:
    1. An equality condition can be represented by field("a.c.d") === 10 Similarly a greater than condition can be represented by field("a.c.d") >= 10

  4. case class sizeOf(field: field) extends Product with Serializable

    Permalink

Value Members

  1. object MapRDBSpark

    Permalink

    MapRDBSpark is a static class which contains factory methods to create scala's ojai document and partitioner objects.

    MapRDBSpark is a static class which contains factory methods to create scala's ojai document and partitioner objects.

    Example:
    1. Factory functions to help create scala's ojai documents val doc = MapRDBSpark.newDocument(jsonString) val doc = MapRDBSpark.newDocument(document: org.ojai.Document) Here are the ways to access elements in OJAIDocument val partitioner = MapRDBSpark.newPartitioner(tableName) It creates a partitioiner using the splits specified in tableName. val partitioner = MapRDBSpark.newPartitioner(Seq("AA","CC")) It creates a partitioner using the splits provided in the sequence. Here three splits will be created (null, "AA") , ("AA","CC") and ("CC", null) Note that this call assumes that user supplies the splits in sorted order.

  2. package RDD

    Permalink
  3. package api

    Permalink
  4. package codec

    Permalink
  5. package condition

    Permalink
  6. package configuration

    Permalink
  7. package dbclient

    Permalink
  8. package documentTypeUtils

    Permalink
  9. package documentUtils

    Permalink
  10. package exceptions

    Permalink
  11. object field extends Serializable

    Permalink
  12. package impl

    Permalink
  13. implicit val ojaiByteBufferOrdering: Ordering[ByteBuffer]

    Permalink
  14. implicit val ojaiDBBinaryKeyOrdering: Ordering[DBBinaryValue]

    Permalink
  15. implicit val ojaiStringKeyOrdering: Ordering[String]

    Permalink
  16. package serializers

    Permalink
  17. package sql

    Permalink
  18. package streaming

    Permalink
  19. implicit def toDocumentRDDFunctions[D](rdd: RDD[D])(implicit arg0: OJAIValue[D]): OJAIDocumentRDDFunctions[D]

    Permalink

    Spark MapRDB connector specific functions to save either RDD[OJAIDocument] or RDD of anyobject

    Spark MapRDB connector specific functions to save either RDD[OJAIDocument] or RDD of anyobject

    rdd

    rdd on which this function is called

    Example:
    1. docs.saveToMapRDB("tablePath") It might throw a DecodingException if the RDD or anyObject is not possible to convert to a document.

  20. implicit def toFilterRDDFunctions[K](rdd: RDD[K])(implicit arg0: OJAIKey[K], arg1: quotes[K]): FilterRDDFunctions[K]

    Permalink

    Spark MapRDB connector specific functions to join external RDD with a MapRDB table.

    Spark MapRDB connector specific functions to join external RDD with a MapRDB table.

    rdd

    rdd on which this function is called

    Example:
    1. docs.joinWithMapRDB("tablePath")

  21. implicit def toPairedRDDFunctions[K, V](rdd: RDD[(K, V)])(implicit arg0: OJAIKey[K], arg1: OJAIValue[V]): PairedDocumentRDDFunctions[K, V]

    Permalink

    Spark MapRDB connector specific functions to save either RDD[(String, OJAIDocument)] or RDD[(String, anyobject)]

    Spark MapRDB connector specific functions to save either RDD[(String, OJAIDocument)] or RDD[(String, anyobject)]

    rdd

    rdd on which this function is called

    Example:
    1. docs.saveToMapRDB("tablePath") It might throw a DecodingException if the RDD or anyObject is not possible to convert to a document.

  22. implicit def toSparkContextFunctions(sc: SparkContext): SparkContextFunctions

    Permalink

    Spark MapRDB connector specific functions to load json tables as RDD[OJAIDocument]

    Spark MapRDB connector specific functions to load json tables as RDD[OJAIDocument]

    sc

    sparkContext

    Example:
    1. val docs = sc.loadMapRDBTable("tablePath")

  23. package types

    Permalink
  24. package utils

    Permalink

Inherited from AnyRef

Inherited from Any

Ungrouped