Object

org.apache.spark.streaming.kafka09

KafkaUtils

Related Doc: package kafka09

Permalink

object KafkaUtils extends Logging

:: Experimental :: object for constructing Kafka streams and RDDs

Annotations
@Experimental()
Linear Supertypes
Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. KafkaUtils
  2. Logging
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def createDirectStream[K, V](jssc: JavaStreamingContext, locationStrategy: LocationStrategy, consumerStrategy: ConsumerStrategy[K, V], perPartitionConfig: PerPartitionConfig): JavaInputDStream[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Java constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    :: Experimental :: Java constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    K

    type of Kafka message key

    V

    type of Kafka message value

    locationStrategy

    In most cases, pass in LocationStrategies.PreferConsistent, see LocationStrategies for more details.

    consumerStrategy

    In most cases, pass in ConsumerStrategies.Subscribe, see ConsumerStrategies for more details

    perPartitionConfig

    configuration of settings such as max rate on a per-partition basis. see PerPartitionConfig for more details.

    Annotations
    @Experimental()
  7. def createDirectStream[K, V](jssc: JavaStreamingContext, locationStrategy: LocationStrategy, consumerStrategy: ConsumerStrategy[K, V]): JavaInputDStream[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Java constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    :: Experimental :: Java constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    K

    type of Kafka message key

    V

    type of Kafka message value

    locationStrategy

    In most cases, pass in LocationStrategies.PreferConsistent, see LocationStrategies for more details.

    consumerStrategy

    In most cases, pass in ConsumerStrategies.Subscribe, see ConsumerStrategies for more details

    Annotations
    @Experimental()
  8. def createDirectStream[K, V](ssc: StreamingContext, locationStrategy: LocationStrategy, consumerStrategy: ConsumerStrategy[K, V], perPartitionConfig: PerPartitionConfig): InputDStream[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Scala constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    :: Experimental :: Scala constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    K

    type of Kafka message key

    V

    type of Kafka message value

    locationStrategy

    In most cases, pass in LocationStrategies.PreferConsistent, see LocationStrategies for more details.

    consumerStrategy

    In most cases, pass in ConsumerStrategies.Subscribe, see ConsumerStrategies for more details.

    perPartitionConfig

    configuration of settings such as max rate on a per-partition basis. see PerPartitionConfig for more details.

    Annotations
    @Experimental()
  9. def createDirectStream[K, V](ssc: StreamingContext, locationStrategy: LocationStrategy, consumerStrategy: ConsumerStrategy[K, V]): InputDStream[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Scala constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition.

    :: Experimental :: Scala constructor for a DStream where each given Kafka topic/partition corresponds to an RDD partition. The spark configuration spark.streaming.kafka.maxRatePerPartition gives the maximum number of messages per second that each partition will accept.

    K

    type of Kafka message key

    V

    type of Kafka message value

    locationStrategy

    In most cases, pass in LocationStrategies.PreferConsistent, see LocationStrategies for more details.

    consumerStrategy

    In most cases, pass in ConsumerStrategies.Subscribe, see ConsumerStrategies for more details

    Annotations
    @Experimental()
  10. def createRDD[K, V](jsc: JavaSparkContext, kafkaParams: Map[String, AnyRef], offsetRanges: Array[OffsetRange], locationStrategy: LocationStrategy): JavaRDD[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Java constructor for a batch-oriented interface for consuming from Kafka.

    :: Experimental :: Java constructor for a batch-oriented interface for consuming from Kafka. Starting and ending offsets are specified in advance, so that you can control exactly-once semantics.

    K

    type of Kafka message key

    V

    type of Kafka message value

    kafkaParams

    Kafka configuration parameters. Requires "bootstrap.servers" to be set with Kafka broker(s) specified in host1:port1,host2:port2 form.

    offsetRanges

    offset ranges that define the Kafka data belonging to this RDD

    locationStrategy

    In most cases, pass in LocationStrategies.PreferConsistent, see LocationStrategies for more details.

    Annotations
    @Experimental()
  11. def createRDD[K, V](sc: SparkContext, kafkaParams: Map[String, AnyRef], offsetRanges: Array[OffsetRange], locationStrategy: LocationStrategy): RDD[ConsumerRecord[K, V]]

    Permalink

    :: Experimental :: Scala constructor for a batch-oriented interface for consuming from Kafka.

    :: Experimental :: Scala constructor for a batch-oriented interface for consuming from Kafka. Starting and ending offsets are specified in advance, so that you can control exactly-once semantics.

    K

    type of Kafka message key

    V

    type of Kafka message value

    kafkaParams

    Kafka configuration parameters. Requires "bootstrap.servers" to be set with Kafka broker(s) specified in host1:port1,host2:port2 form.

    offsetRanges

    offset ranges that define the Kafka data belonging to this RDD

    locationStrategy

    In most cases, pass in LocationStrategies.PreferConsistent, see LocationStrategies for more details.

    Annotations
    @Experimental()
  12. val eofOffset: Int

    Permalink
  13. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  14. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  15. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  16. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  17. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  18. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  19. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  20. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  21. def log: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  22. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  23. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  24. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  25. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  26. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  27. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  28. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  29. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  30. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  31. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  32. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  33. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  34. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  35. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  36. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  37. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  38. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  40. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  41. def waitForConsumerAssignment[K, V](consumer: KafkaConsumer[K, V]): Unit

    Permalink

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped