object TrampolineUtil

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. TrampolineUtil
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def addShutdownHook(priority: Int, runnable: Runnable): AnyRef

    Add shutdown hook with priority

  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def asNullable(dt: DataType): DataType
  7. def bytesToString(size: Long): String

    Get a human-readable string, e.g.: "4.0 MiB", for a value in bytes.

  8. def cleanupAnyExistingSession(): Unit

    Shuts down and cleans up any existing Spark session

  9. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  10. def dataTypeExistsRecursively(dt: DataType, f: (DataType) ⇒ Boolean): Boolean

    Return true if the provided predicate function returns true for any type node within the datatype tree.

  11. def doExecuteBroadcast[T](child: SparkPlan): Broadcast[T]
  12. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  14. def fromAttributes(attrs: Seq[Attribute]): StructType
  15. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  16. def getFSBytesReadOnThreadCallback(): () ⇒ Long

    Returns a function that can be called to find Hadoop FileSystem bytes read.

    Returns a function that can be called to find Hadoop FileSystem bytes read. If getFSBytesReadOnThreadCallback is called from thread r at time t, the returned callback will return the bytes read on r since t.

  17. def getSimpleName(cls: Class[_]): String

    Get the simple name of a class with fixup for any Scala internal errors

  18. def getTaskMemoryManager(): TaskMemoryManager
  19. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  20. def incBytesRead(inputMetrics: InputMetrics, bytesRead: Long): Unit

    Set the bytes read task input metric

  21. def incInputRecordsRows(inputMetrics: InputMetrics, rows: Long): Unit
  22. def incTaskMetricsDiskBytesSpilled(amountSpilled: Long): Unit

    Increment the task's disk bytes spilled metric.

    Increment the task's disk bytes spilled metric. If the current thread does not correspond to a Spark task then this call does nothing.

    amountSpilled

    amount of memory spilled in bytes

  23. def incTaskMetricsMemoryBytesSpilled(amountSpilled: Long): Unit

    Increment the task's memory bytes spilled metric.

    Increment the task's memory bytes spilled metric. If the current thread does not correspond to a Spark task then this call does nothing.

    amountSpilled

    amount of memory spilled in bytes

  24. def isDriver(sparkConf: SparkConf): Boolean
  25. def isDriver(env: SparkEnv): Boolean

    Returns true if called from code running on the Spark driver.

  26. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  27. def isSupportedRelation(mode: BroadcastMode): Boolean
  28. def jsonValue(dataType: DataType): JValue
  29. def makeSparkUpgradeException(version: String, message: String, cause: Throwable): SparkUpgradeException
  30. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  31. def newBlockManagerId(execId: String, host: String, port: Int, topologyInfo: Option[String] = None): BlockManagerId

    Create a BlockManagerId instance

  32. def newInputMetrics(): InputMetrics

    Return a new InputMetrics instance

  33. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  34. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  35. def sameType(a: DataType, b: DataType): Boolean

    Check if a and b are the same data type when ignoring nullability (StructField.nullable, ArrayType.containsNull, and MapType.valueContainsNull).

  36. def setTaskContext(tc: TaskContext): Unit

    Set the task context for the current thread

  37. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  38. def throwAnalysisException(msg: String): Nothing

    Throw a Spark analysis exception

  39. def toAttributes(structType: StructType): Seq[Attribute]
  40. def toString(): String
    Definition Classes
    AnyRef → Any
  41. def unionLikeMerge(left: DataType, right: DataType): DataType
  42. def unsetTaskContext(): Unit

    Remove the task context for the current thread

  43. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  44. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  45. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from AnyRef

Inherited from Any

Ungrouped