Packages

t

com.nvidia.spark.rapids

SparkShims

trait SparkShims extends AnyRef

Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SparkShims
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Abstract Value Members

  1. abstract def ansiCastRule: ExprRule[_ <: Expression]

    Return the replacement rule for AnsiCast.

    Return the replacement rule for AnsiCast. 'AnsiCast' is removed from Spark 3.4.0, so need to handle it separately.

  2. abstract def aqeShuffleReaderExec: ExecRule[_ <: SparkPlan]
  3. abstract def attachTreeIfSupported[TreeType <: TreeNode[_], A](tree: TreeType, msg: String = "")(f: ⇒ A): A

    dropped by SPARK-34234

  4. abstract def avroRebaseReadKey: String
  5. abstract def avroRebaseWriteKey: String
  6. abstract def broadcastModeTransform(mode: BroadcastMode, toArray: Array[InternalRow]): Any
  7. abstract def columnarAdaptivePlan(a: AdaptiveSparkPlanExec, goal: CoalesceSizeGoal): SparkPlan
  8. abstract def filesFromFileIndex(fileCatalog: PartitioningAwareFileIndex): Seq[FileStatus]
  9. abstract def findOperators(plan: SparkPlan, predicate: (SparkPlan) ⇒ Boolean): Seq[SparkPlan]

    Walk the plan recursively and return a list of operators that match the predicate

  10. abstract def getAdaptiveInputPlan(adaptivePlan: AdaptiveSparkPlanExec): SparkPlan
  11. abstract def getDateFormatter(): DateFormatter
  12. abstract def getExecs: Map[Class[_ <: SparkPlan], ExecRule[_ <: SparkPlan]]
  13. abstract def getExprs: Map[Class[_ <: Expression], ExprRule[_ <: Expression]]
  14. abstract def getFileScanRDD(sparkSession: SparkSession, readFunction: (PartitionedFile) ⇒ Iterator[InternalRow], filePartitions: Seq[FilePartition], readDataSchema: StructType, metadataColumns: Seq[AttributeReference] = Seq.empty): RDD[InternalRow]
  15. abstract def getParquetFilters(schema: MessageType, pushDownDate: Boolean, pushDownTimestamp: Boolean, pushDownDecimal: Boolean, pushDownStartWith: Boolean, pushDownInFilterThreshold: Int, caseSensitive: Boolean, lookupFileMeta: (String) ⇒ String, dateTimeRebaseModeFromConf: String): ParquetFilters
  16. abstract def getScans: Map[Class[_ <: Scan], ScanRule[_ <: Scan]]
  17. abstract def getSparkShimVersion: ShimVersion
  18. abstract def hasAliasQuoteFix: Boolean
  19. abstract def hasCastFloatTimestampUpcast: Boolean
  20. abstract def int96ParquetRebaseRead(conf: SQLConf): String
  21. abstract def int96ParquetRebaseReadKey: String
  22. abstract def int96ParquetRebaseWrite(conf: SQLConf): String
  23. abstract def int96ParquetRebaseWriteKey: String
  24. abstract def isAqePlan(p: SparkPlan): Boolean
  25. abstract def isCustomReaderExec(x: SparkPlan): Boolean
  26. abstract def isEmptyRelation(relation: Any): Boolean
  27. abstract def isExchangeOp(plan: SparkPlanMeta[_]): Boolean
  28. abstract def isWindowFunctionExec(plan: SparkPlan): Boolean
  29. abstract def leafNodeDefaultParallelism(ss: SparkSession): Int
  30. abstract def neverReplaceShowCurrentNamespaceCommand: ExecRule[_ <: SparkPlan]
  31. abstract def newBroadcastQueryStageExec(old: BroadcastQueryStageExec, newPlan: SparkPlan): BroadcastQueryStageExec
  32. abstract def parquetRebaseRead(conf: SQLConf): String
  33. abstract def parquetRebaseReadKey: String
  34. abstract def parquetRebaseWrite(conf: SQLConf): String
  35. abstract def parquetRebaseWriteKey: String
  36. abstract def reusedExchangeExecPfn: PartialFunction[SparkPlan, ReusedExchangeExec]
  37. abstract def sessionFromPlan(plan: SparkPlan): SparkSession
  38. abstract def shouldFailDivOverflow: Boolean
  39. abstract def skipAssertIsOnTheGpu(plan: SparkPlan): Boolean

    Our tests, by default, will check that all operators are running on the GPU, but there are some operators that we do not translate to GPU plans, so we need a way to bypass the check for those.

  40. abstract def supportsColumnarAdaptivePlans: Boolean

    Determine if the Spark version allows the supportsColumnar flag to be overridden in AdaptiveSparkPlanExec.

    Determine if the Spark version allows the supportsColumnar flag to be overridden in AdaptiveSparkPlanExec. This feature was introduced in Spark 3.2 as part of SPARK-35881.

  41. abstract def tryTransformIfEmptyRelation(mode: BroadcastMode): Option[Any]

    This call can produce an EmptyHashedRelation or an empty array, allowing the AQE rule EliminateJoinToEmptyRelation in Spark 3.1.x to optimize certain joins.

    This call can produce an EmptyHashedRelation or an empty array, allowing the AQE rule EliminateJoinToEmptyRelation in Spark 3.1.x to optimize certain joins.

    In Spark 3.2.0, the optimization is still performed (under AQEPropagateEmptyRelation), but the AQE optimizer is looking at the metrics for the query stage to determine if numRows == 0, and if so it can eliminate certain joins.

    The call is implemented only for Spark 3.1.x+. It is disabled in Databricks because it requires a task context to perform the BroadcastMode.transform call, but we'd like to call this from the driver.

  42. abstract def v1RepairTableCommand(tableName: TableIdentifier): RunnableCommand

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  6. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  8. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  9. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  10. def isCastingStringToNegDecimalScaleSupported: Boolean
  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  14. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  15. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  16. def toString(): String
    Definition Classes
    AnyRef → Any
  17. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  18. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  19. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from AnyRef

Inherited from Any

Ungrouped