Packages

case class TextTable(name: String, sparkSession: SparkSession, options: CaseInsensitiveStringMap, paths: Seq[String], userSpecifiedSchema: Option[StructType], fallbackFileFormat: Class[_ <: FileFormat]) extends FileTable with Product with Serializable

Linear Supertypes
Serializable, Serializable, Product, Equals, FileTable, SupportsWrite, SupportsRead, Table, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. TextTable
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. FileTable
  7. SupportsWrite
  8. SupportsRead
  9. Table
  10. AnyRef
  11. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new TextTable(name: String, sparkSession: SparkSession, options: CaseInsensitiveStringMap, paths: Seq[String], userSpecifiedSchema: Option[StructType], fallbackFileFormat: Class[_ <: FileFormat])

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def capabilities(): Set[TableCapability]
    Definition Classes
    FileTable → Table
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  7. lazy val dataSchema: StructType
    Definition Classes
    FileTable
  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. val fallbackFileFormat: Class[_ <: FileFormat]

    Returns a V1 FileFormat class of the same file data source.

    Returns a V1 FileFormat class of the same file data source. This is a solution for the following cases: 1. File datasource V2 implementations cause regression. Users can disable the problematic data source via SQL configuration and fall back to FileFormat. 2. Catalog support is required, which is still under development for data source V2.

    Definition Classes
    TextTableFileTable
  10. lazy val fileIndex: PartitioningAwareFileIndex
    Definition Classes
    FileTable
  11. def formatName: String

    The string that represents the format that this data source provider uses.

    The string that represents the format that this data source provider uses. This is overridden by children to provide a nice alias for the data source. For example:

    override def formatName(): String = "ORC"
    Definition Classes
    TextTableFileTable
  12. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  13. def inferSchema(files: Seq[FileStatus]): Option[StructType]

    When possible, this method should return the schema of the given files.

    When possible, this method should return the schema of the given files. When the format does not support inference, or no valid files are given should return None. In these cases Spark will require that user specify the schema manually.

    Definition Classes
    TextTableFileTable
  14. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  15. val name: String
    Definition Classes
    TextTable → Table
  16. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  17. def newScanBuilder(options: CaseInsensitiveStringMap): TextScanBuilder
    Definition Classes
    TextTable → SupportsRead
  18. def newWriteBuilder(info: LogicalWriteInfo): WriteBuilder
    Definition Classes
    TextTable → SupportsWrite
  19. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  20. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  21. val options: CaseInsensitiveStringMap
  22. def partitioning(): Array[Transform]
    Definition Classes
    FileTable → Table
  23. val paths: Seq[String]
  24. def properties(): Map[String, String]
    Definition Classes
    FileTable → Table
  25. lazy val schema: StructType
    Definition Classes
    FileTable → Table
  26. val sparkSession: SparkSession
  27. def supportsDataType(dataType: DataType): Boolean

    Returns whether this format supports the given DataType in read/write path.

    Returns whether this format supports the given DataType in read/write path. By default all data types are supported.

    Definition Classes
    TextTableFileTable
  28. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  29. val userSpecifiedSchema: Option[StructType]
  30. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  32. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from FileTable

Inherited from SupportsWrite

Inherited from SupportsRead

Inherited from Table

Inherited from AnyRef

Inherited from Any

Ungrouped