abstract class TableValuedFunction extends AnyRef
Interface for invoking table-valued functions in Spark SQL.
- Source
- TableValuedFunction.scala
- Since
4.0.0
- Grouped
- Alphabetic
- By Inheritance
- TableValuedFunction
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
- new TableValuedFunction()
Abstract Value Members
- abstract def collations(): Dataset[Row]
Gets all of the Spark SQL string collations.
Gets all of the Spark SQL string collations.
- Since
4.0.0
- abstract def explode(collection: Column): Dataset[Row]
Creates a
DataFrame
containing a new row for each element in the given array or map column.Creates a
DataFrame
containing a new row for each element in the given array or map column. Uses the default column namecol
for elements in the array andkey
andvalue
for elements in the map unless specified otherwise.- Since
4.0.0
- abstract def explode_outer(collection: Column): Dataset[Row]
Creates a
DataFrame
containing a new row for each element in the given array or map column.Creates a
DataFrame
containing a new row for each element in the given array or map column. Uses the default column namecol
for elements in the array andkey
andvalue
for elements in the map unless specified otherwise. Unlike explode, if the array/map is null or empty then null is produced.- Since
4.0.0
- abstract def inline(input: Column): Dataset[Row]
Creates a
DataFrame
containing a new row for each element in the given array of structs.Creates a
DataFrame
containing a new row for each element in the given array of structs.- Since
4.0.0
- abstract def inline_outer(input: Column): Dataset[Row]
Creates a
DataFrame
containing a new row for each element in the given array of structs.Creates a
DataFrame
containing a new row for each element in the given array of structs. Unlike inline, if the array is null or empty then null is produced for each nested column.- Since
4.0.0
- abstract def json_tuple(input: Column, fields: Column*): Dataset[Row]
Creates a
DataFrame
containing a new row for a json column according to the given field names.Creates a
DataFrame
containing a new row for a json column according to the given field names.- Annotations
- @varargs()
- Since
4.0.0
- abstract def posexplode(collection: Column): Dataset[Row]
Creates a
DataFrame
containing a new row for each element with position in the given array or map column.Creates a
DataFrame
containing a new row for each element with position in the given array or map column. Uses the default column namepos
for position, andcol
for elements in the array andkey
andvalue
for elements in the map unless specified otherwise.- Since
4.0.0
- abstract def posexplode_outer(collection: Column): Dataset[Row]
Creates a
DataFrame
containing a new row for each element with position in the given array or map column.Creates a
DataFrame
containing a new row for each element with position in the given array or map column. Uses the default column namepos
for position, andcol
for elements in the array andkey
andvalue
for elements in the map unless specified otherwise. Unlike posexplode, if the array/map is null or empty then the row (null, null) is produced.- Since
4.0.0
- abstract def range(start: Long, end: Long, step: Long, numPartitions: Int): Dataset[Long]
Creates a
Dataset
with a singleLongType
column namedid
, containing elements in a range fromstart
toend
(exclusive) with a step value, with partition number specified.Creates a
Dataset
with a singleLongType
column namedid
, containing elements in a range fromstart
toend
(exclusive) with a step value, with partition number specified.- Since
4.0.0
- abstract def range(start: Long, end: Long, step: Long): Dataset[Long]
Creates a
Dataset
with a singleLongType
column namedid
, containing elements in a range fromstart
toend
(exclusive) with a step value.Creates a
Dataset
with a singleLongType
column namedid
, containing elements in a range fromstart
toend
(exclusive) with a step value.- Since
4.0.0
- abstract def range(start: Long, end: Long): Dataset[Long]
Creates a
Dataset
with a singleLongType
column namedid
, containing elements in a range fromstart
toend
(exclusive) with step value 1.Creates a
Dataset
with a singleLongType
column namedid
, containing elements in a range fromstart
toend
(exclusive) with step value 1.- Since
4.0.0
- abstract def range(end: Long): Dataset[Long]
Creates a
Dataset
with a singleLongType
column namedid
, containing elements in a range from 0 toend
(exclusive) with step value 1.Creates a
Dataset
with a singleLongType
column namedid
, containing elements in a range from 0 toend
(exclusive) with step value 1.- Since
4.0.0
- abstract def sql_keywords(): Dataset[Row]
Gets Spark SQL keywords.
Gets Spark SQL keywords.
- Since
4.0.0
- abstract def stack(n: Column, fields: Column*): Dataset[Row]
Separates
col1
, ...,colk
inton
rows.Separates
col1
, ...,colk
inton
rows. Uses column names col0, col1, etc. by default unless specified otherwise.- Annotations
- @varargs()
- Since
4.0.0
- abstract def variant_explode(input: Column): Dataset[Row]
Separates a variant object/array into multiple rows containing its fields/elements.
Separates a variant object/array into multiple rows containing its fields/elements. Its result schema is
struct<pos int, key string, value variant>
.pos
is the position of the field/element in its parent object/array, andvalue
is the field/element value.key
is the field name when exploding a variant object, or is NULL when exploding a variant array. It ignores any input that is not a variant array/object, including SQL NULL, variant null, and any other variant values.- Since
4.0.0
- abstract def variant_explode_outer(input: Column): Dataset[Row]
Separates a variant object/array into multiple rows containing its fields/elements.
Separates a variant object/array into multiple rows containing its fields/elements. Its result schema is
struct<pos int, key string, value variant>
.pos
is the position of the field/element in its parent object/array, andvalue
is the field/element value.key
is the field name when exploding a variant object, or is NULL when exploding a variant array. Unlike variant_explode, if the given variant is not a variant array/object, including SQL NULL, variant null, and any other variant values, then NULL is produced.- Since
4.0.0
Concrete Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
Deprecated Value Members
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable]) @Deprecated
- Deprecated
(Since version 9)