class DefaultSource extends RelationProvider with SchemaRelationProvider with CreatableRelationProvider with DataSourceRegister
Snowflake Source implementation for Spark SQL Major TODO points:
- Add support for compression Snowflake->Spark
- Add support for using Snowflake Stage files, so the user doesn't need to provide AWS passwords
- Add support for VARIANT
Linear Supertypes
Ordering
- Alphabetic
- By Inheritance
Inherited
- DefaultSource
- DataSourceRegister
- CreatableRelationProvider
- SchemaRelationProvider
- RelationProvider
- AnyRef
- Any
- Hide All
- Show All
Visibility
- Public
- Protected
Instance Constructors
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native() @HotSpotIntrinsicCandidate()
- def createRelation(sqlContext: SQLContext, saveMode: SaveMode, parameters: Map[String, String], data: DataFrame): BaseRelation
Creates a Relation instance by first writing the contents of the given DataFrame to Snowflake
Creates a Relation instance by first writing the contents of the given DataFrame to Snowflake
- Definition Classes
- DefaultSource → CreatableRelationProvider
- def createRelation(sqlContext: SQLContext, parameters: Map[String, String], schema: StructType): BaseRelation
Load a
SnowflakeRelationusing user-provided schema, so no inference over JDBC will be used.Load a
SnowflakeRelationusing user-provided schema, so no inference over JDBC will be used.- Definition Classes
- DefaultSource → SchemaRelationProvider
- def createRelation(sqlContext: SQLContext, parameters: Map[String, String]): BaseRelation
Create a new
SnowflakeRelationinstance using parameters from Spark SQL DDL.Create a new
SnowflakeRelationinstance using parameters from Spark SQL DDL. Resolves the schema using JDBC connection over provided URL, which must contain credentials.- Definition Classes
- DefaultSource → RelationProvider
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
- def shortName(): String
- Definition Classes
- DefaultSource → DataSourceRegister
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
Deprecated Value Members
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable]) @Deprecated
- Deprecated