Package

com.ryft.spark.connector

sql

Permalink

package sql

Provides classes for DataFrames and SparkSQL support.

Overview

These classes are not intended to be used directly but should be accessed via com.ryft.spark.connector.RyftDataFrameReader

For example:

import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.types._
import org.apache.spark.{SparkConf, SparkContext}
import com.ryft.spark.connector._

// Definition for the DataFrame setup
val schema = StructType(Seq(
StructField("Arrest", BooleanType), StructField("Beat", IntegerType),
StructField("Block", StringType), StructField("CaseNumber", StringType),
StructField("_index", StructType(Seq(
  StructField("file", StringType), StructField("offset", StringType),
  StructField("length", IntegerType), StructField("fuzziness", ByteType)))
 )
))

sqlContext.read.ryft(schema, "*.crimestat", "crime_table")
Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. sql
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Visibility
  1. Public
  2. All

Type Members

  1. class DefaultSource extends SchemaRelationProvider with Logging

    Permalink
  2. class RyftRelation extends BaseRelation with PrunedFilteredScan with Serializable with Logging

    Permalink

Inherited from AnyRef

Inherited from Any

Ungrouped