com.ryft.spark.connector
Provides classes for DataFrames and SparkSQL support.
These classes are not intended to be used directly but should be accessed via com.ryft.spark.connector.RyftDataFrameReader
For example:
import org.apache.spark.sql.SQLContext import org.apache.spark.sql.types._ import org.apache.spark.{SparkConf, SparkContext} import com.ryft.spark.connector._ // Definition for the DataFrame setup val schema = StructType(Seq( StructField("Arrest", BooleanType), StructField("Beat", IntegerType), StructField("Block", StringType), StructField("CaseNumber", StringType), StructField("_index", StructType(Seq( StructField("file", StringType), StructField("offset", StringType), StructField("length", IntegerType), StructField("fuzziness", ByteType))) ) )) sqlContext.read.ryft(schema, "*.crimestat", "crime_table")
Provides classes for DataFrames and SparkSQL support.
Overview
These classes are not intended to be used directly but should be accessed via com.ryft.spark.connector.RyftDataFrameReader
For example: