Builder for configuring and creating a MongoSpark
Builder for configuring and creating a MongoSpark
It requires a SparkSession
or the SparkContext
Create a builder for configuring the MongoSpark
Create a builder for configuring the MongoSpark
a MongoSession Builder
The default source string for creating DataFrames from MongoDB
Load data from MongoDB
Load data from MongoDB
The bean class defining the schema for the data
the SparkSession containing the MongoDB connection configuration
the custom readConfig
the class of the data contained in the RDD
the Dataset
Load data from MongoDB
Load data from MongoDB
The bean class defining the schema for the data
the SparkSession containing the MongoDB connection configuration
the class of the data representing the Dataset
the Dataset
2.1
Load data from MongoDB
Load data from MongoDB
the type of the data in the RDD
the Spark context containing the MongoDB connection configuration
the class of the data contained in the RDD
a MongoRDD
Load data from MongoDB
Load data from MongoDB
the type of the data in the RDD
the Spark context containing the MongoDB connection configuration
the class of the data contained in the RDD
a MongoRDD
Load data from MongoDB
Load data from MongoDB
the Spark context containing the MongoDB connection configuration
a MongoRDD
Load data from MongoDB
Load data from MongoDB
the Spark context containing the MongoDB connection configuration
a MongoRDD
Load data from MongoDB
Load data from MongoDB
The optional class defining the schema for the data
the SparkSession containing the MongoDB connection configuration
the custom readConfig
a MongoRDD
Load data from MongoDB
Load data from MongoDB
The optional class defining the schema for the data
the SparkSession containing the MongoDB connection configuration
a MongoRDD
Load data from MongoDB
Load data from MongoDB
the Spark context containing the MongoDB connection configuration
the custom readConfig
a MongoRDD
Load data from MongoDB
Load data from MongoDB
the Spark context containing the MongoDB connection configuration
a MongoRDD
Load data from MongoDB
Load data from MongoDB
the SparkSession containing the MongoDB connection configuration
the custom readConfig
the Dataset
2.1
Load data from MongoDB
Load data from MongoDB
the SparkSession containing the MongoDB connection configuration
the Dataset
2.1
Creates a DataFrameReader with MongoDB
as the source
Creates a DataFrameReader with MongoDB
as the source
the SparkSession
the DataFrameReader
Save data to MongoDB
Save data to MongoDB
Uses the writeConfig
for the database information
Requires a codec for the data type
the type of the data in the RDD
the RDD data to save to MongoDB
the class of the data contained in the RDD
the javaRDD
Save data to MongoDB
Save data to MongoDB
Uses the SparkConf
for the database information
the RDD data to save to MongoDB
the javaRDD
Save data to MongoDB
Save data to MongoDB
Uses the SparkConf
for the database and collection information
Requires a codec for the data type
the type of the data in the RDD
the RDD data to save to MongoDB
the class of the data contained in the RDD
the javaRDD
Save data to MongoDB
Save data to MongoDB
Uses the SparkConf
for the database and collection information
the RDD data to save to MongoDB
the javaRDD
Save data to MongoDB
Save data to MongoDB
the DataFrameWriter save to MongoDB
the writeConfig
Save data to MongoDB
Save data to MongoDB
Uses the SparkConf
for the database and collection information
the DataFrameWriter save to MongoDB
Save data to MongoDB
Save data to MongoDB
Note: If the dataFrame contains an _id
field the data will upserted and replace any existing documents in the collection.
the dataset to save to MongoDB
the writeConfig
1.1.0
Save data to MongoDB
Save data to MongoDB
Uses the SparkConf
for the database and collection information
Note: If the dataFrame contains an _id
field the data will upserted and replace any existing documents in the collection.
the dataset to save to MongoDB
1.1.0
Save data to MongoDB
Save data to MongoDB
the type of the data in the RDD
the RDD data to save to MongoDB
the writeConfig
Save data to MongoDB
Save data to MongoDB
Uses the SparkConf
for the database and collection information
Requires a codec for the data type
the type of the data in the RDD
the RDD data to save to MongoDB
Creates a DataFrameWriter with the MongoDB
underlying output data source.
Creates a DataFrameWriter with the MongoDB
underlying output data source.
the Dataset to convert into a DataFrameWriter
the DataFrameWriter
Load data from MongoDB
Load data from MongoDB
The bean class defining the schema for the data
the SQL context containing the MongoDB connection configuration
the class of the data contained in the RDD
a MongoRDD
(Since version 2.0.0) As of Spark 2.0 SQLContext was replaced by SparkSession. Use the SparkSession method instead
Load data from MongoDB
Load data from MongoDB
The optional class defining the schema for the data
the SQLContext containing the MongoDB connection configuration
a MongoRDD
(Since version 2.0.0) As of Spark 2.0 SQLContext was replaced by SparkSession. Use the SparkSession method instead
Load data from MongoDB
Load data from MongoDB
The optional class defining the schema for the data
the SQLContext containing the MongoDB connection configuration
a MongoRDD
(Since version 2.0.0) As of Spark 2.0 SQLContext was replaced by SparkSession. Use the SparkSession method instead
Creates a DataFrameReader with MongoDB
as the source
Creates a DataFrameReader with MongoDB
as the source
the SQLContext
the DataFrameReader
(Since version 2.0.0) As of Spark 2.0 SQLContext was replaced by SparkSession. Use the SparkSession method instead
The MongoSpark helper allows easy creation of RDDs, DataFrames or Datasets from MongoDB.
1.0