Package

com.mongodb.spark

config

Permalink

package config

Visibility
  1. Public
  2. All

Type Members

  1. case class AggregationConfig(collationString: Option[String] = None, hintString: Option[String] = None, pipelineString: Option[String] = None, allowDiskUse: Boolean = ...) extends MongoClassConfig with Product with Serializable

    Permalink

    The aggregation configuration

    The aggregation configuration

    collationString

    the optional collation config

    hintString

    the optional hint document in extended json format

    pipelineString

    the optional aggregation pipeline, either a list of documents in json syntax or a single document in json syntax

    allowDiskUse

    enables writing to temporary files

  2. trait MongoClassConfig extends Serializable

    Permalink

    Mongo Spark Configurations

    Mongo Spark Configurations

    Defines helper methods for transforming or updating configurations.

    Since

    1.0

    See also

    ReadConfig

  3. trait MongoCollectionConfig extends Serializable

    Permalink

    Configurations for connecting to a specific collection in a database

    Configurations for connecting to a specific collection in a database

    Since

    1.0

  4. trait MongoCompanionConfig extends Serializable

    Permalink

    The Mongo configuration base trait

    The Mongo configuration base trait

    Defines companion object helper methods for creating MongoConfig instances

    Since

    1.0

  5. trait MongoInputConfig extends MongoCompanionConfig

    Permalink

    Mongo input configurations

    Mongo input configurations

    Configurations used when reading from MongoDB

    Configuration Properties

    The prefix when using sparkConf is: spark.mongodb.input. followed by the property name:

    Since

    1.0

    See also

    com.mongodb.spark.config.ReadConfig$

  6. trait MongoOutputConfig extends MongoCompanionConfig

    Permalink

    Mongo output configurations

    Mongo output configurations

    Configurations used when writing data from Spark into MongoDB

    outputProperties

    Since

    1.0

    See also

    WriteConfig

  7. case class ReadConcernConfig(readConcernLevel: Option[String] = None) extends MongoClassConfig with Product with Serializable

    Permalink

    The ReadConcern configuration used by the ReadConfig.

    The ReadConcern configuration used by the ReadConfig.

    readConcernLevel

    the optional read concern level. If None the servers default level will be used.

    Since

    1.0

  8. case class ReadConfig(databaseName: String, collectionName: String, connectionString: Option[String] = None, sampleSize: Int = ReadConfig.DefaultSampleSize, partitioner: MongoPartitioner = ReadConfig.DefaultPartitioner, partitionerOptions: Map[String, String] = ..., localThreshold: Int = ..., readPreferenceConfig: ReadPreferenceConfig = ReadPreferenceConfig(), readConcernConfig: ReadConcernConfig = ReadConcernConfig(), aggregationConfig: AggregationConfig = AggregationConfig(), registerSQLHelperFunctions: Boolean = ..., inferSchemaMapTypesEnabled: Boolean = ..., inferSchemaMapTypesMinimumKeys: Int = ..., pipelineIncludeNullFilters: Boolean = ..., pipelineIncludeFiltersAndProjections: Boolean = ..., samplePoolSize: Int = ReadConfig.DefaultSamplePoolSize) extends MongoCollectionConfig with MongoClassConfig with Product with Serializable

    Permalink

    Read Configuration used when reading data from MongoDB

    Read Configuration used when reading data from MongoDB

    databaseName

    the database name

    collectionName

    the collection name

    connectionString

    the optional connection string used in the creation of this configuration

    sampleSize

    a positive integer sample size to draw from the collection when inferring the schema

    partitioner

    the class name of the partitioner to use to create partitions

    partitionerOptions

    the configuration options for the partitioner

    localThreshold

    the local threshold in milliseconds used when choosing among multiple MongoDB servers to send a request. Only servers whose ping time is less than or equal to the server with the fastest ping time plus the local threshold will be chosen.

    readPreferenceConfig

    the readPreference configuration

    readConcernConfig

    the readConcern configuration

    aggregationConfig

    the aggregation configuration

    registerSQLHelperFunctions

    true to register sql helper functions

    inferSchemaMapTypesEnabled

    true to detect MapTypes when inferring Schema.

    inferSchemaMapTypesMinimumKeys

    the minimum number of keys before a document can be inferred as a MapType.

    pipelineIncludeNullFilters

    true to include and push down null and exists filters into the pipeline when using sql.

    pipelineIncludeFiltersAndProjections

    true to push down filters and projections into the pipeline when using sql.

    samplePoolSize

    the size of the pool to take a sample from, used when there is no $sample support or if there is a pushed down aggregation

    Since

    1.0

  9. case class ReadPreferenceConfig(name: String = "primary", tagSets: Option[String] = None) extends MongoClassConfig with Product with Serializable

    Permalink

    The ReadPreference configuration used by the ReadConfig

    The ReadPreference configuration used by the ReadConfig

    name

    the read preference name

    tagSets

    optional string of tagSets

    Since

    1.0

  10. case class WriteConcernConfig(w: Option[Int] = None, wName: Option[String] = None, journal: Option[Boolean] = None, wTimeout: Option[Duration] = None) extends MongoClassConfig with Product with Serializable

    Permalink

    The WriteConcern configuration used by the WriteConfig

    The WriteConcern configuration used by the WriteConfig

    w

    the optional w integer value

    wName

    the optional w string value

    journal

    the optional journal value

    wTimeout

    the optional timeout value

    Since

    1.0

  11. case class WriteConfig(databaseName: String, collectionName: String, connectionString: Option[String] = None, replaceDocument: Boolean = WriteConfig.DefaultReplaceDocument, maxBatchSize: Int = WriteConfig.DefaultMaxBatchSize, localThreshold: Int = ..., writeConcernConfig: WriteConcernConfig = WriteConcernConfig.Default, shardKey: Option[String] = None, forceInsert: Boolean = WriteConfig.DefaultForceInsert, ordered: Boolean = WriteConfig.DefautOrdered) extends MongoCollectionConfig with MongoClassConfig with Product with Serializable

    Permalink

    Write Configuration for writes to MongoDB

    Write Configuration for writes to MongoDB

    databaseName

    the database name

    collectionName

    the collection name

    connectionString

    the optional connection string used in the creation of this configuration.

    replaceDocument

    replaces the whole document, when saving a Dataset that contains an _id field. If false only updates / sets the fields declared in the Dataset.

    maxBatchSize

    the maxBatchSize when performing a bulk update/insert. Defaults to 512.

    localThreshold

    the local threshold in milliseconds used when choosing among multiple MongoDB servers to send a request. Only servers whose ping time is less than or equal to the server with the fastest ping time plus the local threshold will be chosen.

    writeConcernConfig

    the write concern configuration

    shardKey

    an optional shardKey in extended form: "{key: 1, key2: 1}". Used when upserting DataSets in sharded clusters.

    forceInsert

    if true forces the writes to be inserts, even if a Dataset contains an _id field. Default false.

    ordered

    configures the bulk operation ordered property. Defaults to true

    Since

    1.0

Value Members

  1. object AggregationConfig extends MongoInputConfig

    Permalink

    The AggregationConfig companion object.

    The AggregationConfig companion object.

    Since

    2.3

  2. object ReadConcernConfig extends MongoInputConfig

    Permalink

    The ReadConcernConfig companion object

    The ReadConcernConfig companion object

    Since

    1.0

  3. object ReadConfig extends MongoInputConfig with LoggingTrait

    Permalink

    The ReadConfig companion object

    The ReadConfig companion object

    Configuration Properties

    The prefix when using sparkConf is: spark.mongodb.input. followed by the property name:

    Since

    1.0

  4. object ReadPreferenceConfig extends MongoInputConfig

    Permalink

    The ReadPreferenceConfig companion object

    The ReadPreferenceConfig companion object

    Since

    1.0

  5. object WriteConcernConfig extends MongoOutputConfig

    Permalink

    The WriteConcernConfig companion object

    The WriteConcernConfig companion object

    Since

    1.0

  6. object WriteConfig extends MongoOutputConfig

    Permalink

    The WriteConfig companion object

    The WriteConfig companion object

    Since

    1.0

Ungrouped