Object/Class

com.mongodb.spark.config

ReadPreferenceConfig

Related Docs: class ReadPreferenceConfig | package config

Permalink

object ReadPreferenceConfig extends MongoInputConfig

The ReadPreferenceConfig companion object

Since

1.0

Linear Supertypes
MongoInputConfig, MongoCompanionConfig, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. ReadPreferenceConfig
  2. MongoInputConfig
  3. MongoCompanionConfig
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. type Self = ReadPreferenceConfig

    Permalink

    The type of the MongoConfig

    The type of the MongoConfig

    Definition Classes
    ReadPreferenceConfigMongoCompanionConfig

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. val allowDiskUseProperty: String

    Permalink

    The allow disk use property

    The allow disk use property

    Enables writing to temporary files

    Definition Classes
    MongoInputConfig
    Since

    2.3.1

  5. def apply(options: Map[String, String], default: Option[ReadPreferenceConfig]): ReadPreferenceConfig

    Permalink

    Create a configuration from the values in the Map, using the optional default configuration for any default values.

    Create a configuration from the values in the Map, using the optional default configuration for any default values.

    Note: Values in the map do not need to be prefixed with the configPrefix.

    options

    a map of properties and their string values

    default

    the optional default configuration, used for determining the default values for the properties

    returns

    the configuration

    Definition Classes
    ReadPreferenceConfigMongoCompanionConfig
  6. def apply(readPreference: ReadPreference): ReadPreferenceConfig

    Permalink

    Creates a ReadPreferenceConfig from a ReadPreference instance

    Creates a ReadPreferenceConfig from a ReadPreference instance

    readPreference

    the read preference

    returns

    the configuration

  7. def apply(options: Map[String, String]): Self

    Permalink

    Create a configuration from the values in the Map

    Create a configuration from the values in the Map

    Note: Values in the map do not need to be prefixed with the configPrefix.

    options

    a map of properties and their string values

    returns

    the configuration

    Definition Classes
    MongoCompanionConfig
  8. def apply(sparkConf: SparkConf, options: Map[String, String]): Self

    Permalink

    Create a configuration from the sparkConf

    Create a configuration from the sparkConf

    Uses the prefixed properties that are set in the Spark configuration to create the config.

    sparkConf

    the spark configuration

    options

    overloaded parameters

    returns

    the configuration

    Definition Classes
    MongoCompanionConfig
    See also

    configPrefix

  9. def apply(sparkConf: SparkConf): Self

    Permalink

    Create a configuration from the sparkConf

    Create a configuration from the sparkConf

    Uses the prefixed properties that are set in the Spark configuration to create the config.

    sparkConf

    the spark configuration

    returns

    the configuration

    Definition Classes
    MongoCompanionConfig
    See also

    configPrefix

  10. def apply(sparkSession: SparkSession): Self

    Permalink

    Create a configuration from the sqlContext

    Create a configuration from the sqlContext

    Uses the prefixed properties that are set in the Spark configuration to create the config.

    sparkSession

    the SparkSession

    returns

    the configuration

    Definition Classes
    MongoCompanionConfig
    See also

    configPrefix

  11. def apply(sparkContext: SparkContext): Self

    Permalink

    Create a configuration from the sparkContext

    Create a configuration from the sparkContext

    Uses the prefixed properties that are set in the Spark configuration to create the config.

    sparkContext

    the spark context

    returns

    the configuration

    Definition Classes
    MongoCompanionConfig
    See also

    configPrefix

  12. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  13. val batchSizeProperty: String

    Permalink

    The batch size property

    The batch size property

    The size of batches used by the underlying cursor. Smaller batches will result in more round trips to MongoDB.

    Default: The servers default

    Definition Classes
    MongoInputConfig
    Since

    2.4.1

  14. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  15. val collationProperty: String

    Permalink

    The collation property

    The collation property

    The json representation of a Collation. Created via Collation.asDocument.toJson.

    Definition Classes
    MongoInputConfig
    Since

    2.3

  16. def collectionName(collectionNameProperty: String, options: Map[String, String], default: Option[String] = None): String

    Permalink
    Attributes
    protected
    Definition Classes
    MongoCompanionConfig
  17. val collectionNameProperty: String

    Permalink

    The collection name property

    The collection name property

    Definition Classes
    MongoInputConfig
  18. val configPrefix: String

    Permalink

    The configuration prefix string for the current configuration scope

    The configuration prefix string for the current configuration scope

    Definition Classes
    MongoInputConfigMongoCompanionConfig
  19. def connectionString(options: Map[String, String]): ConnectionString

    Permalink
    Attributes
    protected
    Definition Classes
    MongoCompanionConfig
  20. def create(sparkSession: SparkSession): ReadPreferenceConfig

    Permalink

    Create a configuration easily from the Java API using the JavaSparkContext

    Create a configuration easily from the Java API using the JavaSparkContext

    Uses the prefixed properties that are set in the Spark configuration to create the config.

    sparkSession

    the SparkSession

    returns

    the configuration

    Definition Classes
    ReadPreferenceConfigMongoCompanionConfig
    See also

    configPrefix

  21. def create(sparkConf: SparkConf, options: Map[String, String]): ReadPreferenceConfig

    Permalink

    Create a configuration from the sparkConf

    Create a configuration from the sparkConf

    Uses the prefixed properties that are set in the Spark configuration to create the config.

    sparkConf

    the spark configuration

    options

    overloaded parameters

    returns

    the configuration

    Definition Classes
    ReadPreferenceConfigMongoCompanionConfig
    See also

    configPrefix

  22. def create(options: Map[String, String], default: ReadPreferenceConfig): ReadPreferenceConfig

    Permalink

    Create a configuration easily from the Java API using the values in the Map, using the optional default configuration for any default values.

    Create a configuration easily from the Java API using the values in the Map, using the optional default configuration for any default values.

    Note: Values in the map do not need to be prefixed with the configPrefix.

    options

    a map of properties and their string values

    default

    the optional default configuration, used for determining the default values for the properties

    returns

    the configuration

    Definition Classes
    ReadPreferenceConfigMongoCompanionConfig
  23. def create(options: Map[String, String]): ReadPreferenceConfig

    Permalink

    Create a configuration easily from the Java API using the values in the Map

    Create a configuration easily from the Java API using the values in the Map

    Note: Values in the map do not need to be prefixed with the configPrefix.

    options

    a map of properties and their string values

    returns

    the configuration

    Definition Classes
    ReadPreferenceConfigMongoCompanionConfig
  24. def create(sparkConf: SparkConf): ReadPreferenceConfig

    Permalink

    Create a configuration easily from the Java API using the sparkConf

    Create a configuration easily from the Java API using the sparkConf

    Uses the prefixed properties that are set in the Spark configuration to create the config.

    sparkConf

    the spark configuration

    returns

    the configuration

    Definition Classes
    ReadPreferenceConfigMongoCompanionConfig
    See also

    configPrefix

  25. def create(javaSparkContext: JavaSparkContext): ReadPreferenceConfig

    Permalink

    Create a configuration easily from the Java API using the JavaSparkContext

    Create a configuration easily from the Java API using the JavaSparkContext

    Uses the prefixed properties that are set in the Spark configuration to create the config.

    javaSparkContext

    the java spark context

    returns

    the configuration

    Definition Classes
    ReadPreferenceConfigMongoCompanionConfig
    See also

    configPrefix

  26. def create(readPreference: ReadPreference): ReadPreferenceConfig

    Permalink

    Creates a ReadPreferenceConfig from a ReadPreference instance

    Creates a ReadPreferenceConfig from a ReadPreference instance

    readPreference

    the read preference

    returns

    the configuration

  27. def create(): ReadPreferenceConfig

    Permalink

    Default configuration

    Default configuration

    returns

    the configuration

  28. def databaseName(databaseNameProperty: String, options: Map[String, String], default: Option[String] = None): String

    Permalink
    Attributes
    protected
    Definition Classes
    MongoCompanionConfig
  29. val databaseNameProperty: String

    Permalink

    The database name property

    The database name property

    Definition Classes
    MongoInputConfig
  30. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  31. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  32. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  33. def getBoolean(newValue: Option[String], existingValue: Option[Boolean] = None, defaultValue: Boolean): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    MongoCompanionConfig
  34. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  35. def getInt(newValue: Option[String], existingValue: Option[Int] = None, defaultValue: Int): Int

    Permalink
    Attributes
    protected
    Definition Classes
    MongoCompanionConfig
  36. def getOptionsFromConf(sparkConf: SparkConf): Map[String, String]

    Permalink

    Gets an options map from the SparkConf

    Gets an options map from the SparkConf

    sparkConf

    the SparkConf

    returns

    the options

    Definition Classes
    MongoCompanionConfig
  37. def getString(newValue: Option[String], existingValue: Option[String] = None, defaultValue: String): String

    Permalink
    Attributes
    protected
    Definition Classes
    MongoCompanionConfig
  38. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  39. val hintProperty: String

    Permalink

    The hint property

    The hint property

    The json representation of a hint document

    Definition Classes
    MongoInputConfig
    Since

    2.3

  40. val inferSchemaMapTypeEnabledProperty: String

    Permalink

    The infer schema MapType enabled property

    The infer schema MapType enabled property

    A boolean flag to enable or disable MapType infer. If this flag is enabled, large compatible struct types will be inferred to a MapType instead.

    Default: true

    Definition Classes
    MongoInputConfig
    Since

    2.3

  41. val inferSchemaMapTypeMinimumKeysProperty: String

    Permalink

    The infer schema MapType minimum keys property

    The infer schema MapType minimum keys property

    The minimum keys property controls how large a struct must be before a MapType should be inferred.

    Default: 250

    Definition Classes
    MongoInputConfig
    Since

    2.3

  42. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  43. val localThresholdProperty: String

    Permalink

    The localThreshold property

    The localThreshold property

    The local threshold in milliseconds is used when choosing among multiple MongoDB servers to send a request. Only servers whose ping time is less than or equal to the server with the fastest ping time *plus* the local threshold will be chosen.

    For example when choosing which MongoS to send a request through a localThreshold of 0 would pick the MongoS with the fastest ping time.

    Default: 15 ms

    Definition Classes
    MongoInputConfig
  44. val mongoURIProperty: String

    Permalink

    The mongo URI property

    The mongo URI property

    Represents a connection string.

    Any values set in the connection string will override any default values for the configuration.

    Definition Classes
    MongoCompanionConfig
  45. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  46. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  47. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  48. val partitionerOptionsProperty: String

    Permalink

    The partitioner options property

    The partitioner options property

    Represents a map of options for customising the configuration of a partitioner. Default: Map.empty[String, String]

    Definition Classes
    MongoInputConfig
  49. val partitionerProperty: String

    Permalink

    The partition property

    The partition property

    Represents the name of the partitioner to use when partitioning the data in the collection. Default: MongoDefaultPartitioner

    Definition Classes
    MongoInputConfig
  50. val pipelineIncludeFiltersAndProjectionsProperty: String

    Permalink

    The sql include pipeline filters and projections property

    The sql include pipeline filters and projections property

    A boolean flag to enable or disable pushing down filters and projections into MongoDB when using spark sql. A false value will be expensive as all data will be sent to spark and filtered in Spark.

    Default: true

    Definition Classes
    MongoInputConfig
    Since

    2.3

  51. val pipelineIncludeNullFiltersProperty: String

    Permalink

    The sql include null filters in the pipeline property

    The sql include null filters in the pipeline property

    A boolean flag to enable or disable pushing null value checks into MongoDB when using spark sql. These ensure that the value exists and is not null for each not nullable field.

    Default: true

    Definition Classes
    MongoInputConfig
    Since

    2.3

  52. val pipelineProperty: String

    Permalink

    The pipeline property

    The pipeline property

    Enables custom aggregation pipelines to applied to the collection before sending to Spark. When configuring this should either be an extended json representation of a list of documents:

    """[{"$match": {"closed": false}}, {"$project": {"status": 1, "name": 1, "description": 1}}]"""

    Or the extended json syntax of a single document:

    """{"$match": {"closed": false}}"""

    Note: Custom aggregation pipelines must work with the partitioner strategy. Some stages such as $group may not work as expected.

    Definition Classes
    MongoInputConfig
    Since

    2.3.1

  53. val readConcernLevelProperty: String

    Permalink

    The ReadConcern level property

    The ReadConcern level property

    Default: DEFAULT

    Definition Classes
    MongoInputConfig
    See also

    ReadConcernConfig

  54. val readPreferenceNameProperty: String

    Permalink

    The ReadPreference name property

    The ReadPreference name property

    Default: primary

    Definition Classes
    MongoInputConfig
    See also

    ReadPreferenceConfig

  55. val readPreferenceTagSetsProperty: String

    Permalink

    The ReadPreference tags property

    The ReadPreference tags property

    Definition Classes
    MongoInputConfig
    See also

    ReadPreferenceConfig

  56. val registerSQLHelperFunctions: String

    Permalink
    Definition Classes
    MongoInputConfig
  57. val registerSQLHelperFunctionsProperty: String

    Permalink

    Register SQL Helper functions

    Register SQL Helper functions

    The SQL helper functions allow easy querying of Bson types inside SQL queries

    Definition Classes
    MongoInputConfig
    Since

    1.1

  58. val samplePoolSizeProperty: String

    Permalink

    The sample pool size property

    The sample pool size property

    The size of the pool to take a sample from, used when there is no $sample support or if there is a pushed down aggregation. Can be used to significantly reduce the costs of inferring the schema. A negative value disables limiting when using $sample and will sample from the whole collection.

    Default: 10000

    Definition Classes
    MongoInputConfig
    Since

    2.3.1

  59. val sampleSizeProperty: String

    Permalink

    The sample size property

    The sample size property

    Used when sampling data from MongoDB to determine the Schema. Should be equal or less than the sample pool size. Default: 1000

    Definition Classes
    MongoInputConfig
  60. def stripPrefix(options: Map[String, String]): Map[String, String]

    Permalink

    Strip the prefix from options

    Strip the prefix from options

    options

    options that may contain the prefix

    returns

    prefixLess options

    Definition Classes
    MongoCompanionConfig
  61. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  62. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  63. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  64. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  65. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def apply(sqlContext: SQLContext): Self

    Permalink

    Create a configuration from the sqlContext

    Create a configuration from the sqlContext

    Uses the prefixed properties that are set in the Spark configuration to create the config.

    sqlContext

    the SQL context

    returns

    the configuration

    Definition Classes
    MongoCompanionConfig
    Annotations
    @deprecated
    Deprecated

    (Since version 2.0.0) As of Spark 2.0 SQLContext was replaced by SparkSession. Use the SparkSession method instead

    See also

    configPrefix

  2. def create(sqlContext: SQLContext): ReadPreferenceConfig

    Permalink

    Create a configuration easily from the Java API using the JavaSparkContext

    Create a configuration easily from the Java API using the JavaSparkContext

    Uses the prefixed properties that are set in the Spark configuration to create the config.

    sqlContext

    the SQL context

    returns

    the configuration

    Definition Classes
    ReadPreferenceConfigMongoCompanionConfig
    Annotations
    @deprecated
    Deprecated

    (Since version 2.0.0) As of Spark 2.0 SQLContext was replaced by SparkSession. Use the SparkSession method instead

    See also

    configPrefix

Inherited from MongoInputConfig

Inherited from MongoCompanionConfig

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped