org.apache.spark.sql.execution

CacheManager

class CacheManager extends Logging

Provides support in a SQLContext for caching query results and automatically using these cached results when subsequent queries are executed. Data is cached using byte buffers stored in an InMemoryRelation. This relation is automatically substituted query plans that return the sameResult as the originally cached query.

Internal to Spark SQL.

Linear Supertypes
Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. CacheManager
  2. Logging
  3. AnyRef
  4. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CacheManager()

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def cacheQuery(query: Dataset[_], tableName: Option[String] = None, storageLevel: StorageLevel = MEMORY_AND_DISK): Unit

    Caches the data produced by the logical representation of the given Dataset.

    Caches the data produced by the logical representation of the given Dataset. Unlike RDD.cache(), the default storage level is set to be MEMORY_AND_DISK because recomputing the in-memory columnar representation of the underlying table is expensive.

  8. def clearCache(): Unit

    Clears all cached tables.

  9. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  10. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  11. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  12. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  14. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  15. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Attributes
    protected
    Definition Classes
    Logging
  16. def isEmpty: Boolean

    Checks if the cache is empty.

  17. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  18. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  19. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  20. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  21. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  22. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  23. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  24. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  25. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  26. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  27. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  28. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  29. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  30. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  31. def lookupCachedData(plan: LogicalPlan): Option[CachedData]

    Optionally returns cached data for the given LogicalPlan.

  32. def lookupCachedData(query: Dataset[_]): Option[CachedData]

    Optionally returns cached data for the given Dataset

  33. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  34. final def notify(): Unit

    Definition Classes
    AnyRef
  35. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  36. def recacheByPath(spark: SparkSession, resourcePath: String): Unit

    Tries to re-cache all the cache entries that contain resourcePath in one or more HadoopFsRelation node(s) as part of its logical plan.

  37. def recacheByPlan(spark: SparkSession, plan: LogicalPlan): Unit

    Tries to re-cache all the cache entries that refer to the given plan.

  38. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  39. def toString(): String

    Definition Classes
    AnyRef → Any
  40. def uncacheQuery(spark: SparkSession, plan: LogicalPlan, blocking: Boolean): Unit

    Un-cache all the cache entries that refer to the given plan.

  41. def uncacheQuery(query: Dataset[_], blocking: Boolean = true): Unit

    Un-cache all the cache entries that refer to the given plan.

  42. def useCachedData(plan: LogicalPlan): LogicalPlan

    Replaces segments of the given logical plan with cached versions where possible.

  43. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  44. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  45. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped