public class CreateSparkAppRequest extends AbstractModel
Constructor and Description |
---|
CreateSparkAppRequest() |
CreateSparkAppRequest(CreateSparkAppRequest source)
NOTE: Any ambiguous key set via .set("AnyKey", "value") will be a shallow copy,
and any explicit key, i.e Foo, set via .setFoo("value") will be a deep copy.
|
Modifier and Type | Method and Description |
---|---|
String |
getAppArchives()
Get Archives: Dependency resources
|
String |
getAppConf()
Get Spark configurations separated by line break
|
String |
getAppDriverSize()
Get Driver resource specification of the Spark job.
|
Long |
getAppExecutorMaxNumbers()
Get The specified executor count (max), which defaults to 1.
|
Long |
getAppExecutorNums()
Get Number of Spark job executors
|
String |
getAppExecutorSize()
Get Executor resource specification of the Spark job.
|
String |
getAppFile()
Get Execution entry of the Spark application
|
String |
getAppFiles()
Get Dependency resources of the Spark job separated by comma
|
String |
getAppJars()
Get Dependency JAR packages of the Spark JAR job separated by comma
|
String |
getAppName()
Get Spark application name
|
String |
getAppPythonFiles()
Get PySpark: Python dependency, which can be in .py, .zip, or .egg format.
|
Long |
getAppType()
Get 1: Spark JAR application; 2: Spark streaming application
|
String |
getCmdArgs()
Get Command line parameters of the Spark job
|
String |
getDataEngine()
Get The data engine executing the Spark job
|
String |
getDataSource()
Get Data source name
|
String |
getEni()
Get This field has been disused.
|
String |
getIsLocal()
Get Whether it is upload locally.
|
String |
getIsLocalArchives()
Get Archives: Dependency upload method.
|
String |
getIsLocalFiles()
Get Whether it is upload locally.
|
String |
getIsLocalJars()
Get Whether it is upload locally.
|
String |
getIsLocalPythonFiles()
Get PySpark: Dependency upload method.
|
String |
getMainClass()
Get Main class of the Spark JAR job during execution
|
Long |
getMaxRetries()
Get This parameter takes effect only for Spark flow tasks.
|
Long |
getRoleArn()
Get Execution role ID of the Spark job
|
String |
getSessionId()
Get The ID of the associated Data Lake Compute query script.
|
String |
getSparkImage()
Get The Spark image version.
|
String |
getSparkImageVersion()
Get The Spark image version name.
|
void |
setAppArchives(String AppArchives)
Set Archives: Dependency resources
|
void |
setAppConf(String AppConf)
Set Spark configurations separated by line break
|
void |
setAppDriverSize(String AppDriverSize)
Set Driver resource specification of the Spark job.
|
void |
setAppExecutorMaxNumbers(Long AppExecutorMaxNumbers)
Set The specified executor count (max), which defaults to 1.
|
void |
setAppExecutorNums(Long AppExecutorNums)
Set Number of Spark job executors
|
void |
setAppExecutorSize(String AppExecutorSize)
Set Executor resource specification of the Spark job.
|
void |
setAppFile(String AppFile)
Set Execution entry of the Spark application
|
void |
setAppFiles(String AppFiles)
Set Dependency resources of the Spark job separated by comma
|
void |
setAppJars(String AppJars)
Set Dependency JAR packages of the Spark JAR job separated by comma
|
void |
setAppName(String AppName)
Set Spark application name
|
void |
setAppPythonFiles(String AppPythonFiles)
Set PySpark: Python dependency, which can be in .py, .zip, or .egg format.
|
void |
setAppType(Long AppType)
Set 1: Spark JAR application; 2: Spark streaming application
|
void |
setCmdArgs(String CmdArgs)
Set Command line parameters of the Spark job
|
void |
setDataEngine(String DataEngine)
Set The data engine executing the Spark job
|
void |
setDataSource(String DataSource)
Set Data source name
|
void |
setEni(String Eni)
Set This field has been disused.
|
void |
setIsLocal(String IsLocal)
Set Whether it is upload locally.
|
void |
setIsLocalArchives(String IsLocalArchives)
Set Archives: Dependency upload method.
|
void |
setIsLocalFiles(String IsLocalFiles)
Set Whether it is upload locally.
|
void |
setIsLocalJars(String IsLocalJars)
Set Whether it is upload locally.
|
void |
setIsLocalPythonFiles(String IsLocalPythonFiles)
Set PySpark: Dependency upload method.
|
void |
setMainClass(String MainClass)
Set Main class of the Spark JAR job during execution
|
void |
setMaxRetries(Long MaxRetries)
Set This parameter takes effect only for Spark flow tasks.
|
void |
setRoleArn(Long RoleArn)
Set Execution role ID of the Spark job
|
void |
setSessionId(String SessionId)
Set The ID of the associated Data Lake Compute query script.
|
void |
setSparkImage(String SparkImage)
Set The Spark image version.
|
void |
setSparkImageVersion(String SparkImageVersion)
Set The Spark image version name.
|
void |
toMap(HashMap<String,String> map,
String prefix)
Internal implementation, normal users should not use it.
|
any, fromJsonString, getBinaryParams, getMultipartRequestParams, set, setParamArrayObj, setParamArraySimple, setParamObj, setParamSimple, toJsonString
public CreateSparkAppRequest()
public CreateSparkAppRequest(CreateSparkAppRequest source)
public String getAppName()
public void setAppName(String AppName)
AppName
- Spark application namepublic Long getAppType()
public void setAppType(Long AppType)
AppType
- 1: Spark JAR application; 2: Spark streaming applicationpublic String getDataEngine()
public void setDataEngine(String DataEngine)
DataEngine
- The data engine executing the Spark jobpublic String getAppFile()
public void setAppFile(String AppFile)
AppFile
- Execution entry of the Spark applicationpublic Long getRoleArn()
public void setRoleArn(Long RoleArn)
RoleArn
- Execution role ID of the Spark jobpublic String getAppDriverSize()
public void setAppDriverSize(String AppDriverSize)
AppDriverSize
- Driver resource specification of the Spark job. Valid values: `small`, `medium`, `large`, `xlarge`.public String getAppExecutorSize()
public void setAppExecutorSize(String AppExecutorSize)
AppExecutorSize
- Executor resource specification of the Spark job. Valid values: `small`, `medium`, `large`, `xlarge`.public Long getAppExecutorNums()
public void setAppExecutorNums(Long AppExecutorNums)
AppExecutorNums
- Number of Spark job executorspublic String getEni()
public void setEni(String Eni)
Eni
- This field has been disused. Use the `Datasource` field instead.public String getIsLocal()
public void setIsLocal(String IsLocal)
IsLocal
- Whether it is upload locally. Valid values: `cos`, `lakefs`.public String getMainClass()
public void setMainClass(String MainClass)
MainClass
- Main class of the Spark JAR job during executionpublic String getAppConf()
public void setAppConf(String AppConf)
AppConf
- Spark configurations separated by line breakpublic String getIsLocalJars()
public void setIsLocalJars(String IsLocalJars)
IsLocalJars
- Whether it is upload locally. Valid values: `cos`, `lakefs`.public String getAppJars()
public void setAppJars(String AppJars)
AppJars
- Dependency JAR packages of the Spark JAR job separated by commapublic String getIsLocalFiles()
public void setIsLocalFiles(String IsLocalFiles)
IsLocalFiles
- Whether it is upload locally. Valid values: `cos`, `lakefs`.public String getAppFiles()
public void setAppFiles(String AppFiles)
AppFiles
- Dependency resources of the Spark job separated by commapublic String getCmdArgs()
public void setCmdArgs(String CmdArgs)
CmdArgs
- Command line parameters of the Spark jobpublic Long getMaxRetries()
public void setMaxRetries(Long MaxRetries)
MaxRetries
- This parameter takes effect only for Spark flow tasks.public String getDataSource()
public void setDataSource(String DataSource)
DataSource
- Data source namepublic String getIsLocalPythonFiles()
public void setIsLocalPythonFiles(String IsLocalPythonFiles)
IsLocalPythonFiles
- PySpark: Dependency upload method. 1: cos; 2: lakefs (this method needs to be used in the console but cannot be called through APIs).public String getAppPythonFiles()
public void setAppPythonFiles(String AppPythonFiles)
AppPythonFiles
- PySpark: Python dependency, which can be in .py, .zip, or .egg format. Multiple files should be separated by comma.public String getIsLocalArchives()
public void setIsLocalArchives(String IsLocalArchives)
IsLocalArchives
- Archives: Dependency upload method. 1: cos; 2: lakefs (this method needs to be used in the console but cannot be called through APIs).public String getAppArchives()
public void setAppArchives(String AppArchives)
AppArchives
- Archives: Dependency resourcespublic String getSparkImage()
public void setSparkImage(String SparkImage)
SparkImage
- The Spark image version.public String getSparkImageVersion()
public void setSparkImageVersion(String SparkImageVersion)
SparkImageVersion
- The Spark image version name.public Long getAppExecutorMaxNumbers()
public void setAppExecutorMaxNumbers(Long AppExecutorMaxNumbers)
AppExecutorMaxNumbers
- The specified executor count (max), which defaults to 1. This parameter applies if the "Dynamic" mode is selected. If the "Dynamic" mode is not selected, the executor count is equal to `AppExecutorNums`.public String getSessionId()
public void setSessionId(String SessionId)
SessionId
- The ID of the associated Data Lake Compute query script.Copyright © 2023. All rights reserved.