public class ModifySparkAppBatchRequest extends AbstractModel
header, skipSign
Constructor and Description |
---|
ModifySparkAppBatchRequest() |
ModifySparkAppBatchRequest(ModifySparkAppBatchRequest source)
NOTE: Any ambiguous key set via .set("AnyKey", "value") will be a shallow copy,
and any explicit key, i.e Foo, set via .setFoo("value") will be a deep copy.
|
Modifier and Type | Method and Description |
---|---|
String |
getAppDriverSize()
Get The driver size.
|
Long |
getAppExecutorMaxNumbers()
Get The maximum executor count (in dynamic configuration scenarios).
|
Long |
getAppExecutorNums()
Get The executor count.
|
String |
getAppExecutorSize()
Get The executor size.
|
String |
getDataEngine()
Get The engine ID.
|
Long |
getIsInherit()
Get Whether to inherit the task resource configuration from the cluster template.
|
String[] |
getSparkAppId()
Get The list of the IDs of the Spark job tasks to be modified in batches.
|
void |
setAppDriverSize(String AppDriverSize)
Set The driver size.
|
void |
setAppExecutorMaxNumbers(Long AppExecutorMaxNumbers)
Set The maximum executor count (in dynamic configuration scenarios).
|
void |
setAppExecutorNums(Long AppExecutorNums)
Set The executor count.
|
void |
setAppExecutorSize(String AppExecutorSize)
Set The executor size.
|
void |
setDataEngine(String DataEngine)
Set The engine ID.
|
void |
setIsInherit(Long IsInherit)
Set Whether to inherit the task resource configuration from the cluster template.
|
void |
setSparkAppId(String[] SparkAppId)
Set The list of the IDs of the Spark job tasks to be modified in batches.
|
void |
toMap(HashMap<String,String> map,
String prefix)
Internal implementation, normal users should not use it.
|
any, fromJsonString, getBinaryParams, GetHeader, getMultipartRequestParams, getSkipSign, set, SetHeader, setParamArrayObj, setParamArraySimple, setParamObj, setParamSimple, setSkipSign, toJsonString
public ModifySparkAppBatchRequest()
public ModifySparkAppBatchRequest(ModifySparkAppBatchRequest source)
public String[] getSparkAppId()
public void setSparkAppId(String[] SparkAppId)
SparkAppId
- The list of the IDs of the Spark job tasks to be modified in batches.public String getDataEngine()
public void setDataEngine(String DataEngine)
DataEngine
- The engine ID.public String getAppDriverSize()
public void setAppDriverSize(String AppDriverSize)
AppDriverSize
- The driver size.
Valid values for the standard resource type: `small`, `medium`, `large`, and `xlarge`.
Valid values for the memory resource type: `m.small`, `m.medium`, `m.large`, and `m.xlarge`.public String getAppExecutorSize()
public void setAppExecutorSize(String AppExecutorSize)
AppExecutorSize
- The executor size.
Valid values for the standard resource type: `small`, `medium`, `large`, and `xlarge`.
Valid values for the memory resource type: `m.small`, `m.medium`, `m.large`, and `m.xlarge`.public Long getAppExecutorNums()
public void setAppExecutorNums(Long AppExecutorNums)
AppExecutorNums
- The executor count. The minimum value is 1 and the maximum value is less than the cluster specification.public Long getAppExecutorMaxNumbers()
public void setAppExecutorMaxNumbers(Long AppExecutorMaxNumbers)
AppExecutorMaxNumbers
- The maximum executor count (in dynamic configuration scenarios). The minimum value is 1 and the maximum value is less than the cluster specification. If you set `ExecutorMaxNumbers` to a value smaller than that of `ExecutorNums`, the value of `ExecutorMaxNumbers` is automatically changed to that of `ExecutorNums`.public Long getIsInherit()
public void setIsInherit(Long IsInherit)
IsInherit
- Whether to inherit the task resource configuration from the cluster template. Valid values: `0` (default): No; `1`: Yes.Copyright © 2024. All rights reserved.