@Generated(value="com.amazonaws:aws-java-sdk-code-generator") public class DetectModerationLabelsResult extends AmazonWebServiceResult<ResponseMetadata> implements Serializable, Cloneable
Constructor and Description |
---|
DetectModerationLabelsResult() |
Modifier and Type | Method and Description |
---|---|
DetectModerationLabelsResult |
clone() |
boolean |
equals(Object obj) |
HumanLoopActivationOutput |
getHumanLoopActivationOutput()
Shows the results of the human in the loop evaluation.
|
List<ModerationLabel> |
getModerationLabels()
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were
detected.
|
String |
getModerationModelVersion()
Version number of the moderation detection model that was used to detect unsafe content.
|
int |
hashCode() |
void |
setHumanLoopActivationOutput(HumanLoopActivationOutput humanLoopActivationOutput)
Shows the results of the human in the loop evaluation.
|
void |
setModerationLabels(Collection<ModerationLabel> moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were
detected.
|
void |
setModerationModelVersion(String moderationModelVersion)
Version number of the moderation detection model that was used to detect unsafe content.
|
String |
toString()
Returns a string representation of this object.
|
DetectModerationLabelsResult |
withHumanLoopActivationOutput(HumanLoopActivationOutput humanLoopActivationOutput)
Shows the results of the human in the loop evaluation.
|
DetectModerationLabelsResult |
withModerationLabels(Collection<ModerationLabel> moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were
detected.
|
DetectModerationLabelsResult |
withModerationLabels(ModerationLabel... moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were
detected.
|
DetectModerationLabelsResult |
withModerationModelVersion(String moderationModelVersion)
Version number of the moderation detection model that was used to detect unsafe content.
|
getSdkHttpMetadata, getSdkResponseMetadata, setSdkHttpMetadata, setSdkResponseMetadata
public List<ModerationLabel> getModerationLabels()
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
public void setModerationLabels(Collection<ModerationLabel> moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
moderationLabels
- Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were
detected.public DetectModerationLabelsResult withModerationLabels(ModerationLabel... moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
NOTE: This method appends the values to the existing list (if any). Use
setModerationLabels(java.util.Collection)
or withModerationLabels(java.util.Collection)
if you
want to override the existing values.
moderationLabels
- Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were
detected.public DetectModerationLabelsResult withModerationLabels(Collection<ModerationLabel> moderationLabels)
Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were detected.
moderationLabels
- Array of detected Moderation labels and the time, in milliseconds from the start of the video, they were
detected.public void setModerationModelVersion(String moderationModelVersion)
Version number of the moderation detection model that was used to detect unsafe content.
moderationModelVersion
- Version number of the moderation detection model that was used to detect unsafe content.public String getModerationModelVersion()
Version number of the moderation detection model that was used to detect unsafe content.
public DetectModerationLabelsResult withModerationModelVersion(String moderationModelVersion)
Version number of the moderation detection model that was used to detect unsafe content.
moderationModelVersion
- Version number of the moderation detection model that was used to detect unsafe content.public void setHumanLoopActivationOutput(HumanLoopActivationOutput humanLoopActivationOutput)
Shows the results of the human in the loop evaluation.
humanLoopActivationOutput
- Shows the results of the human in the loop evaluation.public HumanLoopActivationOutput getHumanLoopActivationOutput()
Shows the results of the human in the loop evaluation.
public DetectModerationLabelsResult withHumanLoopActivationOutput(HumanLoopActivationOutput humanLoopActivationOutput)
Shows the results of the human in the loop evaluation.
humanLoopActivationOutput
- Shows the results of the human in the loop evaluation.public String toString()
toString
in class Object
Object.toString()
public DetectModerationLabelsResult clone()