public class CollectAllelicCountsSpark extends LocusWalkerSpark
CollectAllelicCounts
. This behaves the same, except that it supports spark.GATKSparkTool.ReadInputMergingPolicy
maxDepthPerSample, readShardSize, shuffle
addOutputVCFCommandLine, BAM_PARTITION_SIZE_LONG_NAME, bamPartitionSplitSize, CREATE_OUTPUT_BAM_SPLITTING_INDEX_LONG_NAME, createOutputBamIndex, createOutputBamSplittingIndex, createOutputVariantIndex, features, intervalArgumentCollection, NUM_REDUCERS_LONG_NAME, numReducers, OUTPUT_SHARD_DIR_LONG_NAME, readArguments, referenceArguments, sequenceDictionaryValidationArguments, SHARDED_OUTPUT_LONG_NAME, shardedOutput, shardedPartsDir, USE_NIO, useNio
programName, SPARK_PROGRAM_NAME_LONG_NAME, sparkArgs
GATK_CONFIG_FILE, logger, NIO_MAX_REOPENS, NIO_PROJECT_FOR_REQUESTER_PAYS, QUIET, specialArgumentsCollection, tmpDir, useJdkDeflater, useJdkInflater, VERBOSITY
Constructor and Description |
---|
CollectAllelicCountsSpark() |
Modifier and Type | Method and Description |
---|---|
boolean |
emitEmptyLoci()
Does this tool emit information for uncovered loci? Tools that do should override to return
true . |
java.util.List<ReadFilter> |
getDefaultReadFilters()
Returns the default list of ReadFilters that are used for this tool.
|
protected void |
processAlignments(org.apache.spark.api.java.JavaRDD<LocusWalkerContext> rdd,
org.apache.spark.api.java.JavaSparkContext ctx)
Process the alignments and write output.
|
boolean |
requiresIntervals()
Does this tool require intervals? Tools that do should override to return true.
|
boolean |
requiresReference()
Does this tool require reference data? Tools that do should override to return true.
|
defaultMaxDepthPerSample, getAlignments, getDownsamplingInfo, requiresReads, runTool
addReferenceFilesForSpark, addVCFsForSpark, editIntervals, getBestAvailableSequenceDictionary, getDefaultToolVCFHeaderLines, getDefaultVariantAnnotationGroups, getDefaultVariantAnnotations, getGatkReadJavaRDD, getHeaderForReads, getIntervals, getPluginDescriptors, getReadInputMergingPolicy, getReads, getReadSourceHeaderMap, getReadSourceName, getRecommendedNumReducers, getReference, getReferenceSequenceDictionary, getReferenceWindowFunction, getSequenceDictionaryValidationArgumentCollection, getTargetPartitionSize, getUnfilteredReads, hasReads, hasReference, hasUserSuppliedIntervals, makeReadFilter, makeReadFilter, makeVariantAnnotations, runPipeline, useVariantAnnotations, validateSequenceDictionaries, writeReads, writeReads
afterPipeline, doWork, getProgramName
customCommandLineValidation, getCommandLine, getCommandLineParser, getDefaultHeaders, getMetricsFile, getSupportInformation, getToolkitName, getToolkitShortName, getToolStatusWarning, getUsage, getVersion, instanceMain, instanceMainPostParseArgs, isBetaFeature, isExperimentalFeature, onShutdown, onStartup, parseArgs, printLibraryVersions, printSettings, printStartupMessage, runTool, setDefaultHeaders, warnOnToolStatus
public boolean emitEmptyLoci()
LocusWalkerSpark
true
.
NOTE: Typically, this should only be used when intervals are specified.
NOTE: If MappedReadFilter is removed, then emitting empty loci will fail.
NOTE: If there is no available sequence dictionary and this is set to true, there should be a failure. Please
consider requiring reads and/or references for all tools that wish to set this to true
.emitEmptyLoci
in class LocusWalkerSpark
true
if this tool requires uncovered loci information to be emitted, false
otherwisepublic boolean requiresReference()
GATKSparkTool
requiresReference
in class GATKSparkTool
public boolean requiresIntervals()
GATKSparkTool
requiresIntervals
in class GATKSparkTool
public java.util.List<ReadFilter> getDefaultReadFilters()
GATKSparkTool
WellformedReadFilter
filter with all default options. Subclasses
can override to provide alternative filters.
Note: this method is called before command line parsing begins, and thus before a SAMFileHeader is
available through GATKSparkTool.getHeaderForReads()
. The actual SAMFileHeader is propagated to the read filters
by GATKSparkTool.makeReadFilter()
after the filters have been merged with command line arguments.getDefaultReadFilters
in class GATKSparkTool
protected void processAlignments(org.apache.spark.api.java.JavaRDD<LocusWalkerContext> rdd, org.apache.spark.api.java.JavaSparkContext ctx)
LocusWalkerSpark
processAlignments
in class LocusWalkerSpark
rdd
- a distributed collection of LocusWalkerContext
ctx
- our Spark context