|
||||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES | |||||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |
java.lang.Object org.apache.accumulo.core.file.map.MySequenceFile.Sorter
public static class MySequenceFile.Sorter
Sorts key/value pairs in a sequence-format file.
For best performance, applications should make sure that the Writable.readFields(DataInput)
implementation of their keys is very efficient. In
particular, it should avoid allocating memory.
Nested Class Summary | |
---|---|
static interface |
MySequenceFile.Sorter.RawKeyValueIterator
The interface to iterate over raw keys/values of SequenceFiles. |
class |
MySequenceFile.Sorter.SegmentDescriptor
This class defines a merge segment. |
Constructor Summary | |
---|---|
MySequenceFile.Sorter(org.apache.hadoop.fs.FileSystem fs,
Class<? extends org.apache.hadoop.io.WritableComparable> keyClass,
Class valClass,
org.apache.hadoop.conf.Configuration conf)
Sort and merge files containing the named classes. |
|
MySequenceFile.Sorter(org.apache.hadoop.fs.FileSystem fs,
org.apache.hadoop.io.RawComparator comparator,
Class keyClass,
Class valClass,
org.apache.hadoop.conf.Configuration conf)
Sort and merge using an arbitrary RawComparator . |
Method Summary | |
---|---|
MySequenceFile.Writer |
cloneFileAttributes(org.apache.hadoop.fs.Path inputFile,
org.apache.hadoop.fs.Path outputFile,
org.apache.hadoop.util.Progressable prog)
Clones the attributes (like compression of the input file and creates a corresponding Writer |
int |
getFactor()
Get the number of streams to merge at once. |
int |
getMemory()
Get the total amount of buffer memory, in bytes. |
MySequenceFile.Sorter.RawKeyValueIterator |
merge(List<MySequenceFile.Sorter.SegmentDescriptor> segments,
org.apache.hadoop.fs.Path tmpDir)
Merges the list of segments of type SegmentDescriptor |
MySequenceFile.Sorter.RawKeyValueIterator |
merge(org.apache.hadoop.fs.Path[] inNames,
boolean deleteInputs,
int factor,
org.apache.hadoop.fs.Path tmpDir)
Merges the contents of files passed in Path[] |
MySequenceFile.Sorter.RawKeyValueIterator |
merge(org.apache.hadoop.fs.Path[] inNames,
boolean deleteInputs,
org.apache.hadoop.fs.Path tmpDir)
Merges the contents of files passed in Path[] using a max factor value that is already set |
void |
merge(org.apache.hadoop.fs.Path[] inFiles,
org.apache.hadoop.fs.Path outFile)
Merge the provided files. |
MySequenceFile.Sorter.RawKeyValueIterator |
merge(org.apache.hadoop.fs.Path[] inNames,
org.apache.hadoop.fs.Path tempDir,
boolean deleteInputs)
Merges the contents of files passed in Path[] |
void |
setFactor(int factor)
Set the number of streams to merge at once. |
void |
setMemory(int memory)
Set the total amount of buffer memory, in bytes. |
void |
setProgressable(org.apache.hadoop.util.Progressable progressable)
Set the progressable object in order to report progress. |
void |
sort(org.apache.hadoop.fs.Path[] inFiles,
org.apache.hadoop.fs.Path outFile,
boolean deleteInput)
Perform a file sort from a set of input files into an output file. |
void |
sort(org.apache.hadoop.fs.Path inFile,
org.apache.hadoop.fs.Path outFile)
The backwards compatible interface to sort. |
MySequenceFile.Sorter.RawKeyValueIterator |
sortAndIterate(org.apache.hadoop.fs.Path[] inFiles,
org.apache.hadoop.fs.Path tempDir,
boolean deleteInput)
Perform a file sort from a set of input files and return an iterator. |
void |
writeFile(MySequenceFile.Sorter.RawKeyValueIterator records,
MySequenceFile.Writer writer)
Writes records from RawKeyValueIterator into a file represented by the passed writer |
Methods inherited from class java.lang.Object |
---|
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
Constructor Detail |
---|
public MySequenceFile.Sorter(org.apache.hadoop.fs.FileSystem fs, Class<? extends org.apache.hadoop.io.WritableComparable> keyClass, Class valClass, org.apache.hadoop.conf.Configuration conf)
public MySequenceFile.Sorter(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.io.RawComparator comparator, Class keyClass, Class valClass, org.apache.hadoop.conf.Configuration conf)
RawComparator
.
Method Detail |
---|
public void setFactor(int factor)
public int getFactor()
public void setMemory(int memory)
public int getMemory()
public void setProgressable(org.apache.hadoop.util.Progressable progressable)
public void sort(org.apache.hadoop.fs.Path[] inFiles, org.apache.hadoop.fs.Path outFile, boolean deleteInput) throws IOException
inFiles
- the files to be sortedoutFile
- the sorted output filedeleteInput
- should the input files be deleted as they are read?
IOException
public MySequenceFile.Sorter.RawKeyValueIterator sortAndIterate(org.apache.hadoop.fs.Path[] inFiles, org.apache.hadoop.fs.Path tempDir, boolean deleteInput) throws IOException
inFiles
- the files to be sortedtempDir
- the directory where temp files are created during sortdeleteInput
- should the input files be deleted as they are read?
IOException
public void sort(org.apache.hadoop.fs.Path inFile, org.apache.hadoop.fs.Path outFile) throws IOException
inFile
- the input file to sortoutFile
- the sorted output file
IOException
public MySequenceFile.Sorter.RawKeyValueIterator merge(List<MySequenceFile.Sorter.SegmentDescriptor> segments, org.apache.hadoop.fs.Path tmpDir) throws IOException
SegmentDescriptor
segments
- the list of SegmentDescriptorstmpDir
- the directory to write temporary files into
IOException
public MySequenceFile.Sorter.RawKeyValueIterator merge(org.apache.hadoop.fs.Path[] inNames, boolean deleteInputs, org.apache.hadoop.fs.Path tmpDir) throws IOException
inNames
- the array of path namesdeleteInputs
- true if the input files should be deleted when unnecessarytmpDir
- the directory to write temporary files into
IOException
public MySequenceFile.Sorter.RawKeyValueIterator merge(org.apache.hadoop.fs.Path[] inNames, boolean deleteInputs, int factor, org.apache.hadoop.fs.Path tmpDir) throws IOException
inNames
- the array of path namesdeleteInputs
- true if the input files should be deleted when unnecessaryfactor
- the factor that will be used as the maximum merge fan-intmpDir
- the directory to write temporary files into
IOException
public MySequenceFile.Sorter.RawKeyValueIterator merge(org.apache.hadoop.fs.Path[] inNames, org.apache.hadoop.fs.Path tempDir, boolean deleteInputs) throws IOException
inNames
- the array of path namestempDir
- the directory for creating temp files during mergedeleteInputs
- true if the input files should be deleted when unnecessary
IOException
public MySequenceFile.Writer cloneFileAttributes(org.apache.hadoop.fs.Path inputFile, org.apache.hadoop.fs.Path outputFile, org.apache.hadoop.util.Progressable prog) throws IOException
inputFile
- the path of the input file whose attributes should be clonedoutputFile
- the path of the output fileprog
- the Progressable to report status during the file write
IOException
public void writeFile(MySequenceFile.Sorter.RawKeyValueIterator records, MySequenceFile.Writer writer) throws IOException
records
- the RawKeyValueIteratorwriter
- the Writer created earlier
IOException
public void merge(org.apache.hadoop.fs.Path[] inFiles, org.apache.hadoop.fs.Path outFile) throws IOException
inFiles
- the array of input path namesoutFile
- the final output file
IOException
|
||||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES | |||||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |