com.twitter.elephantbird.mapreduce.input
Class RCFileThriftInputFormat.ThriftReader
java.lang.Object
org.apache.hadoop.mapreduce.RecordReader<K,V>
com.twitter.elephantbird.mapreduce.input.FilterRecordReader<org.apache.hadoop.io.LongWritable,org.apache.hadoop.io.Writable>
com.twitter.elephantbird.mapreduce.input.RCFileThriftInputFormat.ThriftReader
- All Implemented Interfaces:
- Closeable
- Direct Known Subclasses:
- RCFileThriftTupleInputFormat.TupleReader
- Enclosing class:
- RCFileThriftInputFormat
public class RCFileThriftInputFormat.ThriftReader
- extends FilterRecordReader<org.apache.hadoop.io.LongWritable,org.apache.hadoop.io.Writable>
|
Constructor Summary |
RCFileThriftInputFormat.ThriftReader(org.apache.hadoop.mapreduce.RecordReader reader)
The reader is expected to be a
RecordReader< LongWritable, BytesRefArrayWritable > |
|
Method Summary |
org.apache.hadoop.hive.serde2.columnar.BytesRefArrayWritable |
getCurrentBytesRefArrayWritable()
returns super.getCurrentValue() |
org.apache.thrift.TBase<?,?> |
getCurrentThriftValue()
Builds Thrift object from the raw bytes returned by RCFile reader. |
org.apache.hadoop.io.Writable |
getCurrentValue()
|
void |
initialize(org.apache.hadoop.mapreduce.InputSplit split,
org.apache.hadoop.mapreduce.TaskAttemptContext ctx)
|
boolean |
isReadingUnknonwsColumn()
is valid only after initialize() is called |
| Methods inherited from class java.lang.Object |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
tDesc
protected TStructDescriptor tDesc
readUnknownsColumn
protected boolean readUnknownsColumn
knownRequiredFields
protected List<TStructDescriptor.Field> knownRequiredFields
columnsBeingRead
protected ArrayList<Integer> columnsBeingRead
memTransport
protected org.apache.thrift.transport.TMemoryInputTransport memTransport
tProto
protected org.apache.thrift.protocol.TBinaryProtocol tProto
thriftWritable
protected ThriftWritable<org.apache.thrift.TBase<?,?>> thriftWritable
RCFileThriftInputFormat.ThriftReader
public RCFileThriftInputFormat.ThriftReader(org.apache.hadoop.mapreduce.RecordReader reader)
- The reader is expected to be a
RecordReader< LongWritable, BytesRefArrayWritable >
isReadingUnknonwsColumn
public boolean isReadingUnknonwsColumn()
- is valid only after initialize() is called
initialize
public void initialize(org.apache.hadoop.mapreduce.InputSplit split,
org.apache.hadoop.mapreduce.TaskAttemptContext ctx)
throws IOException,
InterruptedException
- Overrides:
initialize in class FilterRecordReader<org.apache.hadoop.io.LongWritable,org.apache.hadoop.io.Writable>
- Throws:
IOException
InterruptedException
getCurrentValue
public org.apache.hadoop.io.Writable getCurrentValue()
throws IOException,
InterruptedException
- Overrides:
getCurrentValue in class FilterRecordReader<org.apache.hadoop.io.LongWritable,org.apache.hadoop.io.Writable>
- Throws:
IOException
InterruptedException
getCurrentBytesRefArrayWritable
public org.apache.hadoop.hive.serde2.columnar.BytesRefArrayWritable getCurrentBytesRefArrayWritable()
throws IOException,
InterruptedException
- returns
super.getCurrentValue()
- Throws:
IOException
InterruptedException
getCurrentThriftValue
public org.apache.thrift.TBase<?,?> getCurrentThriftValue()
throws IOException,
InterruptedException,
org.apache.thrift.TException
- Builds Thrift object from the raw bytes returned by RCFile reader.
- Throws:
org.apache.thrift.TException
IOException
InterruptedException
Copyright © 2015 Twitter. All Rights Reserved.