Modifier and Type | Method and Description |
---|---|
HiveStoragePredicateHandler.DecomposedPredicate |
AccumuloStorageHandler.decomposePredicate(org.apache.hadoop.mapred.JobConf conf,
Deserializer deserializer,
ExprNodeDesc desc) |
Modifier and Type | Class and Description |
---|---|
class |
AccumuloSerDe
Deserialization from Accumulo to LazyAccumuloRow for Hive.
|
Modifier and Type | Class and Description |
---|---|
class |
MultiDelimitSerDe
This SerDe allows user to use multiple characters as the field delimiter for a table.
|
class |
TypedBytesSerDe
TypedBytesSerDe uses typed bytes to serialize/deserialize.
|
Modifier and Type | Class and Description |
---|---|
class |
S3LogDeserializer
S3LogDeserializer.
|
Modifier and Type | Class and Description |
---|---|
class |
DruidSerDe
DruidSerDe that is used to deserialize objects from a Druid data source.
|
Modifier and Type | Class and Description |
---|---|
class |
HBaseSerDe
HBaseSerDe can be used to serialize object into an HBase table and
deserialize objects from an HBase table.
|
Modifier and Type | Method and Description |
---|---|
HiveStoragePredicateHandler.DecomposedPredicate |
AbstractHBaseKeyFactory.decomposePredicate(org.apache.hadoop.mapred.JobConf jobConf,
Deserializer deserializer,
ExprNodeDesc predicate) |
HiveStoragePredicateHandler.DecomposedPredicate |
HBaseStorageHandler.decomposePredicate(org.apache.hadoop.mapred.JobConf jobConf,
Deserializer deserializer,
ExprNodeDesc predicate) |
Modifier and Type | Method and Description |
---|---|
org.apache.hadoop.mapred.InputFormat<org.apache.hadoop.io.NullWritable,T> |
LlapIo.getInputFormat(org.apache.hadoop.mapred.InputFormat<?,?> sourceInputFormat,
Deserializer serde) |
Modifier and Type | Method and Description |
---|---|
org.apache.hadoop.mapred.InputFormat<org.apache.hadoop.io.NullWritable,org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch> |
LlapIoImpl.getInputFormat(org.apache.hadoop.mapred.InputFormat<?,?> sourceInputFormat,
Deserializer sourceSerDe) |
Modifier and Type | Method and Description |
---|---|
ReadPipeline |
OrcColumnVectorProducer.createReadPipeline(Consumer<ColumnVectorBatch> consumer,
org.apache.hadoop.mapred.FileSplit split,
ColumnVectorProducer.Includes includes,
org.apache.hadoop.hive.ql.io.sarg.SearchArgument sarg,
QueryFragmentCounters counters,
ColumnVectorProducer.SchemaEvolutionFactory sef,
org.apache.hadoop.mapred.InputFormat<?,?> unused0,
Deserializer unused1,
org.apache.hadoop.mapred.Reporter reporter,
org.apache.hadoop.mapred.JobConf job,
Map<org.apache.hadoop.fs.Path,PartitionDesc> unused2) |
ReadPipeline |
ColumnVectorProducer.createReadPipeline(Consumer<ColumnVectorBatch> consumer,
org.apache.hadoop.mapred.FileSplit split,
ColumnVectorProducer.Includes includes,
org.apache.hadoop.hive.ql.io.sarg.SearchArgument sarg,
QueryFragmentCounters counters,
ColumnVectorProducer.SchemaEvolutionFactory sef,
org.apache.hadoop.mapred.InputFormat<?,?> sourceInputFormat,
Deserializer sourceSerDe,
org.apache.hadoop.mapred.Reporter reporter,
org.apache.hadoop.mapred.JobConf job,
Map<org.apache.hadoop.fs.Path,PartitionDesc> parts) |
ReadPipeline |
GenericColumnVectorProducer.createReadPipeline(Consumer<ColumnVectorBatch> consumer,
org.apache.hadoop.mapred.FileSplit split,
ColumnVectorProducer.Includes includes,
org.apache.hadoop.hive.ql.io.sarg.SearchArgument sarg,
QueryFragmentCounters counters,
ColumnVectorProducer.SchemaEvolutionFactory sef,
org.apache.hadoop.mapred.InputFormat<?,?> sourceInputFormat,
Deserializer sourceSerDe,
org.apache.hadoop.mapred.Reporter reporter,
org.apache.hadoop.mapred.JobConf job,
Map<org.apache.hadoop.fs.Path,PartitionDesc> parts) |
Constructor and Description |
---|
SerDeEncodedDataReader(SerDeLowLevelCacheImpl cache,
BufferUsageManager bufferManager,
org.apache.hadoop.conf.Configuration daemonConf,
org.apache.hadoop.mapred.FileSplit split,
List<Integer> columnIds,
OrcEncodedDataConsumer consumer,
org.apache.hadoop.mapred.JobConf jobConf,
org.apache.hadoop.mapred.Reporter reporter,
org.apache.hadoop.mapred.InputFormat<?,?> sourceInputFormat,
Deserializer sourceSerDe,
QueryFragmentCounters counters,
org.apache.orc.TypeDescription schema,
Map<org.apache.hadoop.fs.Path,PartitionDesc> parts) |
Modifier and Type | Method and Description |
---|---|
static Deserializer |
HiveMetaStoreUtils.getDeserializer(org.apache.hadoop.conf.Configuration conf,
Partition part,
Table table)
getDeserializer
Get the Deserializer for a partition.
|
static Deserializer |
HiveMetaStoreUtils.getDeserializer(org.apache.hadoop.conf.Configuration conf,
Table table,
boolean skipConfError)
getDeserializer
Get the Deserializer for a table.
|
static Deserializer |
HiveMetaStoreUtils.getDeserializer(org.apache.hadoop.conf.Configuration conf,
Table table,
boolean skipConfError,
String lib) |
Modifier and Type | Method and Description |
---|---|
static Class<? extends Deserializer> |
HiveMetaStoreUtils.getDeserializerClass(org.apache.hadoop.conf.Configuration conf,
Table table) |
Modifier and Type | Method and Description |
---|---|
static List<FieldSchema> |
HiveMetaStoreUtils.getFieldsFromDeserializer(String tableName,
Deserializer deserializer) |
Modifier and Type | Method and Description |
---|---|
Deserializer |
MapOperator.getCurrentDeserializer() |
abstract Deserializer |
AbstractMapOperator.getCurrentDeserializer() |
Modifier and Type | Method and Description |
---|---|
static Object[] |
MapOperator.populateVirtualColumnValues(ExecMapperContext ctx,
List<VirtualColumn> vcs,
Object[] vcValues,
Deserializer deserializer) |
Constructor and Description |
---|
KeyValueInputMerger(List<org.apache.tez.runtime.library.api.KeyValueReader> multiMRInputs,
Deserializer deserializer,
ObjectInspector[] inputObjInspectors,
List<String> sortCols) |
Modifier and Type | Class and Description |
---|---|
class |
VectorizedSerde
Serdes that support vectorized
VectorizedRowBatch must implement this interface. |
Modifier and Type | Method and Description |
---|---|
Deserializer |
VectorMapOperator.getCurrentDeserializer() |
Deserializer |
VectorMapOperator.RowDeserializePartitionContext.getPartDeserializer() |
Modifier and Type | Class and Description |
---|---|
class |
ArrowColumnarBatchSerDe
ArrowColumnarBatchSerDe converts Apache Hive rows to Apache Arrow columns.
|
Modifier and Type | Class and Description |
---|---|
class |
OrcSerde
A serde class for ORC.
|
class |
VectorizedOrcSerde
A serde class for ORC.
|
Modifier and Type | Class and Description |
---|---|
class |
ParquetHiveSerDe
A ParquetHiveSerDe for Hive (with the deprecated package mapred)
|
Modifier and Type | Method and Description |
---|---|
Deserializer |
Partition.getDeserializer() |
Deserializer |
Table.getDeserializer() |
Deserializer |
Table.getDeserializer(boolean skipConfError) |
Deserializer |
Table.getDeserializerFromMetaStore(boolean skipConfError) |
Modifier and Type | Method and Description |
---|---|
Class<? extends Deserializer> |
Table.getDeserializerClass() |
Modifier and Type | Method and Description |
---|---|
HiveStoragePredicateHandler.DecomposedPredicate |
HiveStoragePredicateHandler.decomposePredicate(org.apache.hadoop.mapred.JobConf jobConf,
Deserializer deserializer,
ExprNodeDesc predicate)
Gives the storage handler a chance to decompose a predicate.
|
static List<FieldSchema> |
Hive.getFieldsFromDeserializer(String name,
Deserializer serde) |
static List<FieldSchema> |
Hive.getFieldsFromDeserializerForMsStorage(Table tbl,
Deserializer deserializer) |
Modifier and Type | Method and Description |
---|---|
Deserializer |
TableDesc.getDeserializer() |
Deserializer |
TableDesc.getDeserializer(org.apache.hadoop.conf.Configuration conf)
Return a deserializer object corresponding to the tableDesc.
|
Deserializer |
PartitionDesc.getDeserializer(org.apache.hadoop.conf.Configuration conf)
Return a deserializer object corresponding to the partitionDesc.
|
Deserializer |
TableDesc.getDeserializer(org.apache.hadoop.conf.Configuration conf,
boolean ignoreError) |
Modifier and Type | Method and Description |
---|---|
Class<? extends Deserializer> |
TableDesc.getDeserializerClass() |
Modifier and Type | Method and Description |
---|---|
static TableDesc |
PlanUtils.getDefaultQueryOutputTableDesc(String cols,
String colTypes,
String fileFormat,
Class<? extends Deserializer> serdeClass) |
static TableDesc |
PlanUtils.getTableDesc(Class<? extends Deserializer> serdeClass,
String separatorCode,
String columns)
Generate the table descriptor of given serde with the separatorCode and
column names (comma separated string).
|
static TableDesc |
PlanUtils.getTableDesc(Class<? extends Deserializer> serdeClass,
String separatorCode,
String columns,
boolean lastColumnTakesRestOfTheLine)
Generate the table descriptor of the serde specified with the separatorCode
and column names (comma separated string), and whether the last column
should take the rest of the line.
|
static TableDesc |
PlanUtils.getTableDesc(Class<? extends Deserializer> serdeClass,
String separatorCode,
String columns,
String columnTypes,
boolean lastColumnTakesRestOfTheLine) |
static TableDesc |
PlanUtils.getTableDesc(Class<? extends Deserializer> serdeClass,
String separatorCode,
String columns,
String columnTypes,
boolean lastColumnTakesRestOfTheLine,
boolean useDelimitedJSON) |
static TableDesc |
PlanUtils.getTableDesc(Class<? extends Deserializer> serdeClass,
String separatorCode,
String columns,
String columnTypes,
boolean lastColumnTakesRestOfTheLine,
boolean useDelimitedJSON,
String fileFormat) |
Modifier and Type | Class and Description |
---|---|
class |
AbstractDeserializer
Abstract class for implementing Deserializer.
|
class |
AbstractEncodingAwareSerDe
AbstractEncodingAwareSerDe aware the encoding from table properties,
transform data from specified charset to UTF-8 during serialize, and
transform data from UTF-8 to specified charset during deserialize.
|
class |
AbstractSerDe
Abstract class for implementing SerDe.
|
class |
ByteStreamTypedSerDe
ByteStreamTypedSerDe.
|
class |
DelimitedJSONSerDe
DelimitedJSONSerDe.
|
class |
JsonSerDe |
class |
MetadataTypedColumnsetSerDe
MetadataTypedColumnsetSerDe.
|
class |
NullStructSerDe
Placeholder SerDe for cases where neither serialization nor deserialization is needed
|
class |
OpenCSVSerde
OpenCSVSerde use opencsv to deserialize CSV format.
|
class |
RegexSerDe
RegexSerDe uses regular expression (regex) to deserialize data.
|
class |
TypedSerDe
TypedSerDe.
|
Modifier and Type | Method and Description |
---|---|
static void |
SerDeUtils.initializeSerDe(Deserializer deserializer,
org.apache.hadoop.conf.Configuration conf,
Properties tblProps,
Properties partProps)
Initializes a SerDe.
|
static void |
SerDeUtils.initializeSerDeWithoutErrorCheck(Deserializer deserializer,
org.apache.hadoop.conf.Configuration conf,
Properties tblProps,
Properties partProps)
Initializes a SerDe.
|
Modifier and Type | Class and Description |
---|---|
class |
AvroSerDe
Read or write Avro data from Hive.
|
Modifier and Type | Class and Description |
---|---|
class |
BinarySortableSerDe
BinarySortableSerDe can be used to write data in a way that the data can be
compared byte-by-byte with the same order.
|
class |
BinarySortableSerDeWithEndPrefix |
Modifier and Type | Class and Description |
---|---|
class |
ColumnarSerDe
ColumnarSerDe is used for columnar based storage supported by RCFile.
|
class |
ColumnarSerDeBase |
class |
LazyBinaryColumnarSerDe
LazyBinaryColumnarSerDe.
|
Modifier and Type | Class and Description |
---|---|
class |
DynamicSerDe
DynamicSerDe.
|
Modifier and Type | Class and Description |
---|---|
class |
LazySimpleSerDe
LazySimpleSerDe can be used to read the same data format as
MetadataTypedColumnsetSerDe and TCTLSeparatedProtocol.
|
Modifier and Type | Class and Description |
---|---|
class |
LazyBinarySerDe
The LazyBinarySerDe class combines the lazy property of LazySimpleSerDe class
and the binary property of BinarySortable class.
|
class |
LazyBinarySerDe2
Subclass of LazyBinarySerDe with faster serialization, initializing a serializer based on the
row columns rather than checking the ObjectInspector category/primitiveType for every value.
|
Modifier and Type | Class and Description |
---|---|
class |
ThriftByteStreamTypedSerDe
ThriftByteStreamTypedSerDe.
|
class |
ThriftDeserializer
ThriftDeserializer.
|
class |
ThriftJDBCBinarySerDe
This SerDe is used to serialize the final output to thrift-able objects directly in the SerDe.
|
Modifier and Type | Class and Description |
---|---|
class |
HCatRecordSerDe
SerDe class for serializing to and from HCatRecord
|
Modifier and Type | Class and Description |
---|---|
class |
JdbcSerDe |
Copyright © 2022 The Apache Software Foundation. All rights reserved.