Modifier and Type | Field and Description |
---|---|
protected TypeInfo |
HiveColumn.columnType |
Modifier and Type | Method and Description |
---|---|
TypeInfo |
HiveColumn.getColumnType()
The Hive type of this column
|
Modifier and Type | Method and Description |
---|---|
static ColumnMapping |
ColumnMappingFactory.get(String columnSpec,
ColumnEncoding defaultEncoding,
String columnName,
TypeInfo columnType)
Generate the proper instance of a ColumnMapping
|
static ColumnMapping |
ColumnMappingFactory.getMap(String columnSpec,
ColumnEncoding keyEncoding,
ColumnEncoding valueEncoding,
String columnName,
TypeInfo columnType) |
Constructor and Description |
---|
ColumnMapping(String mappingSpec,
ColumnEncoding encoding,
String columnName,
TypeInfo columnType) |
HiveColumn(String columnName,
TypeInfo columnType) |
Constructor and Description |
---|
ColumnMapper(String serializedColumnMappings,
String defaultStorageType,
List<String> columnNames,
List<TypeInfo> columnTypes)
Create a mapping from Hive columns (rowID and column) to Accumulo columns (column family and
qualifier).
|
Modifier and Type | Method and Description |
---|---|
TypeInfo |
AccumuloSerDeParameters.getTypeForHiveColumn(String hiveColumn) |
Modifier and Type | Method and Description |
---|---|
List<TypeInfo> |
AccumuloSerDeParameters.getHiveColumnTypes() |
Modifier and Type | Method and Description |
---|---|
ObjectInspector |
AccumuloRowIdFactory.createRowIdObjectInspector(TypeInfo type)
create custom object inspector for accumulo rowId
|
ObjectInspector |
DefaultAccumuloRowIdFactory.createRowIdObjectInspector(TypeInfo type) |
Modifier and Type | Method and Description |
---|---|
protected ArrayList<ObjectInspector> |
AccumuloSerDe.getColumnObjectInspectors(List<TypeInfo> columnTypes,
LazySerDeParameters serDeParams,
List<ColumnMapping> mappings,
AccumuloRowIdFactory factory) |
Modifier and Type | Method and Description |
---|---|
static io.druid.java.util.common.Pair<List<io.druid.data.input.impl.DimensionSchema>,io.druid.query.aggregation.AggregatorFactory[]> |
DruidStorageHandlerUtils.getDimensionsAndAggregates(org.apache.hadoop.conf.Configuration jc,
List<String> columnNames,
List<TypeInfo> columnTypes) |
Modifier and Type | Method and Description |
---|---|
TypeInfo |
ColumnMappings.ColumnMapping.getColumnType() |
TypeInfo |
HBaseSerDeParameters.getTypeForName(String columnName) |
Modifier and Type | Method and Description |
---|---|
List<TypeInfo> |
HBaseSerDeParameters.getColumnTypes() |
Modifier and Type | Method and Description |
---|---|
ObjectInspector |
HBaseKeyFactory.createKeyObjectInspector(TypeInfo type)
create custom object inspector for hbase key
|
ObjectInspector |
DefaultHBaseKeyFactory.createKeyObjectInspector(TypeInfo type) |
Modifier and Type | Method and Description |
---|---|
ObjectInspector |
AvroHBaseValueFactory.createValueObjectInspector(TypeInfo type) |
ObjectInspector |
DefaultHBaseValueFactory.createValueObjectInspector(TypeInfo type) |
ObjectInspector |
HBaseValueFactory.createValueObjectInspector(TypeInfo type)
create custom object inspector for the value
|
Modifier and Type | Method and Description |
---|---|
TypeInfo |
FieldDesc.getTypeInfo() |
Constructor and Description |
---|
FieldDesc(String name,
TypeInfo typeInfo) |
Modifier and Type | Method and Description |
---|---|
static FieldSchema |
HiveMetaStoreUtils.getFieldSchemaFromTypeInfo(String fieldName,
TypeInfo typeInfo)
Convert TypeInfo to FieldSchema.
|
Modifier and Type | Method and Description |
---|---|
static TypeInfo |
FunctionRegistry.getCommonClass(TypeInfo a,
TypeInfo b)
Find a common class that objects of both TypeInfo a and TypeInfo b can
convert to.
|
static TypeInfo |
FunctionRegistry.getCommonClassForComparison(TypeInfo a,
TypeInfo b)
Find a common class that objects of both TypeInfo a and TypeInfo b can
convert to.
|
static TypeInfo |
FunctionRegistry.getCommonClassForStruct(StructTypeInfo a,
StructTypeInfo b)
Find a common class that objects of both StructTypeInfo a and StructTypeInfo b can
convert to.
|
static TypeInfo |
FunctionRegistry.getCommonClassForUnionAll(TypeInfo a,
TypeInfo b)
Find a common type for union-all operator.
|
TypeInfo |
ColumnInfo.getType() |
static TypeInfo |
FunctionRegistry.getTypeInfoForPrimitiveCategory(PrimitiveTypeInfo a,
PrimitiveTypeInfo b,
PrimitiveObjectInspector.PrimitiveCategory typeCategory)
Given 2 TypeInfo types and the PrimitiveCategory selected as the common class between the two,
return a TypeInfo corresponding to the common PrimitiveCategory, and with type qualifiers
(if applicable) that match the 2 TypeInfo types.
|
Modifier and Type | Method and Description |
---|---|
List<TypeInfo> |
UDFArgumentException.getArgTypeList() |
Modifier and Type | Method and Description |
---|---|
static TypeInfo |
FunctionRegistry.getCommonClass(TypeInfo a,
TypeInfo b)
Find a common class that objects of both TypeInfo a and TypeInfo b can
convert to.
|
static TypeInfo |
FunctionRegistry.getCommonClassForComparison(TypeInfo a,
TypeInfo b)
Find a common class that objects of both TypeInfo a and TypeInfo b can
convert to.
|
static TypeInfo |
FunctionRegistry.getCommonClassForUnionAll(TypeInfo a,
TypeInfo b)
Find a common type for union-all operator.
|
static PrimitiveObjectInspector.PrimitiveCategory |
FunctionRegistry.getPrimitiveCommonCategory(TypeInfo a,
TypeInfo b) |
static int |
FunctionRegistry.matchCost(TypeInfo argumentPassed,
TypeInfo argumentAccepted,
boolean exact)
Returns -1 if passed does not match accepted.
|
void |
ColumnInfo.setType(TypeInfo type) |
Modifier and Type | Method and Description |
---|---|
Method |
DefaultUDFMethodResolver.getEvalMethod(List<TypeInfo> argClasses)
Gets the evaluate method for the UDF given the parameter types.
|
Method |
NumericOpMethodResolver.getEvalMethod(List<TypeInfo> argTypeInfos) |
Method |
UDFMethodResolver.getEvalMethod(List<TypeInfo> argClasses)
Deprecated.
Gets the evaluate method for the UDF given the parameter types.
|
Method |
ComparisonOpMethodResolver.getEvalMethod(List<TypeInfo> argTypeInfos) |
Class<? extends UDAFEvaluator> |
UDAFEvaluatorResolver.getEvaluatorClass(List<TypeInfo> argClasses)
Gets the evaluator class corresponding to the passed parameter list.
|
Class<? extends UDAFEvaluator> |
NumericUDAFEvaluatorResolver.getEvaluatorClass(List<TypeInfo> argTypeInfos) |
Class<? extends UDAFEvaluator> |
DefaultUDAFEvaluatorResolver.getEvaluatorClass(List<TypeInfo> argClasses)
Gets the evaluator class for the UDAF given the parameter types.
|
static Method |
FunctionRegistry.getMethodInternal(Class<?> udfClass,
List<Method> mlist,
boolean exact,
List<TypeInfo> argumentsPassed)
Gets the closest matching method corresponding to the argument list from a
list of methods.
|
static <T> Method |
FunctionRegistry.getMethodInternal(Class<? extends T> udfClass,
String methodName,
boolean exact,
List<TypeInfo> argumentClasses)
This method is shared between UDFRegistry and UDAFRegistry.
|
FunctionInfo |
Registry.registerMacro(String macroName,
ExprNodeDesc body,
List<String> colNames,
List<TypeInfo> colTypes) |
FunctionInfo |
Registry.registerMacro(String macroName,
ExprNodeDesc body,
List<String> colNames,
List<TypeInfo> colTypes,
FunctionInfo.FunctionResource... resources) |
static void |
FunctionRegistry.registerTemporaryMacro(String macroName,
ExprNodeDesc body,
List<String> colNames,
List<TypeInfo> colTypes)
Registers the appropriate kind of temporary function based on a class's
type.
|
Constructor and Description |
---|
ColumnInfo(String internalName,
TypeInfo type,
String tabAlias,
boolean isVirtualCol) |
ColumnInfo(String internalName,
TypeInfo type,
String tabAlias,
boolean isVirtualCol,
boolean isHiddenVirtualCol) |
Constructor and Description |
---|
AmbiguousMethodException(Class<?> funcClass,
List<TypeInfo> argTypeInfos,
List<Method> methods)
Constructor.
|
NoMatchingMethodException(Class<?> funcClass,
List<TypeInfo> argTypeInfos,
List<Method> methods)
Constructor.
|
UDFArgumentException(String message,
Class<?> funcClass,
List<TypeInfo> argTypeInfos,
List<Method> methods)
Constructor.
|
Modifier and Type | Method and Description |
---|---|
static boolean |
MapJoinKey.isSupportedField(TypeInfo typeInfo) |
Modifier and Type | Field and Description |
---|---|
protected TypeInfo[] |
VectorColumnSetInfo.typeInfos |
protected TypeInfo[] |
VectorColumnMapping.typeInfos |
Modifier and Type | Method and Description |
---|---|
TypeInfo[] |
VectorizationContext.getAllTypeInfos() |
TypeInfo[] |
VectorizationContext.getInitialTypeInfos() |
TypeInfo |
VectorAggregationDesc.getInputTypeInfo() |
TypeInfo |
VectorAggregationDesc.getOutputTypeInfo() |
static TypeInfo[] |
VectorMapJoinBaseOperator.getOutputTypeInfos(MapJoinDesc desc) |
TypeInfo[] |
VectorizedRowBatchCtx.getRowColumnTypeInfos() |
TypeInfo |
VectorizationContext.getTypeInfo(int columnNum) |
TypeInfo[] |
VectorColumnOrderedMap.Mapping.getTypeInfos() |
TypeInfo[] |
VectorColumnMapping.getTypeInfos() |
static TypeInfo[] |
VectorizedBatchUtil.typeInfosFromStructObjectInspector(StructObjectInspector structObjectInspector) |
static TypeInfo[] |
VectorizedBatchUtil.typeInfosFromTypeNames(String[] typeNames) |
Modifier and Type | Method and Description |
---|---|
void |
VectorColumnOrderedMap.add(int orderedColumn,
int valueColumn,
TypeInfo typeInfo) |
void |
VectorColumnOutputMapping.add(int sourceColumn,
int outputColumn,
TypeInfo typeInfo) |
abstract void |
VectorColumnMapping.add(int sourceColumn,
int outputColumn,
TypeInfo typeInfo) |
void |
VectorColumnSourceMapping.add(int sourceColumn,
int outputColumn,
TypeInfo typeInfo) |
protected void |
VectorColumnSetInfo.addKey(TypeInfo typeInfo) |
int |
VectorizationContext.allocateScratchColumn(TypeInfo typeInfo) |
static VectorHashKeyWrapperBatch |
VectorHashKeyWrapperBatch.compileKeyWrapperBatch(VectorExpression[] keyExpressions,
TypeInfo[] typeInfos)
Prepares a VectorHashKeyWrapperBatch to work for a specific set of keys.
|
static org.apache.hadoop.hive.ql.exec.vector.ColumnVector |
VectorizedBatchUtil.createColumnVector(TypeInfo typeInfo) |
static org.apache.hadoop.hive.ql.exec.vector.ColumnVector |
VectorizedBatchUtil.createColumnVector(TypeInfo typeInfo,
org.apache.hadoop.hive.common.type.DataTypePhysicalVariation dataTypePhysicalVariation) |
Object |
VectorExtractRow.extractRowColumn(org.apache.hadoop.hive.ql.exec.vector.ColumnVector colVector,
TypeInfo typeInfo,
ObjectInspector objectInspector,
int batchIndex) |
static org.apache.hadoop.hive.ql.exec.vector.ColumnVector.Type |
VectorizationContext.getColumnVectorTypeFromTypeInfo(TypeInfo typeInfo) |
static org.apache.hadoop.hive.ql.exec.vector.ColumnVector.Type |
VectorizationContext.getColumnVectorTypeFromTypeInfo(TypeInfo typeInfo,
org.apache.hadoop.hive.common.type.DataTypePhysicalVariation dataTypePhysicalVariation) |
static GenericUDF |
VectorizationContext.getGenericUDFForCast(TypeInfo castType) |
static org.apache.hadoop.io.Writable |
VectorizedBatchUtil.getPrimitiveWritable(TypeInfo typeInfo) |
boolean |
VectorizationContext.haveCandidateForDecimal64VectorExpression(int numChildren,
List<ExprNodeDesc> childExpr,
TypeInfo returnType) |
void |
VectorSerializeRow.init(TypeInfo[] typeInfos) |
void |
VectorExtractRow.init(TypeInfo[] typeInfos) |
void |
VectorSerializeRow.init(TypeInfo[] typeInfos,
int[] columnMap) |
void |
VectorExtractRow.init(TypeInfo[] typeInfos,
int[] projectedColumns) |
void |
VectorAssignRow.init(TypeInfo typeInfo,
int outputColumnNum) |
void |
VectorDeserializeRow.initConversion(TypeInfo[] targetTypeInfos,
boolean[] columnsToIncludeTruncated)
Initialize for converting the source data type that are going to be read with the
DeserializedRead interface passed to the constructor to the target data types desired in
the VectorizedRowBatch.
|
int |
VectorAssignRow.initConversion(TypeInfo[] sourceTypeInfos,
TypeInfo[] targetTypeInfos,
boolean[] columnsToIncludeTruncated)
Initialize for conversion from a provided (source) data types to the target data types
desired in the VectorizedRowBatch.
|
int |
VectorAssignRow.initConversion(TypeInfo[] sourceTypeInfos,
TypeInfo[] targetTypeInfos,
boolean[] columnsToIncludeTruncated)
Initialize for conversion from a provided (source) data types to the target data types
desired in the VectorizedRowBatch.
|
VectorExpression |
VectorizationContext.instantiateExpression(Class<?> vclass,
TypeInfo returnTypeInfo,
org.apache.hadoop.hive.common.type.DataTypePhysicalVariation returnDataTypePhysicalVariation,
Object... args) |
Modifier and Type | Method and Description |
---|---|
void |
VectorizationContext.setInitialTypeInfos(List<TypeInfo> initialTypeInfos) |
Constructor and Description |
---|
VectorAggregationDesc(AggregationDesc aggrDesc,
GenericUDAFEvaluator evaluator,
TypeInfo inputTypeInfo,
org.apache.hadoop.hive.ql.exec.vector.ColumnVector.Type inputColVectorType,
VectorExpression inputExpression,
TypeInfo outputTypeInfo,
org.apache.hadoop.hive.ql.exec.vector.ColumnVector.Type outputColVectorType,
Class<? extends VectorAggregateExpression> vecAggrClass) |
VectorizedRowBatchCtx(String[] rowColumnNames,
TypeInfo[] rowColumnTypeInfos,
org.apache.hadoop.hive.common.type.DataTypePhysicalVariation[] rowDataTypePhysicalVariations,
int[] dataColumnNums,
int partitionColumnCount,
int virtualColumnCount,
VirtualColumn[] neededVirtualColumns,
String[] scratchColumnTypeNames,
org.apache.hadoop.hive.common.type.DataTypePhysicalVariation[] scratchDataTypePhysicalVariations) |
Constructor and Description |
---|
VectorizationContext(String contextName,
List<String> initialColumnNames,
List<TypeInfo> initialTypeInfos,
List<org.apache.hadoop.hive.common.type.DataTypePhysicalVariation> initialDataTypePhysicalVariations,
HiveConf hiveConf) |
Modifier and Type | Field and Description |
---|---|
protected TypeInfo[] |
VectorExpression.inputTypeInfos
ALL input parameter type information is here including those for (non-computed) columns and
scalar values.
|
protected TypeInfo |
VectorExpression.outputTypeInfo |
Modifier and Type | Method and Description |
---|---|
TypeInfo[] |
VectorExpression.getInputTypeInfos() |
TypeInfo |
VectorExpression.getOutputTypeInfo()
Returns type of the output column.
|
Modifier and Type | Method and Description |
---|---|
static void |
OverflowUtils.accountForOverflowDouble(TypeInfo outputTypeInfo,
org.apache.hadoop.hive.ql.exec.vector.DoubleColumnVector v,
boolean selectedInUse,
int[] sel,
int n) |
static void |
OverflowUtils.accountForOverflowLong(TypeInfo outputTypeInfo,
org.apache.hadoop.hive.ql.exec.vector.LongColumnVector v,
boolean selectedInUse,
int[] sel,
int n) |
static String |
VectorExpression.getTypeName(TypeInfo typeInfo,
org.apache.hadoop.hive.common.type.DataTypePhysicalVariation dataTypePhysicalVariation) |
void |
VectorExpression.setInputTypeInfos(TypeInfo... inputTypeInfos) |
void |
PosModDoubleToDouble.setOutputTypeInfo(TypeInfo outputTypeInfo)
Set type of the output column and also set the flag which determines if cast to float
is needed while calculating PosMod expression
|
void |
VectorExpression.setOutputTypeInfo(TypeInfo outputTypeInfo)
Set type of the output column.
|
void |
PosModLongToLong.setOutputTypeInfo(TypeInfo outputTypeInfo) |
Constructor and Description |
---|
ConstantVectorExpression(int outputColumnNum,
byte[] value,
TypeInfo outputTypeInfo) |
ConstantVectorExpression(int outputColumnNum,
double value,
TypeInfo outputTypeInfo) |
ConstantVectorExpression(int outputColumnNum,
HiveChar value,
TypeInfo outputTypeInfo) |
ConstantVectorExpression(int outputColumnNum,
org.apache.hadoop.hive.common.type.HiveDecimal value,
TypeInfo outputTypeInfo) |
ConstantVectorExpression(int outputColumnNum,
org.apache.hadoop.hive.common.type.HiveIntervalDayTime value,
TypeInfo outputTypeInfo) |
ConstantVectorExpression(int outputColumnNum,
HiveVarchar value,
TypeInfo outputTypeInfo) |
ConstantVectorExpression(int outputColumnNum,
long value,
TypeInfo outputTypeInfo) |
ConstantVectorExpression(int outputColumnNum,
Timestamp value,
TypeInfo outputTypeInfo) |
ConstantVectorExpression(int outputColumnNum,
TypeInfo outputTypeInfo,
boolean isNull) |
DynamicValueVectorExpression(int outputColumnNum,
TypeInfo typeInfo,
DynamicValue dynamicValue) |
Modifier and Type | Field and Description |
---|---|
protected TypeInfo |
VectorAggregateExpression.inputTypeInfo |
protected TypeInfo |
VectorAggregateExpression.outputTypeInfo |
Modifier and Type | Method and Description |
---|---|
TypeInfo |
VectorAggregateExpression.getOutputTypeInfo() |
Modifier and Type | Method and Description |
---|---|
void |
VectorKeySeriesMultiSerialized.init(TypeInfo[] typeInfos,
int[] columnNums) |
Modifier and Type | Field and Description |
---|---|
protected TypeInfo[] |
VectorMapJoinCommonOperator.bigTableKeyTypeInfos |
protected TypeInfo[] |
VectorMapJoinCommonOperator.bigTableValueTypeInfos |
protected TypeInfo[] |
VectorMapJoinCommonOperator.outputTypeInfos |
Modifier and Type | Method and Description |
---|---|
void |
VectorPTFGroupBatches.init(TypeInfo[] reducerBatchTypeInfos,
VectorPTFEvaluatorBase[] evaluators,
int[] outputProjectionColumnMap,
TypeInfo[] outputTypeInfos,
int[] keyInputColumnMap,
int[] nonKeyInputColumnMap,
int[] streamingEvaluatorNums,
org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch overflowBatch) |
void |
VectorPTFGroupBatches.init(TypeInfo[] reducerBatchTypeInfos,
VectorPTFEvaluatorBase[] evaluators,
int[] outputProjectionColumnMap,
TypeInfo[] outputTypeInfos,
int[] keyInputColumnMap,
int[] nonKeyInputColumnMap,
int[] streamingEvaluatorNums,
org.apache.hadoop.hive.ql.exec.vector.VectorizedRowBatch overflowBatch) |
Modifier and Type | Field and Description |
---|---|
protected TypeInfo[] |
VectorReduceSinkObjectHashOperator.reduceSinkBucketTypeInfos |
protected TypeInfo[] |
VectorReduceSinkCommonOperator.reduceSinkKeyTypeInfos |
protected TypeInfo[] |
VectorReduceSinkObjectHashOperator.reduceSinkPartitionTypeInfos |
protected TypeInfo[] |
VectorReduceSinkCommonOperator.reduceSinkValueTypeInfos |
Modifier and Type | Field and Description |
---|---|
TypeInfo |
RecordIdentifier.Field.fieldType |
static TypeInfo |
RecordIdentifier.StructInfo.typeInfo |
Modifier and Type | Method and Description |
---|---|
protected abstract StructType |
BatchToRowReader.createStructObject(Object previous,
List<TypeInfo> childrenTypes) |
protected abstract UnionType |
BatchToRowReader.createUnionObject(List<TypeInfo> childrenTypes,
Object previous) |
Modifier and Type | Method and Description |
---|---|
static org.apache.orc.TypeDescription |
OrcInputFormat.convertTypeInfo(TypeInfo info) |
static ObjectInspector |
OrcStruct.createObjectInspector(TypeInfo info) |
Modifier and Type | Method and Description |
---|---|
protected OrcStruct |
OrcOiBatchToRowReader.createStructObject(Object previous,
List<TypeInfo> childrenTypes) |
protected org.apache.hadoop.hive.ql.io.orc.OrcUnion |
OrcOiBatchToRowReader.createUnionObject(List<TypeInfo> childrenTypes,
Object previous) |
Modifier and Type | Method and Description |
---|---|
static HiveGroupConverter |
HiveCollectionConverter.forList(org.apache.parquet.schema.GroupType listType,
org.apache.hadoop.hive.ql.io.parquet.convert.ConverterParent parent,
int index,
TypeInfo hiveTypeInfo) |
static HiveGroupConverter |
HiveCollectionConverter.forMap(org.apache.parquet.schema.GroupType mapType,
org.apache.hadoop.hive.ql.io.parquet.convert.ConverterParent parent,
int index,
TypeInfo hiveTypeInfo) |
protected static HiveGroupConverter |
HiveGroupConverter.getConverterFromDescription(org.apache.parquet.schema.GroupType type,
int index,
org.apache.hadoop.hive.ql.io.parquet.convert.ConverterParent parent,
TypeInfo hiveTypeInfo) |
protected static org.apache.parquet.io.api.PrimitiveConverter |
HiveGroupConverter.getConverterFromDescription(org.apache.parquet.schema.PrimitiveType type,
int index,
org.apache.hadoop.hive.ql.io.parquet.convert.ConverterParent parent,
TypeInfo hiveTypeInfo) |
protected static org.apache.parquet.io.api.Converter |
HiveGroupConverter.getConverterFromDescription(org.apache.parquet.schema.Type type,
int index,
org.apache.hadoop.hive.ql.io.parquet.convert.ConverterParent parent,
TypeInfo hiveTypeInfo) |
static org.apache.parquet.io.api.PrimitiveConverter |
ETypeConverter.getNewConverter(org.apache.parquet.schema.PrimitiveType type,
int index,
org.apache.hadoop.hive.ql.io.parquet.convert.ConverterParent parent,
TypeInfo hiveTypeInfo) |
Modifier and Type | Method and Description |
---|---|
static org.apache.parquet.schema.MessageType |
HiveSchemaConverter.convert(List<String> columnNames,
List<TypeInfo> columnTypes) |
Constructor and Description |
---|
DataWritableRecordConverter(org.apache.parquet.schema.GroupType requestedSchema,
Map<String,String> metadata,
TypeInfo hiveTypeInfo) |
HiveStructConverter(org.apache.parquet.schema.GroupType selectedGroupType,
org.apache.hadoop.hive.ql.io.parquet.convert.ConverterParent parent,
int index,
org.apache.parquet.schema.GroupType containingGroupType,
TypeInfo hiveTypeInfo) |
HiveStructConverter(org.apache.parquet.schema.GroupType groupType,
org.apache.hadoop.hive.ql.io.parquet.convert.ConverterParent parent,
int index,
TypeInfo hiveTypeInfo) |
HiveStructConverter(org.apache.parquet.schema.GroupType requestedSchema,
org.apache.parquet.schema.GroupType tableSchema,
Map<String,String> metadata,
TypeInfo hiveTypeInfo) |
RepeatedGroupConverter(org.apache.parquet.schema.GroupType groupType,
org.apache.hadoop.hive.ql.io.parquet.convert.ConverterParent parent,
int index,
TypeInfo hiveTypeInfo) |
RepeatedPrimitiveConverter(org.apache.parquet.schema.PrimitiveType primitiveType,
org.apache.hadoop.hive.ql.io.parquet.convert.ConverterParent parent,
int index,
TypeInfo hiveTypeInfo) |
Modifier and Type | Method and Description |
---|---|
static List<TypeInfo> |
DataWritableReadSupport.getColumnTypes(String types)
Returns a list of TypeInfo objects from a string which contains column
types strings.
|
Modifier and Type | Method and Description |
---|---|
static org.apache.parquet.schema.MessageType |
DataWritableReadSupport.getRequestedSchema(boolean indexAccess,
List<String> columnNamesList,
List<TypeInfo> columnTypesList,
org.apache.parquet.schema.MessageType fileSchema,
org.apache.hadoop.conf.Configuration configuration)
It's used for vectorized code path.
|
static org.apache.parquet.schema.MessageType |
DataWritableReadSupport.getSchemaByName(org.apache.parquet.schema.MessageType schema,
List<String> colNames,
List<TypeInfo> colTypes)
Searches column names by name on a given Parquet message schema, and returns its projected
Parquet schema types.
|
Modifier and Type | Field and Description |
---|---|
protected TypeInfo |
BaseVectorizedColumnReader.hiveType |
Modifier and Type | Method and Description |
---|---|
static ParquetDataColumnReader |
ParquetDataColumnReaderFactory.getDataColumnReaderByType(org.apache.parquet.schema.PrimitiveType parquetType,
TypeInfo hiveType,
org.apache.parquet.column.values.ValuesReader realReader,
boolean skipTimestampConversion,
ZoneId writerTimezone) |
static ParquetDataColumnReader |
ParquetDataColumnReaderFactory.getDataColumnReaderByTypeOnDictionary(org.apache.parquet.schema.PrimitiveType parquetType,
TypeInfo hiveType,
org.apache.parquet.column.Dictionary realReader,
boolean skipTimestampConversion,
ZoneId writerTimezone) |
void |
VectorizedMapColumnReader.readBatch(int total,
org.apache.hadoop.hive.ql.exec.vector.ColumnVector column,
TypeInfo columnType) |
void |
VectorizedListColumnReader.readBatch(int total,
org.apache.hadoop.hive.ql.exec.vector.ColumnVector column,
TypeInfo columnType) |
void |
VectorizedColumnReader.readBatch(int total,
org.apache.hadoop.hive.ql.exec.vector.ColumnVector column,
TypeInfo columnType)
read records with specified size and type into the columnVector
|
void |
VectorizedPrimitiveColumnReader.readBatch(int total,
org.apache.hadoop.hive.ql.exec.vector.ColumnVector column,
TypeInfo columnType) |
void |
VectorizedStructColumnReader.readBatch(int total,
org.apache.hadoop.hive.ql.exec.vector.ColumnVector column,
TypeInfo columnType) |
void |
VectorizedDummyColumnReader.readBatch(int total,
org.apache.hadoop.hive.ql.exec.vector.ColumnVector column,
TypeInfo columnType) |
Constructor and Description |
---|
BaseVectorizedColumnReader(org.apache.parquet.column.ColumnDescriptor descriptor,
org.apache.parquet.column.page.PageReader pageReader,
boolean skipTimestampConversion,
ZoneId writerTimezone,
org.apache.parquet.schema.Type parquetType,
TypeInfo hiveType) |
VectorizedListColumnReader(org.apache.parquet.column.ColumnDescriptor descriptor,
org.apache.parquet.column.page.PageReader pageReader,
boolean skipTimestampConversion,
ZoneId writerTimezone,
org.apache.parquet.schema.Type type,
TypeInfo hiveType) |
VectorizedPrimitiveColumnReader(org.apache.parquet.column.ColumnDescriptor descriptor,
org.apache.parquet.column.page.PageReader pageReader,
boolean skipTimestampConversion,
ZoneId writerTimezone,
org.apache.parquet.schema.Type type,
TypeInfo hiveType) |
Modifier and Type | Method and Description |
---|---|
TypeInfo |
VirtualColumn.getTypeInfo() |
Modifier and Type | Method and Description |
---|---|
static List<TypeInfo> |
VirtualColumn.removeVirtualColumnTypes(List<String> columnNames,
List<TypeInfo> columnTypes) |
Modifier and Type | Method and Description |
---|---|
static List<TypeInfo> |
VirtualColumn.removeVirtualColumnTypes(List<String> columnNames,
List<TypeInfo> columnTypes) |
Modifier and Type | Method and Description |
---|---|
static TypeInfo |
TypeConverter.convert(org.apache.calcite.rel.type.RelDataType rType) |
static TypeInfo |
TypeConverter.convertListType(org.apache.calcite.rel.type.RelDataType rType) |
static TypeInfo |
TypeConverter.convertMapType(org.apache.calcite.rel.type.RelDataType rType) |
static TypeInfo |
TypeConverter.convertPrimitiveType(org.apache.calcite.rel.type.RelDataType rType) |
static TypeInfo |
TypeConverter.convertStructType(org.apache.calcite.rel.type.RelDataType rType) |
Modifier and Type | Method and Description |
---|---|
static org.apache.calcite.rel.type.RelDataType |
TypeConverter.convert(TypeInfo type,
org.apache.calcite.rel.type.RelDataTypeFactory dtFactory) |
Modifier and Type | Field and Description |
---|---|
TypeInfo |
SemanticAnalyzer.GenericUDAFInfo.returnType |
Modifier and Type | Method and Description |
---|---|
ArrayList<TypeInfo> |
InputSignature.getTypeArray() |
Modifier and Type | Method and Description |
---|---|
void |
InputSignature.add(TypeInfo paramType) |
Constructor and Description |
---|
InputSignature(String name,
TypeInfo... classList) |
Modifier and Type | Field and Description |
---|---|
protected TypeInfo |
ExprNodeDesc.typeInfo |
Modifier and Type | Method and Description |
---|---|
TypeInfo[] |
VectorMapJoinInfo.getBigTableKeyTypeInfos() |
TypeInfo[] |
VectorMapJoinInfo.getBigTableValueTypeInfos() |
TypeInfo[] |
VectorPartitionDesc.getDataTypeInfos() |
TypeInfo[] |
VectorPTFDesc.getOutputTypeInfos() |
TypeInfo[] |
VectorTableScanDesc.getProjectedColumnTypeInfos() |
TypeInfo[] |
VectorPTFDesc.getReducerBatchTypeInfos() |
TypeInfo[] |
VectorReduceSinkInfo.getReduceSinkBucketTypeInfos() |
TypeInfo[] |
VectorReduceSinkInfo.getReduceSinkKeyTypeInfos() |
TypeInfo[] |
VectorReduceSinkInfo.getReduceSinkPartitionTypeInfos() |
TypeInfo[] |
VectorReduceSinkInfo.getReduceSinkValueTypeInfos() |
TypeInfo |
ExprNodeColumnListDesc.getTypeInfo() |
TypeInfo |
ExprNodeDesc.getTypeInfo() |
TypeInfo |
DynamicValue.getTypeInfo() |
Modifier and Type | Method and Description |
---|---|
List<TypeInfo> |
CreateMacroDesc.getColTypes() |
Modifier and Type | Method and Description |
---|---|
static PrimitiveTypeInfo |
ExprNodeDescUtils.deriveMinArgumentCast(ExprNodeDesc childExpr,
TypeInfo targetType) |
static List<String> |
BaseWork.BaseExplainVectorization.getColumnAndTypes(int[] projectionColumns,
String[] columnNames,
TypeInfo[] typeInfos,
org.apache.hadoop.hive.common.type.DataTypePhysicalVariation[] dataTypePhysicalVariations) |
static boolean |
VectorPartitionConversion.isImplicitVectorColumnConversion(TypeInfo fromTypeInfo,
TypeInfo toTypeInfo) |
void |
VectorMapJoinInfo.setBigTableKeyTypeInfos(TypeInfo[] bigTableKeyTypeInfos) |
void |
VectorMapJoinInfo.setBigTableValueTypeInfos(TypeInfo[] bigTableValueTypeInfos) |
void |
VectorPTFDesc.setOutputTypeInfos(TypeInfo[] outputTypeInfos) |
void |
VectorTableScanDesc.setProjectedColumnTypeInfos(TypeInfo[] projectedColumnTypeInfos) |
void |
VectorPTFDesc.setReducerBatchTypeInfos(TypeInfo[] reducerBatchTypeInfos) |
void |
VectorReduceSinkInfo.setReduceSinkBucketTypeInfos(TypeInfo[] reduceSinkBucketTypeInfos) |
void |
VectorReduceSinkInfo.setReduceSinkKeyTypeInfos(TypeInfo[] reduceSinkKeyTypeInfos) |
void |
VectorReduceSinkInfo.setReduceSinkPartitionTypeInfos(TypeInfo[] reduceSinkPartitionTypeInfos) |
void |
VectorReduceSinkInfo.setReduceSinkValueTypeInfos(TypeInfo[] reduceSinkValueTypeInfos) |
void |
ExprNodeColumnListDesc.setTypeInfo(TypeInfo typeInfo) |
void |
ExprNodeDesc.setTypeInfo(TypeInfo typeInfo) |
void |
DynamicValue.setTypeInfo(TypeInfo typeInfo) |
Modifier and Type | Method and Description |
---|---|
void |
VectorPartitionDesc.setDataTypeInfos(List<TypeInfo> dataTypeInfoList) |
Constructor and Description |
---|
DynamicValue(String id,
TypeInfo typeInfo) |
ExprNodeColumnDesc(TypeInfo typeInfo,
String column,
String tabAlias,
boolean isPartitionColOrVirtualCol) |
ExprNodeColumnDesc(TypeInfo typeInfo,
String column,
String tabAlias,
boolean isPartitionColOrVirtualCol,
boolean isSkewedCol) |
ExprNodeColumnDesc(TypeInfo typeInfo,
String column,
String tabAlias,
boolean isPartitionColOrVirtualCol,
boolean isSkewedCol,
boolean isGenerated) |
ExprNodeConstantDesc(TypeInfo typeInfo,
Object value) |
ExprNodeDesc(TypeInfo typeInfo) |
ExprNodeDynamicListDesc(TypeInfo typeInfo,
Operator<? extends OperatorDesc> source,
int keyIndex) |
ExprNodeFieldDesc(TypeInfo typeInfo,
ExprNodeDesc desc,
String fieldName,
Boolean isList) |
ExprNodeGenericFuncDesc(TypeInfo typeInfo,
GenericUDF genericUDF,
List<ExprNodeDesc> children) |
ExprNodeGenericFuncDesc(TypeInfo typeInfo,
GenericUDF genericUDF,
String funcText,
List<ExprNodeDesc> children) |
ExprNodeSubQueryDesc(TypeInfo typeInfo,
org.apache.calcite.rel.RelNode subQuery,
ExprNodeSubQueryDesc.SubqueryType type) |
ExprNodeSubQueryDesc(TypeInfo typeInfo,
org.apache.calcite.rel.RelNode subQuery,
ExprNodeSubQueryDesc.SubqueryType type,
ExprNodeDesc lhs) |
Constructor and Description |
---|
CreateMacroDesc(String macroName,
List<String> colNames,
List<TypeInfo> colTypes,
ExprNodeDesc body) |
Modifier and Type | Method and Description |
---|---|
TypeInfo |
SettableUDF.getTypeInfo() |
Modifier and Type | Method and Description |
---|---|
void |
SettableUDF.setTypeInfo(TypeInfo typeInfo)
Add data to UDF prior to initialization.
|
Modifier and Type | Method and Description |
---|---|
static TypeInfo |
GenericUDFUtils.deriveInType(List<ExprNodeDesc> children) |
TypeInfo[] |
SimpleGenericUDAFParameterInfo.getParameters()
Deprecated.
|
TypeInfo[] |
GenericUDAFParameterInfo.getParameters()
Deprecated.
|
TypeInfo |
GenericUDFToChar.getTypeInfo() |
TypeInfo |
GenericUDFToTimestampLocalTZ.getTypeInfo() |
TypeInfo |
GenericUDFToVarchar.getTypeInfo() |
protected static TypeInfo |
GenericUDFUtils.updateCommonTypeForDecimal(TypeInfo commonTypeInfo,
TypeInfo ti,
TypeInfo returnType) |
Modifier and Type | Method and Description |
---|---|
List<TypeInfo> |
GenericUDFMacro.getColTypes() |
Modifier and Type | Method and Description |
---|---|
GenericUDAFEvaluator |
GenericUDAFFirstValue.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFCollectSet.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFHistogramNumeric.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFCovariance.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFCollectList.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFStd.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFContextNGrams.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFComputeStats.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFSum.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFSumEmptyIsZero.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFStdSample.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFMin.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFRowNumber.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFVariance.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFVarianceSample.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFNTile.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFCount.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFBloomFilter.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFCorrelation.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFAverage.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFBridge.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFMax.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFResolver.getEvaluator(TypeInfo[] parameters)
Deprecated.
Get the evaluator for the parameter types.
|
GenericUDAFEvaluator |
GenericUDAFRank.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFCovarianceSample.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
AbstractGenericUDAFResolver.getEvaluator(TypeInfo[] info)
Deprecated.
|
GenericUDAFEvaluator |
GenericUDAFBinarySetFunctions.RegrCount.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFBinarySetFunctions.RegrSXX.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFBinarySetFunctions.RegrSYY.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFBinarySetFunctions.RegrAvgX.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFBinarySetFunctions.RegrAvgY.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFBinarySetFunctions.RegrSlope.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFBinarySetFunctions.RegrR2.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFBinarySetFunctions.RegrSXY.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFBinarySetFunctions.RegrIntercept.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFLastValue.getEvaluator(TypeInfo[] parameters) |
GenericUDAFEvaluator |
GenericUDAFnGrams.getEvaluator(TypeInfo[] parameters) |
static PrimitiveObjectInspector.PrimitiveCategory |
GenericUDAFSum.getReturnType(TypeInfo type) |
void |
GenericUDFToChar.setTypeInfo(TypeInfo typeInfo)
Provide char type parameters for the output object inspector.
|
void |
GenericUDFToTimestampLocalTZ.setTypeInfo(TypeInfo typeInfo) |
void |
GenericUDFToVarchar.setTypeInfo(TypeInfo typeInfo)
Provide varchar type parameters for the output object inspector.
|
void |
GenericUDFToDecimal.setTypeInfo(TypeInfo typeInfo) |
protected static TypeInfo |
GenericUDFUtils.updateCommonTypeForDecimal(TypeInfo commonTypeInfo,
TypeInfo ti,
TypeInfo returnType) |
Modifier and Type | Method and Description |
---|---|
void |
GenericUDFMacro.setColTypes(List<TypeInfo> colTypes) |
Constructor and Description |
---|
GenericUDFMacro(String macroName,
ExprNodeDesc bodyDesc,
List<String> colNames,
List<TypeInfo> colTypes) |
Modifier and Type | Method and Description |
---|---|
List<TypeInfo> |
AvroObjectInspectorGenerator.getColumnTypes() |
Modifier and Type | Method and Description |
---|---|
org.apache.avro.Schema |
TypeInfoToSchema.convert(List<String> columnNames,
List<TypeInfo> columnTypes,
List<String> columnComments,
String namespace,
String name,
String doc)
Converts Hive schema to avro schema
|
static org.apache.avro.Schema |
AvroSerDe.getSchemaFromCols(Properties properties,
List<String> columnNames,
List<TypeInfo> columnTypes,
String columnCommentProperty) |
Constructor and Description |
---|
BinarySortableDeserializeRead(TypeInfo[] typeInfos,
boolean useExternalBuffer) |
BinarySortableDeserializeRead(TypeInfo[] typeInfos,
boolean useExternalBuffer,
boolean[] columnSortOrderIsDesc,
byte[] columnNullMarker,
byte[] columnNotNullMarker) |
Modifier and Type | Field and Description |
---|---|
protected TypeInfo[] |
DeserializeRead.typeInfos |
Modifier and Type | Method and Description |
---|---|
TypeInfo[] |
DeserializeRead.typeInfos() |
Constructor and Description |
---|
DeserializeRead(TypeInfo[] typeInfos,
boolean useExternalBuffer) |
DeserializeRead(TypeInfo[] typeInfos,
org.apache.hadoop.hive.common.type.DataTypePhysicalVariation[] dataTypePhysicalVariations,
boolean useExternalBuffer)
Constructor.
|
Modifier and Type | Method and Description |
---|---|
TypeInfo |
LazySerDeParameters.getRowTypeInfo() |
Modifier and Type | Method and Description |
---|---|
List<TypeInfo> |
LazySerDeParameters.getColumnTypes() |
Modifier and Type | Method and Description |
---|---|
static ObjectInspector |
LazyFactory.createLazyObjectInspector(TypeInfo typeInfo,
byte[] separators,
int separatorIndex,
org.apache.hadoop.io.Text nullSequence,
boolean escaped,
byte escapeChar)
Deprecated.
|
static ObjectInspector |
LazyFactory.createLazyObjectInspector(TypeInfo typeInfo,
byte[] separators,
int separatorIndex,
org.apache.hadoop.io.Text nullSequence,
boolean escaped,
byte escapeChar,
boolean extendedBooleanLiteral)
Deprecated.
|
static ObjectInspector |
LazyFactory.createLazyObjectInspector(TypeInfo typeInfo,
byte[] separators,
int separatorIndex,
org.apache.hadoop.io.Text nullSequence,
boolean escaped,
byte escapeChar,
boolean extendedBooleanLiteral,
ObjectInspectorFactory.ObjectInspectorOptions option)
Deprecated.
|
static ObjectInspector |
LazyFactory.createLazyObjectInspector(TypeInfo typeInfo,
byte[] separators,
int separatorIndex,
org.apache.hadoop.io.Text nullSequence,
boolean escaped,
byte escapeChar,
ObjectInspectorFactory.ObjectInspectorOptions option)
Deprecated.
|
static ObjectInspector |
LazyFactory.createLazyObjectInspector(TypeInfo typeInfo,
int separatorIndex,
LazyObjectInspectorParameters lazyParams,
ObjectInspectorFactory.ObjectInspectorOptions option)
Create a hierarchical ObjectInspector for LazyObject with the given typeInfo.
|
static boolean |
VerifyLazy.lazyCompare(TypeInfo typeInfo,
Object lazyObject,
Object expectedObject) |
Modifier and Type | Method and Description |
---|---|
static ObjectInspector |
LazyFactory.createColumnarStructInspector(List<String> columnNames,
List<TypeInfo> columnTypes,
byte[] separators,
org.apache.hadoop.io.Text nullSequence,
boolean escaped,
byte escapeChar)
Deprecated.
|
static ObjectInspector |
LazyFactory.createColumnarStructInspector(List<String> columnNames,
List<TypeInfo> columnTypes,
LazyObjectInspectorParameters lazyParams)
Create a hierarchical ObjectInspector for ColumnarStruct with the given
columnNames and columnTypeInfos.
|
static ObjectInspector |
LazyFactory.createLazyStructInspector(List<String> columnNames,
List<TypeInfo> typeInfos,
byte[] separators,
org.apache.hadoop.io.Text nullSequence,
boolean lastColumnTakesRest,
boolean escaped,
byte escapeChar)
Deprecated.
|
static ObjectInspector |
LazyFactory.createLazyStructInspector(List<String> columnNames,
List<TypeInfo> typeInfos,
byte[] separators,
org.apache.hadoop.io.Text nullSequence,
boolean lastColumnTakesRest,
boolean escaped,
byte escapeChar,
boolean extendedBooleanLiteral)
Deprecated.
|
static ObjectInspector |
LazyFactory.createLazyStructInspector(List<String> columnNames,
List<TypeInfo> typeInfos,
LazyObjectInspectorParameters lazyParams)
Create a hierarchical ObjectInspector for LazyStruct with the given
columnNames and columnTypeInfos.
|
Constructor and Description |
---|
LazySimpleDeserializeRead(TypeInfo[] typeInfos,
boolean useExternalBuffer,
LazySerDeParameters lazyParams) |
LazySimpleDeserializeRead(TypeInfo[] typeInfos,
org.apache.hadoop.hive.common.type.DataTypePhysicalVariation[] dataTypePhysicalVariations,
boolean useExternalBuffer,
LazySerDeParameters lazyParams) |
Modifier and Type | Method and Description |
---|---|
static ObjectInspector |
LazyBinaryUtils.getLazyBinaryObjectInspectorFromTypeInfo(TypeInfo typeInfo)
Returns the lazy binary object inspector that can be used to inspect an
lazy binary object of that typeInfo
For primitive types, we use the standard writable object inspector.
|
Modifier and Type | Method and Description |
---|---|
static ObjectInspector |
LazyBinaryFactory.createColumnarStructInspector(List<String> columnNames,
List<TypeInfo> columnTypes) |
Constructor and Description |
---|
LazyBinaryDeserializeRead(TypeInfo[] typeInfos,
boolean useExternalBuffer) |
Modifier and Type | Method and Description |
---|---|
static Type |
Type.getType(TypeInfo typeInfo)
Convert TypeInfo to appropriate Type
|
Modifier and Type | Class and Description |
---|---|
class |
BaseCharTypeInfo |
class |
CharTypeInfo |
class |
DecimalTypeInfo |
class |
ListTypeInfo
A List Type has homogeneous elements.
|
class |
MapTypeInfo
A Map Type has homogeneous keys and homogeneous values.
|
class |
PrimitiveTypeInfo
There are limited number of Primitive Types.
|
class |
StructTypeInfo
StructTypeInfo represents the TypeInfo of a struct.
|
class |
TimestampLocalTZTypeInfo |
class |
UnionTypeInfo
UnionTypeInfo represents the TypeInfo of an union.
|
class |
VarcharTypeInfo |
Modifier and Type | Method and Description |
---|---|
static TypeInfo |
HiveDecimalUtils.getDecimalTypeForPrimitiveCategories(PrimitiveTypeInfo a,
PrimitiveTypeInfo b) |
TypeInfo |
ListTypeInfo.getListElementTypeInfo() |
static TypeInfo |
TypeInfoFactory.getListTypeInfo(TypeInfo elementTypeInfo) |
TypeInfo |
MapTypeInfo.getMapKeyTypeInfo() |
static TypeInfo |
TypeInfoFactory.getMapTypeInfo(TypeInfo keyTypeInfo,
TypeInfo valueTypeInfo) |
TypeInfo |
MapTypeInfo.getMapValueTypeInfo() |
static TypeInfo |
TypeInfoFactory.getPrimitiveTypeInfoFromJavaPrimitive(Class<?> clazz) |
static TypeInfo |
TypeInfoFactory.getPrimitiveTypeInfoFromPrimitiveWritable(Class<?> clazz) |
TypeInfo |
StructTypeInfo.getStructFieldTypeInfo(String field) |
static TypeInfo |
TypeInfoFactory.getStructTypeInfo(List<String> names,
List<TypeInfo> typeInfos) |
static TypeInfo |
TypeInfoUtils.getTypeInfoFromObjectInspector(ObjectInspector oi)
Get the TypeInfo object from the ObjectInspector object by recursively
going into the ObjectInspector structure.
|
static TypeInfo |
TypeInfoUtils.getTypeInfoFromTypeString(String typeString) |
static TypeInfo |
TypeInfoFactory.getUnionTypeInfo(List<TypeInfo> typeInfos) |
Modifier and Type | Method and Description |
---|---|
ArrayList<TypeInfo> |
StructTypeInfo.getAllStructFieldTypeInfos() |
List<TypeInfo> |
UnionTypeInfo.getAllUnionObjectTypeInfos() |
static List<TypeInfo> |
TypeInfoUtils.getParameterTypeInfos(Method m,
int size)
Get the parameter TypeInfo for a method.
|
static ArrayList<TypeInfo> |
TypeInfoUtils.getTypeInfosFromTypeString(String typeString) |
static ArrayList<TypeInfo> |
TypeInfoUtils.typeInfosFromStructObjectInspector(StructObjectInspector structObjectInspector) |
static ArrayList<TypeInfo> |
TypeInfoUtils.typeInfosFromTypeNames(List<String> typeNames) |
Modifier and Type | Method and Description |
---|---|
boolean |
DecimalTypeInfo.accept(TypeInfo other) |
boolean |
TypeInfo.accept(TypeInfo other) |
static boolean |
TypeInfoUtils.doPrimitiveCategoriesMatch(TypeInfo ti1,
TypeInfo ti2)
returns true if both TypeInfos are of primitive type, and the primitive category matches.
|
static TypeInfo |
TypeInfoFactory.getListTypeInfo(TypeInfo elementTypeInfo) |
static TypeInfo |
TypeInfoFactory.getMapTypeInfo(TypeInfo keyTypeInfo,
TypeInfo valueTypeInfo) |
static ObjectInspector |
TypeInfoUtils.getStandardJavaObjectInspectorFromTypeInfo(TypeInfo typeInfo)
Returns the standard object inspector that can be used to translate an
object of that typeInfo to a standard object type.
|
static ObjectInspector |
TypeInfoUtils.getStandardWritableObjectInspectorFromTypeInfo(TypeInfo typeInfo)
Returns the standard object inspector that can be used to translate an
object of that typeInfo to a standard object type.
|
static boolean |
TypeInfoUtils.implicitConvertible(TypeInfo from,
TypeInfo to)
Returns whether it is possible to implicitly convert an object of Class
from to Class to.
|
static boolean |
TypeInfoUtils.isConversionRequiredForComparison(TypeInfo typeA,
TypeInfo typeB)
Given two types, determine whether conversion needs to occur to compare the two types.
|
void |
ListTypeInfo.setListElementTypeInfo(TypeInfo listElementTypeInfo)
For java serialization use only.
|
void |
MapTypeInfo.setMapKeyTypeInfo(TypeInfo mapKeyTypeInfo)
For java serialization use only.
|
void |
MapTypeInfo.setMapValueTypeInfo(TypeInfo mapValueTypeInfo)
For java serialization use only.
|
Modifier and Type | Method and Description |
---|---|
static TypeInfo |
TypeInfoFactory.getStructTypeInfo(List<String> names,
List<TypeInfo> typeInfos) |
static List<String> |
TypeInfoUtils.getTypeStringsFromTypeInfo(List<TypeInfo> typeInfos) |
static TypeInfo |
TypeInfoFactory.getUnionTypeInfo(List<TypeInfo> typeInfos) |
void |
StructTypeInfo.setAllStructFieldTypeInfos(ArrayList<TypeInfo> allStructFieldTypeInfos)
For java serialization use only.
|
void |
UnionTypeInfo.setAllUnionObjectTypeInfos(List<TypeInfo> allUnionObjectTypeInfos)
For java serialization use only.
|
Modifier and Type | Method and Description |
---|---|
static ObjectInspector |
HCatRecordObjectInspectorFactory.getStandardObjectInspectorFromTypeInfo(TypeInfo typeInfo) |
Modifier and Type | Method and Description |
---|---|
static HCatSchema |
HCatSchemaUtils.getHCatSchema(TypeInfo typeInfo) |
Copyright © 2022 The Apache Software Foundation. All rights reserved.