PREHOOK: query: create table dynamic_part_table(intcol string) partitioned by (partcol1 string, partcol2 string) PREHOOK: type: CREATETABLE PREHOOK: Output: database:default PREHOOK: Output: default@dynamic_part_table POSTHOOK: query: create table dynamic_part_table(intcol string) partitioned by (partcol1 string, partcol2 string) POSTHOOK: type: CREATETABLE POSTHOOK: Output: database:default POSTHOOK: Output: default@dynamic_part_table PREHOOK: query: insert into table dynamic_part_table partition(partcol1, partcol2) select '1', '1', '1' from src where key=150 PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: default@dynamic_part_table POSTHOOK: query: insert into table dynamic_part_table partition(partcol1, partcol2) select '1', '1', '1' from src where key=150 POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: default@dynamic_part_table@partcol1=1/partcol2=1 POSTHOOK: Lineage: dynamic_part_table PARTITION(partcol1=1,partcol2=1).intcol SIMPLE [] PREHOOK: query: insert into table dynamic_part_table partition(partcol1, partcol2) select '1', NULL, '1' from src where key=150 PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: default@dynamic_part_table POSTHOOK: query: insert into table dynamic_part_table partition(partcol1, partcol2) select '1', NULL, '1' from src where key=150 POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: default@dynamic_part_table@partcol1=__HIVE_DEFAULT_PARTITION__/partcol2=1 POSTHOOK: Lineage: dynamic_part_table PARTITION(partcol1=__HIVE_DEFAULT_PARTITION__,partcol2=1).intcol SIMPLE [] PREHOOK: query: insert into table dynamic_part_table partition(partcol1, partcol2) select '1', '1', NULL from src where key=150 PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: default@dynamic_part_table POSTHOOK: query: insert into table dynamic_part_table partition(partcol1, partcol2) select '1', '1', NULL from src where key=150 POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: default@dynamic_part_table@partcol1=1/partcol2=__HIVE_DEFAULT_PARTITION__ POSTHOOK: Lineage: dynamic_part_table PARTITION(partcol1=1,partcol2=__HIVE_DEFAULT_PARTITION__).intcol SIMPLE [] PREHOOK: query: insert into table dynamic_part_table partition(partcol1, partcol2) select '1', NULL, NULL from src where key=150 PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: default@dynamic_part_table POSTHOOK: query: insert into table dynamic_part_table partition(partcol1, partcol2) select '1', NULL, NULL from src where key=150 POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: default@dynamic_part_table@partcol1=__HIVE_DEFAULT_PARTITION__/partcol2=__HIVE_DEFAULT_PARTITION__ POSTHOOK: Lineage: dynamic_part_table PARTITION(partcol1=__HIVE_DEFAULT_PARTITION__,partcol2=__HIVE_DEFAULT_PARTITION__).intcol SIMPLE [] PREHOOK: query: explain extended select intcol from dynamic_part_table where partcol1='1' and partcol2='1' PREHOOK: type: QUERY POSTHOOK: query: explain extended select intcol from dynamic_part_table where partcol1='1' and partcol2='1' POSTHOOK: type: QUERY ABSTRACT SYNTAX TREE: TOK_QUERY TOK_FROM TOK_TABREF TOK_TABNAME dynamic_part_table TOK_INSERT TOK_DESTINATION TOK_DIR TOK_TMP_FILE TOK_SELECT TOK_SELEXPR TOK_TABLE_OR_COL intcol TOK_WHERE and = TOK_TABLE_OR_COL partcol1 '1' = TOK_TABLE_OR_COL partcol2 '1' STAGE DEPENDENCIES: Stage-1 is a root stage Stage-0 depends on stages: Stage-1 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: dynamic_part_table Statistics: Num rows: 1 Data size: 1 Basic stats: COMPLETE Column stats: NONE GatherStats: false Select Operator expressions: intcol (type: string) outputColumnNames: _col0 Statistics: Num rows: 1 Data size: 1 Basic stats: COMPLETE Column stats: NONE File Output Operator compressed: false GlobalTableId: 0 #### A masked pattern was here #### NumFilesPerFileSink: 1 Statistics: Num rows: 1 Data size: 1 Basic stats: COMPLETE Column stats: NONE #### A masked pattern was here #### table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: columns _col0 columns.types string escape.delim \ hive.serialization.extend.nesting.levels true serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe TotalFiles: 1 GatherStats: false MultiFileSpray: false Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: partcol2=1 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat partition values: partcol1 1 partcol2 1 properties: COLUMN_STATS_ACCURATE true bucket_count -1 columns intcol columns.comments columns.types string #### A masked pattern was here #### name default.dynamic_part_table numFiles 1 numRows 1 partition_columns partcol1/partcol2 partition_columns.types string:string rawDataSize 1 serialization.ddl struct dynamic_part_table { string intcol} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 2 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns intcol columns.comments columns.types string #### A masked pattern was here #### name default.dynamic_part_table partition_columns partcol1/partcol2 partition_columns.types string:string serialization.ddl struct dynamic_part_table { string intcol} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.dynamic_part_table name: default.dynamic_part_table Truncated Path -> Alias: /dynamic_part_table/partcol1=1/partcol2=1 [dynamic_part_table] Stage: Stage-0 Fetch Operator limit: -1 Processor Tree: ListSink PREHOOK: query: explain extended select intcol from dynamic_part_table where partcol1='1' and partcol2='1' PREHOOK: type: QUERY POSTHOOK: query: explain extended select intcol from dynamic_part_table where partcol1='1' and partcol2='1' POSTHOOK: type: QUERY ABSTRACT SYNTAX TREE: TOK_QUERY TOK_FROM TOK_TABREF TOK_TABNAME dynamic_part_table TOK_INSERT TOK_DESTINATION TOK_DIR TOK_TMP_FILE TOK_SELECT TOK_SELEXPR TOK_TABLE_OR_COL intcol TOK_WHERE and = TOK_TABLE_OR_COL partcol1 '1' = TOK_TABLE_OR_COL partcol2 '1' STAGE DEPENDENCIES: Stage-1 is a root stage Stage-0 depends on stages: Stage-1 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: dynamic_part_table Statistics: Num rows: 1 Data size: 1 Basic stats: COMPLETE Column stats: NONE GatherStats: false Select Operator expressions: intcol (type: string) outputColumnNames: _col0 Statistics: Num rows: 1 Data size: 1 Basic stats: COMPLETE Column stats: NONE File Output Operator compressed: false GlobalTableId: 0 #### A masked pattern was here #### NumFilesPerFileSink: 1 Statistics: Num rows: 1 Data size: 1 Basic stats: COMPLETE Column stats: NONE #### A masked pattern was here #### table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: columns _col0 columns.types string escape.delim \ hive.serialization.extend.nesting.levels true serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe TotalFiles: 1 GatherStats: false MultiFileSpray: false Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: partcol2=1 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat partition values: partcol1 1 partcol2 1 properties: COLUMN_STATS_ACCURATE true bucket_count -1 columns intcol columns.comments columns.types string #### A masked pattern was here #### name default.dynamic_part_table numFiles 1 numRows 1 partition_columns partcol1/partcol2 partition_columns.types string:string rawDataSize 1 serialization.ddl struct dynamic_part_table { string intcol} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 2 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns intcol columns.comments columns.types string #### A masked pattern was here #### name default.dynamic_part_table partition_columns partcol1/partcol2 partition_columns.types string:string serialization.ddl struct dynamic_part_table { string intcol} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.dynamic_part_table name: default.dynamic_part_table Truncated Path -> Alias: /dynamic_part_table/partcol1=1/partcol2=1 [dynamic_part_table] Stage: Stage-0 Fetch Operator limit: -1 Processor Tree: ListSink PREHOOK: query: explain extended select intcol from dynamic_part_table where (partcol1='1' and partcol2='1')or (partcol1='1' and partcol2='__HIVE_DEFAULT_PARTITION__') PREHOOK: type: QUERY POSTHOOK: query: explain extended select intcol from dynamic_part_table where (partcol1='1' and partcol2='1')or (partcol1='1' and partcol2='__HIVE_DEFAULT_PARTITION__') POSTHOOK: type: QUERY ABSTRACT SYNTAX TREE: TOK_QUERY TOK_FROM TOK_TABREF TOK_TABNAME dynamic_part_table TOK_INSERT TOK_DESTINATION TOK_DIR TOK_TMP_FILE TOK_SELECT TOK_SELEXPR TOK_TABLE_OR_COL intcol TOK_WHERE or and = TOK_TABLE_OR_COL partcol1 '1' = TOK_TABLE_OR_COL partcol2 '1' and = TOK_TABLE_OR_COL partcol1 '1' = TOK_TABLE_OR_COL partcol2 '__HIVE_DEFAULT_PARTITION__' STAGE DEPENDENCIES: Stage-1 is a root stage Stage-0 depends on stages: Stage-1 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: dynamic_part_table Statistics: Num rows: 2 Data size: 2 Basic stats: COMPLETE Column stats: NONE GatherStats: false Select Operator expressions: intcol (type: string) outputColumnNames: _col0 Statistics: Num rows: 2 Data size: 2 Basic stats: COMPLETE Column stats: NONE File Output Operator compressed: false GlobalTableId: 0 #### A masked pattern was here #### NumFilesPerFileSink: 1 Statistics: Num rows: 2 Data size: 2 Basic stats: COMPLETE Column stats: NONE #### A masked pattern was here #### table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: columns _col0 columns.types string escape.delim \ hive.serialization.extend.nesting.levels true serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe TotalFiles: 1 GatherStats: false MultiFileSpray: false Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: partcol2=1 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat partition values: partcol1 1 partcol2 1 properties: COLUMN_STATS_ACCURATE true bucket_count -1 columns intcol columns.comments columns.types string #### A masked pattern was here #### name default.dynamic_part_table numFiles 1 numRows 1 partition_columns partcol1/partcol2 partition_columns.types string:string rawDataSize 1 serialization.ddl struct dynamic_part_table { string intcol} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 2 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns intcol columns.comments columns.types string #### A masked pattern was here #### name default.dynamic_part_table partition_columns partcol1/partcol2 partition_columns.types string:string serialization.ddl struct dynamic_part_table { string intcol} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.dynamic_part_table name: default.dynamic_part_table #### A masked pattern was here #### Partition base file name: partcol2=__HIVE_DEFAULT_PARTITION__ input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat partition values: partcol1 1 partcol2 __HIVE_DEFAULT_PARTITION__ properties: COLUMN_STATS_ACCURATE true bucket_count -1 columns intcol columns.comments columns.types string #### A masked pattern was here #### name default.dynamic_part_table numFiles 1 numRows 1 partition_columns partcol1/partcol2 partition_columns.types string:string rawDataSize 1 serialization.ddl struct dynamic_part_table { string intcol} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 2 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns intcol columns.comments columns.types string #### A masked pattern was here #### name default.dynamic_part_table partition_columns partcol1/partcol2 partition_columns.types string:string serialization.ddl struct dynamic_part_table { string intcol} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.dynamic_part_table name: default.dynamic_part_table Truncated Path -> Alias: /dynamic_part_table/partcol1=1/partcol2=1 [dynamic_part_table] /dynamic_part_table/partcol1=1/partcol2=__HIVE_DEFAULT_PARTITION__ [dynamic_part_table] Stage: Stage-0 Fetch Operator limit: -1 Processor Tree: ListSink