PREHOOK: query: create table src2 as select key, count(1) as count from src group by key PREHOOK: type: CREATETABLE_AS_SELECT PREHOOK: Input: default@src POSTHOOK: query: create table src2 as select key, count(1) as count from src group by key POSTHOOK: type: CREATETABLE_AS_SELECT POSTHOOK: Input: default@src POSTHOOK: Output: default@src2 PREHOOK: query: create table src3 as select * from src2 PREHOOK: type: CREATETABLE_AS_SELECT PREHOOK: Input: default@src2 POSTHOOK: query: create table src3 as select * from src2 POSTHOOK: type: CREATETABLE_AS_SELECT POSTHOOK: Input: default@src2 POSTHOOK: Output: default@src3 PREHOOK: query: create table src4 as select * from src2 PREHOOK: type: CREATETABLE_AS_SELECT PREHOOK: Input: default@src2 POSTHOOK: query: create table src4 as select * from src2 POSTHOOK: type: CREATETABLE_AS_SELECT POSTHOOK: Input: default@src2 POSTHOOK: Output: default@src4 PREHOOK: query: create table src5 as select * from src2 PREHOOK: type: CREATETABLE_AS_SELECT PREHOOK: Input: default@src2 POSTHOOK: query: create table src5 as select * from src2 POSTHOOK: type: CREATETABLE_AS_SELECT POSTHOOK: Input: default@src2 POSTHOOK: Output: default@src5 PREHOOK: query: explain extended select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select key, count from src4 where key < 10 union all select key, count(1) as count from src5 where key < 10 group by key )s order by s.key ASC, s.count ASC PREHOOK: type: QUERY POSTHOOK: query: explain extended select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select key, count from src4 where key < 10 union all select key, count(1) as count from src5 where key < 10 group by key )s order by s.key ASC, s.count ASC POSTHOOK: type: QUERY ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_SUBQUERY (TOK_UNION (TOK_UNION (TOK_UNION (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src2))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL count))) (TOK_WHERE (< (TOK_TABLE_OR_COL key) 10)))) (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src3))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL count))) (TOK_WHERE (< (TOK_TABLE_OR_COL key) 10))))) (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src4))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL count))) (TOK_WHERE (< (TOK_TABLE_OR_COL key) 10))))) (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src5))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_FUNCTION count 1) count)) (TOK_WHERE (< (TOK_TABLE_OR_COL key) 10)) (TOK_GROUPBY (TOK_TABLE_OR_COL key))))) s)) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (. (TOK_TABLE_OR_COL s) key)) (TOK_SELEXPR (. (TOK_TABLE_OR_COL s) count))) (TOK_ORDERBY (TOK_TABSORTCOLNAMEASC (. (TOK_TABLE_OR_COL s) key)) (TOK_TABSORTCOLNAMEASC (. (TOK_TABLE_OR_COL s) count))))) STAGE DEPENDENCIES: Stage-3 is a root stage Stage-2 depends on stages: Stage-3 Stage-0 is a root stage STAGE PLANS: Stage: Stage-3 Map Reduce Alias -> Map Operator Tree: null-subquery2:s-subquery2:src5 TableScan alias: src5 GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Select Operator expressions: expr: key type: string outputColumnNames: key Group By Operator aggregations: expr: count(1) bucketGroup: false keys: expr: key type: string mode: hash outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string sort order: + Map-reduce partition columns: expr: _col0 type: string tag: -1 value expressions: expr: _col1 type: bigint Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: src5 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src5 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src5 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src5 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src5 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src5 name: default.src5 Truncated Path -> Alias: /src5 [null-subquery2:s-subquery2:src5] Needs Tagging: false Reduce Operator Tree: Group By Operator aggregations: expr: count(VALUE._col0) bucketGroup: false keys: expr: KEY._col0 type: string mode: mergepartial outputColumnNames: _col0, _col1 Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 0 #### A masked pattern was here #### NumFilesPerFileSink: 1 table: input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ TotalFiles: 1 GatherStats: false MultiFileSpray: false Stage: Stage-2 Map Reduce Alias -> Map Operator Tree: #### A masked pattern was here #### TableScan GatherStats: false Union Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string expr: _col1 type: bigint sort order: ++ tag: -1 value expressions: expr: _col0 type: string expr: _col1 type: bigint null-subquery1-subquery1-subquery1:s-subquery1-subquery1-subquery1:src2 TableScan alias: src2 GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Select Operator expressions: expr: key type: string expr: count type: bigint outputColumnNames: _col0, _col1 Union Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string expr: _col1 type: bigint sort order: ++ tag: -1 value expressions: expr: _col0 type: string expr: _col1 type: bigint null-subquery1-subquery1-subquery2:s-subquery1-subquery1-subquery2:src3 TableScan alias: src3 GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Select Operator expressions: expr: key type: string expr: count type: bigint outputColumnNames: _col0, _col1 Union Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string expr: _col1 type: bigint sort order: ++ tag: -1 value expressions: expr: _col0 type: string expr: _col1 type: bigint null-subquery1-subquery2:s-subquery1-subquery2:src4 TableScan alias: src4 GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Select Operator expressions: expr: key type: string expr: count type: bigint outputColumnNames: _col0, _col1 Union Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string expr: _col1 type: bigint sort order: ++ tag: -1 value expressions: expr: _col0 type: string expr: _col1 type: bigint Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: -mr-10002 input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ #### A masked pattern was here #### Partition base file name: src2 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src2 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src2 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src2 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src2 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src2 name: default.src2 #### A masked pattern was here #### Partition base file name: src3 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src3 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src3 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src3 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src3 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src3 name: default.src3 #### A masked pattern was here #### Partition base file name: src4 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src4 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src4 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src4 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src4 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src4 name: default.src4 Truncated Path -> Alias: /src2 [null-subquery1-subquery1-subquery1:s-subquery1-subquery1-subquery1:src2] /src3 [null-subquery1-subquery1-subquery2:s-subquery1-subquery1-subquery2:src3] /src4 [null-subquery1-subquery2:s-subquery1-subquery2:src4] #### A masked pattern was here #### Needs Tagging: false Reduce Operator Tree: Extract File Output Operator compressed: false GlobalTableId: 0 #### A masked pattern was here #### NumFilesPerFileSink: 1 #### A masked pattern was here #### table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: columns _col0,_col1 columns.types string:bigint escape.delim \ hive.serialization.extend.nesting.levels true serialization.format 1 TotalFiles: 1 GatherStats: false MultiFileSpray: false Stage: Stage-0 Fetch Operator limit: -1 PREHOOK: query: select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select key, count from src4 where key < 10 union all select key, count(1) as count from src5 where key < 10 group by key )s order by s.key ASC, s.count ASC PREHOOK: type: QUERY PREHOOK: Input: default@src2 PREHOOK: Input: default@src3 PREHOOK: Input: default@src4 PREHOOK: Input: default@src5 #### A masked pattern was here #### POSTHOOK: query: select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select key, count from src4 where key < 10 union all select key, count(1) as count from src5 where key < 10 group by key )s order by s.key ASC, s.count ASC POSTHOOK: type: QUERY POSTHOOK: Input: default@src2 POSTHOOK: Input: default@src3 POSTHOOK: Input: default@src4 POSTHOOK: Input: default@src5 #### A masked pattern was here #### 0 1 0 3 0 3 0 3 2 1 2 1 2 1 2 1 4 1 4 1 4 1 4 1 5 1 5 3 5 3 5 3 8 1 8 1 8 1 8 1 9 1 9 1 9 1 9 1 PREHOOK: query: explain extended select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select a.key as key, b.count as count from src4 a join src5 b on a.key=b.key where a.key < 10 )s order by s.key ASC, s.count ASC PREHOOK: type: QUERY POSTHOOK: query: explain extended select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select a.key as key, b.count as count from src4 a join src5 b on a.key=b.key where a.key < 10 )s order by s.key ASC, s.count ASC POSTHOOK: type: QUERY ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_SUBQUERY (TOK_UNION (TOK_UNION (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src2))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL count))) (TOK_WHERE (< (TOK_TABLE_OR_COL key) 10)))) (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src3))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL count))) (TOK_WHERE (< (TOK_TABLE_OR_COL key) 10))))) (TOK_QUERY (TOK_FROM (TOK_JOIN (TOK_TABREF (TOK_TABNAME src4) a) (TOK_TABREF (TOK_TABNAME src5) b) (= (. (TOK_TABLE_OR_COL a) key) (. (TOK_TABLE_OR_COL b) key)))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (. (TOK_TABLE_OR_COL a) key) key) (TOK_SELEXPR (. (TOK_TABLE_OR_COL b) count) count)) (TOK_WHERE (< (. (TOK_TABLE_OR_COL a) key) 10))))) s)) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (. (TOK_TABLE_OR_COL s) key)) (TOK_SELEXPR (. (TOK_TABLE_OR_COL s) count))) (TOK_ORDERBY (TOK_TABSORTCOLNAMEASC (. (TOK_TABLE_OR_COL s) key)) (TOK_TABSORTCOLNAMEASC (. (TOK_TABLE_OR_COL s) count))))) STAGE DEPENDENCIES: Stage-1 is a root stage Stage-2 depends on stages: Stage-1 Stage-0 is a root stage STAGE PLANS: Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: null-subquery2:s-subquery2:a TableScan alias: a GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Reduce Output Operator key expressions: expr: key type: string sort order: + Map-reduce partition columns: expr: key type: string tag: 0 value expressions: expr: key type: string null-subquery2:s-subquery2:b TableScan alias: b GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Reduce Output Operator key expressions: expr: key type: string sort order: + Map-reduce partition columns: expr: key type: string tag: 1 value expressions: expr: count type: bigint Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: src4 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src4 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src4 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src4 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src4 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src4 name: default.src4 #### A masked pattern was here #### Partition base file name: src5 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src5 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src5 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src5 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src5 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src5 name: default.src5 Truncated Path -> Alias: /src4 [null-subquery2:s-subquery2:a] /src5 [null-subquery2:s-subquery2:b] Needs Tagging: true Reduce Operator Tree: Join Operator condition map: Inner Join 0 to 1 condition expressions: 0 {VALUE._col0} 1 {VALUE._col1} handleSkewJoin: false outputColumnNames: _col0, _col5 Select Operator expressions: expr: _col0 type: string expr: _col5 type: bigint outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 0 #### A masked pattern was here #### NumFilesPerFileSink: 1 table: input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ TotalFiles: 1 GatherStats: false MultiFileSpray: false Stage: Stage-2 Map Reduce Alias -> Map Operator Tree: #### A masked pattern was here #### TableScan GatherStats: false Union Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string expr: _col1 type: bigint sort order: ++ tag: -1 value expressions: expr: _col0 type: string expr: _col1 type: bigint null-subquery1-subquery1:s-subquery1-subquery1:src2 TableScan alias: src2 GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Select Operator expressions: expr: key type: string expr: count type: bigint outputColumnNames: _col0, _col1 Union Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string expr: _col1 type: bigint sort order: ++ tag: -1 value expressions: expr: _col0 type: string expr: _col1 type: bigint null-subquery1-subquery2:s-subquery1-subquery2:src3 TableScan alias: src3 GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Select Operator expressions: expr: key type: string expr: count type: bigint outputColumnNames: _col0, _col1 Union Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string expr: _col1 type: bigint sort order: ++ tag: -1 value expressions: expr: _col0 type: string expr: _col1 type: bigint Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: -mr-10002 input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ #### A masked pattern was here #### Partition base file name: src2 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src2 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src2 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src2 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src2 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src2 name: default.src2 #### A masked pattern was here #### Partition base file name: src3 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src3 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src3 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src3 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src3 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src3 name: default.src3 Truncated Path -> Alias: /src2 [null-subquery1-subquery1:s-subquery1-subquery1:src2] /src3 [null-subquery1-subquery2:s-subquery1-subquery2:src3] #### A masked pattern was here #### Needs Tagging: false Reduce Operator Tree: Extract File Output Operator compressed: false GlobalTableId: 0 #### A masked pattern was here #### NumFilesPerFileSink: 1 #### A masked pattern was here #### table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: columns _col0,_col1 columns.types string:bigint escape.delim \ hive.serialization.extend.nesting.levels true serialization.format 1 TotalFiles: 1 GatherStats: false MultiFileSpray: false Stage: Stage-0 Fetch Operator limit: -1 PREHOOK: query: select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select a.key as key, b.count as count from src4 a join src5 b on a.key=b.key where a.key < 10 )s order by s.key ASC, s.count ASC PREHOOK: type: QUERY PREHOOK: Input: default@src2 PREHOOK: Input: default@src3 PREHOOK: Input: default@src4 PREHOOK: Input: default@src5 #### A masked pattern was here #### POSTHOOK: query: select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select a.key as key, b.count as count from src4 a join src5 b on a.key=b.key where a.key < 10 )s order by s.key ASC, s.count ASC POSTHOOK: type: QUERY POSTHOOK: Input: default@src2 POSTHOOK: Input: default@src3 POSTHOOK: Input: default@src4 POSTHOOK: Input: default@src5 #### A masked pattern was here #### 0 3 0 3 0 3 2 1 2 1 2 1 4 1 4 1 4 1 5 3 5 3 5 3 8 1 8 1 8 1 9 1 9 1 9 1 PREHOOK: query: explain extended select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select a.key as key, count(1) as count from src4 a join src5 b on a.key=b.key where a.key < 10 group by a.key )s order by s.key ASC, s.count ASC PREHOOK: type: QUERY POSTHOOK: query: explain extended select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select a.key as key, count(1) as count from src4 a join src5 b on a.key=b.key where a.key < 10 group by a.key )s order by s.key ASC, s.count ASC POSTHOOK: type: QUERY ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_SUBQUERY (TOK_UNION (TOK_UNION (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src2))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL count))) (TOK_WHERE (< (TOK_TABLE_OR_COL key) 10)))) (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src3))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL count))) (TOK_WHERE (< (TOK_TABLE_OR_COL key) 10))))) (TOK_QUERY (TOK_FROM (TOK_JOIN (TOK_TABREF (TOK_TABNAME src4) a) (TOK_TABREF (TOK_TABNAME src5) b) (= (. (TOK_TABLE_OR_COL a) key) (. (TOK_TABLE_OR_COL b) key)))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (. (TOK_TABLE_OR_COL a) key) key) (TOK_SELEXPR (TOK_FUNCTION count 1) count)) (TOK_WHERE (< (. (TOK_TABLE_OR_COL a) key) 10)) (TOK_GROUPBY (. (TOK_TABLE_OR_COL a) key))))) s)) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (. (TOK_TABLE_OR_COL s) key)) (TOK_SELEXPR (. (TOK_TABLE_OR_COL s) count))) (TOK_ORDERBY (TOK_TABSORTCOLNAMEASC (. (TOK_TABLE_OR_COL s) key)) (TOK_TABSORTCOLNAMEASC (. (TOK_TABLE_OR_COL s) count))))) STAGE DEPENDENCIES: Stage-1 is a root stage Stage-2 depends on stages: Stage-1 Stage-3 depends on stages: Stage-2 Stage-0 is a root stage STAGE PLANS: Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: null-subquery2:s-subquery2:a TableScan alias: a GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Reduce Output Operator key expressions: expr: key type: string sort order: + Map-reduce partition columns: expr: key type: string tag: 0 value expressions: expr: key type: string null-subquery2:s-subquery2:b TableScan alias: b GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Reduce Output Operator key expressions: expr: key type: string sort order: + Map-reduce partition columns: expr: key type: string tag: 1 Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: src4 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src4 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src4 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src4 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src4 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src4 name: default.src4 #### A masked pattern was here #### Partition base file name: src5 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src5 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src5 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src5 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src5 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src5 name: default.src5 Truncated Path -> Alias: /src4 [null-subquery2:s-subquery2:a] /src5 [null-subquery2:s-subquery2:b] Needs Tagging: true Reduce Operator Tree: Join Operator condition map: Inner Join 0 to 1 condition expressions: 0 {VALUE._col0} 1 handleSkewJoin: false outputColumnNames: _col0 Select Operator expressions: expr: _col0 type: string outputColumnNames: _col0 Group By Operator aggregations: expr: count(1) bucketGroup: false keys: expr: _col0 type: string mode: hash outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 0 #### A masked pattern was here #### NumFilesPerFileSink: 1 table: input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ TotalFiles: 1 GatherStats: false MultiFileSpray: false Stage: Stage-2 Map Reduce Alias -> Map Operator Tree: #### A masked pattern was here #### Reduce Output Operator key expressions: expr: _col0 type: string sort order: + Map-reduce partition columns: expr: _col0 type: string tag: -1 value expressions: expr: _col1 type: bigint Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: -mr-10002 input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ Truncated Path -> Alias: #### A masked pattern was here #### Needs Tagging: false Reduce Operator Tree: Group By Operator aggregations: expr: count(VALUE._col0) bucketGroup: false keys: expr: KEY._col0 type: string mode: mergepartial outputColumnNames: _col0, _col1 Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 0 #### A masked pattern was here #### NumFilesPerFileSink: 1 table: input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ TotalFiles: 1 GatherStats: false MultiFileSpray: false Stage: Stage-3 Map Reduce Alias -> Map Operator Tree: #### A masked pattern was here #### TableScan GatherStats: false Union Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string expr: _col1 type: bigint sort order: ++ tag: -1 value expressions: expr: _col0 type: string expr: _col1 type: bigint null-subquery1-subquery1:s-subquery1-subquery1:src2 TableScan alias: src2 GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Select Operator expressions: expr: key type: string expr: count type: bigint outputColumnNames: _col0, _col1 Union Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string expr: _col1 type: bigint sort order: ++ tag: -1 value expressions: expr: _col0 type: string expr: _col1 type: bigint null-subquery1-subquery2:s-subquery1-subquery2:src3 TableScan alias: src3 GatherStats: false Filter Operator isSamplingPred: false predicate: expr: (key < 10) type: boolean Select Operator expressions: expr: key type: string expr: count type: bigint outputColumnNames: _col0, _col1 Union Select Operator expressions: expr: _col0 type: string expr: _col1 type: bigint outputColumnNames: _col0, _col1 Reduce Output Operator key expressions: expr: _col0 type: string expr: _col1 type: bigint sort order: ++ tag: -1 value expressions: expr: _col0 type: string expr: _col1 type: bigint Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: -mr-10003 input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ input format: org.apache.hadoop.mapred.SequenceFileInputFormat output format: org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat properties: columns _col0,_col1 columns.types string,bigint escape.delim \ #### A masked pattern was here #### Partition base file name: src2 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src2 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src2 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src2 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src2 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src2 name: default.src2 #### A masked pattern was here #### Partition base file name: src3 input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src3 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src3 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count -1 columns key,count columns.types string:bigint #### A masked pattern was here #### name default.src3 numFiles 1 numPartitions 0 numRows 309 rawDataSize 1482 serialization.ddl struct src3 { string key, i64 count} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 1791 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src3 name: default.src3 Truncated Path -> Alias: /src2 [null-subquery1-subquery1:s-subquery1-subquery1:src2] /src3 [null-subquery1-subquery2:s-subquery1-subquery2:src3] #### A masked pattern was here #### Needs Tagging: false Reduce Operator Tree: Extract File Output Operator compressed: false GlobalTableId: 0 #### A masked pattern was here #### NumFilesPerFileSink: 1 #### A masked pattern was here #### table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: columns _col0,_col1 columns.types string:bigint escape.delim \ hive.serialization.extend.nesting.levels true serialization.format 1 TotalFiles: 1 GatherStats: false MultiFileSpray: false Stage: Stage-0 Fetch Operator limit: -1 PREHOOK: query: select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select a.key as key, count(1) as count from src4 a join src5 b on a.key=b.key where a.key < 10 group by a.key )s order by s.key ASC, s.count ASC PREHOOK: type: QUERY PREHOOK: Input: default@src2 PREHOOK: Input: default@src3 PREHOOK: Input: default@src4 PREHOOK: Input: default@src5 #### A masked pattern was here #### POSTHOOK: query: select s.key, s.count from ( select key, count from src2 where key < 10 union all select key, count from src3 where key < 10 union all select a.key as key, count(1) as count from src4 a join src5 b on a.key=b.key where a.key < 10 group by a.key )s order by s.key ASC, s.count ASC POSTHOOK: type: QUERY POSTHOOK: Input: default@src2 POSTHOOK: Input: default@src3 POSTHOOK: Input: default@src4 POSTHOOK: Input: default@src5 #### A masked pattern was here #### 0 1 0 3 0 3 2 1 2 1 2 1 4 1 4 1 4 1 5 1 5 3 5 3 8 1 8 1 8 1 9 1 9 1 9 1