PREHOOK: query: CREATE TABLE src_union_1 (key int, value string) PARTITIONED BY (ds string) PREHOOK: type: CREATETABLE POSTHOOK: query: CREATE TABLE src_union_1 (key int, value string) PARTITIONED BY (ds string) POSTHOOK: type: CREATETABLE POSTHOOK: Output: default@src_union_1 PREHOOK: query: CREATE INDEX src_union_1_key_idx ON TABLE src_union_1(key) AS 'COMPACT' WITH DEFERRED REBUILD PREHOOK: type: CREATEINDEX POSTHOOK: query: CREATE INDEX src_union_1_key_idx ON TABLE src_union_1(key) AS 'COMPACT' WITH DEFERRED REBUILD POSTHOOK: type: CREATEINDEX POSTHOOK: Output: default@default__src_union_1_src_union_1_key_idx__ PREHOOK: query: CREATE TABLE src_union_2 (key int, value string) PARTITIONED BY (ds string, part_1 string) PREHOOK: type: CREATETABLE POSTHOOK: query: CREATE TABLE src_union_2 (key int, value string) PARTITIONED BY (ds string, part_1 string) POSTHOOK: type: CREATETABLE POSTHOOK: Output: default@src_union_2 PREHOOK: query: CREATE INDEX src_union_2_key_idx ON TABLE src_union_2(key) AS 'COMPACT' WITH DEFERRED REBUILD PREHOOK: type: CREATEINDEX POSTHOOK: query: CREATE INDEX src_union_2_key_idx ON TABLE src_union_2(key) AS 'COMPACT' WITH DEFERRED REBUILD POSTHOOK: type: CREATEINDEX POSTHOOK: Output: default@default__src_union_2_src_union_2_key_idx__ PREHOOK: query: CREATE TABLE src_union_3(key int, value string) PARTITIONED BY (ds string, part_1 string, part_2 string) PREHOOK: type: CREATETABLE POSTHOOK: query: CREATE TABLE src_union_3(key int, value string) PARTITIONED BY (ds string, part_1 string, part_2 string) POSTHOOK: type: CREATETABLE POSTHOOK: Output: default@src_union_3 PREHOOK: query: CREATE INDEX src_union_3_key_idx ON TABLE src_union_3(key) AS 'COMPACT' WITH DEFERRED REBUILD PREHOOK: type: CREATEINDEX POSTHOOK: query: CREATE INDEX src_union_3_key_idx ON TABLE src_union_3(key) AS 'COMPACT' WITH DEFERRED REBUILD POSTHOOK: type: CREATEINDEX POSTHOOK: Output: default@default__src_union_3_src_union_3_key_idx__ ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_1))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL value)) (TOK_SELEXPR (TOK_TABLE_OR_COL ds))) (TOK_WHERE (and (= (TOK_TABLE_OR_COL key) 86) (= (TOK_TABLE_OR_COL ds) '1'))))) STAGE DEPENDENCIES: Stage-3 is a root stage Stage-2 depends on stages: Stage-3 Stage-1 depends on stages: Stage-2 Stage-0 is a root stage STAGE PLANS: Stage: Stage-3 Map Reduce Alias -> Map Operator Tree: default__src_union_1_src_union_1_key_idx__ TableScan alias: default__src_union_1_src_union_1_key_idx__ filterExpr: expr: ((key = 86) and (ds = '1')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-2 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_1 TableScan alias: src_union_1 filterExpr: expr: ((key = 86) and (ds = '1')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_2))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL value)) (TOK_SELEXPR (TOK_TABLE_OR_COL ds))) (TOK_WHERE (and (= (TOK_TABLE_OR_COL key) 86) (= (TOK_TABLE_OR_COL ds) '2'))))) STAGE DEPENDENCIES: Stage-3 is a root stage Stage-2 depends on stages: Stage-3 Stage-1 depends on stages: Stage-2 Stage-0 is a root stage STAGE PLANS: Stage: Stage-3 Map Reduce Alias -> Map Operator Tree: default__src_union_2_src_union_2_key_idx__ TableScan alias: default__src_union_2_src_union_2_key_idx__ filterExpr: expr: ((key = 86) and (ds = '2')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-2 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_2 TableScan alias: src_union_2 filterExpr: expr: ((key = 86) and (ds = '2')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_3))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL value)) (TOK_SELEXPR (TOK_TABLE_OR_COL ds))) (TOK_WHERE (and (= (TOK_TABLE_OR_COL key) 86) (= (TOK_TABLE_OR_COL ds) '3'))))) STAGE DEPENDENCIES: Stage-3 is a root stage Stage-2 depends on stages: Stage-3 Stage-1 depends on stages: Stage-2 Stage-0 is a root stage STAGE PLANS: Stage: Stage-3 Map Reduce Alias -> Map Operator Tree: default__src_union_3_src_union_3_key_idx__ TableScan alias: default__src_union_3_src_union_3_key_idx__ filterExpr: expr: ((key = 86) and (ds = '3')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-2 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_3 TableScan alias: src_union_3 filterExpr: expr: ((key = 86) and (ds = '3')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 86 val_86 1 86 val_86 2 86 val_86 2 86 val_86 3 86 val_86 3 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_1))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_FUNCTION count 1))) (TOK_WHERE (= (TOK_TABLE_OR_COL ds) '1')))) STAGE DEPENDENCIES: Stage-3 is a root stage Stage-2 depends on stages: Stage-3 Stage-1 depends on stages: Stage-2 Stage-0 is a root stage STAGE PLANS: Stage: Stage-3 Map Reduce Alias -> Map Operator Tree: default__src_union_1_src_union_1_key_idx__ TableScan alias: default__src_union_1_src_union_1_key_idx__ filterExpr: expr: (ds = '1') type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-2 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_1 TableScan alias: src_union_1 filterExpr: expr: (ds = '1') type: boolean Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint Reduce Operator Tree: Group By Operator aggregations: expr: count(VALUE._col0) bucketGroup: false mode: mergepartial outputColumnNames: _col0 Select Operator expressions: expr: _col0 type: bigint outputColumnNames: _col0 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_2))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_FUNCTION count 1))) (TOK_WHERE (= (TOK_TABLE_OR_COL ds) '2')))) STAGE DEPENDENCIES: Stage-3 is a root stage Stage-2 depends on stages: Stage-3 Stage-1 depends on stages: Stage-2 Stage-0 is a root stage STAGE PLANS: Stage: Stage-3 Map Reduce Alias -> Map Operator Tree: default__src_union_2_src_union_2_key_idx__ TableScan alias: default__src_union_2_src_union_2_key_idx__ filterExpr: expr: (ds = '2') type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-2 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_2 TableScan alias: src_union_2 filterExpr: expr: (ds = '2') type: boolean Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint Reduce Operator Tree: Group By Operator aggregations: expr: count(VALUE._col0) bucketGroup: false mode: mergepartial outputColumnNames: _col0 Select Operator expressions: expr: _col0 type: bigint outputColumnNames: _col0 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_3))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_FUNCTION count 1))) (TOK_WHERE (= (TOK_TABLE_OR_COL ds) '3')))) STAGE DEPENDENCIES: Stage-3 is a root stage Stage-2 depends on stages: Stage-3 Stage-1 depends on stages: Stage-2 Stage-0 is a root stage STAGE PLANS: Stage: Stage-3 Map Reduce Alias -> Map Operator Tree: default__src_union_3_src_union_3_key_idx__ TableScan alias: default__src_union_3_src_union_3_key_idx__ filterExpr: expr: (ds = '3') type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-2 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_3 TableScan alias: src_union_3 filterExpr: expr: (ds = '3') type: boolean Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint Reduce Operator Tree: Group By Operator aggregations: expr: count(VALUE._col0) bucketGroup: false mode: mergepartial outputColumnNames: _col0 Select Operator expressions: expr: _col0 type: bigint outputColumnNames: _col0 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 500 1000 1000 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_view))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL value)) (TOK_SELEXPR (TOK_TABLE_OR_COL ds))) (TOK_WHERE (AND (= (TOK_TABLE_OR_COL key) 86) (= (TOK_TABLE_OR_COL ds) '1'))))) STAGE DEPENDENCIES: Stage-5 is a root stage Stage-4 depends on stages: Stage-5 Stage-1 depends on stages: Stage-4 Stage-0 is a root stage STAGE PLANS: Stage: Stage-5 Map Reduce Alias -> Map Operator Tree: default__src_union_1_src_union_1_key_idx__ TableScan alias: default__src_union_1_src_union_1_key_idx__ filterExpr: expr: ((key = 86) and (ds = '1')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_view-subquery1-subquery1:subq-subquery1-subquery1:src_union_1 TableScan alias: src_union_1 filterExpr: expr: ((key = 86) and (ds = '1')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat src_union_view-subquery1-subquery2:subq-subquery1-subquery2:src_union_2 TableScan alias: src_union_2 filterExpr: expr: ((key = 86) and (ds = '1')) type: boolean Filter Operator predicate: expr: ((key = 86) and (ds = '1')) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat src_union_view-subquery2:subq-subquery2:src_union_3 TableScan alias: src_union_3 filterExpr: expr: ((key = 86) and (ds = '1')) type: boolean Filter Operator predicate: expr: ((key = 86) and (ds = '1')) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_view))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL value)) (TOK_SELEXPR (TOK_TABLE_OR_COL ds))) (TOK_WHERE (AND (= (TOK_TABLE_OR_COL key) 86) (= (TOK_TABLE_OR_COL ds) '2'))))) STAGE DEPENDENCIES: Stage-5 is a root stage Stage-4 depends on stages: Stage-5 Stage-1 depends on stages: Stage-4 Stage-0 is a root stage STAGE PLANS: Stage: Stage-5 Map Reduce Alias -> Map Operator Tree: default__src_union_2_src_union_2_key_idx__ TableScan alias: default__src_union_2_src_union_2_key_idx__ filterExpr: expr: ((key = 86) and (ds = '2')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_view-subquery1-subquery1:subq-subquery1-subquery1:src_union_1 TableScan alias: src_union_1 filterExpr: expr: ((key = 86) and (ds = '2')) type: boolean Filter Operator predicate: expr: ((key = 86) and (ds = '2')) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat src_union_view-subquery1-subquery2:subq-subquery1-subquery2:src_union_2 TableScan alias: src_union_2 filterExpr: expr: ((key = 86) and (ds = '2')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat src_union_view-subquery2:subq-subquery2:src_union_3 TableScan alias: src_union_3 filterExpr: expr: ((key = 86) and (ds = '2')) type: boolean Filter Operator predicate: expr: ((key = 86) and (ds = '2')) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_view))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL value)) (TOK_SELEXPR (TOK_TABLE_OR_COL ds))) (TOK_WHERE (AND (= (TOK_TABLE_OR_COL key) 86) (= (TOK_TABLE_OR_COL ds) '3'))))) STAGE DEPENDENCIES: Stage-5 is a root stage Stage-4 depends on stages: Stage-5 Stage-1 depends on stages: Stage-4 Stage-0 is a root stage STAGE PLANS: Stage: Stage-5 Map Reduce Alias -> Map Operator Tree: default__src_union_3_src_union_3_key_idx__ TableScan alias: default__src_union_3_src_union_3_key_idx__ filterExpr: expr: ((key = 86) and (ds = '3')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_view-subquery1-subquery1:subq-subquery1-subquery1:src_union_1 TableScan alias: src_union_1 filterExpr: expr: ((key = 86) and (ds = '3')) type: boolean Filter Operator predicate: expr: ((key = 86) and (ds = '3')) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat src_union_view-subquery1-subquery2:subq-subquery1-subquery2:src_union_2 TableScan alias: src_union_2 filterExpr: expr: ((key = 86) and (ds = '3')) type: boolean Filter Operator predicate: expr: ((key = 86) and (ds = '3')) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat src_union_view-subquery2:subq-subquery2:src_union_3 TableScan alias: src_union_3 filterExpr: expr: ((key = 86) and (ds = '3')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_view))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL value)) (TOK_SELEXPR (TOK_TABLE_OR_COL ds))) (TOK_WHERE (AND (= (TOK_TABLE_OR_COL key) 86) (TOK_FUNCTION TOK_ISNOTNULL (TOK_TABLE_OR_COL ds)))) (TOK_ORDERBY (TOK_TABSORTCOLNAMEASC (TOK_TABLE_OR_COL ds))))) STAGE DEPENDENCIES: Stage-5 is a root stage Stage-4 depends on stages: Stage-5 Stage-1 depends on stages: Stage-4, Stage-6, Stage-8 Stage-7 is a root stage Stage-6 depends on stages: Stage-7 Stage-9 is a root stage Stage-8 depends on stages: Stage-9 Stage-0 is a root stage STAGE PLANS: Stage: Stage-5 Map Reduce Alias -> Map Operator Tree: default__src_union_2_src_union_2_key_idx__ TableScan alias: default__src_union_2_src_union_2_key_idx__ filterExpr: expr: (key = 86) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_view-subquery1-subquery1:subq-subquery1-subquery1:src_union_1 TableScan alias: src_union_1 filterExpr: expr: ((key = 86) and ds is not null) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 Reduce Output Operator key expressions: expr: _col2 type: string sort order: + tag: -1 value expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string src_union_view-subquery1-subquery2:subq-subquery1-subquery2:src_union_2 TableScan alias: src_union_2 filterExpr: expr: ((key = 86) and ds is not null) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 Reduce Output Operator key expressions: expr: _col2 type: string sort order: + tag: -1 value expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string src_union_view-subquery2:subq-subquery2:src_union_3 TableScan alias: src_union_3 filterExpr: expr: ((key = 86) and ds is not null) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 Reduce Output Operator key expressions: expr: _col2 type: string sort order: + tag: -1 value expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string Reduce Operator Tree: Extract File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-7 Map Reduce Alias -> Map Operator Tree: default__src_union_1_src_union_1_key_idx__ TableScan alias: default__src_union_1_src_union_1_key_idx__ filterExpr: expr: (key = 86) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-6 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-9 Map Reduce Alias -> Map Operator Tree: default__src_union_3_src_union_3_key_idx__ TableScan alias: default__src_union_3_src_union_3_key_idx__ filterExpr: expr: (key = 86) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-8 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-0 Fetch Operator limit: -1 86 val_86 1 86 val_86 2 86 val_86 2 86 val_86 3 86 val_86 3 86 val_86 1 86 val_86 2 86 val_86 2 86 val_86 3 86 val_86 3 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_view))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_FUNCTION count 1))) (TOK_WHERE (= (TOK_TABLE_OR_COL ds) '1')))) STAGE DEPENDENCIES: Stage-5 is a root stage Stage-4 depends on stages: Stage-5 Stage-1 depends on stages: Stage-4 Stage-0 is a root stage STAGE PLANS: Stage: Stage-5 Map Reduce Alias -> Map Operator Tree: default__src_union_1_src_union_1_key_idx__ TableScan alias: default__src_union_1_src_union_1_key_idx__ filterExpr: expr: (ds = '1') type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_view-subquery1-subquery1:subq-subquery1-subquery1:src_union_1 TableScan alias: src_union_1 filterExpr: expr: (ds = '1') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint src_union_view-subquery1-subquery2:subq-subquery1-subquery2:src_union_2 TableScan alias: src_union_2 filterExpr: expr: (ds = '1') type: boolean Filter Operator predicate: expr: (ds = '1') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint src_union_view-subquery2:subq-subquery2:src_union_3 TableScan alias: src_union_3 filterExpr: expr: (ds = '1') type: boolean Filter Operator predicate: expr: (ds = '1') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint Reduce Operator Tree: Group By Operator aggregations: expr: count(VALUE._col0) bucketGroup: false mode: mergepartial outputColumnNames: _col0 Select Operator expressions: expr: _col0 type: bigint outputColumnNames: _col0 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_view))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_FUNCTION count 1))) (TOK_WHERE (= (TOK_TABLE_OR_COL ds) '2')))) STAGE DEPENDENCIES: Stage-5 is a root stage Stage-4 depends on stages: Stage-5 Stage-1 depends on stages: Stage-4 Stage-0 is a root stage STAGE PLANS: Stage: Stage-5 Map Reduce Alias -> Map Operator Tree: default__src_union_2_src_union_2_key_idx__ TableScan alias: default__src_union_2_src_union_2_key_idx__ filterExpr: expr: (ds = '2') type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_view-subquery1-subquery1:subq-subquery1-subquery1:src_union_1 TableScan alias: src_union_1 filterExpr: expr: (ds = '2') type: boolean Filter Operator predicate: expr: (ds = '2') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint src_union_view-subquery1-subquery2:subq-subquery1-subquery2:src_union_2 TableScan alias: src_union_2 filterExpr: expr: (ds = '2') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint src_union_view-subquery2:subq-subquery2:src_union_3 TableScan alias: src_union_3 filterExpr: expr: (ds = '2') type: boolean Filter Operator predicate: expr: (ds = '2') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint Reduce Operator Tree: Group By Operator aggregations: expr: count(VALUE._col0) bucketGroup: false mode: mergepartial outputColumnNames: _col0 Select Operator expressions: expr: _col0 type: bigint outputColumnNames: _col0 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_view))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_FUNCTION count 1))) (TOK_WHERE (= (TOK_TABLE_OR_COL ds) '3')))) STAGE DEPENDENCIES: Stage-5 is a root stage Stage-4 depends on stages: Stage-5 Stage-1 depends on stages: Stage-4 Stage-0 is a root stage STAGE PLANS: Stage: Stage-5 Map Reduce Alias -> Map Operator Tree: default__src_union_3_src_union_3_key_idx__ TableScan alias: default__src_union_3_src_union_3_key_idx__ filterExpr: expr: (ds = '3') type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_view-subquery1-subquery1:subq-subquery1-subquery1:src_union_1 TableScan alias: src_union_1 filterExpr: expr: (ds = '3') type: boolean Filter Operator predicate: expr: (ds = '3') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint src_union_view-subquery1-subquery2:subq-subquery1-subquery2:src_union_2 TableScan alias: src_union_2 filterExpr: expr: (ds = '3') type: boolean Filter Operator predicate: expr: (ds = '3') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint src_union_view-subquery2:subq-subquery2:src_union_3 TableScan alias: src_union_3 filterExpr: expr: (ds = '3') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint Reduce Operator Tree: Group By Operator aggregations: expr: count(VALUE._col0) bucketGroup: false mode: mergepartial outputColumnNames: _col0 Select Operator expressions: expr: _col0 type: bigint outputColumnNames: _col0 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 500 1000 1000 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_view))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_TABLE_OR_COL key)) (TOK_SELEXPR (TOK_TABLE_OR_COL value)) (TOK_SELEXPR (TOK_TABLE_OR_COL ds))) (TOK_WHERE (AND (= (TOK_TABLE_OR_COL key) 86) (= (TOK_TABLE_OR_COL ds) '4'))))) STAGE DEPENDENCIES: Stage-5 is a root stage Stage-4 depends on stages: Stage-5 Stage-1 depends on stages: Stage-4 Stage-0 is a root stage STAGE PLANS: Stage: Stage-5 Map Reduce Alias -> Map Operator Tree: default__src_union_3_src_union_3_key_idx__ TableScan alias: default__src_union_3_src_union_3_key_idx__ filterExpr: expr: ((key = 86) and (ds = '4')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_view-subquery1-subquery1:subq-subquery1-subquery1:src_union_1 TableScan alias: src_union_1 filterExpr: expr: ((key = 86) and (ds = '4')) type: boolean Filter Operator predicate: expr: ((key = 86) and (ds = '4')) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat src_union_view-subquery1-subquery2:subq-subquery1-subquery2:src_union_2 TableScan alias: src_union_2 filterExpr: expr: ((key = 86) and (ds = '4')) type: boolean Filter Operator predicate: expr: ((key = 86) and (ds = '4')) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat src_union_view-subquery2:subq-subquery2:src_union_3 TableScan alias: src_union_3 filterExpr: expr: ((key = 86) and (ds = '4')) type: boolean Filter Operator predicate: expr: (key = 86) type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator expressions: expr: _col0 type: int expr: _col1 type: string expr: _col2 type: string outputColumnNames: _col0, _col1, _col2 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 86 val_86 4 ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME src_union_view))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_FUNCTION count 1))) (TOK_WHERE (= (TOK_TABLE_OR_COL ds) '4')))) STAGE DEPENDENCIES: Stage-5 is a root stage Stage-4 depends on stages: Stage-5 Stage-1 depends on stages: Stage-4 Stage-0 is a root stage STAGE PLANS: Stage: Stage-5 Map Reduce Alias -> Map Operator Tree: default__src_union_3_src_union_3_key_idx__ TableScan alias: default__src_union_3_src_union_3_key_idx__ filterExpr: expr: (ds = '4') type: boolean Select Operator expressions: expr: _bucketname type: string expr: _offsets type: array outputColumnNames: _col0, _col1 File Output Operator compressed: false GlobalTableId: 1 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Map Reduce Alias -> Map Operator Tree: src_union_view-subquery1-subquery1:subq-subquery1-subquery1:src_union_1 TableScan alias: src_union_1 filterExpr: expr: (ds = '4') type: boolean Filter Operator predicate: expr: (ds = '4') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint src_union_view-subquery1-subquery2:subq-subquery1-subquery2:src_union_2 TableScan alias: src_union_2 filterExpr: expr: (ds = '4') type: boolean Filter Operator predicate: expr: (ds = '4') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint src_union_view-subquery2:subq-subquery2:src_union_3 TableScan alias: src_union_3 filterExpr: expr: (ds = '4') type: boolean Select Operator expressions: expr: key type: int expr: value type: string expr: ds type: string outputColumnNames: _col0, _col1, _col2 Union Select Operator Group By Operator aggregations: expr: count(1) bucketGroup: false mode: hash outputColumnNames: _col0 Reduce Output Operator sort order: tag: -1 value expressions: expr: _col0 type: bigint Reduce Operator Tree: Group By Operator aggregations: expr: count(VALUE._col0) bucketGroup: false mode: mergepartial outputColumnNames: _col0 Select Operator expressions: expr: _col0 type: bigint outputColumnNames: _col0 File Output Operator compressed: false GlobalTableId: 0 table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat Stage: Stage-0 Fetch Operator limit: -1 500