PREHOOK: query: -- SORT_QUERY_RESULTS create table insert1(key int, value string) stored as textfile PREHOOK: type: CREATETABLE PREHOOK: Output: database:default PREHOOK: Output: default@insert1 POSTHOOK: query: -- SORT_QUERY_RESULTS create table insert1(key int, value string) stored as textfile POSTHOOK: type: CREATETABLE POSTHOOK: Output: database:default POSTHOOK: Output: default@insert1 PREHOOK: query: create table insert2(key int, value string) stored as textfile PREHOOK: type: CREATETABLE PREHOOK: Output: database:default PREHOOK: Output: default@insert2 POSTHOOK: query: create table insert2(key int, value string) stored as textfile POSTHOOK: type: CREATETABLE POSTHOOK: Output: database:default POSTHOOK: Output: default@insert2 PREHOOK: query: insert overwrite table insert1 select a.key, a.value from insert2 a WHERE (a.key=-1) PREHOOK: type: QUERY PREHOOK: Input: default@insert2 PREHOOK: Output: default@insert1 POSTHOOK: query: insert overwrite table insert1 select a.key, a.value from insert2 a WHERE (a.key=-1) POSTHOOK: type: QUERY POSTHOOK: Input: default@insert2 POSTHOOK: Output: default@insert1 POSTHOOK: Lineage: insert1.key SIMPLE [(insert2)a.FieldSchema(name:key, type:int, comment:null), ] POSTHOOK: Lineage: insert1.value SIMPLE [(insert2)a.FieldSchema(name:value, type:string, comment:null), ] PREHOOK: query: explain insert into table insert1 select a.key, a.value from insert2 a WHERE (a.key=-1) PREHOOK: type: QUERY POSTHOOK: query: explain insert into table insert1 select a.key, a.value from insert2 a WHERE (a.key=-1) POSTHOOK: type: QUERY STAGE DEPENDENCIES: Stage-1 is a root stage Stage-7 depends on stages: Stage-1 , consists of Stage-4, Stage-3, Stage-5 Stage-4 Stage-0 depends on stages: Stage-4, Stage-3, Stage-6 Stage-2 depends on stages: Stage-0 Stage-3 Stage-5 Stage-6 depends on stages: Stage-5 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: a Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Filter Operator predicate: (key = -1) (type: boolean) Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Select Operator expressions: -1 (type: int), value (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE File Output Operator compressed: false Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-7 Conditional Operator Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-0 Move Operator tables: replace: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-2 Stats-Aggr Operator Stage: Stage-3 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-5 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-6 Move Operator files: hdfs directory: true #### A masked pattern was here #### PREHOOK: query: explain insert into table INSERT1 select a.key, a.value from insert2 a WHERE (a.key=-1) PREHOOK: type: QUERY POSTHOOK: query: explain insert into table INSERT1 select a.key, a.value from insert2 a WHERE (a.key=-1) POSTHOOK: type: QUERY STAGE DEPENDENCIES: Stage-1 is a root stage Stage-7 depends on stages: Stage-1 , consists of Stage-4, Stage-3, Stage-5 Stage-4 Stage-0 depends on stages: Stage-4, Stage-3, Stage-6 Stage-2 depends on stages: Stage-0 Stage-3 Stage-5 Stage-6 depends on stages: Stage-5 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: a Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Filter Operator predicate: (key = -1) (type: boolean) Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Select Operator expressions: -1 (type: int), value (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE File Output Operator compressed: false Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-7 Conditional Operator Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-0 Move Operator tables: replace: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-2 Stats-Aggr Operator Stage: Stage-3 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-5 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-6 Move Operator files: hdfs directory: true #### A masked pattern was here #### PREHOOK: query: -- HIVE-3465 create database x PREHOOK: type: CREATEDATABASE PREHOOK: Output: database:x POSTHOOK: query: -- HIVE-3465 create database x POSTHOOK: type: CREATEDATABASE POSTHOOK: Output: database:x PREHOOK: query: create table x.insert1(key int, value string) stored as textfile PREHOOK: type: CREATETABLE PREHOOK: Output: database:x PREHOOK: Output: x@insert1 POSTHOOK: query: create table x.insert1(key int, value string) stored as textfile POSTHOOK: type: CREATETABLE POSTHOOK: Output: database:x POSTHOOK: Output: x@insert1 PREHOOK: query: explain insert into table x.INSERT1 select a.key, a.value from insert2 a WHERE (a.key=-1) PREHOOK: type: QUERY POSTHOOK: query: explain insert into table x.INSERT1 select a.key, a.value from insert2 a WHERE (a.key=-1) POSTHOOK: type: QUERY STAGE DEPENDENCIES: Stage-1 is a root stage Stage-7 depends on stages: Stage-1 , consists of Stage-4, Stage-3, Stage-5 Stage-4 Stage-0 depends on stages: Stage-4, Stage-3, Stage-6 Stage-2 depends on stages: Stage-0 Stage-3 Stage-5 Stage-6 depends on stages: Stage-5 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: a Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Filter Operator predicate: (key = -1) (type: boolean) Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Select Operator expressions: -1 (type: int), value (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE File Output Operator compressed: false Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: x.insert1 Stage: Stage-7 Conditional Operator Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-0 Move Operator tables: replace: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: x.insert1 Stage: Stage-2 Stats-Aggr Operator Stage: Stage-3 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: x.insert1 Stage: Stage-5 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: x.insert1 Stage: Stage-6 Move Operator files: hdfs directory: true #### A masked pattern was here #### PREHOOK: query: explain insert into table default.INSERT1 select a.key, a.value from insert2 a WHERE (a.key=-1) PREHOOK: type: QUERY POSTHOOK: query: explain insert into table default.INSERT1 select a.key, a.value from insert2 a WHERE (a.key=-1) POSTHOOK: type: QUERY STAGE DEPENDENCIES: Stage-1 is a root stage Stage-7 depends on stages: Stage-1 , consists of Stage-4, Stage-3, Stage-5 Stage-4 Stage-0 depends on stages: Stage-4, Stage-3, Stage-6 Stage-2 depends on stages: Stage-0 Stage-3 Stage-5 Stage-6 depends on stages: Stage-5 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: a Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Filter Operator predicate: (key = -1) (type: boolean) Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Select Operator expressions: -1 (type: int), value (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE File Output Operator compressed: false Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-7 Conditional Operator Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-0 Move Operator tables: replace: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-2 Stats-Aggr Operator Stage: Stage-3 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-5 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-6 Move Operator files: hdfs directory: true #### A masked pattern was here #### PREHOOK: query: explain from insert2 insert into table insert1 select * where key < 10 insert overwrite table x.insert1 select * where key > 10 and key < 20 PREHOOK: type: QUERY POSTHOOK: query: explain from insert2 insert into table insert1 select * where key < 10 insert overwrite table x.insert1 select * where key > 10 and key < 20 POSTHOOK: type: QUERY STAGE DEPENDENCIES: Stage-2 is a root stage Stage-8 depends on stages: Stage-2 , consists of Stage-5, Stage-4, Stage-6 Stage-5 Stage-0 depends on stages: Stage-5, Stage-4, Stage-7 Stage-3 depends on stages: Stage-0 Stage-4 Stage-6 Stage-7 depends on stages: Stage-6 Stage-14 depends on stages: Stage-2 , consists of Stage-11, Stage-10, Stage-12 Stage-11 Stage-1 depends on stages: Stage-11, Stage-10, Stage-13 Stage-9 depends on stages: Stage-1 Stage-10 Stage-12 Stage-13 depends on stages: Stage-12 STAGE PLANS: Stage: Stage-2 Map Reduce Map Operator Tree: TableScan alias: insert2 Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Filter Operator predicate: (key < 10) (type: boolean) Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Select Operator expressions: key (type: int), value (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE File Output Operator compressed: false Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Filter Operator predicate: ((key > 10) and (key < 20)) (type: boolean) Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE Select Operator expressions: key (type: int), value (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE File Output Operator compressed: false Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: x.insert1 Stage: Stage-8 Conditional Operator Stage: Stage-5 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-0 Move Operator tables: replace: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-3 Stats-Aggr Operator Stage: Stage-4 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-6 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.insert1 Stage: Stage-7 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-14 Conditional Operator Stage: Stage-11 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-1 Move Operator tables: replace: true table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: x.insert1 Stage: Stage-9 Stats-Aggr Operator Stage: Stage-10 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: x.insert1 Stage: Stage-12 Map Reduce Map Operator Tree: TableScan File Output Operator compressed: false table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: x.insert1 Stage: Stage-13 Move Operator files: hdfs directory: true #### A masked pattern was here #### PREHOOK: query: -- HIVE-3676 CREATE DATABASE db2 PREHOOK: type: CREATEDATABASE PREHOOK: Output: database:db2 POSTHOOK: query: -- HIVE-3676 CREATE DATABASE db2 POSTHOOK: type: CREATEDATABASE POSTHOOK: Output: database:db2 PREHOOK: query: USE db2 PREHOOK: type: SWITCHDATABASE PREHOOK: Input: database:db2 POSTHOOK: query: USE db2 POSTHOOK: type: SWITCHDATABASE POSTHOOK: Input: database:db2 PREHOOK: query: CREATE TABLE result(col1 STRING) PREHOOK: type: CREATETABLE PREHOOK: Output: database:db2 PREHOOK: Output: db2@result POSTHOOK: query: CREATE TABLE result(col1 STRING) POSTHOOK: type: CREATETABLE POSTHOOK: Output: database:db2 POSTHOOK: Output: db2@result PREHOOK: query: INSERT OVERWRITE TABLE result SELECT 'db2_insert1' FROM default.src LIMIT 1 PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: db2@result POSTHOOK: query: INSERT OVERWRITE TABLE result SELECT 'db2_insert1' FROM default.src LIMIT 1 POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: db2@result POSTHOOK: Lineage: result.col1 SIMPLE [] PREHOOK: query: INSERT INTO TABLE result SELECT 'db2_insert2' FROM default.src LIMIT 1 PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: db2@result POSTHOOK: query: INSERT INTO TABLE result SELECT 'db2_insert2' FROM default.src LIMIT 1 POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: db2@result POSTHOOK: Lineage: result.col1 SIMPLE [] PREHOOK: query: SELECT * FROM result PREHOOK: type: QUERY PREHOOK: Input: db2@result #### A masked pattern was here #### POSTHOOK: query: SELECT * FROM result POSTHOOK: type: QUERY POSTHOOK: Input: db2@result #### A masked pattern was here #### db2_insert1 db2_insert2 PREHOOK: query: USE default PREHOOK: type: SWITCHDATABASE PREHOOK: Input: database:default POSTHOOK: query: USE default POSTHOOK: type: SWITCHDATABASE POSTHOOK: Input: database:default PREHOOK: query: CREATE DATABASE db1 PREHOOK: type: CREATEDATABASE PREHOOK: Output: database:db1 POSTHOOK: query: CREATE DATABASE db1 POSTHOOK: type: CREATEDATABASE POSTHOOK: Output: database:db1 PREHOOK: query: CREATE TABLE db1.result(col1 STRING) PREHOOK: type: CREATETABLE PREHOOK: Output: database:db1 PREHOOK: Output: db1@result POSTHOOK: query: CREATE TABLE db1.result(col1 STRING) POSTHOOK: type: CREATETABLE POSTHOOK: Output: database:db1 POSTHOOK: Output: db1@result PREHOOK: query: INSERT OVERWRITE TABLE db1.result SELECT 'db1_insert1' FROM src LIMIT 1 PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: db1@result POSTHOOK: query: INSERT OVERWRITE TABLE db1.result SELECT 'db1_insert1' FROM src LIMIT 1 POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: db1@result POSTHOOK: Lineage: result.col1 SIMPLE [] PREHOOK: query: INSERT INTO TABLE db1.result SELECT 'db1_insert2' FROM src LIMIT 1 PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: db1@result POSTHOOK: query: INSERT INTO TABLE db1.result SELECT 'db1_insert2' FROM src LIMIT 1 POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: db1@result POSTHOOK: Lineage: result.col1 SIMPLE [] PREHOOK: query: SELECT * FROM db1.result PREHOOK: type: QUERY PREHOOK: Input: db1@result #### A masked pattern was here #### POSTHOOK: query: SELECT * FROM db1.result POSTHOOK: type: QUERY POSTHOOK: Input: db1@result #### A masked pattern was here #### db1_insert1 db1_insert2