PREHOOK: query: DROP TABLE orc_createas1a PREHOOK: type: DROPTABLE POSTHOOK: query: DROP TABLE orc_createas1a POSTHOOK: type: DROPTABLE PREHOOK: query: DROP TABLE orc_createas1b PREHOOK: type: DROPTABLE POSTHOOK: query: DROP TABLE orc_createas1b POSTHOOK: type: DROPTABLE PREHOOK: query: DROP TABLE orc_createas1c PREHOOK: type: DROPTABLE POSTHOOK: query: DROP TABLE orc_createas1c POSTHOOK: type: DROPTABLE PREHOOK: query: CREATE TABLE orc_createas1a (key INT, value STRING) PARTITIONED BY (ds string) PREHOOK: type: CREATETABLE PREHOOK: Output: database:default PREHOOK: Output: default@orc_createas1a POSTHOOK: query: CREATE TABLE orc_createas1a (key INT, value STRING) PARTITIONED BY (ds string) POSTHOOK: type: CREATETABLE POSTHOOK: Output: database:default POSTHOOK: Output: default@orc_createas1a PREHOOK: query: INSERT OVERWRITE TABLE orc_createas1a PARTITION (ds='1') SELECT * FROM src PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: default@orc_createas1a@ds=1 POSTHOOK: query: INSERT OVERWRITE TABLE orc_createas1a PARTITION (ds='1') SELECT * FROM src POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: default@orc_createas1a@ds=1 POSTHOOK: Lineage: orc_createas1a PARTITION(ds=1).key EXPRESSION [(src)src.FieldSchema(name:key, type:string, comment:default), ] POSTHOOK: Lineage: orc_createas1a PARTITION(ds=1).value SIMPLE [(src)src.FieldSchema(name:value, type:string, comment:default), ] PREHOOK: query: INSERT OVERWRITE TABLE orc_createas1a PARTITION (ds='2') SELECT * FROM src PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: default@orc_createas1a@ds=2 POSTHOOK: query: INSERT OVERWRITE TABLE orc_createas1a PARTITION (ds='2') SELECT * FROM src POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: default@orc_createas1a@ds=2 POSTHOOK: Lineage: orc_createas1a PARTITION(ds=2).key EXPRESSION [(src)src.FieldSchema(name:key, type:string, comment:default), ] POSTHOOK: Lineage: orc_createas1a PARTITION(ds=2).value SIMPLE [(src)src.FieldSchema(name:value, type:string, comment:default), ] PREHOOK: query: EXPLAIN CREATE TABLE orc_createas1b STORED AS ORC AS SELECT * FROM src PREHOOK: type: CREATETABLE_AS_SELECT POSTHOOK: query: EXPLAIN CREATE TABLE orc_createas1b STORED AS ORC AS SELECT * FROM src POSTHOOK: type: CREATETABLE_AS_SELECT STAGE DEPENDENCIES: Stage-1 is a root stage Stage-7 depends on stages: Stage-1 , consists of Stage-4, Stage-3, Stage-5 Stage-4 Stage-0 depends on stages: Stage-4, Stage-3, Stage-6 Stage-8 depends on stages: Stage-0 Stage-2 depends on stages: Stage-8 Stage-3 Stage-5 Stage-6 depends on stages: Stage-5 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: src Statistics: Num rows: 500 Data size: 5312 Basic stats: COMPLETE Column stats: NONE Select Operator expressions: key (type: string), value (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 500 Data size: 5312 Basic stats: COMPLETE Column stats: NONE File Output Operator compressed: false Statistics: Num rows: 500 Data size: 5312 Basic stats: COMPLETE Column stats: NONE table: input format: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat output format: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat serde: org.apache.hadoop.hive.ql.io.orc.OrcSerde name: default.orc_createas1b Stage: Stage-7 Conditional Operator Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-0 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-8 Create Table Operator: Create Table columns: key string, value string input format: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat output format: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat serde name: org.apache.hadoop.hive.ql.io.orc.OrcSerde name: default.orc_createas1b Stage: Stage-2 Stats-Aggr Operator Stage: Stage-3 Merge File Operator Map Operator Tree: ORC File Merge Operator merge level: stripe input format: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat Stage: Stage-5 Merge File Operator Map Operator Tree: ORC File Merge Operator merge level: stripe input format: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat Stage: Stage-6 Move Operator files: hdfs directory: true #### A masked pattern was here #### PREHOOK: query: CREATE TABLE orc_createas1b STORED AS ORC AS SELECT * FROM src PREHOOK: type: CREATETABLE_AS_SELECT PREHOOK: Input: default@src PREHOOK: Output: database:default PREHOOK: Output: default@orc_createas1b POSTHOOK: query: CREATE TABLE orc_createas1b STORED AS ORC AS SELECT * FROM src POSTHOOK: type: CREATETABLE_AS_SELECT POSTHOOK: Input: default@src POSTHOOK: Output: database:default POSTHOOK: Output: default@orc_createas1b PREHOOK: query: EXPLAIN SELECT * FROM orc_createas1b ORDER BY key LIMIT 5 PREHOOK: type: QUERY POSTHOOK: query: EXPLAIN SELECT * FROM orc_createas1b ORDER BY key LIMIT 5 POSTHOOK: type: QUERY STAGE DEPENDENCIES: Stage-1 is a root stage Stage-0 depends on stages: Stage-1 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: orc_createas1b Statistics: Num rows: 500 Data size: 88000 Basic stats: COMPLETE Column stats: NONE Select Operator expressions: key (type: string), value (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 500 Data size: 88000 Basic stats: COMPLETE Column stats: NONE Reduce Output Operator key expressions: _col0 (type: string) sort order: + Statistics: Num rows: 500 Data size: 88000 Basic stats: COMPLETE Column stats: NONE value expressions: _col1 (type: string) Reduce Operator Tree: Select Operator expressions: KEY.reducesinkkey0 (type: string), VALUE._col0 (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 500 Data size: 88000 Basic stats: COMPLETE Column stats: NONE Limit Number of rows: 5 Statistics: Num rows: 5 Data size: 880 Basic stats: COMPLETE Column stats: NONE File Output Operator compressed: false Statistics: Num rows: 5 Data size: 880 Basic stats: COMPLETE Column stats: NONE table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe Stage: Stage-0 Fetch Operator limit: 5 Processor Tree: ListSink PREHOOK: query: SELECT * FROM orc_createas1b ORDER BY key LIMIT 5 PREHOOK: type: QUERY PREHOOK: Input: default@orc_createas1b #### A masked pattern was here #### POSTHOOK: query: SELECT * FROM orc_createas1b ORDER BY key LIMIT 5 POSTHOOK: type: QUERY POSTHOOK: Input: default@orc_createas1b #### A masked pattern was here #### 0 val_0 0 val_0 0 val_0 10 val_10 100 val_100 PREHOOK: query: EXPLAIN CREATE TABLE orc_createas1c STORED AS ORC AS SELECT key, value, PMOD(HASH(key), 50) as part FROM orc_createas1a PREHOOK: type: CREATETABLE_AS_SELECT POSTHOOK: query: EXPLAIN CREATE TABLE orc_createas1c STORED AS ORC AS SELECT key, value, PMOD(HASH(key), 50) as part FROM orc_createas1a POSTHOOK: type: CREATETABLE_AS_SELECT STAGE DEPENDENCIES: Stage-1 is a root stage Stage-7 depends on stages: Stage-1 , consists of Stage-4, Stage-3, Stage-5 Stage-4 Stage-0 depends on stages: Stage-4, Stage-3, Stage-6 Stage-8 depends on stages: Stage-0 Stage-2 depends on stages: Stage-8 Stage-3 Stage-5 Stage-6 depends on stages: Stage-5 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: orc_createas1a Statistics: Num rows: 1000 Data size: 10624 Basic stats: COMPLETE Column stats: NONE Select Operator expressions: key (type: int), value (type: string), (hash(key) pmod 50) (type: int) outputColumnNames: _col0, _col1, _col2 Statistics: Num rows: 1000 Data size: 10624 Basic stats: COMPLETE Column stats: NONE File Output Operator compressed: false Statistics: Num rows: 1000 Data size: 10624 Basic stats: COMPLETE Column stats: NONE table: input format: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat output format: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat serde: org.apache.hadoop.hive.ql.io.orc.OrcSerde name: default.orc_createas1c Stage: Stage-7 Conditional Operator Stage: Stage-4 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-0 Move Operator files: hdfs directory: true #### A masked pattern was here #### Stage: Stage-8 Create Table Operator: Create Table columns: key int, value string, part int input format: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat output format: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat serde name: org.apache.hadoop.hive.ql.io.orc.OrcSerde name: default.orc_createas1c Stage: Stage-2 Stats-Aggr Operator Stage: Stage-3 Merge File Operator Map Operator Tree: ORC File Merge Operator merge level: stripe input format: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat Stage: Stage-5 Merge File Operator Map Operator Tree: ORC File Merge Operator merge level: stripe input format: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat Stage: Stage-6 Move Operator files: hdfs directory: true #### A masked pattern was here #### PREHOOK: query: CREATE TABLE orc_createas1c STORED AS ORC AS SELECT key, value, PMOD(HASH(key), 50) as part FROM orc_createas1a PREHOOK: type: CREATETABLE_AS_SELECT PREHOOK: Input: default@orc_createas1a PREHOOK: Input: default@orc_createas1a@ds=1 PREHOOK: Input: default@orc_createas1a@ds=2 PREHOOK: Output: database:default PREHOOK: Output: default@orc_createas1c POSTHOOK: query: CREATE TABLE orc_createas1c STORED AS ORC AS SELECT key, value, PMOD(HASH(key), 50) as part FROM orc_createas1a POSTHOOK: type: CREATETABLE_AS_SELECT POSTHOOK: Input: default@orc_createas1a POSTHOOK: Input: default@orc_createas1a@ds=1 POSTHOOK: Input: default@orc_createas1a@ds=2 POSTHOOK: Output: database:default POSTHOOK: Output: default@orc_createas1c PREHOOK: query: SELECT SUM(HASH(c)) FROM ( SELECT TRANSFORM(key, value) USING 'tr \t _' AS (c) FROM orc_createas1a ) t PREHOOK: type: QUERY PREHOOK: Input: default@orc_createas1a PREHOOK: Input: default@orc_createas1a@ds=1 PREHOOK: Input: default@orc_createas1a@ds=2 #### A masked pattern was here #### POSTHOOK: query: SELECT SUM(HASH(c)) FROM ( SELECT TRANSFORM(key, value) USING 'tr \t _' AS (c) FROM orc_createas1a ) t POSTHOOK: type: QUERY POSTHOOK: Input: default@orc_createas1a POSTHOOK: Input: default@orc_createas1a@ds=1 POSTHOOK: Input: default@orc_createas1a@ds=2 #### A masked pattern was here #### 14412220296 PREHOOK: query: SELECT SUM(HASH(c)) FROM ( SELECT TRANSFORM(key, value) USING 'tr \t _' AS (c) FROM orc_createas1c ) t PREHOOK: type: QUERY PREHOOK: Input: default@orc_createas1c #### A masked pattern was here #### POSTHOOK: query: SELECT SUM(HASH(c)) FROM ( SELECT TRANSFORM(key, value) USING 'tr \t _' AS (c) FROM orc_createas1c ) t POSTHOOK: type: QUERY POSTHOOK: Input: default@orc_createas1c #### A masked pattern was here #### 14412220296 PREHOOK: query: DROP TABLE orc_createas1a PREHOOK: type: DROPTABLE PREHOOK: Input: default@orc_createas1a PREHOOK: Output: default@orc_createas1a POSTHOOK: query: DROP TABLE orc_createas1a POSTHOOK: type: DROPTABLE POSTHOOK: Input: default@orc_createas1a POSTHOOK: Output: default@orc_createas1a PREHOOK: query: DROP TABLE orc_createas1b PREHOOK: type: DROPTABLE PREHOOK: Input: default@orc_createas1b PREHOOK: Output: default@orc_createas1b POSTHOOK: query: DROP TABLE orc_createas1b POSTHOOK: type: DROPTABLE POSTHOOK: Input: default@orc_createas1b POSTHOOK: Output: default@orc_createas1b PREHOOK: query: DROP TABLE orc_createas1c PREHOOK: type: DROPTABLE PREHOOK: Input: default@orc_createas1c PREHOOK: Output: default@orc_createas1c POSTHOOK: query: DROP TABLE orc_createas1c POSTHOOK: type: DROPTABLE POSTHOOK: Input: default@orc_createas1c POSTHOOK: Output: default@orc_createas1c