PREHOOK: query: CREATE TABLE bucket2_1(key int, value string) CLUSTERED BY (key) INTO 2 BUCKETS PREHOOK: type: CREATETABLE PREHOOK: Output: database:default PREHOOK: Output: default@bucket2_1 POSTHOOK: query: CREATE TABLE bucket2_1(key int, value string) CLUSTERED BY (key) INTO 2 BUCKETS POSTHOOK: type: CREATETABLE POSTHOOK: Output: database:default POSTHOOK: Output: default@bucket2_1 PREHOOK: query: explain extended insert overwrite table bucket2_1 select * from src PREHOOK: type: QUERY POSTHOOK: query: explain extended insert overwrite table bucket2_1 select * from src POSTHOOK: type: QUERY ABSTRACT SYNTAX TREE: TOK_QUERY TOK_FROM TOK_TABREF TOK_TABNAME src TOK_INSERT TOK_DESTINATION TOK_TAB TOK_TABNAME bucket2_1 TOK_SELECT TOK_SELEXPR TOK_ALLCOLREF STAGE DEPENDENCIES: Stage-1 is a root stage Stage-0 depends on stages: Stage-1 Stage-2 depends on stages: Stage-0 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: src Statistics: Num rows: 500 Data size: 5312 Basic stats: COMPLETE Column stats: NONE GatherStats: false Select Operator expressions: key (type: string), value (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 500 Data size: 5312 Basic stats: COMPLETE Column stats: NONE Reduce Output Operator sort order: Map-reduce partition columns: UDFToInteger(_col0) (type: int) Statistics: Num rows: 500 Data size: 5312 Basic stats: COMPLETE Column stats: NONE tag: -1 value expressions: _col0 (type: string), _col1 (type: string) auto parallelism: false Path -> Alias: #### A masked pattern was here #### Path -> Partition: #### A masked pattern was here #### Partition base file name: src input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: COLUMN_STATS_ACCURATE true bucket_count -1 columns key,value columns.comments 'default','default' columns.types string:string #### A masked pattern was here #### name default.src numFiles 1 numRows 500 rawDataSize 5312 serialization.ddl struct src { string key, string value} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 5812 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: COLUMN_STATS_ACCURATE true bucket_count -1 columns key,value columns.comments 'default','default' columns.types string:string #### A masked pattern was here #### name default.src numFiles 1 numRows 500 rawDataSize 5312 serialization.ddl struct src { string key, string value} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe totalSize 5812 #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.src name: default.src Truncated Path -> Alias: /src [src] Needs Tagging: false Reduce Operator Tree: Select Operator expressions: UDFToInteger(VALUE._col0) (type: int), VALUE._col1 (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 500 Data size: 5312 Basic stats: COMPLETE Column stats: NONE File Output Operator compressed: false GlobalTableId: 1 #### A masked pattern was here #### NumFilesPerFileSink: 2 Statistics: Num rows: 500 Data size: 5312 Basic stats: COMPLETE Column stats: NONE #### A masked pattern was here #### table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count 2 bucket_field_name key columns key,value columns.comments columns.types int:string #### A masked pattern was here #### name default.bucket2_1 serialization.ddl struct bucket2_1 { i32 key, string value} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.bucket2_1 TotalFiles: 2 GatherStats: true MultiFileSpray: true Stage: Stage-0 Move Operator tables: replace: true #### A masked pattern was here #### table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat properties: bucket_count 2 bucket_field_name key columns key,value columns.comments columns.types int:string #### A masked pattern was here #### name default.bucket2_1 serialization.ddl struct bucket2_1 { i32 key, string value} serialization.format 1 serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe #### A masked pattern was here #### serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe name: default.bucket2_1 Stage: Stage-2 Stats-Aggr Operator #### A masked pattern was here #### PREHOOK: query: insert overwrite table bucket2_1 select * from src PREHOOK: type: QUERY PREHOOK: Input: default@src PREHOOK: Output: default@bucket2_1 POSTHOOK: query: insert overwrite table bucket2_1 select * from src POSTHOOK: type: QUERY POSTHOOK: Input: default@src POSTHOOK: Output: default@bucket2_1 POSTHOOK: Lineage: bucket2_1.key EXPRESSION [(src)src.FieldSchema(name:key, type:string, comment:default), ] POSTHOOK: Lineage: bucket2_1.value SIMPLE [(src)src.FieldSchema(name:value, type:string, comment:default), ] PREHOOK: query: explain select * from bucket2_1 tablesample (bucket 1 out of 2) s order by key PREHOOK: type: QUERY POSTHOOK: query: explain select * from bucket2_1 tablesample (bucket 1 out of 2) s order by key POSTHOOK: type: QUERY STAGE DEPENDENCIES: Stage-1 is a root stage Stage-0 depends on stages: Stage-1 STAGE PLANS: Stage: Stage-1 Map Reduce Map Operator Tree: TableScan alias: s Statistics: Num rows: 55 Data size: 5812 Basic stats: COMPLETE Column stats: NONE Filter Operator predicate: (((hash(key) & 2147483647) % 2) = 0) (type: boolean) Statistics: Num rows: 27 Data size: 2853 Basic stats: COMPLETE Column stats: NONE Select Operator expressions: key (type: int), value (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 27 Data size: 2853 Basic stats: COMPLETE Column stats: NONE Reduce Output Operator key expressions: _col0 (type: int) sort order: + Statistics: Num rows: 27 Data size: 2853 Basic stats: COMPLETE Column stats: NONE value expressions: _col1 (type: string) Reduce Operator Tree: Select Operator expressions: KEY.reducesinkkey0 (type: int), VALUE._col0 (type: string) outputColumnNames: _col0, _col1 Statistics: Num rows: 27 Data size: 2853 Basic stats: COMPLETE Column stats: NONE File Output Operator compressed: false Statistics: Num rows: 27 Data size: 2853 Basic stats: COMPLETE Column stats: NONE table: input format: org.apache.hadoop.mapred.TextInputFormat output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe Stage: Stage-0 Fetch Operator limit: -1 Processor Tree: ListSink PREHOOK: query: select * from bucket2_1 tablesample (bucket 1 out of 2) s order by key PREHOOK: type: QUERY PREHOOK: Input: default@bucket2_1 #### A masked pattern was here #### POSTHOOK: query: select * from bucket2_1 tablesample (bucket 1 out of 2) s order by key POSTHOOK: type: QUERY POSTHOOK: Input: default@bucket2_1 #### A masked pattern was here #### 0 val_0 0 val_0 0 val_0 2 val_2 4 val_4 8 val_8 10 val_10 12 val_12 12 val_12 18 val_18 18 val_18 20 val_20 24 val_24 24 val_24 26 val_26 26 val_26 28 val_28 30 val_30 34 val_34 42 val_42 42 val_42 44 val_44 54 val_54 58 val_58 58 val_58 64 val_64 66 val_66 70 val_70 70 val_70 70 val_70 72 val_72 72 val_72 74 val_74 76 val_76 76 val_76 78 val_78 80 val_80 82 val_82 84 val_84 84 val_84 86 val_86 90 val_90 90 val_90 90 val_90 92 val_92 96 val_96 98 val_98 98 val_98 100 val_100 100 val_100 104 val_104 104 val_104 114 val_114 116 val_116 118 val_118 118 val_118 120 val_120 120 val_120 126 val_126 128 val_128 128 val_128 128 val_128 134 val_134 134 val_134 136 val_136 138 val_138 138 val_138 138 val_138 138 val_138 146 val_146 146 val_146 150 val_150 152 val_152 152 val_152 156 val_156 158 val_158 160 val_160 162 val_162 164 val_164 164 val_164 166 val_166 168 val_168 170 val_170 172 val_172 172 val_172 174 val_174 174 val_174 176 val_176 176 val_176 178 val_178 180 val_180 186 val_186 190 val_190 192 val_192 194 val_194 196 val_196 200 val_200 200 val_200 202 val_202 208 val_208 208 val_208 208 val_208 214 val_214 216 val_216 216 val_216 218 val_218 222 val_222 224 val_224 224 val_224 226 val_226 228 val_228 230 val_230 230 val_230 230 val_230 230 val_230 230 val_230 238 val_238 238 val_238 242 val_242 242 val_242 244 val_244 248 val_248 252 val_252 256 val_256 256 val_256 258 val_258 260 val_260 262 val_262 266 val_266 272 val_272 272 val_272 274 val_274 278 val_278 278 val_278 280 val_280 280 val_280 282 val_282 282 val_282 284 val_284 286 val_286 288 val_288 288 val_288 292 val_292 296 val_296 298 val_298 298 val_298 298 val_298 302 val_302 306 val_306 308 val_308 310 val_310 316 val_316 316 val_316 316 val_316 318 val_318 318 val_318 318 val_318 322 val_322 322 val_322 332 val_332 336 val_336 338 val_338 342 val_342 342 val_342 344 val_344 344 val_344 348 val_348 348 val_348 348 val_348 348 val_348 348 val_348 356 val_356 360 val_360 362 val_362 364 val_364 366 val_366 368 val_368 374 val_374 378 val_378 382 val_382 382 val_382 384 val_384 384 val_384 384 val_384 386 val_386 392 val_392 394 val_394 396 val_396 396 val_396 396 val_396 400 val_400 402 val_402 404 val_404 404 val_404 406 val_406 406 val_406 406 val_406 406 val_406 414 val_414 414 val_414 418 val_418 424 val_424 424 val_424 430 val_430 430 val_430 430 val_430 432 val_432 436 val_436 438 val_438 438 val_438 438 val_438 444 val_444 446 val_446 448 val_448 452 val_452 454 val_454 454 val_454 454 val_454 458 val_458 458 val_458 460 val_460 462 val_462 462 val_462 466 val_466 466 val_466 466 val_466 468 val_468 468 val_468 468 val_468 468 val_468 470 val_470 472 val_472 478 val_478 478 val_478 480 val_480 480 val_480 480 val_480 482 val_482 484 val_484 490 val_490 492 val_492 492 val_492 494 val_494 496 val_496 498 val_498 498 val_498 498 val_498