public interface HiveSparkClient extends Serializable, Closeable
Modifier and Type | Method and Description |
---|---|
SparkJobRef |
execute(DriverContext driverContext,
SparkWork sparkWork)
HiveSparkClient should generate Spark RDD graph by given sparkWork and driverContext,
and submit RDD graph to Spark cluster.
|
int |
getDefaultParallelism()
For standalone mode, this can be used to get total number of cores.
|
int |
getExecutorCount() |
org.apache.spark.SparkConf |
getSparkConf() |
SparkJobRef execute(DriverContext driverContext, SparkWork sparkWork) throws Exception
driverContext
- sparkWork
- Exception
org.apache.spark.SparkConf getSparkConf()
int getExecutorCount() throws Exception
Exception
Copyright © 2021 The Apache Software Foundation. All rights reserved.