- A - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for 'A'
- abort(String, Throwable) - Method in interface org.apache.hadoop.hbase.client.Admin
-
- abortProcedure(long, boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
abort a procedure
- abortProcedureAsync(long, boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Abort a procedure but does not block and wait for it be completely removed.
- abortTask(TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputCommitter
-
- AccessControlClient - Class in org.apache.hadoop.hbase.security.access
-
Utility client for doing access control admin operations.
- AccessControlClient() - Constructor for class org.apache.hadoop.hbase.security.access.AccessControlClient
-
- AccessControlConstants - Interface in org.apache.hadoop.hbase.security.access
-
- AccessDeniedException - Exception in org.apache.hadoop.hbase.security
-
Exception thrown by access-related methods.
- AccessDeniedException() - Constructor for exception org.apache.hadoop.hbase.security.AccessDeniedException
-
- AccessDeniedException(Class<?>, String) - Constructor for exception org.apache.hadoop.hbase.security.AccessDeniedException
-
- AccessDeniedException(String) - Constructor for exception org.apache.hadoop.hbase.security.AccessDeniedException
-
- AccessDeniedException(Throwable) - Constructor for exception org.apache.hadoop.hbase.security.AccessDeniedException
-
- ACL_TABLE_NAME - Static variable in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
- ACTION_BY_CODE - Static variable in class org.apache.hadoop.hbase.security.access.Permission
-
- actions - Variable in class org.apache.hadoop.hbase.security.access.Permission
-
- add(byte[], byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Append
-
Add the specified column and value to this Append operation.
- add(Cell) - Method in class org.apache.hadoop.hbase.client.Append
-
Add column and value to this Append operation.
- add(Double, Double) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- add(Cell) - Method in class org.apache.hadoop.hbase.client.Increment
-
Add the specified KeyValue to this operation.
- add(Cell) - Method in class org.apache.hadoop.hbase.client.Put
-
Add the specified KeyValue to this Put operation.
- add(Put) - Method in class org.apache.hadoop.hbase.client.RowMutations
-
Add a
Put
operation to the list of mutations
- add(Delete) - Method in class org.apache.hadoop.hbase.client.RowMutations
-
Add a
Delete
operation to the list of mutations
- add(String) - Method in class org.apache.hadoop.hbase.rest.client.Cluster
-
Add a node to the cluster
- add(String, int) - Method in class org.apache.hadoop.hbase.rest.client.Cluster
-
Add a node to the cluster
- add(DataType<?>) - Method in class org.apache.hadoop.hbase.types.StructBuilder
-
Append field
to the sequence of accumulated fields.
- add(byte[], byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- add(byte[], byte[], byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- add(byte[][]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- add(long) - Method in class org.apache.hadoop.hbase.util.Counter
-
- add(long, long) - Method in class org.apache.hadoop.hbase.util.FastLongHistogram
-
Adds a value to the histogram.
- addClientPort(int) - Method in class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
Add a client port to the list.
- addColumn(TableName, HColumnDescriptor) - Method in interface org.apache.hadoop.hbase.client.Admin
-
- addColumn(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Delete
-
Delete the latest version of the specified column.
- addColumn(byte[], byte[], long) - Method in class org.apache.hadoop.hbase.client.Delete
-
Delete the specified version of the specified column.
- addColumn(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Get
-
Get the column from the specific family with the specified qualifier.
- addColumn(byte[], byte[], long) - Method in class org.apache.hadoop.hbase.client.Increment
-
Increment the column from the specific family with the specified qualifier
by the specified amount.
- addColumn(byte[], byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Put
-
Add the specified column and value to this Put operation.
- addColumn(byte[], byte[], long, byte[]) - Method in class org.apache.hadoop.hbase.client.Put
-
Add the specified column and value, with the specified timestamp as
its version to this Put operation.
- addColumn(byte[], ByteBuffer, long, ByteBuffer) - Method in class org.apache.hadoop.hbase.client.Put
-
Add the specified column and value, with the specified timestamp as
its version to this Put operation.
- addColumn(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Scan
-
Get the column from the specified family with the specified qualifier.
- addColumnFamily(TableName, HColumnDescriptor) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Add a column family to an existing table.
- addColumns(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Delete
-
Delete all versions of the specified column.
- addColumns(byte[], byte[], long) - Method in class org.apache.hadoop.hbase.client.Delete
-
Delete all versions of the specified column with a timestamp less than
or equal to the specified timestamp.
- addColumns(Scan, byte[][]) - Static method in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Adds an array of columns specified using old format, family:qualifier.
- addConfiguration(Map<String, String>) - Method in class org.apache.hadoop.hbase.NamespaceDescriptor.Builder
-
- addConfiguration(String, String) - Method in class org.apache.hadoop.hbase.NamespaceDescriptor.Builder
-
- addCoprocessor(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Add a table coprocessor to this table.
- addCoprocessor(String, Path, int, Map<String, String>) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Add a table coprocessor to this table.
- addCoprocessorWithSpec(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Add a table coprocessor to this table.
- addDeleteMarker(Cell) - Method in class org.apache.hadoop.hbase.client.Delete
-
Advanced use only.
- addDependencyJars(JobConf) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
- addDependencyJars(Job) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Add the HBase dependency jars as well as jars for any of the configured
job classes to the job configuration, so that JobClient will ship them
to the cluster and add them to the DistributedCache.
- addDependencyJars(Configuration, Class<?>...) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Add the jars containing the given classes to the job's configuration
such that JobClient will ship them to the cluster and add them to
the DistributedCache.
- addExtraHeader(String, String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Add extra headers.
- addFamily(byte[]) - Method in class org.apache.hadoop.hbase.client.Delete
-
Delete all versions of all columns of the specified family.
- addFamily(byte[], long) - Method in class org.apache.hadoop.hbase.client.Delete
-
Delete all columns of the specified family with a timestamp less than
or equal to the specified timestamp.
- addFamily(byte[]) - Method in class org.apache.hadoop.hbase.client.Get
-
Get all columns from the specified family.
- addFamily(byte[]) - Method in class org.apache.hadoop.hbase.client.Scan
-
Get all columns from the specified family.
- addFamily(HColumnDescriptor) - Method in class org.apache.hadoop.hbase.client.UnmodifyableHTableDescriptor
-
Does NOT add a column family.
- addFamily(HColumnDescriptor) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Adds a column family.
- addFamilyVersion(byte[], long) - Method in class org.apache.hadoop.hbase.client.Delete
-
Delete all columns of the specified family with a timestamp equal to
the specified timestamp.
- addFilter(Filter) - Method in class org.apache.hadoop.hbase.filter.FilterList
-
Add a filter.
- addFilterAndArguments(Configuration, Class<? extends Filter>, List<String>) - Static method in class org.apache.hadoop.hbase.mapreduce.Import
-
Add a Filter to be instantiated on import
- addHBaseDependencyJars(Configuration) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Add HBase and its dependencies (only) to the job configuration.
- addHbaseResources(Configuration) - Static method in class org.apache.hadoop.hbase.HBaseConfiguration
-
- addImmutable(byte[], byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Put
-
- addImmutable(byte[], byte[], byte[], Tag[]) - Method in class org.apache.hadoop.hbase.client.Put
-
This expects that the underlying arrays won't change.
- addImmutable(byte[], byte[], long, byte[]) - Method in class org.apache.hadoop.hbase.client.Put
-
- addImmutable(byte[], byte[], long, byte[], Tag[]) - Method in class org.apache.hadoop.hbase.client.Put
-
This expects that the underlying arrays won't change.
- addImmutable(byte[], ByteBuffer, long, ByteBuffer, Tag[]) - Method in class org.apache.hadoop.hbase.client.Put
-
This expects that the underlying arrays won't change.
- addImmutable(byte[], ByteBuffer, long, ByteBuffer) - Method in class org.apache.hadoop.hbase.client.Put
-
- addLabel(Configuration, String) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
- addLabel(Connection, String) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
Utility method for adding label to the system.
- addLabels(Configuration, String[]) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
- addLabels(Connection, String[]) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
Utility method for adding labels to the system.
- addLabels(List<byte[]>) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
Adds the set of labels into the system.
- addMaster() - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- addMaster(Configuration, int) - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- addMaster(Configuration, int, User) - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- addPeer(String, String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Deprecated.
Use addPeer(String, ReplicationPeerConfig, Map) instead.
- addPeer(String, String, String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Deprecated.
- addPeer(String, ReplicationPeerConfig, Map<TableName, ? extends Collection<String>>) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Add a new remote slave cluster for replication.
- addRegionServer() - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- addRegionServer(Configuration, int) - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- addRegionServer(Configuration, int, User) - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- addResults(ClientProtos.RegionLoadStats) - Method in class org.apache.hadoop.hbase.client.Result
-
- addToCounter(String, long) - Method in class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
- addToken(Token<? extends TokenIdentifier>) - Method in class org.apache.hadoop.hbase.security.User
-
Adds the given Token to the user's credentials.
- addTokenForJob(Connection, JobConf, User) - Static method in class org.apache.hadoop.hbase.security.token.TokenUtil
-
Checks for an authentication token for the given user, obtaining a new token if necessary,
and adds it to the credentials for the given map reduce job.
- addTokenForJob(Connection, User, Job) - Static method in class org.apache.hadoop.hbase.security.token.TokenUtil
-
Checks for an authentication token for the given user, obtaining a new token if necessary,
and adds it to the credentials for the given map reduce job.
- addTokenIfMissing(Connection, User) - Static method in class org.apache.hadoop.hbase.security.token.TokenUtil
-
Checks if an authentication tokens exists for the connected cluster,
obtaining one if needed and adding it to the user's credentials.
- addTypeFilter(QuotaType) - Method in class org.apache.hadoop.hbase.quotas.QuotaFilter
-
Add a type to the filter list
- Admin - Interface in org.apache.hadoop.hbase.client
-
The administrative API for HBase.
- Admin.CompactType - Enum in org.apache.hadoop.hbase.client
-
Currently, there are only two compact types:
NORMAL
means do store files compaction;
MOB
means do mob files compaction.
- ADMIN_QOS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- advance() - Method in class org.apache.hadoop.hbase.client.Result
-
- ALL_VERSIONS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Define for 'return-all-versions'.
- AND - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
AND Byte Array
- AND_ARRAY - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
AND Array
- AND_BUFFER - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
- Append - Class in org.apache.hadoop.hbase.client
-
Performs Append operations on a single row.
- Append(byte[]) - Constructor for class org.apache.hadoop.hbase.client.Append
-
Create a Append operation for the specified row.
- Append(Append) - Constructor for class org.apache.hadoop.hbase.client.Append
-
Copy constructor
- Append(byte[], int, int) - Constructor for class org.apache.hadoop.hbase.client.Append
-
Create a Append operation for the specified row.
- append(Append) - Method in interface org.apache.hadoop.hbase.client.Table
-
Appends values to one or more columns within a single row.
- append(Append) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- appendPeerTableCFs(String, String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Append the replicable table-cf config of the specified peer
- appendPeerTableCFs(String, Map<TableName, ? extends Collection<String>>) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Append the replicable table-cf config of the specified peer
- apply(byte) - Method in enum org.apache.hadoop.hbase.util.Order
-
Apply order to the byte val
.
- apply(byte[]) - Method in enum org.apache.hadoop.hbase.util.Order
-
Apply order to the byte array val
.
- apply(byte[], int, int) - Method in enum org.apache.hadoop.hbase.util.Order
-
Apply order to a range within the byte array val
.
- areAdjacent(HRegionInfo, HRegionInfo) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Check whether two regions are adjacent
- arePartsEqual(ByteBuffer, int, int, int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Check whether two parts in the same buffer are equal.
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedBlob
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedBlobVar
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedFloat32
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedFloat64
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedInt16
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedInt32
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedInt64
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedInt8
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedNumeric
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedString
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.RawBytes
-
- ASCENDING - Static variable in class org.apache.hadoop.hbase.types.RawString
-
- assign(byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
- ATTRIBUTE_SEPERATOR_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- Attributes - Interface in org.apache.hadoop.hbase.client
-
- Authorizations - Class in org.apache.hadoop.hbase.security.visibility
-
This class contains visibility labels associated with a Scan/Get deciding which all labeled data
current scan/get can access.
- Authorizations(String...) - Constructor for class org.apache.hadoop.hbase.security.visibility.Authorizations
-
- Authorizations(List<String>) - Constructor for class org.apache.hadoop.hbase.security.visibility.Authorizations
-
- CACHE_BLOOMS_ON_WRITE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- CACHE_DATA_IN_L1 - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Key for cache data into L1 if cache is set up with more than one tier.
- CACHE_DATA_ON_WRITE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- CACHE_INDEX_ON_WRITE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- call(T) - Method in interface org.apache.hadoop.hbase.client.coprocessor.Batch.Call
-
- callBlockingMethod(Descriptors.MethodDescriptor, RpcController, Message, Message) - Method in class org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel
-
- CallerDisconnectedException - Exception in org.apache.hadoop.hbase.ipc
-
Exception indicating that the remote host making this IPC lost its
IPC connection.
- CallerDisconnectedException(String) - Constructor for exception org.apache.hadoop.hbase.ipc.CallerDisconnectedException
-
- callExecService(Descriptors.MethodDescriptor, Message, Message) - Method in class org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel
-
- callMethod(Descriptors.MethodDescriptor, RpcController, Message, Message, RpcCallback<Message>) - Method in class org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel
-
- CallTimeoutException - Exception in org.apache.hadoop.hbase.ipc
-
Client-side call timeout
- CallTimeoutException(String) - Constructor for exception org.apache.hadoop.hbase.ipc.CallTimeoutException
-
- callWithoutRetries(RetryingCallable<T>, int) - Method in interface org.apache.hadoop.hbase.client.RpcRetryingCaller
-
Call the server once only.
- callWithRetries(RetryingCallable<T>, int) - Method in interface org.apache.hadoop.hbase.client.RpcRetryingCaller
-
Retries if invocation fails.
- cancel() - Method in interface org.apache.hadoop.hbase.client.RpcRetryingCaller
-
- castToCellType(Double) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- castToReturnType(Double) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- CATALOG_FAMILY - Static variable in class org.apache.hadoop.hbase.HConstants
-
The catalog family
- CATALOG_FAMILY_STR - Static variable in class org.apache.hadoop.hbase.HConstants
-
The catalog family as a string
- Cell - Interface in org.apache.hadoop.hbase
-
The unit of storage in HBase consisting of the following fields:
- CellCounter - Class in org.apache.hadoop.hbase.mapreduce
-
A job with a a map and reduce phase to count cells in a table.
- CellCounter() - Constructor for class org.apache.hadoop.hbase.mapreduce.CellCounter
-
- CellCreator - Class in org.apache.hadoop.hbase.mapreduce
-
Facade to create Cells for HFileOutputFormat.
- CellCreator(Configuration) - Constructor for class org.apache.hadoop.hbase.mapreduce.CellCreator
-
- cellScanner() - Method in class org.apache.hadoop.hbase.client.Mutation
-
- cellScanner() - Method in class org.apache.hadoop.hbase.client.Result
-
- CellUtil - Class in org.apache.hadoop.hbase
-
Utility methods helpful slinging
Cell
instances.
- CellVisibility - Class in org.apache.hadoop.hbase.security.visibility
-
This contains a visibility expression which can be associated with a cell.
- CellVisibility(String) - Constructor for class org.apache.hadoop.hbase.security.visibility.CellVisibility
-
- cellVisibilityExpr - Variable in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
- CF_ATTRIBUTE_EARLY_OUT - Static variable in interface org.apache.hadoop.hbase.security.access.AccessControlConstants
-
Configuration or CF schema option for early termination of access checks
if table or CF permissions grant access.
- CF_RENAME_PROP - Static variable in class org.apache.hadoop.hbase.mapreduce.Import
-
- CFNAME - Static variable in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
- chance - Variable in class org.apache.hadoop.hbase.filter.RandomRowFilter
-
- checkAndDelete(byte[], byte[], byte[], byte[], Delete) - Method in interface org.apache.hadoop.hbase.client.Table
-
Atomically checks if a row/family/qualifier value matches the expected
value.
- checkAndDelete(byte[], byte[], byte[], CompareFilter.CompareOp, byte[], Delete) - Method in interface org.apache.hadoop.hbase.client.Table
-
Atomically checks if a row/family/qualifier value matches the expected
value.
- checkAndDelete(byte[], byte[], byte[], byte[], Delete) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- checkAndDelete(byte[], byte[], byte[], CompareFilter.CompareOp, byte[], Delete) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- checkAndMutate(byte[], byte[], byte[], CompareFilter.CompareOp, byte[], RowMutations) - Method in interface org.apache.hadoop.hbase.client.Table
-
Atomically checks if a row/family/qualifier value matches the expected value.
- checkAndMutate(byte[], byte[], byte[], CompareFilter.CompareOp, byte[], RowMutations) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- checkAndPut(byte[], byte[], byte[], byte[], Put) - Method in interface org.apache.hadoop.hbase.client.Table
-
Atomically checks if a row/family/qualifier value matches the expected
value.
- checkAndPut(byte[], byte[], byte[], CompareFilter.CompareOp, byte[], Put) - Method in interface org.apache.hadoop.hbase.client.Table
-
Atomically checks if a row/family/qualifier value matches the expected
value.
- checkAndPut(byte[], byte[], byte[], byte[], Put) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- checkAndPut(byte[], byte[], byte[], CompareFilter.CompareOp, byte[], Put) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- checkForAnd(byte[], int) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Checks if the current index of filter string we are on is the beginning of the keyword 'AND'
- checkForOr(byte[], int) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Checks if the current index of filter string we are on is the beginning of the keyword 'OR'
- checkForSkip(byte[], int) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Checks if the current index of filter string we are on is the beginning of the keyword 'SKIP'
- checkForWhile(byte[], int) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Checks if the current index of filter string we are on is the beginning of the keyword 'WHILE'
- checkOutputSpecs(FileSystem, JobConf) - Method in class org.apache.hadoop.hbase.mapred.TableOutputFormat
-
- checkOutputSpecs(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat
-
- checkOutputSpecs(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputFormat
-
Checks if the output target exists.
- CHECKSUM_TYPE_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
The name of the configuration parameter that specifies
the name of an algorithm that is used to compute checksums
for newly created blocks.
- Cipher - Class in org.apache.hadoop.hbase.io.crypto
-
A common interface for a cryptographic algorithm.
- Cipher(CipherProvider) - Constructor for class org.apache.hadoop.hbase.io.crypto.Cipher
-
- CIPHER_AES - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default cipher for encryption
- CipherProvider - Interface in org.apache.hadoop.hbase.io.crypto
-
An CipherProvider contributes support for various cryptographic
Ciphers.
- classifyExs(List<Throwable>) - Static method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- cleanupJob(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputCommitter
-
- clearAuths(Configuration, String[], String) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
- clearAuths(Connection, String[], String) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
Removes given labels from user's globally authorized list of labels.
- clearAuths(byte[], List<byte[]>) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
Removes given labels from user's globally authorized list of labels.
- clearCaches(ServerName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use thru HConnection
- clearRegionCache() - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- clearRegionCache(TableName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- clearRegionCache(byte[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- Client - Class in org.apache.hadoop.hbase.rest.client
-
A wrapper around HttpClient which provides some useful function and
semantics for interacting with the REST gateway.
- Client() - Constructor for class org.apache.hadoop.hbase.rest.client.Client
-
Default Constructor
- Client(Cluster) - Constructor for class org.apache.hadoop.hbase.rest.client.Client
-
Constructor
- Client(Cluster, boolean) - Constructor for class org.apache.hadoop.hbase.rest.client.Client
-
Constructor
- CLIENT_PORT_STR - Static variable in class org.apache.hadoop.hbase.HConstants
-
The ZK client port key in the ZK properties map.
- ClientBackoffPolicy - Interface in org.apache.hadoop.hbase.client.backoff
-
Configurable policy for the amount of time a client should wait for a new request to the
server when given the server load statistics.
- ClockOutOfSyncException - Exception in org.apache.hadoop.hbase
-
This exception is thrown by the master when a region server clock skew is
too high.
- ClockOutOfSyncException(String) - Constructor for exception org.apache.hadoop.hbase.ClockOutOfSyncException
-
- clone() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- cloneFamily(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- cloneQualifier(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- cloneRow(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
get individual arrays for tests
- cloneSnapshot(byte[], TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Create a new table by cloning the snapshot content.
- cloneSnapshot(String, TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Create a new table by cloning the snapshot content.
- cloneTags(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- cloneValue(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- close() - Method in interface org.apache.hadoop.hbase.client.Admin
-
- close() - Method in interface org.apache.hadoop.hbase.client.BufferedMutator
-
- close() - Method in interface org.apache.hadoop.hbase.client.Connection
-
- close() - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
- close() - Method in interface org.apache.hadoop.hbase.client.ResultScanner
-
Closes the scanner and releases any resources it has allocated
- close() - Method in interface org.apache.hadoop.hbase.client.Table
-
Releases any resources held or pending changes in internal buffers.
- close() - Method in class org.apache.hadoop.hbase.client.TableSnapshotScanner
-
- close() - Method in class org.apache.hadoop.hbase.io.ByteBufferOutputStream
-
- close() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- close() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- close() - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReader
-
Closes the split.
- close() - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl
-
Closes the split.
- close() - Method in class org.apache.hadoop.hbase.quotas.QuotaRetriever
-
- close() - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- close() - Method in class org.apache.hadoop.hbase.util.Base64.Base64OutputStream
-
Flushes and closes (I think, in the superclass) the stream.
- closeRegion(String, String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Close a region.
- closeRegion(byte[], String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Close a region.
- closeRegion(ServerName, HRegionInfo) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Close a region.
- closeRegionWithEncodedRegionName(String, String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
For expert-admins.
- closeTable() - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
- closeTable() - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
- Cluster - Class in org.apache.hadoop.hbase.rest.client
-
A list of 'host:port' addresses of HTTP servers operating as a single
entity, for example multiple redundant web service gateways.
- Cluster() - Constructor for class org.apache.hadoop.hbase.rest.client.Cluster
-
Constructor
- Cluster(List<String>) - Constructor for class org.apache.hadoop.hbase.rest.client.Cluster
-
Constructor
- CLUSTER_DISTRIBUTED - Static variable in class org.apache.hadoop.hbase.HConstants
-
Cluster is in distributed mode or not
- CLUSTER_ID_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default value for cluster ID
- CLUSTER_ID_FILE_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
name of the file for unique cluster ID
- CLUSTER_IS_DISTRIBUTED - Static variable in class org.apache.hadoop.hbase.HConstants
-
Cluster is fully-distributed
- CLUSTER_IS_LOCAL - Static variable in class org.apache.hadoop.hbase.HConstants
-
Cluster is standalone or pseudo-distributed
- ClusterStatus - Class in org.apache.hadoop.hbase
-
Status information on the HBase cluster.
- ClusterStatus(String, String, Map<ServerName, ServerLoad>, Collection<ServerName>, ServerName, Collection<ServerName>, Map<String, RegionState>, String[], Boolean) - Constructor for class org.apache.hadoop.hbase.ClusterStatus
-
- cmp(int) - Method in enum org.apache.hadoop.hbase.util.Order
-
Returns the adjusted trichotomous value according to the ordering imposed by this
Order
.
- code() - Method in enum org.apache.hadoop.hbase.security.access.Permission.Action
-
- COLON - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for colon (:)
- COLUMN_LIST - Static variable in class org.apache.hadoop.hbase.mapred.TableInputFormat
-
space delimited list of columns
- ColumnCountGetFilter - Class in org.apache.hadoop.hbase.filter
-
Simple filter that returns first N columns on row only.
- ColumnCountGetFilter(int) - Constructor for class org.apache.hadoop.hbase.filter.ColumnCountGetFilter
-
- columnFamily - Variable in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- columnFamily - Variable in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- ColumnPaginationFilter - Class in org.apache.hadoop.hbase.filter
-
A filter, based on the ColumnCountGetFilter, takes two arguments: limit and offset.
- ColumnPaginationFilter(int, int) - Constructor for class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
Initializes filter with an integer offset and limit.
- ColumnPaginationFilter(int, byte[]) - Constructor for class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
Initializes filter with a string/bookmark based offset and limit.
- ColumnPrefixFilter - Class in org.apache.hadoop.hbase.filter
-
This filter is used for selecting only those keys with columns that matches
a particular prefix.
- ColumnPrefixFilter(byte[]) - Constructor for class org.apache.hadoop.hbase.filter.ColumnPrefixFilter
-
- columnQualifier - Variable in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- columnQualifier - Variable in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- ColumnRangeFilter - Class in org.apache.hadoop.hbase.filter
-
This filter is used for selecting only those keys with columns that are
between minColumn to maxColumn.
- ColumnRangeFilter(byte[], boolean, byte[], boolean) - Constructor for class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
Create a filter to select those keys with columns that are between minColumn
and maxColumn.
- columns - Variable in class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
- columns - Variable in class org.apache.hadoop.hbase.mapreduce.GroupingTableMapper
-
The grouping columns.
- COLUMNS_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- com.google.protobuf - package com.google.protobuf
-
- COMMA - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for a comma
- commitTask(TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputCommitter
-
- compact(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Compact a table.
- compact(TableName, byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Compact a column family within a table.
- compact(TableName, Admin.CompactType) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Compact a table.
- compact(TableName, byte[], Admin.CompactType) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Compact a column family within a table.
- COMPACTION_ENABLED - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL Used by HBase Shell interface to access this metadata
attribute which denotes if the table is compaction enabled
- COMPACTION_KV_MAX - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for the maximum batch of KVs to be used in flushes and compactions
- COMPACTION_KV_MAX_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- compactRegion(byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Compact an individual region.
- compactRegion(byte[], byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Compact a column family within a region.
- compactRegionServer(ServerName, boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Compact all regions on the region server
- comparator - Variable in class org.apache.hadoop.hbase.filter.CompareFilter
-
- comparator - Variable in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- compare(Double, Double) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- compare(byte[], int, int, byte[], int, int) - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable.Comparator
-
- compare(long) - Method in class org.apache.hadoop.hbase.io.TimeRange
-
Compare the timestamp to timerange
- compare(byte[], byte[]) - Method in class org.apache.hadoop.hbase.util.Bytes.ByteArrayComparator
-
- compare(byte[], int, int, byte[], int, int) - Method in class org.apache.hadoop.hbase.util.Bytes.ByteArrayComparator
-
- compare(byte[], byte[]) - Method in class org.apache.hadoop.hbase.util.Bytes.RowEndKeyComparator
-
- compare(byte[], int, int, byte[], int, int) - Method in class org.apache.hadoop.hbase.util.Bytes.RowEndKeyComparator
-
- compareFamily(CompareFilter.CompareOp, ByteArrayComparable, Cell) - Method in class org.apache.hadoop.hbase.filter.CompareFilter
-
- CompareFilter - Class in org.apache.hadoop.hbase.filter
-
This is a generic filter to be used to filter by comparison.
- CompareFilter(CompareFilter.CompareOp, ByteArrayComparable) - Constructor for class org.apache.hadoop.hbase.filter.CompareFilter
-
Constructor.
- CompareFilter.CompareOp - Enum in org.apache.hadoop.hbase.filter
-
Comparison operators.
- compareOp - Variable in class org.apache.hadoop.hbase.filter.CompareFilter
-
- compareOp - Variable in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- compareQualifier(CompareFilter.CompareOp, ByteArrayComparable, Cell) - Method in class org.apache.hadoop.hbase.filter.CompareFilter
-
- compareResults(Result, Result) - Static method in class org.apache.hadoop.hbase.client.Result
-
Does a deep comparison of two Results, down to the byte arrays.
- compareRow(CompareFilter.CompareOp, ByteArrayComparable, Cell) - Method in class org.apache.hadoop.hbase.filter.CompareFilter
-
- compareTo(Row) - Method in class org.apache.hadoop.hbase.client.Get
-
- compareTo(Row) - Method in class org.apache.hadoop.hbase.client.Increment
-
- compareTo(Row) - Method in class org.apache.hadoop.hbase.client.Mutation
-
- compareTo(Row) - Method in class org.apache.hadoop.hbase.client.RowMutations
-
- compareTo(byte[], int, int) - Method in class org.apache.hadoop.hbase.filter.BinaryComparator
-
- compareTo(ByteBuffer, int, int) - Method in class org.apache.hadoop.hbase.filter.BinaryComparator
-
- compareTo(byte[], int, int) - Method in class org.apache.hadoop.hbase.filter.BinaryPrefixComparator
-
- compareTo(ByteBuffer, int, int) - Method in class org.apache.hadoop.hbase.filter.BinaryPrefixComparator
-
- compareTo(byte[], int, int) - Method in class org.apache.hadoop.hbase.filter.BitComparator
-
- compareTo(ByteBuffer, int, int) - Method in class org.apache.hadoop.hbase.filter.BitComparator
-
- compareTo(byte[]) - Method in class org.apache.hadoop.hbase.filter.ByteArrayComparable
-
- compareTo(byte[], int, int) - Method in class org.apache.hadoop.hbase.filter.ByteArrayComparable
-
Special compareTo method for subclasses, to avoid
copying byte[] unnecessarily.
- compareTo(ByteBuffer, int, int) - Method in class org.apache.hadoop.hbase.filter.ByteArrayComparable
-
Special compareTo method for subclasses, to avoid copying bytes unnecessarily.
- compareTo(byte[], int, int) - Method in class org.apache.hadoop.hbase.filter.LongComparator
-
- compareTo(ByteBuffer, int, int) - Method in class org.apache.hadoop.hbase.filter.LongComparator
-
- compareTo(MultiRowRangeFilter.RowRange) - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
- compareTo(byte[]) - Method in class org.apache.hadoop.hbase.filter.NullComparator
-
- compareTo(byte[], int, int) - Method in class org.apache.hadoop.hbase.filter.NullComparator
-
- compareTo(ByteBuffer, int, int) - Method in class org.apache.hadoop.hbase.filter.NullComparator
-
- compareTo(byte[], int, int) - Method in class org.apache.hadoop.hbase.filter.RegexStringComparator
-
- compareTo(byte[], int, int) - Method in class org.apache.hadoop.hbase.filter.SubstringComparator
-
- compareTo(HColumnDescriptor) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- compareTo(HRegionInfo) - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- compareTo(HRegionLocation) - Method in class org.apache.hadoop.hbase.HRegionLocation
-
- compareTo(HTableDescriptor) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Compares the descriptor with another descriptor which is passed as a parameter.
- compareTo(ImmutableBytesWritable) - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
Define the sort order of the BytesWritable.
- compareTo(byte[]) - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
Compares the bytes in this object to the specified byte array
- compareTo(TableSplit) - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- compareTo(TableSplit) - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Compares this split against the given one.
- compareTo(ServerName) - Method in class org.apache.hadoop.hbase.ServerName
-
- compareTo(TableName) - Method in class org.apache.hadoop.hbase.TableName
-
For performance reasons, the ordering is not lexicographic.
- compareTo(ByteBuffer, int, int, ByteBuffer, int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- compareTo(ByteBuffer, int, int, byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- compareTo(Bytes) - Method in class org.apache.hadoop.hbase.util.Bytes
-
Define the sort order of the Bytes.
- compareTo(byte[]) - Method in class org.apache.hadoop.hbase.util.Bytes
-
Compares the bytes in this object to the specified byte array
- compareTo(byte[], byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- compareTo(byte[], int, int, byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Lexicographically compare two arrays.
- compareValue(CompareFilter.CompareOp, ByteArrayComparable, Cell) - Method in class org.apache.hadoop.hbase.filter.CompareFilter
-
- COMPRESS_TAGS - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- COMPRESSION - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- Compression.Algorithm - Enum in org.apache.hadoop.hbase.io.compress
-
Compression algorithms.
- COMPRESSION_COMPACT - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- conf - Variable in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
- CONF_SKIP_TMP - Static variable in class org.apache.hadoop.hbase.snapshot.ExportSnapshot
-
- CONFIG - Static variable in class org.apache.hadoop.hbase.HBaseInterfaceAudience
-
Denotes class names that appear in user facing configuration files.
- CONFIGURATION - Static variable in class org.apache.hadoop.hbase.HConstants
-
- ConfigurationUtil - Class in org.apache.hadoop.hbase.util
-
Utilities for storing more complex collection types in
Configuration
instances.
- configure(JobConf) - Method in class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
- configure(JobConf) - Method in class org.apache.hadoop.hbase.mapred.HRegionPartitioner
-
- configure(JobConf) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormat
-
- configureCfRenaming(Configuration, Map<String, String>) - Static method in class org.apache.hadoop.hbase.mapreduce.Import
-
Sets a configuration property with key
Import.CF_RENAME_PROP
in conf that tells
the mapper how to rename column families.
- configureIncrementalLoad(Job, HTable) - Static method in class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat
-
Deprecated.
Configure a MapReduce Job to perform an incremental load into the given
table.
- configureIncrementalLoad(Job, HTable) - Static method in class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2
-
- configureIncrementalLoad(Job, Table, RegionLocator) - Static method in class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2
-
Configure a MapReduce Job to perform an incremental load into the given
table.
- configureIncrementalLoad(Job, HTableDescriptor, RegionLocator) - Static method in class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2
-
Configure a MapReduce Job to perform an incremental load into the given
table.
- configureIncrementalLoadMap(Job, HTableDescriptor) - Static method in class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2
-
- configureSplitTable(Job, TableName) - Static method in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Sets split table in map-reduce job.
- Connection - Interface in org.apache.hadoop.hbase.client
-
A cluster connection encapsulating lower level individual connections to actual servers and
a connection to zookeeper.
- ConnectionClosingException - Exception in org.apache.hadoop.hbase.exceptions
-
Thrown when the client believes that we are trying to communicate to has
been repeatedly unresponsive for a while.
- ConnectionClosingException(String) - Constructor for exception org.apache.hadoop.hbase.exceptions.ConnectionClosingException
-
- ConnectionFactory - Class in org.apache.hadoop.hbase.client
-
A non-instantiable class that manages creation of
Connection
s.
- ConnectionFactory() - Constructor for class org.apache.hadoop.hbase.client.ConnectionFactory
-
No public c.tors
- Consistency - Enum in org.apache.hadoop.hbase.client
-
Consistency defines the expected consistency level for an operation.
- consistency - Variable in class org.apache.hadoop.hbase.client.Query
-
- Constants - Interface in org.apache.hadoop.hbase.rest
-
Common constants for org.apache.hadoop.hbase.rest
- contains(byte[]) - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
- contains(byte[], int, int) - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
- contains(byte[], byte) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- contains(byte[], byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- containsColumn(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Result
-
Checks for existence of a value for the specified column (empty or not).
- containsColumn(byte[], int, int, byte[], int, int) - Method in class org.apache.hadoop.hbase.client.Result
-
Checks for existence of a value for the specified column (empty or not).
- containsEmptyColumn(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Result
-
Checks if the specified column contains an empty value (a zero-length byte array).
- containsEmptyColumn(byte[], int, int, byte[], int, int) - Method in class org.apache.hadoop.hbase.client.Result
-
Checks if the specified column contains an empty value (a zero-length byte array).
- containsNonEmptyColumn(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Result
-
Checks if the specified column contains a non-empty value (not a zero-length byte array).
- containsNonEmptyColumn(byte[], int, int, byte[], int, int) - Method in class org.apache.hadoop.hbase.client.Result
-
Checks if the specified column contains a non-empty value (not a zero-length byte array).
- containsRange(byte[], byte[]) - Method in class org.apache.hadoop.hbase.HRegionInfo
-
Returns true if the given inclusive range of rows is fully contained
by this region.
- containsRow(byte[]) - Method in class org.apache.hadoop.hbase.HRegionInfo
-
Return true if the given row falls in this region.
- Context - Class in org.apache.hadoop.hbase.io.crypto
-
Crypto context.
- convert(HBaseProtos.TableState.State) - Static method in enum org.apache.hadoop.hbase.client.TableState.State
-
Covert from PB version of State
- convert() - Method in enum org.apache.hadoop.hbase.client.TableState.State
-
Covert to PB version of State
- convert() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
Convert a ClusterStatus to a protobuf ClusterStatus
- convert(ClusterStatusProtos.ClusterStatus) - Static method in class org.apache.hadoop.hbase.ClusterStatus
-
Convert a protobuf ClusterStatus to a ClusterStatus
- convert(HBaseProtos.ColumnFamilySchema) - Static method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- convert() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- convert(HRegionInfo) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Convert a HRegionInfo to a RegionInfo
- convert(HBaseProtos.RegionInfo) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Convert a RegionInfo to a HRegionInfo
- convert() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- convert(HBaseProtos.TableSchema) - Static method in class org.apache.hadoop.hbase.HTableDescriptor
-
- convert(ProcedureProtos.Procedure) - Static method in class org.apache.hadoop.hbase.ProcedureInfo
-
Helper to convert the protobuf object.
- convertByteArrayToBoolean(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Converts a boolean expressed in a byte array to an actual boolean
- convertByteArrayToInt(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Converts an int expressed in a byte array to an actual int
- convertByteArrayToLong(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Converts a long expressed in a byte array to an actual long
- convertToProcedureProto(ProcedureInfo) - Static method in class org.apache.hadoop.hbase.ProcedureInfo
-
- COPROC - Static variable in class org.apache.hadoop.hbase.HBaseInterfaceAudience
-
- CoprocessorException - Exception in org.apache.hadoop.hbase.coprocessor
-
Thrown if a coprocessor encounters any exception.
- CoprocessorException() - Constructor for exception org.apache.hadoop.hbase.coprocessor.CoprocessorException
-
Default Constructor
- CoprocessorException(Class<?>, String) - Constructor for exception org.apache.hadoop.hbase.coprocessor.CoprocessorException
-
Constructor with a Class object and exception message.
- CoprocessorException(String) - Constructor for exception org.apache.hadoop.hbase.coprocessor.CoprocessorException
-
Constructs the exception and supplies a string as the message
- CoprocessorRpcChannel - Class in org.apache.hadoop.hbase.ipc
-
Base class which provides clients with an RPC connection to
call coprocessor endpoint Service
s.
- CoprocessorRpcChannel() - Constructor for class org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel
-
- coprocessorService() - Method in interface org.apache.hadoop.hbase.client.Admin
-
Creates and returns a RpcChannel
instance connected to the active
master.
- coprocessorService(ServerName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Creates and returns a RpcChannel
instance
connected to the passed region server.
- coprocessorService(byte[]) - Method in interface org.apache.hadoop.hbase.client.Table
-
Creates and returns a RpcChannel
instance connected to the
table region containing the specified row.
- coprocessorService(Class<T>, byte[], byte[], Batch.Call<T, R>) - Method in interface org.apache.hadoop.hbase.client.Table
-
Creates an instance of the given
Service
subclass for each table
region spanning the range from the
startKey
row to
endKey
row (inclusive), and
invokes the passed
Batch.Call.call(T)
method
with each
Service
instance.
- coprocessorService(Class<T>, byte[], byte[], Batch.Call<T, R>, Batch.Callback<R>) - Method in interface org.apache.hadoop.hbase.client.Table
-
Creates an instance of the given
Service
subclass for each table
region spanning the range from the
startKey
row to
endKey
row (inclusive), and
invokes the passed
Batch.Call.call(T)
method
with each
Service
instance.
- coprocessorService(byte[]) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- coprocessorService(Class<T>, byte[], byte[], Batch.Call<T, R>) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- coprocessorService(Class<T>, byte[], byte[], Batch.Call<T, R>, Batch.Callback<R>) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- copy(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Copy the byte array given in parameter and return an instance
of a new byte array with the same length and the same content.
- copy(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Copy the byte array given in parameter and return an instance
of a new byte array with the same length and the same content.
- copyBufferToStream(OutputStream, ByteBuffer, int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copy data from a buffer to an output stream.
- copyBytes() - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
Returns a copy of the bytes referred to by this writable
- copyBytes() - Method in class org.apache.hadoop.hbase.util.Bytes
-
Returns a copy of the bytes referred to by this writable
- copyFamilyTo(Cell, byte[], int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- copyFrom(Result) - Method in class org.apache.hadoop.hbase.client.Result
-
Copy another Result into this one.
- copyFromArrayToBuffer(ByteBuffer, byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copies the bytes from given array's offset to length part into the given buffer.
- copyFromBufferToArray(byte[], ByteBuffer, int, int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copies specified number of bytes from given offset of 'in' ByteBuffer to
the array.
- copyFromBufferToBuffer(ByteBuffer, ByteBuffer) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copy one buffer's whole data to another.
- copyFromBufferToBuffer(ByteBuffer, ByteBuffer, int, int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copy from one buffer to another from given offset.
- copyFromBufferToBuffer(ByteBuffer, ByteBuffer, int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copy from one buffer to another from given offset.
- copyFromStreamToBuffer(ByteBuffer, DataInputStream, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copy the given number of bytes from the given stream and put it at the
current position of the given buffer, updating the position in the buffer.
- copyQualifierTo(Cell, byte[], int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- copyRowTo(Cell, byte[], int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
copyTo
- CopyTable - Class in org.apache.hadoop.hbase.mapreduce
-
Tool used to copy a table to another one which can be on a different setup.
- CopyTable() - Constructor for class org.apache.hadoop.hbase.mapreduce.CopyTable
-
- copyTagTo(Cell, byte[], int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Copies the tags info into the tag portion of the cell
- copyToNewArrays(Collection<ByteRange>) - Static method in class org.apache.hadoop.hbase.util.ByteRangeUtils
-
- copyValueTo(Cell, byte[], int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- CORRUPT_DIR_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
- CorruptedSnapshotException - Exception in org.apache.hadoop.hbase.snapshot
-
Exception thrown when the found snapshot info from the filesystem is not valid
- CorruptedSnapshotException(String, Exception) - Constructor for exception org.apache.hadoop.hbase.snapshot.CorruptedSnapshotException
-
- CorruptedSnapshotException(String, HBaseProtos.SnapshotDescription) - Constructor for exception org.apache.hadoop.hbase.snapshot.CorruptedSnapshotException
-
Snapshot was corrupt for some reason
- CorruptedSnapshotException(String) - Constructor for exception org.apache.hadoop.hbase.snapshot.CorruptedSnapshotException
-
- COUNT_OF_ROWS_FILTERED_KEY - Static variable in class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
- COUNT_OF_ROWS_SCANNED_KEY - Static variable in class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
- Counter - Class in org.apache.hadoop.hbase.util
-
High scalable counter.
- Counter() - Constructor for class org.apache.hadoop.hbase.util.Counter
-
- Counter(long) - Constructor for class org.apache.hadoop.hbase.util.Counter
-
- countOfBytesInRemoteResults - Variable in class org.apache.hadoop.hbase.client.metrics.ScanMetrics
-
number of bytes in Result objects from remote region servers
- countOfBytesInResults - Variable in class org.apache.hadoop.hbase.client.metrics.ScanMetrics
-
number of bytes in Result objects from region servers
- countOfNSRE - Variable in class org.apache.hadoop.hbase.client.metrics.ScanMetrics
-
number of NotServingRegionException caught
- countOfRegions - Variable in class org.apache.hadoop.hbase.client.metrics.ScanMetrics
-
number of regions
- countOfRemoteRPCcalls - Variable in class org.apache.hadoop.hbase.client.metrics.ScanMetrics
-
number of remote RPC calls
- countOfRemoteRPCRetries - Variable in class org.apache.hadoop.hbase.client.metrics.ScanMetrics
-
number of remote RPC retries
- countOfRowsFiltered - Variable in class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
number of rows filtered during scan RPC
- countOfRowsScanned - Variable in class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
number of rows scanned during scan RPC.
- countOfRPCcalls - Variable in class org.apache.hadoop.hbase.client.metrics.ScanMetrics
-
number of RPC calls
- countOfRPCRetries - Variable in class org.apache.hadoop.hbase.client.metrics.ScanMetrics
-
number of RPC retries
- CP_HTD_ATTR_KEY_PATTERN - Static variable in class org.apache.hadoop.hbase.HConstants
-
- CP_HTD_ATTR_VALUE_PARAM_KEY_PATTERN - Static variable in class org.apache.hadoop.hbase.HConstants
-
- CP_HTD_ATTR_VALUE_PARAM_PATTERN - Static variable in class org.apache.hadoop.hbase.HConstants
-
- CP_HTD_ATTR_VALUE_PARAM_VALUE_PATTERN - Static variable in class org.apache.hadoop.hbase.HConstants
-
- CP_HTD_ATTR_VALUE_PATTERN - Static variable in class org.apache.hadoop.hbase.HConstants
-
Pattern that matches a coprocessor specification.
- create(List<Cell>) - Static method in class org.apache.hadoop.hbase.client.Result
-
Instantiate a Result with the specified List of KeyValues.
- create(List<Cell>, Boolean) - Static method in class org.apache.hadoop.hbase.client.Result
-
- create(List<Cell>, Boolean, boolean) - Static method in class org.apache.hadoop.hbase.client.Result
-
- create(List<Cell>, Boolean, boolean, boolean) - Static method in class org.apache.hadoop.hbase.client.Result
-
- create(Cell[]) - Static method in class org.apache.hadoop.hbase.client.Result
-
Instantiate a Result with the specified array of KeyValues.
- create(Cell[], Boolean, boolean) - Static method in class org.apache.hadoop.hbase.client.Result
-
- create(Cell[], Boolean, boolean, boolean) - Static method in class org.apache.hadoop.hbase.client.Result
-
- create() - Static method in class org.apache.hadoop.hbase.HBaseConfiguration
-
Creates a Configuration with HBase resources
- create(Configuration) - Static method in class org.apache.hadoop.hbase.HBaseConfiguration
-
- create(byte[], int, int, byte[], int, int, byte[], int, int, long, byte[], int, int) - Method in class org.apache.hadoop.hbase.mapreduce.CellCreator
-
- create(byte[], int, int, byte[], int, int, byte[], int, int, long, byte[], int, int, String) - Method in class org.apache.hadoop.hbase.mapreduce.CellCreator
-
Deprecated.
- create(byte[], int, int, byte[], int, int, byte[], int, int, long, byte[], int, int, List<Tag>) - Method in class org.apache.hadoop.hbase.mapreduce.CellCreator
-
- create(String) - Static method in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- create(NamespaceDescriptor) - Static method in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- create(UserGroupInformation) - Static method in class org.apache.hadoop.hbase.security.User
-
Wraps an underlying UserGroupInformation
instance.
- CREATE_TABLE_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- CREATE_TABLE_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
-
- createCell(byte[], byte[], byte[], long, byte, byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- createCell(byte[], int, int, byte[], int, int, byte[], int, int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- createCell(byte[], byte[], byte[], long, byte, byte[], long) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Marked as audience Private as of 1.2.0.
- createCell(byte[], byte[], byte[], long, byte, byte[], byte[], long) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Marked as audience Private as of 1.2.0.
- createCell(byte[], byte[], byte[], long, KeyValue.Type, byte[], byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Marked as audience Private as of 1.2.0.
- createCell(byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Create a Cell with specific row.
- createCell(byte[], byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Create a Cell with specific row and value.
- createCell(byte[], byte[], byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Create a Cell with specific row.
- createCellScanner(List<? extends CellScannable>) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- createCellScanner(Iterable<Cell>) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- createCellScanner(Iterator<Cell>) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- createCellScanner(Cell[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- createCellScanner(NavigableMap<byte[], List<Cell>>) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Flatten the map of cells out under the CellScanner
- createComparator(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Parses a comparator of the form comparatorType:comparatorValue form and returns a comparator
- createCompareOp(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Takes a compareOperator symbol as a byte array and returns the corresponding CompareOperator
- createCompleteResult(List<Result>) - Static method in class org.apache.hadoop.hbase.client.Result
-
Forms a single result from the partial results in the partialResults list.
- createCompressionStream(OutputStream, Compressor, int) - Method in enum org.apache.hadoop.hbase.io.compress.Compression.Algorithm
-
- createConnection() - Static method in class org.apache.hadoop.hbase.client.ConnectionFactory
-
Create a new Connection instance using default HBaseConfiguration.
- createConnection(Configuration) - Static method in class org.apache.hadoop.hbase.client.ConnectionFactory
-
Create a new Connection instance using the passed conf
instance.
- createConnection(Configuration, ExecutorService) - Static method in class org.apache.hadoop.hbase.client.ConnectionFactory
-
Create a new Connection instance using the passed conf
instance.
- createConnection(Configuration, User) - Static method in class org.apache.hadoop.hbase.client.ConnectionFactory
-
Create a new Connection instance using the passed conf
instance.
- createConnection(Configuration, ExecutorService, User) - Static method in class org.apache.hadoop.hbase.client.ConnectionFactory
-
Create a new Connection instance using the passed conf
instance.
- createCounter(String) - Method in class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
Create a new counter with the specified name
- createDecompressionStream(InputStream, Decompressor, int) - Method in enum org.apache.hadoop.hbase.io.compress.Compression.Algorithm
-
- createDecryptionStream(InputStream, Context, byte[]) - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Create a decrypting input stream given a context and IV
- createDecryptionStream(InputStream, Decryptor) - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Create a decrypting output stream given an initialized decryptor
- createDecryptionStream(InputStream) - Method in interface org.apache.hadoop.hbase.io.crypto.Decryptor
-
Create a stream for decryption
- createEncoder(String) - Static method in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
- createEncryptionStream(OutputStream, Context, byte[]) - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Create an encrypting output stream given a context and IV
- createEncryptionStream(OutputStream, Encryptor) - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Create an encrypting output stream given an initialized encryptor
- createEncryptionStream(OutputStream) - Method in interface org.apache.hadoop.hbase.io.crypto.Encryptor
-
Create a stream for encryption
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.ColumnCountGetFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.ColumnPrefixFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.FamilyFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.InclusiveStopFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.KeyOnlyFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.PageFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.PrefixFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.QualifierFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.RowFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.SingleColumnValueExcludeFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.TimestampsFilter
-
- createFilterFromArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.ValueFilter
-
- createFirstDeleteFamilyCellOnRow(byte[], byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Create a Delete Family Cell for the specified row and family that would
be smaller than all other possible Delete Family KeyValues that have the
same row and family.
- createFirstOnNextRow(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Create a Cell that is smaller than all other possible Cells for the given Cell row's next row.
- createFirstOnRow(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Create a Cell that is smaller than all other possible Cells for the given Cell's row.
- createFirstOnRowCol(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Create a Cell that is smaller than all other possible Cells for the given Cell's row.
- createFirstOnRowCol(Cell, byte[], int, int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Create a Cell that is smaller than all other possible Cells for the given Cell's rk:cf and
passed qualifier.
- createFirstOnRowColTS(Cell, long) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Creates the first cell with the row/family/qualifier of this cell and the given timestamp.
- createGroupKey(byte[][]) - Method in class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
Create a key by concatenating multiple column values.
- createGroupKey(byte[][]) - Method in class org.apache.hadoop.hbase.mapreduce.GroupingTableMapper
-
Create a key by concatenating multiple column values.
- createKey() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- createKey() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- createLastOnRow(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Create a Cell that is larger than all other possible Cells for the given Cell's row.
- createLastOnRowCol(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Create a Cell that is larger than all other possible Cells for the given Cell's rk:cf:q.
- createMaxByteArray(int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Create a max byte array with the specified max byte count
- createNamespace(NamespaceDescriptor) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Create a new namespace
- createPlainCompressionStream(OutputStream, Compressor) - Method in enum org.apache.hadoop.hbase.io.compress.Compression.Algorithm
-
Creates a compression stream without any additional wrapping into
buffering streams.
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.HLogInputFormat
-
Deprecated.
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormatBase
-
Builds a TableRecordReader.
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableSnapshotInputFormat
-
- createRecordReader(InputSplit, TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.WALInputFormat
-
- createRegionName(TableName, byte[], long, boolean) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Make a region name of passed parameters.
- createRegionName(TableName, byte[], String, boolean) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Make a region name of passed parameters.
- createRegionName(TableName, byte[], long, int, boolean) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Make a region name of passed parameters.
- createRegionName(TableName, byte[], byte[], boolean) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Make a region name of passed parameters.
- createRegionName(TableName, byte[], byte[], int, boolean) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Make a region name of passed parameters.
- createSubmittableJob(String[]) - Method in class org.apache.hadoop.hbase.mapred.RowCounter
-
- createSubmittableJob(Configuration, String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.CellCounter
-
Sets up the actual job.
- createSubmittableJob(String[]) - Method in class org.apache.hadoop.hbase.mapreduce.CopyTable
-
Sets up the actual job.
- createSubmittableJob(Configuration, String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.Export
-
Sets up the actual job.
- createSubmittableJob(Configuration, String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.Import
-
Sets up the actual job.
- createSubmittableJob(Configuration, String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
Sets up the actual job.
- createSubmittableJob(Configuration, String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.RowCounter
-
Sets up the actual job.
- createSubmittableJob(String[]) - Method in class org.apache.hadoop.hbase.mapreduce.WALPlayer
-
Sets up the actual job.
- createTable(HTableDescriptor) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Creates a new table.
- createTable(HTableDescriptor, byte[], byte[], int) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Creates a new table with the specified number of regions.
- createTable(HTableDescriptor, byte[][]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Creates a new table with an initial set of empty regions defined by the specified split keys.
- createTable(HTableDescriptor) - Method in class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
Creates a new table.
- createTableAsync(HTableDescriptor, byte[][]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Creates a new table but does not block and wait for it to come online.
- createTreeSet() - Method in class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- createUnescapdArgument(byte[], int, int) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Removes the single quote escaping a single quote - thus it returns an unescaped argument
- createUserForTesting(Configuration, String, String[]) - Static method in class org.apache.hadoop.hbase.security.User
-
Generates a new User
instance specifically for use in test code.
- createValue() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- createValue() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- createVisibilityExpTags(String) - Method in interface org.apache.hadoop.hbase.mapreduce.VisibilityExpressionResolver
-
Convert visibility expression into tags to be serialized.
- createVisibilityExpTags(String, boolean, boolean) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
Creates tags corresponding to given visibility expression.
- CREDENTIALS_LOCATION - Static variable in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- CRLF - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- CRYPTO_ALTERNATE_KEY_ALGORITHM_CONF_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for the name of the alternate cipher algorithm for the cluster, a string
- CRYPTO_CIPHERPROVIDER_CONF_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for the crypto algorithm provider, a class name
- CRYPTO_KEY_ALGORITHM_CONF_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for the algorithm used for creating jks key, a string
- CRYPTO_KEYPROVIDER_CONF_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for the crypto key provider, a class name
- CRYPTO_KEYPROVIDER_PARAMETERS_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for the crypto key provider parameters
- CRYPTO_MASTERKEY_ALTERNATE_NAME_CONF_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for the name of the alternate master key for the cluster, a string
- CRYPTO_MASTERKEY_NAME_CONF_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for the name of the master key for the cluster, a string
- CRYPTO_WAL_ALGORITHM_CONF_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for the algorithm to use when encrypting the WAL, a string
- CRYPTO_WAL_KEY_NAME_CONF_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for the name of the master WAL encryption key for the cluster, a string
- current() - Method in class org.apache.hadoop.hbase.client.Result
-
- CUSTOM_FILTERS - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- D - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for 'D'
- DATA_BLOCK_ENCODING - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- DATA_FILE_UMASK_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
File permission umask to use when creating hbase data files
- DATABLOCK_ENCODING_OVERRIDE_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat
-
Deprecated.
- DATABLOCK_ENCODING_OVERRIDE_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2
-
- DataBlockEncoding - Enum in org.apache.hadoop.hbase.io.encoding
-
Provide access to all data block encoding algorithms.
- DataType<T> - Interface in org.apache.hadoop.hbase.types
-
DataType
is the base class for all HBase data types.
- DAY_IN_SECONDS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Seconds in a day, hour and minute
- decode(PositionedByteRange) - Method in interface org.apache.hadoop.hbase.types.DataType
-
Read an instance of T
from the buffer src
.
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.FixedLengthWrapper
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedBlob
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedBlobVar
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedFloat32
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedFloat64
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedInt16
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedInt32
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedInt64
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedInt8
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedNumeric
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedString
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawByte
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawBytes
-
- decode(PositionedByteRange, int) - Method in class org.apache.hadoop.hbase.types.RawBytes
-
Read a byte[]
from the buffer src
.
- decode(PositionedByteRange, int) - Method in class org.apache.hadoop.hbase.types.RawBytesFixedLength
-
Read a byte[]
from the buffer src
.
- decode(PositionedByteRange, int) - Method in class org.apache.hadoop.hbase.types.RawBytesTerminated
-
Read a byte[]
from the buffer src
.
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawDouble
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawFloat
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawInteger
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawLong
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawShort
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawString
-
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.Struct
-
- decode(PositionedByteRange, int) - Method in class org.apache.hadoop.hbase.types.Struct
-
Read the field at index
.
- decode(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
- DECODE - Static variable in class org.apache.hadoop.hbase.util.Base64
-
Specify decoding.
- decode(byte[], int, int, int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Very low-level access to decoding ASCII characters in the form of a byte
array.
- decode(String) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Decodes data from Base64 notation, automatically detecting gzip-compressed
data and decompressing it.
- decode(String, int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Decodes data from Base64 notation, automatically detecting gzip-compressed
data and decompressing it.
- decode4to3(byte[], int, byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Decodes four bytes from array source and writes the resulting
bytes (up to three of them) to destination.
- decodeA(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.Union2
-
Read an instance of the first type parameter from buffer src
.
- decodeB(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.Union2
-
Read an instance of the second type parameter from buffer src
.
- decodeBlobCopy(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode a Blob value, byte-for-byte copy.
- decodeBlobVar(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode a blob value that was encoded using BlobVar encoding.
- decodeByte(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedInt8
-
Read a byte
value from the buffer src
.
- decodeByte(byte[], int) - Method in class org.apache.hadoop.hbase.types.RawByte
-
Read a byte
value from the buffer buff
.
- decodeC(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.Union3
-
Read an instance of the third type parameter from buffer src
.
- decodeD(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.Union4
-
Read an instance of the fourth type parameter from buffer src
.
- decodeDouble(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedFloat64
-
Read a double
value from the buffer src
.
- decodeDouble(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedNumeric
-
Read a double
value from the buffer src
.
- decodeDouble(byte[], int) - Method in class org.apache.hadoop.hbase.types.RawDouble
-
Read a double
value from the buffer buff
.
- decodeFileToFile(String, String) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Reads infile and decodes it to outfile.
- decodeFloat(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedFloat32
-
Read a float
value from the buffer dst
.
- decodeFloat(byte[], int) - Method in class org.apache.hadoop.hbase.types.RawFloat
-
Read a float
value from the buffer buff
.
- decodeFloat32(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode a 32-bit floating point value using the fixed-length encoding.
- decodeFloat64(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode a 64-bit floating point value using the fixed-length encoding.
- decodeFromFile(String) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Convenience method for reading a base64-encoded file and decoding it.
- decodeInt(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedInt32
-
Read an int
value from the buffer src
.
- decodeInt(byte[], int) - Method in class org.apache.hadoop.hbase.types.RawInteger
-
Read an int
value from the buffer buff
.
- decodeInt16(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode an int16
value.
- decodeInt32(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode an int32
value.
- decodeInt64(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode an int64
value.
- decodeInt8(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode an int8
value.
- decodeLong(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedInt64
-
Read a long
value from the buffer src
.
- decodeLong(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedNumeric
-
Read a long
value from the buffer src
.
- decodeLong(byte[], int) - Method in class org.apache.hadoop.hbase.types.RawLong
-
Read a long
value from the buffer buff
.
- decodeNumericAsBigDecimal(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode a
BigDecimal
value from the variable-length encoding.
- decodeNumericAsDouble(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode a primitive double
value from the Numeric encoding.
- decodeNumericAsLong(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode a primitive long
value from the Numeric encoding.
- decodeShort(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedInt16
-
Read a short
value from the buffer src
.
- decodeShort(byte[], int) - Method in class org.apache.hadoop.hbase.types.RawShort
-
Read a short
value from the buffer buff
.
- decodeString(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Decode a String value.
- decodeToFile(String, String) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Convenience method for decoding data to a file.
- decodeToObject(String) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Attempts to decode Base64 data and deserialize a Java Object within.
- decrement() - Method in class org.apache.hadoop.hbase.util.Counter
-
- decrypt(byte[], int, InputStream, int, Decryptor) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Decrypt a block of ciphertext read in from a stream with the given
cipher and context
- decrypt(byte[], int, InputStream, int, Encryption.Context, byte[]) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Decrypt a block of ciphertext from a stream given a context and IV
- decrypt(OutputStream, InputStream, int, Decryptor) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Decrypt a stream of ciphertext given a decryptor
- decrypt(OutputStream, InputStream, int, Encryption.Context, byte[]) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Decrypt a stream of ciphertext given a context and IV
- Decryptor - Interface in org.apache.hadoop.hbase.io.crypto
-
Decryptors apply a cipher to an InputStream to recover plaintext.
- decryptWithSubjectKey(OutputStream, InputStream, int, String, Configuration, Cipher, byte[]) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Decrypts a block of ciphertext with the symmetric key resolved for the given subject
- deepCopy() - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Create a new ByteRange
with new backing byte[] containing a copy
of the content from this
range's window.
- deepCopy() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- deepCopy() - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- deepCopy() - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- deepCopy() - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- deepCopy() - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- deepCopySubRangeTo(int, int, byte[], int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Wrapper for System.arraycopy.
- deepCopyTo(byte[], int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Wrapper for System.arraycopy.
- deepCopyToNewArray() - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Instantiate a new byte[] with exact length, which is at least 24 bytes +
length.
- DEFAULT_ATTRIBUTE_EARLY_OUT - Static variable in interface org.apache.hadoop.hbase.security.access.AccessControlConstants
-
Default setting for hbase.security.access.early_out
- DEFAULT_BLOCKCACHE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default setting for whether to use a block cache or not.
- DEFAULT_BLOCKSIZE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default size of blocks in files stored to the filesytem (hfiles).
- DEFAULT_BLOCKSIZE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default block size for an HFile.
- DEFAULT_BLOOMFILTER - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default setting for whether or not to use bloomfilters.
- DEFAULT_CACHE_BLOOMS_ON_WRITE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default setting for whether to cache bloom filter blocks on write if block
caching is enabled.
- DEFAULT_CACHE_DATA_IN_L1 - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default setting for whether to cache data blocks in L1 tier.
- DEFAULT_CACHE_DATA_ON_WRITE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default setting for whether to cache data blocks on write if block caching
is enabled.
- DEFAULT_CACHE_INDEX_ON_WRITE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default setting for whether to cache index blocks on write if block
caching is enabled.
- DEFAULT_CLUSTER_DISTRIBUTED - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default value for cluster distributed mode
- DEFAULT_CLUSTER_ID - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default cluster ID, cannot be used to identify a cluster so a key with
this value means it wasn't meant for replication.
- DEFAULT_COMPACTION_ENABLED - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
Constant that denotes whether the table is compaction enabled by default
- DEFAULT_COMPRESS_TAGS - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default compress tags along with any type of DataBlockEncoding.
- DEFAULT_COMPRESSION - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default compression type.
- DEFAULT_DATA_BLOCK_ENCODING - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default data block encoding algorithm.
- DEFAULT_DFS_REPLICATION - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- DEFAULT_DISALLOW_WRITES_IN_RECOVERING_CONFIG - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_DISTRIBUTED_LOG_REPLAY_CONFIG - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_ENABLE_CLIENT_BACKPRESSURE - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_ENCODE_ON_DISK - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default value of the flag that enables data block encoding on disk, as
opposed to encoding in cache only.
- DEFAULT_EVICT_BLOCKS_ON_CLOSE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default setting for whether to evict cached blocks from the blockcache on
close.
- DEFAULT_EVICT_REMAIN_RATIO - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_EXEC_PERMISSION_CHECKS - Static variable in interface org.apache.hadoop.hbase.security.access.AccessControlConstants
-
Default setting for hbase.security.exec.permission.checks; false
- DEFAULT_HBASE_CLIENT_MAX_PERREGION_TASKS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_CLIENT_MAX_PERSERVER_TASKS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_CLIENT_MAX_TOTAL_TASKS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_CLIENT_OPERATION_TIMEOUT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default HBase client operation timeout, which is tantamount to a blocking call
- DEFAULT_HBASE_CLIENT_PAUSE - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_CLIENT_RETRIES_NUMBER - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_CLIENT_SCANNER_ASYNC_PREFETCH - Static variable in class org.apache.hadoop.hbase.client.Scan
-
- DEFAULT_HBASE_CLIENT_SCANNER_CACHING - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_CLIENT_SCANNER_MAX_RESULT_SIZE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Maximum number of bytes returned when calling a scanner's next method.
- DEFAULT_HBASE_CLIENT_SCANNER_TIMEOUT_PERIOD - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_META_BLOCK_SIZE - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_META_SCANNER_CACHING - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_META_VERSIONS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_RPC_SHORTOPERATION_TIMEOUT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_RPC_TIMEOUT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_SERVER_PAUSE - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HBASE_SERVER_SCANNER_MAX_RESULT_SIZE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Maximum number of bytes returned when calling a scanner's next method.
- DEFAULT_HEALTH_FAILURE_THRESHOLD - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HEALTH_SCRIPT_TIMEOUT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HEAP_OCCUPANCY_HIGH_WATERMARK - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HEAP_OCCUPANCY_LOW_WATERMARK - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HOST - Static variable in class org.apache.hadoop.hbase.HConstants
-
default host address
- DEFAULT_HREGION_EDITS_REPLAY_SKIP_ERRORS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_HREGION_MEMSTORE_BLOCK_MULTIPLIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default value for hbase.hregion.memstore.block.multiplier
- DEFAULT_HSTORE_OPEN_AND_CLOSE_THREADS_MAX - Static variable in class org.apache.hadoop.hbase.HConstants
-
The default number for the max number of threads used for opening and
closing stores or store files in parallel
- DEFAULT_IN_MEMORY - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default setting for whether to try and serve this column family from memory or not.
- DEFAULT_KEEP_DELETED - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default setting for preventing deleted from being collected immediately.
- DEFAULT_LISTEN_PORT - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- DEFAULT_MASTER_HANLDER_COUNT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_MASTER_INFOPORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
default port for master web api
- DEFAULT_MASTER_PORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
default port that the master listens on
- DEFAULT_MASTER_TYPE_BACKUP - Static variable in class org.apache.hadoop.hbase.HConstants
-
by default every master is a possible primary master unless the conf explicitly overrides it
- DEFAULT_MATH_CONTEXT - Static variable in class org.apache.hadoop.hbase.util.OrderedBytes
-
- DEFAULT_MAX_AGE - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- DEFAULT_MAX_BACKOFF - Static variable in class org.apache.hadoop.hbase.client.backoff.ExponentialClientBackoffPolicy
-
- DEFAULT_MAX_FILE_SIZE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default maximum file size
- DEFAULT_MEMSTORE_FLUSH_SIZE - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
Constant that denotes the maximum default size of the memstore after which
the contents are flushed to the store files
- DEFAULT_META_REPLICA_NUM - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_MIN_VERSIONS - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default is not to keep a minimum of versions.
- DEFAULT_MOB_CACHE_EVICT_PERIOD - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_MOB_CLEANER_PERIOD - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_MOB_COMPACTION_BATCH_SIZE - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_MOB_COMPACTION_CHORE_PERIOD - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_MOB_COMPACTION_MERGEABLE_THRESHOLD - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_MOB_COMPACTION_THREADS_MAX - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_MOB_DELFILE_MAX_COUNT - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_MOB_FILE_CACHE_SIZE - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_MOB_SWEEP_TOOL_COMPACTION_MEMSTORE_FLUSH_SIZE - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_MOB_THRESHOLD - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- DEFAULT_NAMESPACE - Static variable in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- DEFAULT_NAMESPACE_NAME - Static variable in class org.apache.hadoop.hbase.NamespaceDescriptor
-
Default namespace name.
- DEFAULT_NAMESPACE_NAME_STR - Static variable in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- DEFAULT_NORMALIZATION_ENABLED - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
Constant that denotes whether the table is normalized by default.
- DEFAULT_PREFETCH_BLOCKS_ON_OPEN - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- DEFAULT_READONLY - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
Constant that denotes whether the table is READONLY by default and is false
- DEFAULT_REGION_MEMSTORE_REPLICATION - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
- DEFAULT_REGION_REPLICATION - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
- DEFAULT_REGION_SERVER_HANDLER_ABORT_ON_ERROR_PERCENT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_REGION_SERVER_HANDLER_COUNT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_REGION_SERVER_HIGH_PRIORITY_HANDLER_COUNT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_REGION_SERVER_REPLICATION_HANDLER_COUNT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_REGIONSERVER_INFOPORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
default port for region server web api
- DEFAULT_REGIONSERVER_METRICS_PERIOD - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_REGIONSERVER_PORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default port region server listens on.
- DEFAULT_REPLICA_ID - Static variable in class org.apache.hadoop.hbase.HRegionInfo
-
- DEFAULT_REPLICATION_SCOPE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default scope.
- DEFAULT_STATUS_MULTICAST_ADDRESS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_STATUS_MULTICAST_BIND_ADDRESS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_STATUS_MULTICAST_PORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_SWEEP_TOOL_MOB_COMPACTION_MERGEABLE_SIZE - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_SWEEP_TOOL_MOB_COMPACTION_RATIO - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- DEFAULT_THREAD_WAKE_FREQUENCY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default value for thread wake frequency
- DEFAULT_TTL - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default time to live of cell contents.
- DEFAULT_USE_META_REPLICAS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_VERSION_FILE_WRITE_ATTEMPTS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for how often we should try to write a version file, before failing
- DEFAULT_VERSIONS - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Default number of versions of a record to keep.
- DEFAULT_WAL_STORAGE_POLICY - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_ZK_SESSION_TIMEOUT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default value for ZooKeeper session timeout
- DEFAULT_ZOOKEEPER_ZNODE_PARENT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DEFAULT_ZOOKEPER_CLIENT_PORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default client port that the zookeeper listens on
- DEFAULT_ZOOKEPER_MAX_CLIENT_CNXNS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default limit on concurrent client-side zookeeper connections
- DEFAULT_ZOOKEPER_RECOVERABLE_WAITIME - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default wait time for the recoverable zookeeper
- DefaultCipherProvider - Class in org.apache.hadoop.hbase.io.crypto
-
The default cipher provider.
- DEFERRED_LOG_FLUSH - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
- Delete - Class in org.apache.hadoop.hbase.client
-
Used to perform Delete operations on a single row.
- Delete(byte[]) - Constructor for class org.apache.hadoop.hbase.client.Delete
-
Create a Delete operation for the specified row.
- Delete(byte[], long) - Constructor for class org.apache.hadoop.hbase.client.Delete
-
Create a Delete operation for the specified row and timestamp.
- Delete(byte[], int, int) - Constructor for class org.apache.hadoop.hbase.client.Delete
-
Create a Delete operation for the specified row and timestamp.
- Delete(byte[], int, int, long) - Constructor for class org.apache.hadoop.hbase.client.Delete
-
Create a Delete operation for the specified row and timestamp.
- Delete(Delete) - Constructor for class org.apache.hadoop.hbase.client.Delete
-
- delete(Delete) - Method in interface org.apache.hadoop.hbase.client.Table
-
Deletes the specified cells/row.
- delete(List<Delete>) - Method in interface org.apache.hadoop.hbase.client.Table
-
Deletes the specified cells/rows in bulk.
- delete(String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a DELETE request
- delete(Cluster, String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a DELETE request
- delete(Delete) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- delete(List<Delete>) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- deleteCachedRegionLocation(HRegionLocation) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use thru HConnection
- deleteColumn(TableName, byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
- deleteColumnFamily(TableName, byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Delete a column family from a table.
- deleteNamespace(String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Delete an existing namespace.
- deleteSnapshot(byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Delete an existing snapshot.
- deleteSnapshot(String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Delete an existing snapshot.
- deleteSnapshots(String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Delete existing snapshots whose names match the pattern passed.
- deleteSnapshots(Pattern) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Delete existing snapshots whose names match the pattern passed.
- deleteTable(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Deletes a table.
- deleteTable(String) - Method in class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
Deletes a table.
- deleteTable(byte[]) - Method in class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
Deletes a table.
- deleteTableAsync(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Deletes the table but does not block and wait for it be completely removed.
- deleteTables(String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Deletes tables matching the passed in pattern and wait on completion.
- deleteTables(Pattern) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Delete tables matching the passed in pattern and wait on completion.
- deleteTableSnapshots(String, String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Delete all existing snapshots matching the given table name regular expression and snapshot
name regular expression.
- deleteTableSnapshots(Pattern, Pattern) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Delete all existing snapshots matching the given table name regular expression and snapshot
name regular expression.
- DELIMITER - Static variable in class org.apache.hadoop.hbase.HConstants
-
delimiter used between portions of a region name
- DependentColumnFilter - Class in org.apache.hadoop.hbase.filter
-
A filter for adding inter-column timestamp matching
Only cells with a correspondingly timestamped entry in
the target column will be retained
Not compatible with Scan.setBatch as operations need
full rows for correct filtering
- DependentColumnFilter(byte[], byte[], boolean, CompareFilter.CompareOp, ByteArrayComparable) - Constructor for class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
Build a dependent column filter with value checking
dependent column varies will be compared using the supplied
compareOp and comparator, for usage of which
refer to
CompareFilter
- DependentColumnFilter(byte[], byte[]) - Constructor for class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
Constructor for DependentColumn filter.
- DependentColumnFilter(byte[], byte[], boolean) - Constructor for class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
Constructor for DependentColumn filter.
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedBlob
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedBlobVar
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedFloat32
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedFloat64
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedInt16
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedInt32
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedInt64
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedInt8
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedNumeric
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.OrderedString
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.RawBytes
-
- DESCENDING - Static variable in class org.apache.hadoop.hbase.types.RawString
-
- deserialize(byte[]) - Static method in exception org.apache.hadoop.hbase.errorhandling.ForeignException
-
Takes a series of bytes and tries to generate an ForeignException instance for it.
- DFS_REPLICATION - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- disablePeer(String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Stop the replication stream to the specified peer.
- disableTable(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Disable table and wait on completion.
- disableTableAsync(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Disable the table but does not block and wait for it be completely disabled.
- disableTableRep(TableName) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Disable a table's replication switch.
- disableTables(String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Disable tables matching the passed in pattern and wait on completion.
- disableTables(Pattern) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Disable tables matching the passed in pattern and wait on completion.
- DISALLOW_WRITES_IN_RECOVERING - Static variable in class org.apache.hadoop.hbase.HConstants
-
- DISTRIBUTED_LOG_REPLAY_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Conf key that enables unflushed WAL edits directly being replayed to region servers
- divideForAvg(Double, Long) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- doBulkLoad(Path, HTable) - Method in class org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
-
Perform a bulk load of the given directory into the given
pre-existing table.
- doBulkLoad(Path, Admin, Table, RegionLocator) - Method in class org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
-
Perform a bulk load of the given directory into the given
pre-existing table.
- doLoadColumnFamiliesOnDemand() - Method in class org.apache.hadoop.hbase.client.Scan
-
Get the logical value indicating whether on-demand CF loading should be allowed.
- DoNotRetryIOException - Exception in org.apache.hadoop.hbase
-
Subclass if exception is not meant to be retried: e.g.
- DoNotRetryIOException() - Constructor for exception org.apache.hadoop.hbase.DoNotRetryIOException
-
default constructor
- DoNotRetryIOException(String) - Constructor for exception org.apache.hadoop.hbase.DoNotRetryIOException
-
- DoNotRetryIOException(String, Throwable) - Constructor for exception org.apache.hadoop.hbase.DoNotRetryIOException
-
- DoNotRetryIOException(Throwable) - Constructor for exception org.apache.hadoop.hbase.DoNotRetryIOException
-
- DoNotRetryRegionException - Exception in org.apache.hadoop.hbase.client
-
Similar to RegionException, but disables retries.
- DoNotRetryRegionException() - Constructor for exception org.apache.hadoop.hbase.client.DoNotRetryRegionException
-
- DoNotRetryRegionException(String) - Constructor for exception org.apache.hadoop.hbase.client.DoNotRetryRegionException
-
- DONT_BREAK_LINES - Static variable in class org.apache.hadoop.hbase.util.Base64
-
Don't break lines when encoding (violates strict Base64 specification)
- doSetup(Reducer<ImmutableBytesWritable, Text, ImmutableBytesWritable, KeyValue>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.TextSortReducer
-
Handles common parameter initialization that a subclass might want to leverage.
- doSetup(Mapper<LongWritable, Text, ImmutableBytesWritable, Put>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
Handles common parameter initialization that a subclass might want to leverage.
- doSetup(Mapper<LongWritable, Text, ImmutableBytesWritable, Text>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterTextMapper
-
Handles common parameter initialization that a subclass might want to leverage.
- DoubleColumnInterpreter - Class in org.apache.hadoop.hbase.client.coprocessor
-
a concrete column interpreter implementation.
- DoubleColumnInterpreter() - Constructor for class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- drainInputStreamToBuffer(InputStream) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copy from the InputStream to a new heap ByteBuffer until the InputStream is exhausted.
- dropDependentColumn - Variable in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- dropDependentColumn() - Method in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- DroppedSnapshotException - Exception in org.apache.hadoop.hbase
-
Thrown during flush if the possibility snapshot content was not properly
persisted into store files.
- DroppedSnapshotException(String) - Constructor for exception org.apache.hadoop.hbase.DroppedSnapshotException
-
- DroppedSnapshotException() - Constructor for exception org.apache.hadoop.hbase.DroppedSnapshotException
-
default constructor
- DRY_RUN_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- Durability - Enum in org.apache.hadoop.hbase.client
-
Enum describing the durability guarantees for tables and
Mutation
s
Note that the items must be sorted in order of increasing durability
- durability - Variable in class org.apache.hadoop.hbase.client.Mutation
-
- DURABILITY - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
- E - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for 'E'
- EMPTY_BYTE_ARRAY - Static variable in class org.apache.hadoop.hbase.HConstants
-
An empty instance.
- EMPTY_BYTE_BUFFER - Static variable in class org.apache.hadoop.hbase.HConstants
-
- EMPTY_END_ROW - Static variable in class org.apache.hadoop.hbase.HConstants
-
Last row in a table.
- EMPTY_HEADER_ARRAY - Static variable in class org.apache.hadoop.hbase.rest.client.Client
-
- EMPTY_RESULT - Static variable in class org.apache.hadoop.hbase.client.Result
-
- EMPTY_SERVER_LIST - Static variable in class org.apache.hadoop.hbase.ServerName
-
- EMPTY_SERVERLOAD - Static variable in class org.apache.hadoop.hbase.ServerLoad
-
- EMPTY_START_ROW - Static variable in class org.apache.hadoop.hbase.HConstants
-
Used by scanners, etc when they want to start at the beginning of a region
- EMPTY_STRING - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- EMPTY_VALUE_ON_MOBCELL_MISS - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- ENABLE_CLIENT_BACKPRESSURE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Config key for if the server should send backpressure and if the client should listen to
that backpressure from the server
- ENABLE_DATA_FILE_UMASK - Static variable in class org.apache.hadoop.hbase.HConstants
-
Enable file permission modification from standard hbase
- ENABLE_WAL_COMPRESSION - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration name of WAL Compression
- ENABLE_WAL_ENCRYPTION - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for enabling WAL encryption, a boolean
- enableCatalogJanitor(boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Enable/Disable the catalog janitor
- enablePeer(String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Restart the replication stream to the specified peer.
- enableTable(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Enable a table.
- enableTableAsync(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Enable the table but does not block and wait for it be completely enabled.
- enableTableRep(TableName) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Enable a table's replication switch.
- enableTables(String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Enable tables matching the passed in pattern and wait on completion.
- enableTables(Pattern) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Enable tables matching the passed in pattern and wait on completion.
- encode(PositionedByteRange, T) - Method in interface org.apache.hadoop.hbase.types.DataType
-
Write instance val
into buffer dst
.
- encode(PositionedByteRange, T) - Method in class org.apache.hadoop.hbase.types.FixedLengthWrapper
-
- encode(PositionedByteRange, byte[]) - Method in class org.apache.hadoop.hbase.types.OrderedBlob
-
- encode(PositionedByteRange, byte[], int, int) - Method in class org.apache.hadoop.hbase.types.OrderedBlob
-
Write a subset of val
to dst
.
- encode(PositionedByteRange, byte[]) - Method in class org.apache.hadoop.hbase.types.OrderedBlobVar
-
- encode(PositionedByteRange, byte[], int, int) - Method in class org.apache.hadoop.hbase.types.OrderedBlobVar
-
Write a subset of val
to buff
.
- encode(PositionedByteRange, Float) - Method in class org.apache.hadoop.hbase.types.OrderedFloat32
-
- encode(PositionedByteRange, Double) - Method in class org.apache.hadoop.hbase.types.OrderedFloat64
-
- encode(PositionedByteRange, Short) - Method in class org.apache.hadoop.hbase.types.OrderedInt16
-
- encode(PositionedByteRange, Integer) - Method in class org.apache.hadoop.hbase.types.OrderedInt32
-
- encode(PositionedByteRange, Long) - Method in class org.apache.hadoop.hbase.types.OrderedInt64
-
- encode(PositionedByteRange, Byte) - Method in class org.apache.hadoop.hbase.types.OrderedInt8
-
- encode(PositionedByteRange, Number) - Method in class org.apache.hadoop.hbase.types.OrderedNumeric
-
- encode(PositionedByteRange, String) - Method in class org.apache.hadoop.hbase.types.OrderedString
-
- encode(PositionedByteRange, Byte) - Method in class org.apache.hadoop.hbase.types.RawByte
-
- encode(PositionedByteRange, byte[]) - Method in class org.apache.hadoop.hbase.types.RawBytes
-
- encode(PositionedByteRange, byte[], int, int) - Method in class org.apache.hadoop.hbase.types.RawBytes
-
Write val
into dst
, respecting voff
and vlen
.
- encode(PositionedByteRange, byte[], int, int) - Method in class org.apache.hadoop.hbase.types.RawBytesFixedLength
-
Write val
into buff
, respecting offset
and
length
.
- encode(PositionedByteRange, byte[], int, int) - Method in class org.apache.hadoop.hbase.types.RawBytesTerminated
-
Write val
into dst
, respecting offset
and
length
.
- encode(PositionedByteRange, Double) - Method in class org.apache.hadoop.hbase.types.RawDouble
-
- encode(PositionedByteRange, Float) - Method in class org.apache.hadoop.hbase.types.RawFloat
-
- encode(PositionedByteRange, Integer) - Method in class org.apache.hadoop.hbase.types.RawInteger
-
- encode(PositionedByteRange, Long) - Method in class org.apache.hadoop.hbase.types.RawLong
-
- encode(PositionedByteRange, Short) - Method in class org.apache.hadoop.hbase.types.RawShort
-
- encode(PositionedByteRange, String) - Method in class org.apache.hadoop.hbase.types.RawString
-
- encode(PositionedByteRange, Object[]) - Method in class org.apache.hadoop.hbase.types.Struct
-
- encode(PositionedByteRange, T) - Method in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
Write instance val
into buffer dst
.
- ENCODE - Static variable in class org.apache.hadoop.hbase.util.Base64
-
Specify encoding.
- encode3to4(byte[], byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Encodes up to the first three bytes of array threeBytes and
returns a four-byte array in Base64 notation.
- encode3to4(byte[], int, int, byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Encodes up to three bytes of the array source and writes the
resulting four Base64 bytes to destination.
- ENCODE_ON_DISK - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- encodeBlobCopy(PositionedByteRange, byte[], int, int, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a Blob value as a byte-for-byte copy.
- encodeBlobCopy(PositionedByteRange, byte[], Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a Blob value as a byte-for-byte copy.
- encodeBlobVar(PositionedByteRange, byte[], int, int, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a Blob value using a modified varint encoding scheme.
- encodeBlobVar(PositionedByteRange, byte[], Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a blob value using a modified varint encoding scheme.
- encodeByte(PositionedByteRange, byte) - Method in class org.apache.hadoop.hbase.types.OrderedInt8
-
Write instance val
into buffer dst
.
- encodeByte(byte[], int, byte) - Method in class org.apache.hadoop.hbase.types.RawByte
-
Write instance val
into buffer buff
.
- encodeBytes(byte[]) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Encodes a byte array into Base64 notation.
- encodeBytes(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Encodes a byte array into Base64 notation.
- encodeBytes(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Encodes a byte array into Base64 notation.
- encodeBytes(byte[], int, int, int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Encodes a byte array into Base64 notation.
- ENCODED_REGION_NAME_REGEX - Static variable in class org.apache.hadoop.hbase.HRegionInfo
-
A non-capture group so that this can be embedded.
- encodedClass() - Method in interface org.apache.hadoop.hbase.types.DataType
-
Inform consumers over what type this DataType
operates.
- encodedClass() - Method in class org.apache.hadoop.hbase.types.FixedLengthWrapper
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.OrderedBlob
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.OrderedBlobVar
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.OrderedFloat32
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.OrderedFloat64
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.OrderedInt16
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.OrderedInt32
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.OrderedInt64
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.OrderedInt8
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.OrderedNumeric
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.OrderedString
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.RawByte
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.RawBytes
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.RawDouble
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.RawFloat
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.RawInteger
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.RawLong
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.RawShort
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.RawString
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.Struct
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
- encodedClass() - Method in class org.apache.hadoop.hbase.types.Union2
-
- encodedLength(T) - Method in interface org.apache.hadoop.hbase.types.DataType
-
Inform consumers how long the encoded byte[]
will be.
- encodedLength(T) - Method in class org.apache.hadoop.hbase.types.FixedLengthWrapper
-
- encodedLength(byte[]) - Method in class org.apache.hadoop.hbase.types.OrderedBlob
-
- encodedLength(byte[]) - Method in class org.apache.hadoop.hbase.types.OrderedBlobVar
-
- encodedLength(Float) - Method in class org.apache.hadoop.hbase.types.OrderedFloat32
-
- encodedLength(Double) - Method in class org.apache.hadoop.hbase.types.OrderedFloat64
-
- encodedLength(Short) - Method in class org.apache.hadoop.hbase.types.OrderedInt16
-
- encodedLength(Integer) - Method in class org.apache.hadoop.hbase.types.OrderedInt32
-
- encodedLength(Long) - Method in class org.apache.hadoop.hbase.types.OrderedInt64
-
- encodedLength(Byte) - Method in class org.apache.hadoop.hbase.types.OrderedInt8
-
- encodedLength(Number) - Method in class org.apache.hadoop.hbase.types.OrderedNumeric
-
- encodedLength(String) - Method in class org.apache.hadoop.hbase.types.OrderedString
-
- encodedLength(T) - Method in class org.apache.hadoop.hbase.types.PBType
-
- encodedLength(Byte) - Method in class org.apache.hadoop.hbase.types.RawByte
-
- encodedLength(byte[]) - Method in class org.apache.hadoop.hbase.types.RawBytes
-
- encodedLength(Double) - Method in class org.apache.hadoop.hbase.types.RawDouble
-
- encodedLength(Float) - Method in class org.apache.hadoop.hbase.types.RawFloat
-
- encodedLength(Integer) - Method in class org.apache.hadoop.hbase.types.RawInteger
-
- encodedLength(Long) - Method in class org.apache.hadoop.hbase.types.RawLong
-
- encodedLength(Short) - Method in class org.apache.hadoop.hbase.types.RawShort
-
- encodedLength(String) - Method in class org.apache.hadoop.hbase.types.RawString
-
- encodedLength(Object[]) - Method in class org.apache.hadoop.hbase.types.Struct
-
- encodedLength(T) - Method in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
- encodeDouble(PositionedByteRange, double) - Method in class org.apache.hadoop.hbase.types.OrderedFloat64
-
Write instance val
into buffer dst
.
- encodeDouble(PositionedByteRange, double) - Method in class org.apache.hadoop.hbase.types.OrderedNumeric
-
Write instance val
into buffer dst
.
- encodeDouble(byte[], int, double) - Method in class org.apache.hadoop.hbase.types.RawDouble
-
Write instance val
into buffer buff
.
- encodeFileToFile(String, String) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Reads infile and encodes it to outfile.
- encodeFloat(PositionedByteRange, float) - Method in class org.apache.hadoop.hbase.types.OrderedFloat32
-
Write instance val
into buffer buff
.
- encodeFloat(byte[], int, float) - Method in class org.apache.hadoop.hbase.types.RawFloat
-
Write instance val
into buffer buff
.
- encodeFloat32(PositionedByteRange, float, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a 32-bit floating point value using the fixed-length encoding.
- encodeFloat64(PositionedByteRange, double, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a 64-bit floating point value using the fixed-length encoding.
- encodeFromFile(String) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Convenience method for reading a binary file and base64-encoding it.
- encodeInt(PositionedByteRange, int) - Method in class org.apache.hadoop.hbase.types.OrderedInt32
-
Write instance val
into buffer dst
.
- encodeInt(byte[], int, int) - Method in class org.apache.hadoop.hbase.types.RawInteger
-
Write instance val
into buffer buff
.
- encodeInt16(PositionedByteRange, short, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode an int16
value using the fixed-length encoding.
- encodeInt32(PositionedByteRange, int, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode an int32
value using the fixed-length encoding.
- encodeInt64(PositionedByteRange, long, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode an int64
value using the fixed-length encoding.
- encodeInt8(PositionedByteRange, byte, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode an int8
value using the fixed-length encoding.
- encodeLong(PositionedByteRange, long) - Method in class org.apache.hadoop.hbase.types.OrderedInt64
-
Write instance val
into buffer dst
.
- encodeLong(PositionedByteRange, long) - Method in class org.apache.hadoop.hbase.types.OrderedNumeric
-
Write instance val
into buffer dst
.
- encodeLong(byte[], int, long) - Method in class org.apache.hadoop.hbase.types.RawLong
-
Write instance val
into buffer buff
.
- encodeNull(PositionedByteRange, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a null value.
- encodeNumeric(PositionedByteRange, long, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a numerical value using the variable-length encoding.
- encodeNumeric(PositionedByteRange, double, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a numerical value using the variable-length encoding.
- encodeNumeric(PositionedByteRange, BigDecimal, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a numerical value using the variable-length encoding.
- encodeObject(Serializable) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Serializes an object and returns the Base64-encoded version of that
serialized object.
- encodeObject(Serializable, int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Serializes an object and returns the Base64-encoded version of that
serialized object.
- encodeRegionName(byte[]) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
- encodeShort(PositionedByteRange, short) - Method in class org.apache.hadoop.hbase.types.OrderedInt16
-
Write instance val
into buffer dst
.
- encodeShort(byte[], int, short) - Method in class org.apache.hadoop.hbase.types.RawShort
-
Write instance val
into buffer buff
.
- encodeString(PositionedByteRange, String, Order) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Encode a String value.
- encodeToFile(byte[], String) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Convenience method for encoding data to a file.
- encodeVisibilityForReplication(List<Tag>, Byte) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
Provides a way to modify the visibility tags of type TagType
.VISIBILITY_TAG_TYPE, that are part of the cell created from the WALEdits
that are prepared for replication while calling
ReplicationEndpoint
.replicate().
- encrypt(OutputStream, byte[], int, int, Encryptor) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Encrypt a block of plaintext
- encrypt(OutputStream, byte[], int, int, Encryption.Context, byte[]) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Encrypt a block of plaintext
- encrypt(OutputStream, InputStream, Encryptor) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Encrypt a stream of plaintext given an encryptor
- encrypt(OutputStream, InputStream, Encryption.Context, byte[]) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Encrypt a stream of plaintext given a context and IV
- ENCRYPTION - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- Encryption - Class in org.apache.hadoop.hbase.io.crypto
-
A facade for encryption algorithms and related support.
- Encryption.Context - Class in org.apache.hadoop.hbase.io.crypto
-
Crypto context
- ENCRYPTION_KEY - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- EncryptionTest - Class in org.apache.hadoop.hbase.util
-
- Encryptor - Interface in org.apache.hadoop.hbase.io.crypto
-
Encryptors apply a cipher to an OutputStream to produce ciphertext.
- encryptWithSubjectKey(OutputStream, InputStream, String, Configuration, Cipher, byte[]) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Encrypts a block of plaintext with the symmetric key resolved for the given subject
- END - Static variable in class org.apache.hadoop.hbase.mapreduce.SimpleTotalOrderPartitioner
-
Deprecated.
- END_TIME_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.HLogInputFormat
-
Deprecated.
- END_TIME_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.WALInputFormat
-
- ENSEMBLE_TABLE_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
The name of the ensemble table
- EQUAL_TO - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for equal to (=)
- EQUAL_TO_ARRAY - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
EQUAL_TO Array
- EQUAL_TO_BUFFER - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
- equals(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
equals
- equals(Object) - Method in class org.apache.hadoop.hbase.client.Get
-
- equals(Object) - Method in class org.apache.hadoop.hbase.client.Increment
-
- equals(Object) - Method in class org.apache.hadoop.hbase.client.RowMutations
-
- equals(Object) - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- equals(Object) - Method in class org.apache.hadoop.hbase.filter.NullComparator
-
- equals(Object) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- equals(Object) - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- equals(Object) - Method in class org.apache.hadoop.hbase.HRegionLocation
-
- equals(Object) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Compare the contents of the descriptor with another one passed as a parameter.
- equals(Object) - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- equals(Object) - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- equals(Object) - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
- equals(Object) - Method in class org.apache.hadoop.hbase.security.access.Permission
-
- equals(Object) - Method in class org.apache.hadoop.hbase.security.User
-
- equals(Object) - Method in class org.apache.hadoop.hbase.ServerName
-
- equals(Object) - Method in class org.apache.hadoop.hbase.TableName
-
- equals(ByteBuffer, int, int, ByteBuffer, int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- equals(ByteBuffer, int, int, byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- equals(Object) - Method in class org.apache.hadoop.hbase.util.Bytes
-
- equals(byte[], byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- equals(byte[], int, int, byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- equals(byte[], ByteBuffer) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- equals(List<byte[]>, List<byte[]>) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- equals(Object) - Method in class org.apache.hadoop.hbase.util.Pair
-
- equals(Object) - Method in class org.apache.hadoop.hbase.util.PairOfSameType
-
- equals(Object) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- equals(Object) - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- equalsIgnoreMvccVersion(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
special case for Cell.equals
- ESTIMATED_HEAP_TAX - Static variable in class org.apache.hadoop.hbase.util.Bytes
-
Estimate of size cost to pay beyond payload in jvm for instance of byte [].
- estimatedHeapSizeOf(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
This is an estimate of the heap space occupied by a cell.
- estimatedSerializedSizeOf(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- estimatedSerializedSizeOfKey(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- evaluate(Cell) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityExpEvaluator
-
Evaluates whether the passed cell passes Scan/Get Authorization.
- EVICT_BLOCKS_ON_CLOSE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- EXEC_PERMISSION_CHECKS_KEY - Static variable in interface org.apache.hadoop.hbase.security.access.AccessControlConstants
-
Configuration option that toggles whether EXEC permission checking is
performed during coprocessor endpoint invocations.
- execProcedure(String, String, Map<String, String>) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Execute a distributed procedure on a cluster.
- execProcedureWithRet(String, String, Map<String, String>) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Execute a distributed procedure on a cluster.
- execute(Cluster, HttpMethod, Header[], String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Execute a transaction method.
- executePathOnly(Cluster, HttpMethod, Header[], String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Execute a transaction method given only the path.
- executeURI(HttpMethod, Header[], String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Execute a transaction method given a complete URI.
- executionTime() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- exists(Get) - Method in interface org.apache.hadoop.hbase.client.Table
-
Test for the existence of columns in the table, as specified by the Get.
- exists(Get) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- exists(List<Get>) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
Deprecated.
- existsAll(List<Get>) - Method in interface org.apache.hadoop.hbase.client.Table
-
Test for the existence of columns in the table, as specified by the Gets.
- existsAll(List<Get>) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
exists(List) is really a list of get() calls.
- ExponentialClientBackoffPolicy - Class in org.apache.hadoop.hbase.client.backoff
-
Simple exponential backoff policy on for the client that uses a percent^4 times the
max backoff to generate the backoff time.
- ExponentialClientBackoffPolicy(Configuration) - Constructor for class org.apache.hadoop.hbase.client.backoff.ExponentialClientBackoffPolicy
-
- Export - Class in org.apache.hadoop.hbase.mapreduce
-
Export an HBase table.
- Export() - Constructor for class org.apache.hadoop.hbase.mapreduce.Export
-
- ExportSnapshot - Class in org.apache.hadoop.hbase.snapshot
-
Export the specified snapshot to a given FileSystem.
- ExportSnapshot() - Constructor for class org.apache.hadoop.hbase.snapshot.ExportSnapshot
-
- ExportSnapshotException - Exception in org.apache.hadoop.hbase.snapshot
-
Thrown when a snapshot could not be exported due to an error during the operation.
- ExportSnapshotException(String) - Constructor for exception org.apache.hadoop.hbase.snapshot.ExportSnapshotException
-
- ExportSnapshotException(String, Exception) - Constructor for exception org.apache.hadoop.hbase.snapshot.ExportSnapshotException
-
- extendLimit(ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- extractArguments(ArrayList<byte[]>) - Static method in class org.apache.hadoop.hbase.filter.CompareFilter
-
- extractFilterSimpleExpression(byte[], int) - Method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Extracts a simple filter expression from the filter string given by the user
- extractKeyValues(Result) - Method in class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
Extract columns values from the current record.
- extractKeyValues(Result) - Method in class org.apache.hadoop.hbase.mapreduce.GroupingTableMapper
-
Extract columns values from the current record.
- extraHeapSize() - Method in class org.apache.hadoop.hbase.client.Increment
-
- extraHeapSize() - Method in class org.apache.hadoop.hbase.client.Mutation
-
Subclasses should override this method to add the heap size of their own fields.
- Get - Class in org.apache.hadoop.hbase.client
-
Used to perform Get operations on a single row.
- Get(byte[]) - Constructor for class org.apache.hadoop.hbase.client.Get
-
Create a Get operation for the specified row.
- Get(Get) - Constructor for class org.apache.hadoop.hbase.client.Get
-
Copy-constructor
- get(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Put
-
Returns a list of all KeyValue objects with matching column family and qualifier.
- get(Get) - Method in interface org.apache.hadoop.hbase.client.Table
-
Extracts certain cells from a given row.
- get(List<Get>) - Method in interface org.apache.hadoop.hbase.client.Table
-
Extracts certain cells from the given rows, in batch.
- get() - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
Get the data from the BytesWritable.
- get(String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a GET request
- get(Cluster, String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a GET request
- get(String, String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a GET request
- get(Cluster, String, String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a GET request
- get(String, Header[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a GET request
- get(Cluster, String, Header[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a GET request
- get(Get) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- get(List<Get>) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- get(int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Retrieve the byte at index
.
- get(int, byte[]) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Fill dst
with bytes from the range, starting from index
.
- get(int, byte[], int, int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Fill dst
with bytes from the range, starting from index
.
- get() - Method in class org.apache.hadoop.hbase.util.Bytes
-
Get the data from the Bytes.
- get() - Method in class org.apache.hadoop.hbase.util.Counter
-
- get() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Retrieve the next byte from this range.
- get(byte[]) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Fill dst
with bytes from the range, starting from position
.
- get(byte[], int, int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Fill dst
with bytes from the range, starting from the current
position
.
- get(int, byte[]) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- get(int, byte[], int, int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- get(int, byte[]) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- get(int, byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- get(int, byte[]) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- get(int, byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- getACL() - Method in class org.apache.hadoop.hbase.client.Mutation
-
- getACL() - Method in class org.apache.hadoop.hbase.client.Query
-
- getActions() - Method in class org.apache.hadoop.hbase.security.access.Permission
-
- getActiveMaster() - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
Gets the current active master, if available.
- getAdmin() - Method in interface org.apache.hadoop.hbase.client.Connection
-
Retrieve an Admin implementation to administer an HBase cluster.
- getAdmin() - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Retrieve an Admin implementation to administer an HBase cluster.
- getAdmin(ServerName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use thru HConnection
- getAdmin(ServerName, boolean) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
You can pass master flag but nothing special is done.
- getAdmin() - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Allows subclasses to get the
Admin
.
- getAliasPassword(String) - Method in class org.apache.hadoop.hbase.io.crypto.KeyStoreKeyProvider
-
- getAllFilters() - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Returns all known filters
- getAllowPartialResults() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getAllRegionLocations() - Method in interface org.apache.hadoop.hbase.client.RegionLocator
-
Retrieves all of the regions associated with this table.
- getAlphabet(int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Returns one of the _SOMETHING_ALPHABET byte arrays depending on the options
specified.
- getAlterStatus(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get the status of alter command - indicates how many regions have received the updated schema
Asynchronous operation.
- getAlterStatus(byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get the status of alter command - indicates how many regions have received the updated schema
Asynchronous operation.
- getAttribute(String) - Method in interface org.apache.hadoop.hbase.client.Attributes
-
Gets an attribute
- getAttribute(String) - Method in class org.apache.hadoop.hbase.client.OperationWithAttributes
-
- getAttributeSize() - Method in class org.apache.hadoop.hbase.client.OperationWithAttributes
-
- getAttributesMap() - Method in interface org.apache.hadoop.hbase.client.Attributes
-
Gets all attributes
- getAttributesMap() - Method in class org.apache.hadoop.hbase.client.OperationWithAttributes
-
- getAuthorizations() - Method in class org.apache.hadoop.hbase.client.Query
-
- getAuths(Configuration, String) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
- getAuths(Connection, String) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
- getAverageLatencyForEachRegionServer() - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer.HTableMultiplexerStatus
-
- getAverageLoad() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getBackoffTime(ServerName, byte[], ServerStatistics) - Method in interface org.apache.hadoop.hbase.client.backoff.ClientBackoffPolicy
-
- getBackoffTime(ServerName, byte[], ServerStatistics) - Method in class org.apache.hadoop.hbase.client.backoff.ExponentialClientBackoffPolicy
-
- getBackupMasters() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getBackupMastersSize() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getBackupZooKeeperServerNum() - Method in class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
- getBadLineCount() - Method in class org.apache.hadoop.hbase.mapreduce.TextSortReducer
-
- getBadLineCount() - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
- getBadLineCount() - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterTextMapper
-
- getBalancerOn() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getBatch() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getBlocksize() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getBlockSize() - Method in interface org.apache.hadoop.hbase.io.crypto.Decryptor
-
Get the cipher's internal block size
- getBlockSize() - Method in interface org.apache.hadoop.hbase.io.crypto.Encryptor
-
Get the cipher's internal block size
- getBloomFilterType() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getBody() - Method in class org.apache.hadoop.hbase.rest.client.Response
-
- getBufferedCounterForEachRegionServer() - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer.HTableMultiplexerStatus
-
- getBufferedMutator(TableName) - Method in interface org.apache.hadoop.hbase.client.Connection
-
- getBufferedMutator(BufferedMutatorParams) - Method in interface org.apache.hadoop.hbase.client.Connection
-
- getByteBuffer() - Method in class org.apache.hadoop.hbase.io.ByteBufferOutputStream
-
This flips the underlying BB so be sure to use it _last_!
- getBytes() - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
The underlying byte[].
- getBytes(ByteBuffer) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Returns a new byte array, copied from the given buf
,
from the position (inclusive) to the limit (exclusive).
- getCacheBlocks() - Method in class org.apache.hadoop.hbase.client.Get
-
Get whether blocks should be cached for this Get.
- getCacheBlocks() - Method in class org.apache.hadoop.hbase.client.Scan
-
Get whether blocks should be cached for this Scan.
- getCaching() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getCause(int) - Method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- getCauses() - Method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- getCellKeyAsString(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- getCellKeySerializedAsKeyValueKey(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
This method exists just to encapsulate how we serialize keys.
- getCellValueFromProto(HBaseProtos.DoubleMsg) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- getCellVisibility() - Method in class org.apache.hadoop.hbase.client.Mutation
-
- getChance() - Method in class org.apache.hadoop.hbase.filter.RandomRowFilter
-
- getCipher(String) - Method in interface org.apache.hadoop.hbase.io.crypto.CipherProvider
-
Get an Cipher
- getCipher() - Method in class org.apache.hadoop.hbase.io.crypto.Context
-
- getCipher(String) - Method in class org.apache.hadoop.hbase.io.crypto.DefaultCipherProvider
-
- getCipher(Configuration, String) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Get an cipher given a name
- getCipherProvider(Configuration) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
- getClient(ServerName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use thru HConnection
- getClientAckTime() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getClientPort() - Method in class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
- getClientPortList() - Method in class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
Get the list of client ports.
- getCluster() - Method in class org.apache.hadoop.hbase.rest.client.Client
-
- getClusterId() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getClusterIds() - Method in class org.apache.hadoop.hbase.client.Mutation
-
- getClusterKey() - Method in class org.apache.hadoop.hbase.replication.ReplicationPeerConfig
-
- getClusterStatus() - Method in interface org.apache.hadoop.hbase.client.Admin
-
- getClusterStatus() - Method in class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
- getClusterVersion() - Method in class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
- getCode() - Method in class org.apache.hadoop.hbase.rest.client.Response
-
- getColumnCells(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Result
-
Return the Cells for the specific column.
- getColumnFamilies() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getColumnLatestCell(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Result
-
The Cell for the most recent timestamp for a given column.
- getColumnLatestCell(byte[], int, int, byte[], int, int) - Method in class org.apache.hadoop.hbase.client.Result
-
The Cell for the most recent timestamp for a given column.
- getColumnOffset() - Method in class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
- getCompactionCompression() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getCompactionCompressionType() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getCompactionState(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get the current compaction state of a table.
- getCompactionState(TableName, Admin.CompactType) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get the current compaction state of a table.
- getCompactionStateForRegion(byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get the current compaction state of region.
- getComparator() - Method in class org.apache.hadoop.hbase.filter.CompareFilter
-
- getComparator() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- getComparator() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
Deprecated.
Use Region#getCellComparator(). deprecated for hbase 2.0, remove for hbase 3.0
- getCompleteSequenceId() - Method in class org.apache.hadoop.hbase.RegionLoad
-
This does not really belong inside RegionLoad but its being done in the name of expediency.
- getCompression() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getCompressionType() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getCompressor() - Method in enum org.apache.hadoop.hbase.io.compress.Compression.Algorithm
-
- getConf() - Method in class org.apache.hadoop.hbase.io.crypto.Context
-
- getConf() - Method in class org.apache.hadoop.hbase.io.crypto.DefaultCipherProvider
-
- getConf() - Method in class org.apache.hadoop.hbase.mapreduce.GroupingTableMapper
-
Returns the current configuration.
- getConf() - Method in class org.apache.hadoop.hbase.mapreduce.HRegionPartitioner
-
Returns the current configuration.
- getConf() - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormat
-
Returns the current configuration.
- getConf() - Method in class org.apache.hadoop.hbase.mapreduce.SimpleTotalOrderPartitioner
-
- getConf() - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Returns the current configuration.
- getConf() - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputFormat
-
- getConfiguration() - Method in interface org.apache.hadoop.hbase.client.Admin
-
- getConfiguration() - Method in interface org.apache.hadoop.hbase.client.BufferedMutator
-
Returns the Configuration
object used by this instance.
- getConfiguration() - Method in interface org.apache.hadoop.hbase.client.Connection
-
- getConfiguration() - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
- getConfiguration() - Method in interface org.apache.hadoop.hbase.client.Table
-
Returns the Configuration
object used by this instance.
- getConfiguration() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getConfiguration() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getConfiguration() - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- getConfiguration() - Method in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- getConfiguration() - Method in class org.apache.hadoop.hbase.replication.ReplicationPeerConfig
-
- getConfiguration() - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- getConfigurationValue(String) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Getter for accessing the configuration value by key.
- getConfigurationValue(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Getter for accessing the configuration value by key
- getConfigurationValue(String) - Method in class org.apache.hadoop.hbase.NamespaceDescriptor
-
Getter for accessing the configuration value by key
- getConnection() - Method in interface org.apache.hadoop.hbase.client.Admin
-
- getConsistency() - Method in class org.apache.hadoop.hbase.client.Query
-
Returns the consistency level for this operation
- getCoprocessors() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Return the list of attached co-processor represented by their name className
- getCounter(String) - Method in class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
- getCurrent() - Static method in class org.apache.hadoop.hbase.security.User
-
Returns the User
instance within current execution context.
- getCurrentCompactedKVs() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getCurrentCompactedKVs() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getCurrentKey() - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReader
-
Returns the current key.
- getCurrentKey() - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl
-
Returns the current key.
- getCurrentNrHRS() - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
This method will be changed from public to package protected.
- getCurrentValue() - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReader
-
Returns the current value.
- getCurrentValue() - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl
-
Returns the current value.
- getDataBlockEncoderById(short) - Static method in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
Find and create data block encoder for given id;
- getDataBlockEncoding() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getDataLocality() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getDate() - Static method in class org.apache.hadoop.hbase.util.VersionInfo
-
The date that hbase was compiled.
- getDeadServerNames() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getDeadServers() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getDeadServersSize() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getDecodabet(int) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Returns one of the _SOMETHING_DECODABET byte arrays depending on the
options specified.
- getDecompressor() - Method in enum org.apache.hadoop.hbase.io.compress.Compression.Algorithm
-
- getDecryptor() - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Return a decryptor for decrypting data.
- getDefaultValues() - Static method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getDesc(List<Throwable>, List<? extends Row>, List<String>) - Static method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- getDesc(Map<String, Integer>) - Static method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- getDescriptiveNameFromRegionStateForDisplay(RegionState, Configuration) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Get the descriptive name as RegionState
does it but with hidden
startkey optionally
- getDFSReplication() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getDropDependentColumn() - Method in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- getDurability() - Method in class org.apache.hadoop.hbase.client.Mutation
-
Get the current durability
- getDurability() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Returns the durability setting for the table.
- getEncodedName() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getEncodedNameAsBytes() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getEncoder() - Method in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
Return new data block encoder for given algorithm type.
- getEncodingById(short) - Static method in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
- getEncryptionKey() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Return the raw crypto key attribute for the family, or null if not set
- getEncryptionType() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Return the encryption algorithm in use by this family
- getEncryptor() - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Get an encryptor for encrypting data.
- getEnd() - Method in exception org.apache.hadoop.hbase.errorhandling.TimeoutException
-
- getEndKey() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getEndKeyForDisplay(HRegionInfo, Configuration) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Get the end key for display.
- getEndKeys() - Method in interface org.apache.hadoop.hbase.client.RegionLocator
-
Gets the ending row key for every region in the currently open table.
- getEndRow() - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getEndRow() - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Returns the end row.
- getException() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getExceptionCause() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getExceptionFullMessage() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getExceptionMessage() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getExhaustiveDescription() - Method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- getExists() - Method in class org.apache.hadoop.hbase.client.Result
-
- getExpression() - Method in class org.apache.hadoop.hbase.security.visibility.CellVisibility
-
- getExtraHeader(String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Get an extra header value.
- getExtraHeaders() - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Get all extra headers (read-only).
- getFailedCounterForEachRegionServer() - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer.HTableMultiplexerStatus
-
- getFailureCount() - Method in exception org.apache.hadoop.hbase.exceptions.PreemptiveFastFailException
-
- getFamilies() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getFamilies() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Returns an unmodifiable collection of all the
HColumnDescriptor
of all the column families of the table.
- getFamiliesKeys() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Returns all the column family names of the current table.
- getFamily() - Method in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- getFamily() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- getFamily(byte[]) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Returns the HColumnDescriptor for a specific column family with name as
specified by the parameter column.
- getFamilyArray() - Method in interface org.apache.hadoop.hbase.Cell
-
Contiguous bytes composed of legal HDFS filename characters which may start at any index in the
containing array.
- getFamilyCellMap() - Method in class org.apache.hadoop.hbase.client.Mutation
-
Method for retrieving the put's familyMap
- getFamilyLength() - Method in interface org.apache.hadoop.hbase.Cell
-
- getFamilyMap() - Method in class org.apache.hadoop.hbase.client.Get
-
Method for retrieving the get's familyMap
- getFamilyMap(byte[]) - Method in class org.apache.hadoop.hbase.client.Result
-
Map of qualifiers to values.
- getFamilyMap() - Method in class org.apache.hadoop.hbase.client.Scan
-
Getting the familyMap
- getFamilyMapOfLongs() - Method in class org.apache.hadoop.hbase.client.Increment
-
Before 0.95, when you called Increment#getFamilyMap(), you got back
a map of families to a list of Longs.
- getFamilyOffset() - Method in interface org.apache.hadoop.hbase.Cell
-
- getFilter() - Method in class org.apache.hadoop.hbase.client.Query
-
- getFilter() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getFilter() - Method in class org.apache.hadoop.hbase.filter.SkipFilter
-
- getFilter() - Method in class org.apache.hadoop.hbase.filter.WhileMatchFilter
-
- getFilterArguments(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Returns the arguments of the filter from the filter string
- getFilterIfMissing() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
Get whether entire row should be filtered if column is not found.
- getFilterName(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Returns the filter name given a simple filter expression
- getFilters() - Method in class org.apache.hadoop.hbase.filter.FilterList
-
Get the filters.
- getFingerprint() - Method in class org.apache.hadoop.hbase.client.Get
-
Compile the table and column family (i.e.
- getFingerprint() - Method in class org.apache.hadoop.hbase.client.Mutation
-
Compile the column family (i.e.
- getFingerprint() - Method in class org.apache.hadoop.hbase.client.Operation
-
Produces a Map containing a fingerprint which identifies the type and
the static schema components of a query (i.e.
- getFingerprint() - Method in class org.apache.hadoop.hbase.client.Scan
-
Compile the table and column family (i.e.
- getFirst() - Method in class org.apache.hadoop.hbase.util.Pair
-
Return the first element stored in the pair.
- getFirst() - Method in class org.apache.hadoop.hbase.util.PairOfSameType
-
Return the first element stored in the pair.
- getFirstFailureAt() - Method in exception org.apache.hadoop.hbase.exceptions.PreemptiveFastFailException
-
- getFlushPolicyClassName() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
This gets the class associated with the flush policy which determines the stores need to be
flushed when flushing a region.
- getForeignExceptionMessage() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getGroupAuths(String[], boolean) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
Retrieve the visibility labels for the groups.
- getGroupNames() - Method in class org.apache.hadoop.hbase.security.User
-
Returns the list of groups of which this user is a member.
- getHBaseVersion() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getHeader(String) - Method in class org.apache.hadoop.hbase.rest.client.Response
-
- getHeaders() - Method in class org.apache.hadoop.hbase.rest.client.Response
-
- getHostAndPort() - Method in class org.apache.hadoop.hbase.ServerName
-
- getHostname() - Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getHostname() - Method in exception org.apache.hadoop.hbase.ipc.RemoteWithExtrasException
-
- getHostname() - Method in class org.apache.hadoop.hbase.ServerName
-
- getHostnamePort(int) - Method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- getHostnamePort() - Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getHTable() - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
- getHTable() - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
- getHTableDescriptor(TableName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- getHTableDescriptor(byte[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- getHTableDescriptors(List<String>) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
- getHTableDescriptorsByTableName(List<TableName>) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
- getHTableMultiplexerStatus() - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer
-
- getHttpClient() - Method in class org.apache.hadoop.hbase.rest.client.Client
-
- getId() - Method in class org.apache.hadoop.hbase.client.OperationWithAttributes
-
This method allows you to retrieve the identifier for the operation if one
was set.
- getId() - Method in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
- getInfoServerPort() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getInstance() - Static method in class org.apache.hadoop.hbase.io.crypto.DefaultCipherProvider
-
- getInt(Configuration, String, String, int) - Static method in class org.apache.hadoop.hbase.HBaseConfiguration
-
Get the value of the name
property as an int
, possibly
referring to the deprecated name of the configuration property.
- getInt(int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Retrieve the int value at index
- getInt() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Retrieve the next int value from this range.
- getIsolationLevel() - Method in class org.apache.hadoop.hbase.client.Query
-
- getIv() - Method in interface org.apache.hadoop.hbase.io.crypto.Encryptor
-
Get the initialization vector
- getIvLength() - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Return the expected initialization vector length, in bytes, or 0 if not applicable
- getIvLength() - Method in interface org.apache.hadoop.hbase.io.crypto.Decryptor
-
Get the expected length for the initialization vector
- getIvLength() - Method in interface org.apache.hadoop.hbase.io.crypto.Encryptor
-
Get the expected length for the initialization vector
- getKeepAliveMasterService() - Method in interface org.apache.hadoop.hbase.client.HConnection
-
- getKeepDeletedCells() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getKey() - Method in class org.apache.hadoop.hbase.io.crypto.Context
-
- getKey(String) - Method in interface org.apache.hadoop.hbase.io.crypto.KeyProvider
-
Retrieve the key for a given key aliase
- getKey(String) - Method in class org.apache.hadoop.hbase.io.crypto.KeyStoreKeyProvider
-
- getKeyBytes() - Method in class org.apache.hadoop.hbase.io.crypto.Context
-
- getKeyBytesHash() - Method in class org.apache.hadoop.hbase.io.crypto.Context
-
- getKeyFormat() - Method in class org.apache.hadoop.hbase.io.crypto.Context
-
- getKeyLength() - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Return the key length required by this cipher, in bytes
- getKeyProvider(Configuration) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
- getKeys(String[]) - Method in interface org.apache.hadoop.hbase.io.crypto.KeyProvider
-
Retrieve keys for a given set of key aliases
- getKeys(String[]) - Method in class org.apache.hadoop.hbase.io.crypto.KeyStoreKeyProvider
-
- getKeyValues(Configuration, String) - Static method in class org.apache.hadoop.hbase.util.ConfigurationUtil
-
Retrieve a list of key value pairs from configuration, stored under the provided key
- getKeyValues(Configuration, String, char) - Static method in class org.apache.hadoop.hbase.util.ConfigurationUtil
-
Retrieve a list of key value pairs from configuration, stored under the provided key
- getLabels() - Method in class org.apache.hadoop.hbase.security.visibility.Authorizations
-
- getLabels(User, Authorizations) - Method in interface org.apache.hadoop.hbase.security.visibility.ScanLabelGenerator
-
Helps to get a list of lables associated with an UGI
- getLastAttemptAt() - Method in exception org.apache.hadoop.hbase.exceptions.PreemptiveFastFailException
-
- getLastMajorCompactionTimestamp(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get the timestamp of the last major compaction for the passed table
The timestamp of the oldest HFile resulting from a major compaction of that table,
or 0 if no such HFile could be found.
- getLastMajorCompactionTimestampForRegion(byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get the timestamp of the last major compaction for the passed region.
- getLastMajorCompactionTs() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getLastMajorCompactionTsForRegion(byte[]) - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getLastMajorCompactionTsForTable(TableName) - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getLastUpdate() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getLatestVersionOnly() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
Get whether only the latest version of the column value should be compared.
- getLength() - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- getLength() - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getLength() - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Returns the length of the split.
- getLength() - Method in class org.apache.hadoop.hbase.types.FixedLengthWrapper
-
Retrieve the maximum length (in bytes) of encoded values.
- getLength() - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
The length of the range.
- getLength() - Method in class org.apache.hadoop.hbase.util.Bytes
-
- getLimit() - Method in class org.apache.hadoop.hbase.filter.ColumnCountGetFilter
-
- getLimit() - Method in class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
- getLimit() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Return the current limit
- getListener() - Method in class org.apache.hadoop.hbase.client.BufferedMutatorParams
-
- getLiveMasters() - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- getLiveRegionServers() - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- getLoad(ServerName) - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getLoad() - Method in class org.apache.hadoop.hbase.ServerLoad
-
Originally, this method factored in the effect of requests going to the
server as well.
- getLoadColumnFamiliesOnDemandValue() - Method in class org.apache.hadoop.hbase.client.Scan
-
Get the raw loadColumnFamiliesOnDemand setting; if it's not set, can be null.
- getLocation() - Method in class org.apache.hadoop.hbase.rest.client.Response
-
- getLocations() - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getLocations() - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Returns the region's location as an array.
- getLong(int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Retrieve the long value at index
- getLong() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Retrieve the next long value from this range.
- getMap() - Method in class org.apache.hadoop.hbase.client.Result
-
Map of families to all versions of its qualifiers and values.
- getMaster() - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use thru HConnection
- getMaster() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
Returns detailed information about the current master
ServerName
.
- getMaster(int) - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- getMasterCoprocessors() - Method in interface org.apache.hadoop.hbase.client.Admin
-
Helper delegage to getClusterStatus().getMasterCoprocessors().
- getMasterCoprocessors() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getMasterInfoPort() - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get the info port of the current master if one is available.
- getMasters() - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- getMax() - Method in class org.apache.hadoop.hbase.io.TimeRange
-
- getMaxAllowedOperationTime() - Method in exception org.apache.hadoop.hbase.errorhandling.TimeoutException
-
- getMaxColumn() - Method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- getMaxColumnInclusive() - Method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- getMaxFileSize() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Returns the maximum size upto which a region can grow to after which a region
split is triggered.
- getMaxHeapMB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getMaxKeyValueSize() - Method in class org.apache.hadoop.hbase.client.BufferedMutatorParams
-
- getMaxLatency() - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer.HTableMultiplexerStatus
-
- getMaxLatencyForEachRegionServer() - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer.HTableMultiplexerStatus
-
- getMaxResultSize() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getMaxResultsPerColumnFamily() - Method in class org.apache.hadoop.hbase.client.Get
-
Method for retrieving the get's maximum number of values
to return per Column Family
- getMaxResultsPerColumnFamily() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getMaxValue() - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- getMaxVersions() - Method in class org.apache.hadoop.hbase.client.Get
-
Method for retrieving the get's maximum number of version
- getMaxVersions() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getMaxVersions() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getMD5AsHex(byte[]) - Static method in class org.apache.hadoop.hbase.util.MD5Hash
-
Given a byte array, returns in MD5 hash as a hex string.
- getMD5AsHex(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.MD5Hash
-
Given a byte array, returns its MD5 hash as a hex string.
- getMemStoreFlushSize() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Returns the size of the memstore after which a flush to filesystem is triggered.
- getMemstoreSizeInMB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getMemStoreSizeMB() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getMetricsMap() - Method in class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
Get all of the values since the last time this function was called.
- getMin() - Method in class org.apache.hadoop.hbase.filter.TimestampsFilter
-
Gets the minimum timestamp requested by filter.
- getMin() - Method in class org.apache.hadoop.hbase.io.TimeRange
-
- getMinColumn() - Method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- getMinColumnInclusive() - Method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- getMinValue() - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- getMinVersions() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getMobThreshold() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Gets the mob threshold of the family.
- getMutations() - Method in class org.apache.hadoop.hbase.client.RowMutations
-
- getName() - Method in interface org.apache.hadoop.hbase.client.BufferedMutator
-
Gets the fully qualified table name instance of the table that this BufferedMutator writes to.
- getName() - Method in interface org.apache.hadoop.hbase.client.RegionLocator
-
Gets the fully qualified table name instance of this table.
- getName() - Method in enum org.apache.hadoop.hbase.client.security.SecurityCapability
-
- getName() - Method in interface org.apache.hadoop.hbase.client.Table
-
Gets the fully qualified table name instance of this table.
- getName() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getName() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getName() - Method in enum org.apache.hadoop.hbase.io.compress.Compression.Algorithm
-
- getName() - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Return this Cipher's name
- getName() - Method in interface org.apache.hadoop.hbase.io.crypto.CipherProvider
-
Return the provider's name
- getName() - Method in class org.apache.hadoop.hbase.io.crypto.DefaultCipherProvider
-
- getName() - Method in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- getName() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getName() - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- getName() - Method in class org.apache.hadoop.hbase.security.User
-
Returns the full user name.
- getName() - Method in class org.apache.hadoop.hbase.TableName
-
- getNameAsString() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getNameAsString() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Get the name of the table as a String
- getNameAsString() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getNameAsString() - Method in class org.apache.hadoop.hbase.TableName
-
- getNameFromId(short) - Static method in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
Find and return the name of data block encoder for the given id.
- getNameInBytes() - Method in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
- getNamespace() - Method in class org.apache.hadoop.hbase.quotas.QuotaSettings
-
- getNamespace() - Method in class org.apache.hadoop.hbase.TableName
-
- getNamespaceAsString() - Method in class org.apache.hadoop.hbase.TableName
-
- getNamespaceDescriptor(String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get a namespace descriptor by name
- getNamespaceFilter() - Method in class org.apache.hadoop.hbase.quotas.QuotaFilter
-
- getNameWithNamespaceInclAsString() - Method in class org.apache.hadoop.hbase.TableName
-
Ideally, getNameAsString should contain namespace within it,
but if the namespace is default, it just returns the name.
- getNextCellHint(Cell) - Method in class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
- getNextCellHint(Cell) - Method in class org.apache.hadoop.hbase.filter.ColumnPrefixFilter
-
- getNextCellHint(Cell) - Method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- getNextCellHint(Cell) - Method in class org.apache.hadoop.hbase.filter.Filter
-
If the filter returns the match code SEEK_NEXT_USING_HINT, then it should also tell which is
the next key it must seek to.
- getNextCellHint(Cell) - Method in class org.apache.hadoop.hbase.filter.FilterList
-
- getNextCellHint(Cell) - Method in class org.apache.hadoop.hbase.filter.FuzzyRowFilter
-
- getNextCellHint(Cell) - Method in class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- getNextCellHint(Cell) - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter
-
- getNonceGenerator() - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use thru HConnection
- getNonceKey() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getNoVersionMap() - Method in class org.apache.hadoop.hbase.client.Result
-
Map of families to their most recent qualifiers and values.
- getNumberOfRegions() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getNumberOfRequests() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getNumExceptions() - Method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- getOffset() - Method in class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
- getOffset() - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- getOffset() - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
The offset, the index into the underlying byte[] at which this range
begins.
- getOffset() - Method in class org.apache.hadoop.hbase.util.Bytes
-
- getOnlineRegions(ServerName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get all the online regions on a region server.
- getOperationTimeout() - Method in interface org.apache.hadoop.hbase.client.Admin
-
- getOperator() - Method in class org.apache.hadoop.hbase.filter.BitComparator
-
- getOperator() - Method in class org.apache.hadoop.hbase.filter.CompareFilter
-
- getOperator() - Method in class org.apache.hadoop.hbase.filter.FilterList
-
Get the operator.
- getOperator() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- getOrder() - Method in interface org.apache.hadoop.hbase.types.DataType
-
Retrieve the sort
Order
imposed by this data type, or null when
natural ordering is not preserved.
- getOrder() - Method in class org.apache.hadoop.hbase.types.FixedLengthWrapper
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.OrderedBytesBase
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.PBType
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.RawByte
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.RawBytes
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.RawDouble
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.RawFloat
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.RawInteger
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.RawLong
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.RawShort
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.RawString
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.Struct
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.Union2
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.Union3
-
- getOrder() - Method in class org.apache.hadoop.hbase.types.Union4
-
- getOutputCommitter(TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat
-
- getOutputCommitter(TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputFormat
-
Returns the output committer.
- getOverallAverageLatency() - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer.HTableMultiplexerStatus
-
- getOwnerString() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Deprecated.
- getPageSize() - Method in class org.apache.hadoop.hbase.filter.PageFilter
-
- getParentId() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getPartition(ImmutableBytesWritable, V2, int) - Method in class org.apache.hadoop.hbase.mapred.HRegionPartitioner
-
- getPartition(ImmutableBytesWritable, VALUE, int) - Method in class org.apache.hadoop.hbase.mapreduce.HRegionPartitioner
-
Gets the partition number for a given key (hence record) given the total
number of partitions i.e.
- getPartition(ImmutableBytesWritable, VALUE, int) - Method in class org.apache.hadoop.hbase.mapreduce.SimpleTotalOrderPartitioner
-
- getPassword(Configuration, String, String) - Static method in class org.apache.hadoop.hbase.HBaseConfiguration
-
Get the password from the Configuration instance using the
getPassword method if it exists.
- getPeerConfig(String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
- getPeerData() - Method in class org.apache.hadoop.hbase.replication.ReplicationPeerConfig
-
- getPeersCount() - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Get the number of slave clusters the local cluster has.
- getPeerState(String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Get the state of the specified peer cluster
- getPeerTableCFs(String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Get the replicable table-cf config of the specified peer.
- getPool() - Method in class org.apache.hadoop.hbase.client.BufferedMutatorParams
-
- getPort() - Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getPort() - Method in exception org.apache.hadoop.hbase.ipc.RemoteWithExtrasException
-
- getPort() - Method in class org.apache.hadoop.hbase.ServerName
-
- getPos() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- getPos() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- getPosition() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
The current position
marker.
- getPrefix() - Method in class org.apache.hadoop.hbase.filter.ColumnPrefixFilter
-
- getPrefix() - Method in class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- getPrefix() - Method in class org.apache.hadoop.hbase.filter.PrefixFilter
-
- getProcId() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getProcName() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getProcOwner() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getProcState() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getProgress() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- getProgress() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- getProgress() - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReader
-
The current progress of the record reader through its data.
- getProgress() - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl
-
The current progress of the record reader through its data.
- getPromotedValueFromProto(HBaseProtos.DoubleMsg) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- getProtocol() - Method in exception org.apache.hadoop.hbase.exceptions.UnknownProtocolException
-
- getProtoForCellType(Double) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- getProtoForPromotedType(Double) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- getProvider() - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Return the provider for this Cipher
- getQualifier() - Method in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- getQualifier() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- getQualifier() - Method in class org.apache.hadoop.hbase.TableName
-
- getQualifierArray() - Method in interface org.apache.hadoop.hbase.Cell
-
Contiguous raw bytes that may start at any index in the containing array.
- getQualifierAsString() - Method in class org.apache.hadoop.hbase.TableName
-
- getQualifierBufferShallowCopy(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Deprecated.
As of release 2.0.0, this will be removed in HBase 3.0.0.
- getQualifierLength() - Method in interface org.apache.hadoop.hbase.Cell
-
- getQualifierOffset() - Method in interface org.apache.hadoop.hbase.Cell
-
- getQuantiles(double[]) - Method in class org.apache.hadoop.hbase.util.FastLongHistogram
-
Computes the quantiles give the ratios.
- getQuotaRetriever(QuotaFilter) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Return a QuotaRetriever to list the quotas based on the filter.
- getQuotaType() - Method in class org.apache.hadoop.hbase.quotas.QuotaSettings
-
- getRandomKey() - Method in class org.apache.hadoop.hbase.io.crypto.Cipher
-
Create a random symmetric key
- getReadRequestsCount() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getReadRequestsCount() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getRecordReader(InputSplit, JobConf, Reporter) - Method in class org.apache.hadoop.hbase.mapred.MultiTableSnapshotInputFormat
-
- getRecordReader(InputSplit, JobConf, Reporter) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
Builds a TableRecordReader.
- getRecordReader(InputSplit, JobConf, Reporter) - Method in class org.apache.hadoop.hbase.mapred.TableSnapshotInputFormat
-
- getRecordWriter(FileSystem, JobConf, String, Progressable) - Method in class org.apache.hadoop.hbase.mapred.TableOutputFormat
-
Creates a new record writer.
- getRecordWriter(TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat
-
Deprecated.
- getRecordWriter(TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2
-
- getRecordWriter(TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat
-
- getRecordWriter(TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputFormat
-
Creates a new record writer.
- getRegionCachePrefetch(TableName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
always return false since 0.99
- getRegionCachePrefetch(byte[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
always return false since 0.99
- getRegionId() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getRegionInfo() - Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getRegionLocation(TableName, byte[], boolean) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use thru HConnection
- getRegionLocation(byte[], byte[], boolean) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- getRegionLocation(byte[]) - Method in interface org.apache.hadoop.hbase.client.RegionLocator
-
Finds the region on which the given row is being served.
- getRegionLocation(byte[], boolean) - Method in interface org.apache.hadoop.hbase.client.RegionLocator
-
Finds the region on which the given row is being served.
- getRegionLocation() - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getRegionLocation() - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Returns the region location.
- getRegionLocator(TableName) - Method in interface org.apache.hadoop.hbase.client.Connection
-
Retrieve a RegionLocator implementation to inspect region information on a table.
- getRegionLocator(TableName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Retrieve a RegionLocator implementation to inspect region information on a table.
- getRegionLocator() - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
- getRegionName() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getRegionNameAsString() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getRegionNameAsStringForDisplay(HRegionInfo, Configuration) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Get the region name for display.
- getRegionNameForDisplay(HRegionInfo, Configuration) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Get the region name for display.
- getRegionReplication() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Returns the configured replicas per region
- getRegionsCount() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getRegionServer(int) - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- getRegionServerCoprocessors() - Method in class org.apache.hadoop.hbase.ServerLoad
-
Return the RegionServer-level coprocessors
- getRegionServers() - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- getRegionsInTransition() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getRegionsLoad() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getRegionSplitPolicyClassName() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
This gets the class associated with the region split policy which
determines when a region split should occur.
- getRemaining() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
The number of bytes remaining between position and the end of the range.
- getReplicaId() - Method in class org.apache.hadoop.hbase.client.Query
-
Returns region replica id where Query will fetch data from.
- getReplicaId() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
Returns the region replica id
- getReplicationEndpointImpl() - Method in class org.apache.hadoop.hbase.replication.ReplicationPeerConfig
-
- getReplicationLoadSink() - Method in class org.apache.hadoop.hbase.ServerLoad
-
Call directly from client such as hbase shell
- getReplicationLoadSourceList() - Method in class org.apache.hadoop.hbase.ServerLoad
-
Call directly from client such as hbase shell
- getRequestData() - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- getRequestsCount() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getRequestsCount() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getRequestsPerSecond() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getRestVersion() - Method in class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
- getResult() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getRevision() - Static method in class org.apache.hadoop.hbase.util.VersionInfo
-
Get the subversion revision number for the root directory
- getRootIndexSizeKB() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getRootIndexSizeKB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getRow() - Method in class org.apache.hadoop.hbase.client.Get
-
Method for retrieving the get's row
- getRow() - Method in class org.apache.hadoop.hbase.client.Mutation
-
Method for retrieving the delete's row
- getRow() - Method in class org.apache.hadoop.hbase.client.Result
-
Method for retrieving the row key that corresponds to
the row from which this Result was created.
- getRow(int) - Method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- getRow() - Method in interface org.apache.hadoop.hbase.client.Row
-
- getRow() - Method in class org.apache.hadoop.hbase.client.RowMutations
-
- getRowArray() - Method in interface org.apache.hadoop.hbase.Cell
-
Contiguous raw bytes that may start at any index in the containing array.
- getRowAsInt(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Converts the rowkey bytes of the given cell into an int value
- getRowByte(Cell, int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
misc
- getRowComparator() - Method in class org.apache.hadoop.hbase.TableName
-
Deprecated.
The comparator is an internal property of the table. Should
not have been exposed here
- getRowLength() - Method in interface org.apache.hadoop.hbase.Cell
-
- getRowOffset() - Method in interface org.apache.hadoop.hbase.Cell
-
- getRowOffsetPerColumnFamily() - Method in class org.apache.hadoop.hbase.client.Get
-
Method for retrieving the get's offset per row per column
family (#kvs to be skipped)
- getRowOffsetPerColumnFamily() - Method in class org.apache.hadoop.hbase.client.Scan
-
Method for retrieving the scan's offset per row per column
family (#kvs to be skipped)
- getRowRanges() - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter
-
- getRsCoprocessors() - Method in class org.apache.hadoop.hbase.ServerLoad
-
Return the RegionServer-level and Region-level coprocessors
- getScan() - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Gets the scan defining the actual details like columns etc.
- getScan() - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Returns a Scan object from the stored string representation.
- getScanMetrics() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getScanner(Scan) - Method in interface org.apache.hadoop.hbase.client.Table
-
Returns a scanner on the current table as specified by the
Scan
object.
- getScanner(byte[]) - Method in interface org.apache.hadoop.hbase.client.Table
-
Gets a scanner on the current table for the given family.
- getScanner(byte[], byte[]) - Method in interface org.apache.hadoop.hbase.client.Table
-
Gets a scanner on the current table for the given family and qualifier.
- getScanner(Scan) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- getScanner(byte[]) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- getScanner(byte[], byte[]) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- getScans() - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormatBase
-
Allows subclasses to get the list of
Scan
objects.
- getScope() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getSecond() - Method in class org.apache.hadoop.hbase.util.Pair
-
Return the second element stored in the pair.
- getSecond() - Method in class org.apache.hadoop.hbase.util.PairOfSameType
-
Return the second element stored in the pair.
- getSecretKeyForSubject(String, Configuration) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Resolves a key for the given subject
- getSecurityCapabilities() - Method in interface org.apache.hadoop.hbase.client.Admin
-
Return the set of supported security capabilities.
- getSeqNum() - Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getSequenceId() - Method in interface org.apache.hadoop.hbase.Cell
-
A region-specific unique monotonically increasing sequence ID given to each Cell.
- getServerName() - Method in class org.apache.hadoop.hbase.HRegionLocation
-
- getServerName() - Method in class org.apache.hadoop.hbase.ServerName
-
- getServerName(String, long) - Static method in class org.apache.hadoop.hbase.ServerName
-
- getServerNameLessStartCode(String) - Static method in class org.apache.hadoop.hbase.ServerName
-
Utility method to excise the start code from a server name
- getServers() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getServersSize() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getServerStartcodeFromServerName(String) - Static method in class org.apache.hadoop.hbase.ServerName
-
- getShort(int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Retrieve the short value at index
- getShort() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Retrieve the next short value from this range.
- getShortName() - Method in class org.apache.hadoop.hbase.security.User
-
Returns the shortened version of the user name -- the portion that maps
to an operating system user name.
- getShortNameToLog() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getSize() - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- getSize() - Method in class org.apache.hadoop.hbase.util.Bytes
-
- getSkipBadLines() - Method in class org.apache.hadoop.hbase.mapreduce.TextSortReducer
-
- getSkipBadLines() - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
- getSkipBadLines() - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterTextMapper
-
- getSnapshotDescription() - Method in exception org.apache.hadoop.hbase.snapshot.HBaseSnapshotException
-
- getSnapshotList(Configuration) - Static method in class org.apache.hadoop.hbase.snapshot.SnapshotInfo
-
Returns the list of available snapshots in the specified location
- getSnapshotStats(Configuration, HBaseProtos.SnapshotDescription) - Static method in class org.apache.hadoop.hbase.snapshot.SnapshotInfo
-
Returns the snapshot stats
- getSource() - Method in exception org.apache.hadoop.hbase.errorhandling.ForeignException
-
- getSourceName() - Method in exception org.apache.hadoop.hbase.errorhandling.TimeoutException
-
- getSplitKey(byte[], byte[], boolean) - Static method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
select a split point in the region.
- getSplits(JobConf, int) - Method in class org.apache.hadoop.hbase.mapred.MultiTableSnapshotInputFormat
-
- getSplits(JobConf, int) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
Calculates the splits that will serve as input for the map tasks.
- getSplits(JobConf, int) - Method in class org.apache.hadoop.hbase.mapred.TableSnapshotInputFormat
-
- getSplits(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.HLogInputFormat
-
Deprecated.
- getSplits(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormatBase
-
Calculates the splits that will serve as input for the map tasks.
- getSplits(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableSnapshotInputFormat
-
- getSplits(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Calculates the splits that will serve as input for the map tasks.
- getSplits(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Calculates the splits that will serve as input for the map tasks.
- getSplits(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableSnapshotInputFormat
-
- getSplits(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.WALInputFormat
-
- getSrcChecksum() - Static method in class org.apache.hadoop.hbase.util.VersionInfo
-
Get the checksum of the source files from which Hadoop was compiled.
- getStart() - Method in exception org.apache.hadoop.hbase.errorhandling.TimeoutException
-
- getStartcode() - Method in class org.apache.hadoop.hbase.ServerName
-
- getStartEndKeys() - Method in interface org.apache.hadoop.hbase.client.RegionLocator
-
Gets the starting and ending row keys for every region in the currently
open table.
- getStartEndKeys() - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
- getStartEndKeys() - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
- getStartKey(byte[]) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Gets the start key from the specified region name.
- getStartKey() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- getStartKeyForDisplay(HRegionInfo, Configuration) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Get the start key for display.
- getStartKeys() - Method in interface org.apache.hadoop.hbase.client.RegionLocator
-
Gets the starting row key for every region in the currently open table.
- getStartRow() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getStartRow() - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
- getStartRow() - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getStartRow() - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Returns the start row.
- getStartTime() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- getStats() - Method in class org.apache.hadoop.hbase.client.Result
-
- getStopRow() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getStopRow() - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
- getStopRowKey() - Method in class org.apache.hadoop.hbase.filter.InclusiveStopFilter
-
- getStoreCompleteSequenceId() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getStorefileIndexSizeInMB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getStorefileIndexSizeMB() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getStorefiles() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getStorefiles() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getStorefileSizeInMB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getStorefileSizeMB() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getStores() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getStores() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getStoreUncompressedSizeMB() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getStoreUncompressedSizeMB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getStream() - Method in class org.apache.hadoop.hbase.rest.client.Response
-
Gets the input stream instance.
- getSupportedCiphers() - Method in interface org.apache.hadoop.hbase.io.crypto.CipherProvider
-
Return the set of Ciphers supported by this provider
- getSupportedCiphers() - Method in class org.apache.hadoop.hbase.io.crypto.DefaultCipherProvider
-
- getSupportedCiphers() - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Get names of supported encryption algorithms
- getSupportedCiphers(Configuration) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Get names of supported encryption algorithms
- getSupportedFilters() - Method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Return a Set of filters supported by the Filter Language
- getTable(TableName) - Method in interface org.apache.hadoop.hbase.client.Connection
-
Retrieve a Table implementation for accessing a table.
- getTable(TableName, ExecutorService) - Method in interface org.apache.hadoop.hbase.client.Connection
-
Retrieve a Table implementation for accessing a table.
- getTable(String) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Retrieve an HTableInterface implementation for access to a table.
- getTable(byte[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Retrieve an HTableInterface implementation for access to a table.
- getTable(TableName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Retrieve an HTableInterface implementation for access to a table.
- getTable(String, ExecutorService) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Retrieve an HTableInterface implementation for access to a table.
- getTable(byte[], ExecutorService) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Retrieve an HTableInterface implementation for access to a table.
- getTable(TableName, ExecutorService) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Retrieve an HTableInterface implementation for access to a table.
- getTable(byte[]) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Gets the table name from the specified region name.
- getTable() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
Get current table name of the region
- getTable() - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
Allows subclasses to get the
Table
.
- getTable() - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getTable() - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Allows subclasses to get the
Table
.
- getTable() - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Returns the table name.
- getTableDescriptor(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Method for getting the tableDescriptor
- getTableDescriptor() - Method in interface org.apache.hadoop.hbase.client.Table
-
- getTableDescriptor() - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- getTableDescriptors(List<String>) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get tableDescriptors
- getTableDescriptorsByTableName(List<TableName>) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get tableDescriptors
- getTableDir(Path, byte[]) - Static method in class org.apache.hadoop.hbase.HTableDescriptor
-
Deprecated.
- getTableFilter() - Method in class org.apache.hadoop.hbase.quotas.QuotaFilter
-
- getTableList() - Method in class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
- getTableName() - Method in class org.apache.hadoop.hbase.client.BufferedMutatorParams
-
- getTableName() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Get the name of the table
- getTableName() - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- getTableName() - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Returns the table name converted to a byte array.
- getTableName() - Method in class org.apache.hadoop.hbase.quotas.QuotaSettings
-
- getTableName() - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- getTableNames() - Method in interface org.apache.hadoop.hbase.client.HConnection
-
- getTableRegions(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Get the regions of a given table.
- getTableState(TableName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Retrieve TableState, represent current table state.
- getTagArray(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Returns tag value in a new byte array.
- getTagsArray() - Method in interface org.apache.hadoop.hbase.Cell
-
- getTagsLength() - Method in interface org.apache.hadoop.hbase.Cell
-
- getTagsOffset() - Method in interface org.apache.hadoop.hbase.Cell
-
- getTimeRange() - Method in class org.apache.hadoop.hbase.client.Get
-
Method for retrieving the get's TimeRange
- getTimeRange() - Method in class org.apache.hadoop.hbase.client.Increment
-
Gets the TimeRange used for this increment.
- getTimeRange() - Method in class org.apache.hadoop.hbase.client.Scan
-
- getTimestamp() - Method in interface org.apache.hadoop.hbase.Cell
-
- getTimeStamp() - Method in class org.apache.hadoop.hbase.client.Mutation
-
Method for retrieving the timestamp
- getTimestamps() - Method in class org.apache.hadoop.hbase.filter.TimestampsFilter
-
- getTimeToLive() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getToken(String, String) - Method in class org.apache.hadoop.hbase.security.User
-
Returns the Token of the specified kind associated with this user,
or null if the Token is not present.
- getTokens() - Method in class org.apache.hadoop.hbase.security.User
-
Returns all the tokens stored in the user's credentials.
- getTotalBufferedCounter() - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer.HTableMultiplexerStatus
-
- getTotalCompactingKVs() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getTotalCompactingKVs() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getTotalFailedCounter() - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer.HTableMultiplexerStatus
-
- getTotalNumberOfRequests() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getTotalSizeOfCells(Result) - Static method in class org.apache.hadoop.hbase.client.Result
-
Get total size of raw cells
- getTotalStaticBloomSizeKB() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getTotalStaticBloomSizeKB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getTotalStaticIndexSizeKB() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getTotalStaticIndexSizeKB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getTs() - Method in class org.apache.hadoop.hbase.mapreduce.TextSortReducer
-
- getTs() - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
- getTTL() - Method in class org.apache.hadoop.hbase.client.Mutation
-
Return the TTL requested for the result of the mutation, in milliseconds.
- getType() - Method in exception org.apache.hadoop.hbase.quotas.ThrottlingException
-
- getTypeByte() - Method in interface org.apache.hadoop.hbase.Cell
-
- getTypeFilters() - Method in class org.apache.hadoop.hbase.quotas.QuotaFilter
-
- getUGI() - Method in class org.apache.hadoop.hbase.security.User
-
- getUnit(String) - Static method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getUrl() - Static method in class org.apache.hadoop.hbase.util.VersionInfo
-
Get the subversion URL for the root hbase directory.
- getUsedHeapMB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getUser() - Static method in class org.apache.hadoop.hbase.util.VersionInfo
-
The user that compiled hbase.
- getUserAuths(byte[], boolean) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
Retrieve the visibility labels for the user.
- getUserFilter() - Method in class org.apache.hadoop.hbase.quotas.QuotaFilter
-
- getUserName() - Method in class org.apache.hadoop.hbase.quotas.QuotaSettings
-
- getUserPermissions(Connection, String) - Static method in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
List all the userPermissions matching the given pattern.
- getUtf8ByteArrays(List<String>) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- getValue(byte[], byte[], Cell) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- getValue(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Result
-
Get the latest version of the specified column.
- getValue() - Method in enum org.apache.hadoop.hbase.client.security.SecurityCapability
-
- getValue() - Method in class org.apache.hadoop.hbase.filter.ByteArrayComparable
-
- getValue() - Method in class org.apache.hadoop.hbase.filter.SubstringComparator
-
- getValue(byte[]) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getValue(String) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getValue(byte[]) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Getter for accessing the metadata associated with the key
- getValue(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Getter for accessing the metadata associated with the key
- getValueArray() - Method in interface org.apache.hadoop.hbase.Cell
-
Contiguous raw bytes that may start at any index in the containing array.
- getValueAsBigDecimal(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Converts the value bytes of the given cell into a BigDecimal
- getValueAsByteBuffer(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Result
-
Returns the value wrapped in a new ByteBuffer
.
- getValueAsByteBuffer(byte[], int, int, byte[], int, int) - Method in class org.apache.hadoop.hbase.client.Result
-
Returns the value wrapped in a new ByteBuffer
.
- getValueAsDouble(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Converts the value bytes of the given cell into a double value
- getValueAsLong(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Converts the value bytes of the given cell into a long value
- getValueBufferShallowCopy(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- getValueLength() - Method in interface org.apache.hadoop.hbase.Cell
-
- getValueOffset() - Method in interface org.apache.hadoop.hbase.Cell
-
- getValues() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- getValues() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- getVersion() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- getVersion() - Method in class org.apache.hadoop.hbase.security.access.Permission
-
- getVersion() - Static method in class org.apache.hadoop.hbase.util.VersionInfo
-
Get the hbase version.
- getVersionedBytes() - Method in class org.apache.hadoop.hbase.ServerName
-
- getVisibilityExpEvaluator(Authorizations) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
Creates VisibilityExpEvaluator corresponding to given Authorizations.
- getVisibilityExpressionResolver() - Method in class org.apache.hadoop.hbase.mapreduce.CellCreator
-
- getVLong(int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Retrieve the long value at index
which is stored as VLong
- getVLong() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Retrieve the next long value, which is stored as VLong, from this range
- getWaitInterval() - Method in exception org.apache.hadoop.hbase.quotas.ThrottlingException
-
- getWriteBufferSize() - Method in interface org.apache.hadoop.hbase.client.BufferedMutator
-
Returns the maximum size in bytes of the write buffer for this HTable.
- getWriteBufferSize() - Method in class org.apache.hadoop.hbase.client.BufferedMutatorParams
-
- getWriteBufferSize() - Method in interface org.apache.hadoop.hbase.client.Table
-
- getWriteBufferSize() - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- getWriteRequestsCount() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- getWriteRequestsCount() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- getZooKeeperServerNum() - Method in class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
- grant(Connection, TableName, String, byte[], byte[], Permission.Action...) - Static method in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
Grants permission on the specified table for the specified user
- grant(Connection, String, String, Permission.Action...) - Static method in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
Grants permission on the specified namespace for the specified user.
- grant(Connection, String, Permission.Action...) - Static method in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
- GREATER_THAN_ARRAY - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
GREATER_THAN Array
- GREATER_THAN_BUFFER - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
- GREATER_THAN_OR_EQUAL_TO_ARRAY - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
GREATER_THAN_OR_EQUAL_TO Array
- GREATER_THAN_OR_EQUAL_TO_BUFFER - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
- GROUP_COLUMNS - Static variable in class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
JobConf parameter to specify the columns used to produce the key passed to
collect from the map phase
- GROUP_COLUMNS - Static variable in class org.apache.hadoop.hbase.mapreduce.GroupingTableMapper
-
JobConf parameter to specify the columns used to produce the key passed to
collect from the map phase.
- GroupingTableMap - Class in org.apache.hadoop.hbase.mapred
-
Extract grouping columns from input record
- GroupingTableMap() - Constructor for class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
- GroupingTableMapper - Class in org.apache.hadoop.hbase.mapreduce
-
Extract grouping columns from input record.
- GroupingTableMapper() - Constructor for class org.apache.hadoop.hbase.mapreduce.GroupingTableMapper
-
- groupOrSplit(Multimap<ByteBuffer, LoadIncrementalHFiles.LoadQueueItem>, LoadIncrementalHFiles.LoadQueueItem, Table, Pair<byte[][], byte[][]>) - Method in class org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
-
Attempt to assign the given load queue item into its target region group.
- GZIP - Static variable in class org.apache.hadoop.hbase.util.Base64
-
Specify that data should be gzip-compressed.
- H - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for 'H'
- has(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Put
-
A convenience method to determine if this object's familyMap contains
a value assigned to the given family & qualifier.
- has(byte[], byte[], long) - Method in class org.apache.hadoop.hbase.client.Put
-
A convenience method to determine if this object's familyMap contains
a value assigned to the given family, qualifier and timestamp.
- has(byte[], byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.Put
-
A convenience method to determine if this object's familyMap contains
a value assigned to the given family, qualifier and timestamp.
- has(byte[], byte[], long, byte[]) - Method in class org.apache.hadoop.hbase.client.Put
-
A convenience method to determine if this object's familyMap contains
the given value assigned to the given family, qualifier and timestamp.
- HAS_LARGE_RESULT - Static variable in class org.apache.hadoop.hbase.mapreduce.Import
-
- hasBody() - Method in class org.apache.hadoop.hbase.rest.client.Response
-
- hasClientAckTime() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- hasCoprocessor(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Check if the table has an attached co-processor represented by the name className
- hasCounter(String) - Method in class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
- hasFamilies() - Method in class org.apache.hadoop.hbase.client.Get
-
Method for checking if any families have been inserted into this Get
- hasFamilies() - Method in class org.apache.hadoop.hbase.client.Increment
-
Method for checking if any families have been inserted into this Increment
- hasFamilies() - Method in class org.apache.hadoop.hbase.client.Scan
-
- hasFamily(byte[]) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Checks to see if this table contains the given column family
- hasFilter() - Method in class org.apache.hadoop.hbase.client.Scan
-
- hasFilterRow() - Method in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- hasFilterRow() - Method in class org.apache.hadoop.hbase.filter.Filter
-
Primarily used to check for conflicts with scans(such as scans that do not read a full row at a
time).
- hasFilterRow() - Method in class org.apache.hadoop.hbase.filter.FilterList
-
- hasFilterRow() - Method in class org.apache.hadoop.hbase.filter.PageFilter
-
- hasFilterRow() - Method in class org.apache.hadoop.hbase.filter.RandomRowFilter
-
- hasFilterRow() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueExcludeFilter
-
- hasFilterRow() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- hasFilterRow() - Method in class org.apache.hadoop.hbase.filter.SkipFilter
-
- hasFilterRow() - Method in class org.apache.hadoop.hbase.filter.WhileMatchFilter
-
- hasFoundKV() - Method in class org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter
-
- hash128(String...) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Return the MD5 digest of the concatenation of the supplied arguments.
- hash128(byte[]...) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Return the MD5 digest of the concatenation of the supplied arguments.
- hash256(String...) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Return the SHA-256 digest of the concatenation of the supplied arguments.
- hash256(byte[]...) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Return the SHA-256 digest of the concatenation of the supplied arguments.
- hashCode() - Method in class org.apache.hadoop.hbase.client.Get
-
- hashCode() - Method in class org.apache.hadoop.hbase.client.Increment
-
- hashCode() - Method in class org.apache.hadoop.hbase.client.RowMutations
-
- hashCode() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- hashCode() - Method in class org.apache.hadoop.hbase.filter.NullComparator
-
- hashCode() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- hashCode() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- hashCode() - Method in class org.apache.hadoop.hbase.HRegionLocation
-
- hashCode() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- hashCode() - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- hashCode() - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- hashCode() - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
- hashCode() - Method in class org.apache.hadoop.hbase.security.access.Permission
-
- hashCode() - Method in class org.apache.hadoop.hbase.security.User
-
- hashCode() - Method in class org.apache.hadoop.hbase.ServerName
-
- hashCode() - Method in class org.apache.hadoop.hbase.TableName
-
- hashCode() - Method in class org.apache.hadoop.hbase.util.Bytes
-
- hashCode(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- hashCode(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- hashCode(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- hashCode() - Method in class org.apache.hadoop.hbase.util.Pair
-
- hashCode() - Method in class org.apache.hadoop.hbase.util.PairOfSameType
-
- hasHigherPriority(ByteBuffer, ByteBuffer) - Method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Returns which operator has higher precedence
- hasMaxHeapMB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- hasNext() - Method in class org.apache.hadoop.hbase.types.StructIterator
-
- hasNumberOfRequests() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- hasParentId() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- hasRegionMemstoreReplication() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- hasResultData() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- hasTotalNumberOfRequests() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- hasUsedHeapMB() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- havingSystemAuth(User) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
System checks for user auth during admin operations.
- HBASE_CANARY_WRITE_DATA_TTL_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Canary config keys
- HBASE_CANARY_WRITE_PERSERVER_REGIONS_LOWERLIMIT_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_CANARY_WRITE_PERSERVER_REGIONS_UPPERLIMIT_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_CANARY_WRITE_TABLE_CHECK_PERIOD_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_CANARY_WRITE_VALUE_SIZE_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_CHECKSUM_VERIFICATION - Static variable in class org.apache.hadoop.hbase.HConstants
-
If this parameter is set to true, then hbase will read
data and then verify checksums.
- HBASE_CLIENT_CONNECTION_IMPL - Static variable in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Key for configuration in Configuration whose value is the class we implement making a
new HConnection instance.
- HBASE_CLIENT_ENABLE_FAST_FAIL_MODE_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_CLIENT_FAST_FAIL_CLEANUP_DURATION_MS_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_CLIENT_FAST_FAIL_CLEANUP_MS_DURATION_MS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_CLIENT_FAST_FAIL_INTERCEPTOR_IMPL - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_CLIENT_FAST_FAIL_MODE_ENABLED - Static variable in class org.apache.hadoop.hbase.HConstants
-
Config for enabling/disabling the fast fail mode.
- HBASE_CLIENT_FAST_FAIL_THREASHOLD_MS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_CLIENT_FAST_FAIL_THREASHOLD_MS_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_CLIENT_INSTANCE_ID - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for unique identifier for this Configuration
instance.
- HBASE_CLIENT_IPC_POOL_SIZE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for HBase client IPC pool size
- HBASE_CLIENT_IPC_POOL_TYPE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for HBase client IPC pool type
- HBASE_CLIENT_MAX_PERREGION_TASKS - Static variable in class org.apache.hadoop.hbase.HConstants
-
The maximum number of concurrent connections the client will maintain to a single
Region.
- HBASE_CLIENT_MAX_PERSERVER_TASKS - Static variable in class org.apache.hadoop.hbase.HConstants
-
The maximum number of concurrent connections the client will maintain to a single
RegionServer.
- HBASE_CLIENT_MAX_TOTAL_TASKS - Static variable in class org.apache.hadoop.hbase.HConstants
-
The maximum number of concurrent connections the client will maintain.
- HBASE_CLIENT_META_OPERATION_TIMEOUT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for HBase client operation timeout, which overrides RPC timeout
- HBASE_CLIENT_OPERATION_TIMEOUT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for HBase client operation timeout, which overrides RPC timeout
- HBASE_CLIENT_PAUSE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for client pause value, used mostly as value to wait
before running a retry of a failed get, region lookup, etc.
- HBASE_CLIENT_RETRIES_NUMBER - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for maximum retries, used as maximum for all retryable
operations such as fetching of the root region from root region server,
getting a cell's value, starting a row update, etc.
- HBASE_CLIENT_SCANNER_ASYNC_PREFETCH - Static variable in class org.apache.hadoop.hbase.client.Scan
-
Parameter name for client scanner sync/async prefetch toggle.
- HBASE_CLIENT_SCANNER_CACHING - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name to set the default scanner caching for all clients.
- HBASE_CLIENT_SCANNER_MAX_RESULT_SIZE_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for maximum number of bytes returned when calling a scanner's next method.
- HBASE_CLIENT_SCANNER_TIMEOUT_PERIOD - Static variable in class org.apache.hadoop.hbase.HConstants
-
The client scanner timeout period in milliseconds.
- HBASE_CLUSTER_MINIMUM_MEMORY_THRESHOLD - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_COORDINATED_STATE_MANAGER_CLASS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Config for pluggable consensus provider
- HBASE_DIR - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for HBase instance root directory
- HBASE_MASTER_LOADBALANCE_BYTABLE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Config for balancing the cluster by table
- HBASE_MASTER_LOADBALANCER_CLASS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Config for pluggable load balancers
- HBASE_MASTER_LOGCLEANER_PLUGINS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_MASTER_NORMALIZER_CLASS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Config for pluggable region normalizer
- HBASE_META_BLOCK_SIZE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for number of versions, kept by meta table.
- HBASE_META_SCANNER_CACHING - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for number of rows that will be fetched when calling next on
a scanner if it is not served from memory.
- HBASE_META_VERSIONS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for number of versions, kept by meta table.
- HBASE_NON_TABLE_DIRS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Directories that are not HBase table directories
- HBASE_NON_USER_TABLE_DIRS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Directories that are not HBase user table directories
- HBASE_NORMALIZER_ENABLED - Static variable in class org.apache.hadoop.hbase.HConstants
-
Config for enabling/disabling pluggable region normalizer
- HBASE_REGION_SPLIT_POLICY_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HBASE_REGIONSERVER_LEASE_PERIOD_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Deprecated.
This config option is deprecated. Will be removed at later releases after 0.96.
- HBASE_RPC_SHORTOPERATION_TIMEOUT_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
timeout for short operation RPC
- HBASE_RPC_TIMEOUT_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
timeout for each RPC
- HBASE_RS_NONCES_ENABLED - Static variable in class org.apache.hadoop.hbase.HConstants
-
Whether nonces are enabled; default is true.
- HBASE_SECURITY_AUTHORIZATION_CONF_KEY - Static variable in class org.apache.hadoop.hbase.security.User
-
- HBASE_SECURITY_CONF_KEY - Static variable in class org.apache.hadoop.hbase.security.User
-
- HBASE_SERVER_PAUSE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for server pause value, used mostly as value to wait before
running a retry of a failed operation.
- HBASE_SERVER_SCANNER_MAX_RESULT_SIZE_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for maximum number of bytes returned when calling a scanner's next method.
- HBASE_SPLITLOG_MANAGER_TIMEOUT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for SplitLog manager timeout
- HBASE_TEMP_DIRECTORY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Temporary directory used for table creation and deletion
- HBASECLIENT_IMPL - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for what hbase client implementation to use.
- HBaseConfiguration - Class in org.apache.hadoop.hbase
-
Adds HBase configuration files to a Configuration
- HBaseConfiguration() - Constructor for class org.apache.hadoop.hbase.HBaseConfiguration
-
Deprecated.
Please use create() instead.
- HBaseConfiguration(Configuration) - Constructor for class org.apache.hadoop.hbase.HBaseConfiguration
-
Deprecated.
Please user create(conf) instead.
- HBaseInterfaceAudience - Class in org.apache.hadoop.hbase
-
This class defines constants for different classes of hbase limited private apis
- HBaseIOException - Exception in org.apache.hadoop.hbase
-
All hbase specific IOExceptions should be subclasses of HBaseIOException
- HBaseIOException() - Constructor for exception org.apache.hadoop.hbase.HBaseIOException
-
- HBaseIOException(String) - Constructor for exception org.apache.hadoop.hbase.HBaseIOException
-
- HBaseIOException(String, Throwable) - Constructor for exception org.apache.hadoop.hbase.HBaseIOException
-
- HBaseIOException(Throwable) - Constructor for exception org.apache.hadoop.hbase.HBaseIOException
-
- HBaseSnapshotException - Exception in org.apache.hadoop.hbase.snapshot
-
General exception base class for when a snapshot fails
- HBaseSnapshotException(String) - Constructor for exception org.apache.hadoop.hbase.snapshot.HBaseSnapshotException
-
Some exception happened for a snapshot and don't even know the snapshot that it was about
- HBaseSnapshotException(String, HBaseProtos.SnapshotDescription) - Constructor for exception org.apache.hadoop.hbase.snapshot.HBaseSnapshotException
-
Exception for the given snapshot that has no previous root cause
- HBaseSnapshotException(String, Throwable, HBaseProtos.SnapshotDescription) - Constructor for exception org.apache.hadoop.hbase.snapshot.HBaseSnapshotException
-
Exception for the given snapshot due to another exception
- HBaseSnapshotException(String, Exception) - Constructor for exception org.apache.hadoop.hbase.snapshot.HBaseSnapshotException
-
Exception when the description of the snapshot cannot be determined, due to some root other
root cause
- HBCK_CODE_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
HBCK special code name used as server name when manipulating ZK nodes
- HBCK_SIDELINEDIR_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
Used by HBCK to sideline backup data
- HColumnDescriptor - Class in org.apache.hadoop.hbase
-
An HColumnDescriptor contains information about a column family such as the
number of versions, compression settings, etc.
- HColumnDescriptor(String) - Constructor for class org.apache.hadoop.hbase.HColumnDescriptor
-
Construct a column descriptor specifying only the family name
The other attributes are defaulted.
- HColumnDescriptor(byte[]) - Constructor for class org.apache.hadoop.hbase.HColumnDescriptor
-
Construct a column descriptor specifying only the family name
The other attributes are defaulted.
- HColumnDescriptor(HColumnDescriptor) - Constructor for class org.apache.hadoop.hbase.HColumnDescriptor
-
Constructor.
- HConnection - Interface in org.apache.hadoop.hbase.client
-
- HConstants - Class in org.apache.hadoop.hbase
-
HConstants holds a bunch of HBase-related constants
- head(String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a HEAD request
- head(Cluster, String, Header[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a HEAD request
- head(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- HEALTH_CHORE_WAKE_FREQ - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HEALTH_FAILURE_THRESHOLD - Static variable in class org.apache.hadoop.hbase.HConstants
-
The maximum number of health check failures a server can encounter consecutively.
- HEALTH_SCRIPT_LOC - Static variable in class org.apache.hadoop.hbase.HConstants
-
Health script related settings.
- HEALTH_SCRIPT_TIMEOUT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HEAP_OCCUPANCY_HIGH_WATERMARK_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HEAP_OCCUPANCY_LOW_WATERMARK_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
- heapSize() - Method in class org.apache.hadoop.hbase.client.Mutation
-
- HFILE_ARCHIVE_DIRECTORY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Directory under /hbase where archived hfiles are stored
- HFILE_BLOCK_CACHE_SIZE_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HFILE_BLOCK_CACHE_SIZE_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for the size of the block cache
- HFILEBLOCK_DUMMY_HEADER - Static variable in class org.apache.hadoop.hbase.HConstants
-
Just an array of bytes of the right size.
- HFILEBLOCK_HEADER_SIZE - Static variable in class org.apache.hadoop.hbase.HConstants
-
The size of a version 2 HFile block header, minor version 1.
- HFILEBLOCK_HEADER_SIZE_NO_CHECKSUM - Static variable in class org.apache.hadoop.hbase.HConstants
-
The size data structures with minor version is 0
- HFileOutputFormat - Class in org.apache.hadoop.hbase.mapreduce
-
- HFileOutputFormat() - Constructor for class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat
-
Deprecated.
- HFileOutputFormat2 - Class in org.apache.hadoop.hbase.mapreduce
-
Writes HFiles.
- HFileOutputFormat2() - Constructor for class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2
-
- HIDDEN_END_KEY - Static variable in class org.apache.hadoop.hbase.HRegionInfo
-
- HIDDEN_START_KEY - Static variable in class org.apache.hadoop.hbase.HRegionInfo
-
- HIGH_QOS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- hint - Variable in class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- HLogInputFormat - Class in org.apache.hadoop.hbase.mapreduce
-
- HLogInputFormat() - Constructor for class org.apache.hadoop.hbase.mapreduce.HLogInputFormat
-
Deprecated.
- HOUR_IN_SECONDS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HREGION_COMPACTIONDIR_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
Used to construct the name of the compaction directory during compaction
- HREGION_EDITS_REPLAY_SKIP_ERRORS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- HREGION_LOGDIR_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
Used to construct the name of the log directory for a region server
- HREGION_MAX_FILESIZE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Conf key for the max file size after which we split the region
- HREGION_MEMSTORE_BLOCK_MULTIPLIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
Block updates if memstore has hbase.hregion.memstore.block.multiplier
times hbase.hregion.memstore.flush.size bytes.
- HREGION_MEMSTORE_FLUSH_SIZE - Static variable in class org.apache.hadoop.hbase.HConstants
-
Conf key for the memstore size at which we flush the memstore
- HREGION_OLDLOGDIR_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
Like the previous, but for old logs that are about to be deleted
- HRegionInfo - Class in org.apache.hadoop.hbase
-
Information about a region.
- HRegionInfo(long, TableName, int) - Constructor for class org.apache.hadoop.hbase.HRegionInfo
-
- HRegionInfo(TableName) - Constructor for class org.apache.hadoop.hbase.HRegionInfo
-
- HRegionInfo(TableName, byte[], byte[]) - Constructor for class org.apache.hadoop.hbase.HRegionInfo
-
Construct HRegionInfo with explicit parameters
- HRegionInfo(TableName, byte[], byte[], boolean) - Constructor for class org.apache.hadoop.hbase.HRegionInfo
-
Construct HRegionInfo with explicit parameters
- HRegionInfo(TableName, byte[], byte[], boolean, long) - Constructor for class org.apache.hadoop.hbase.HRegionInfo
-
Construct HRegionInfo with explicit parameters
- HRegionInfo(TableName, byte[], byte[], boolean, long, int) - Constructor for class org.apache.hadoop.hbase.HRegionInfo
-
Construct HRegionInfo with explicit parameters
- HRegionInfo(HRegionInfo) - Constructor for class org.apache.hadoop.hbase.HRegionInfo
-
Costruct a copy of another HRegionInfo
- HRegionInfo(HRegionInfo, int) - Constructor for class org.apache.hadoop.hbase.HRegionInfo
-
- HRegionLocation - Class in org.apache.hadoop.hbase
-
Data structure to hold HRegionInfo and the address for the hosting
HRegionServer.
- HRegionLocation(HRegionInfo, ServerName) - Constructor for class org.apache.hadoop.hbase.HRegionLocation
-
- HRegionLocation(HRegionInfo, ServerName, long) - Constructor for class org.apache.hadoop.hbase.HRegionLocation
-
- HRegionPartitioner<K2,V2> - Class in org.apache.hadoop.hbase.mapred
-
This is used to partition the output keys into groups of keys.
- HRegionPartitioner() - Constructor for class org.apache.hadoop.hbase.mapred.HRegionPartitioner
-
- HRegionPartitioner<KEY,VALUE> - Class in org.apache.hadoop.hbase.mapreduce
-
This is used to partition the output keys into groups of keys.
- HRegionPartitioner() - Constructor for class org.apache.hadoop.hbase.mapreduce.HRegionPartitioner
-
- HSTORE_OPEN_AND_CLOSE_THREADS_MAX - Static variable in class org.apache.hadoop.hbase.HConstants
-
The max number of threads used for opening and closing stores or store
files in parallel
- HTableDescriptor - Class in org.apache.hadoop.hbase
-
HTableDescriptor contains the details about an HBase table such as the descriptors of
all the column families, is the table a catalog table, -ROOT-
or
hbase:meta
, if the table is read only, the maximum size of the memstore,
when the region split should occur, coprocessors associated with it etc...
- HTableDescriptor(TableName, HColumnDescriptor[]) - Constructor for class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL Private constructor used internally creating table descriptors for
catalog tables, hbase:meta
and -ROOT-
.
- HTableDescriptor(TableName, HColumnDescriptor[], Map<Bytes, Bytes>) - Constructor for class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL Private constructor used internally creating table descriptors for
catalog tables, hbase:meta
and -ROOT-
.
- HTableDescriptor() - Constructor for class org.apache.hadoop.hbase.HTableDescriptor
-
Deprecated.
As of release 0.96 (HBASE-5453).
This was made protected in 2.0.0 and will be removed in HBase 3.0.0.
Used by Writables and Writables are going away.
- HTableDescriptor(TableName) - Constructor for class org.apache.hadoop.hbase.HTableDescriptor
-
Construct a table descriptor specifying a TableName object
- HTableDescriptor(byte[]) - Constructor for class org.apache.hadoop.hbase.HTableDescriptor
-
Deprecated.
- HTableDescriptor(String) - Constructor for class org.apache.hadoop.hbase.HTableDescriptor
-
Deprecated.
- HTableDescriptor(HTableDescriptor) - Constructor for class org.apache.hadoop.hbase.HTableDescriptor
-
Construct a table descriptor by cloning the descriptor passed as a parameter.
- HTableDescriptor(TableName, HTableDescriptor) - Constructor for class org.apache.hadoop.hbase.HTableDescriptor
-
Construct a table descriptor by cloning the descriptor passed as a parameter
but using a different table name.
- HTableMultiplexer - Class in org.apache.hadoop.hbase.client
-
HTableMultiplexer provides a thread-safe non blocking PUT API across all the tables.
- HTableMultiplexer(Configuration, int) - Constructor for class org.apache.hadoop.hbase.client.HTableMultiplexer
-
- HTableMultiplexer.HTableMultiplexerStatus - Class in org.apache.hadoop.hbase.client
-
HTableMultiplexerStatus keeps track of the current status of the HTableMultiplexer.
- HTableMultiplexer.HTableMultiplexerStatus(Map<HRegionLocation, HTableMultiplexer.FlushWorker>) - Constructor for class org.apache.hadoop.hbase.client.HTableMultiplexer.HTableMultiplexerStatus
-
- I - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for 'I'
- ID_ATRIBUTE - Static variable in class org.apache.hadoop.hbase.client.OperationWithAttributes
-
- ID_SIZE - Static variable in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
- IdentityTableMap - Class in org.apache.hadoop.hbase.mapred
-
Pass the given key and record as-is to reduce
- IdentityTableMap() - Constructor for class org.apache.hadoop.hbase.mapred.IdentityTableMap
-
constructor
- IdentityTableMapper - Class in org.apache.hadoop.hbase.mapreduce
-
Pass the given key and record as-is to the reduce phase.
- IdentityTableMapper() - Constructor for class org.apache.hadoop.hbase.mapreduce.IdentityTableMapper
-
- IdentityTableReduce - Class in org.apache.hadoop.hbase.mapred
-
Write to table each key, record pair
- IdentityTableReduce() - Constructor for class org.apache.hadoop.hbase.mapred.IdentityTableReduce
-
- IdentityTableReducer - Class in org.apache.hadoop.hbase.mapreduce
-
Convenience class that simply writes all values (which must be
Put
or
Delete
instances)
passed to it out to the configured HBase table.
- IdentityTableReducer() - Constructor for class org.apache.hadoop.hbase.mapreduce.IdentityTableReducer
-
- idx - Variable in class org.apache.hadoop.hbase.types.StructIterator
-
- ImmutableBytesWritable - Class in org.apache.hadoop.hbase.io
-
A byte sequence that is usable as a key or value.
- ImmutableBytesWritable() - Constructor for class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
Create a zero-size sequence.
- ImmutableBytesWritable(byte[]) - Constructor for class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
Create a ImmutableBytesWritable using the byte array as the initial value.
- ImmutableBytesWritable(ImmutableBytesWritable) - Constructor for class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
Set the new ImmutableBytesWritable to the contents of the passed
ibw
.
- ImmutableBytesWritable(byte[], int, int) - Constructor for class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
Set the value to a given byte range
- ImmutableBytesWritable.Comparator - Class in org.apache.hadoop.hbase.io
-
A Comparator optimized for ImmutableBytesWritable.
- ImmutableBytesWritable.Comparator() - Constructor for class org.apache.hadoop.hbase.io.ImmutableBytesWritable.Comparator
-
constructor
- implies(Permission.Action) - Method in class org.apache.hadoop.hbase.security.access.Permission
-
- Import - Class in org.apache.hadoop.hbase.mapreduce
-
Import data written by
Export
.
- Import() - Constructor for class org.apache.hadoop.hbase.mapreduce.Import
-
- ImportTsv - Class in org.apache.hadoop.hbase.mapreduce
-
Tool to import data from a TSV file.
- ImportTsv() - Constructor for class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- IN_MEMORY - Static variable in class org.apache.hadoop.hbase.HConstants
-
- includeRegionInSplit(byte[], byte[]) - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormatBase
-
Test if the given region is to be included in the InputSplit while
splitting the regions of a table.
- includeRegionInSplit(byte[], byte[]) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Test if the given region is to be included in the InputSplit while splitting
the regions of a table.
- InclusiveStopFilter - Class in org.apache.hadoop.hbase.filter
-
A Filter that stops after the given row.
- InclusiveStopFilter(byte[]) - Constructor for class org.apache.hadoop.hbase.filter.InclusiveStopFilter
-
- IncompatibleFilterException - Exception in org.apache.hadoop.hbase.filter
-
Used to indicate a filter incompatibility
- IncompatibleFilterException() - Constructor for exception org.apache.hadoop.hbase.filter.IncompatibleFilterException
-
constructor
- IncompatibleFilterException(String) - Constructor for exception org.apache.hadoop.hbase.filter.IncompatibleFilterException
-
constructor
- increment(Double) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- Increment - Class in org.apache.hadoop.hbase.client
-
Used to perform Increment operations on a single row.
- Increment(byte[]) - Constructor for class org.apache.hadoop.hbase.client.Increment
-
Create a Increment operation for the specified row.
- Increment(byte[], int, int) - Constructor for class org.apache.hadoop.hbase.client.Increment
-
Create a Increment operation for the specified row.
- Increment(Increment) - Constructor for class org.apache.hadoop.hbase.client.Increment
-
Copy constructor
- increment(Increment) - Method in interface org.apache.hadoop.hbase.client.Table
-
Increments one or more columns within a single row.
- increment(Increment) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- increment() - Method in class org.apache.hadoop.hbase.util.Counter
-
- incrementBadLineCount(int) - Method in class org.apache.hadoop.hbase.mapreduce.TextSortReducer
-
- incrementBadLineCount(int) - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
- incrementBadLineCount(int) - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterTextMapper
-
- incrementBytes(byte[], long) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Bytewise binary increment/deincrement of long contained in byte array
on given amount.
- incrementColumnValue(byte[], byte[], byte[], long) - Method in interface org.apache.hadoop.hbase.client.Table
-
- incrementColumnValue(byte[], byte[], byte[], long, Durability) - Method in interface org.apache.hadoop.hbase.client.Table
-
Atomically increments a column value.
- incrementColumnValue(byte[], byte[], byte[], long) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- incrementColumnValue(byte[], byte[], byte[], long, Durability) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- incrementIv(byte[]) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
- incrementIv(byte[], int) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
- INDEX_KEY_MAGIC - Static variable in class org.apache.hadoop.hbase.HConstants
-
Used as a magic return value while optimized index key feature enabled(HBASE-7845)
- indexOf(byte[], byte) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Returns the index of the first appearance of the value target
in
array
.
- indexOf(byte[], byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Returns the start position of the first occurrence of the specified target
within array
, or -1
if there is no such occurrence.
- inferBoundaries(TreeMap<byte[], Integer>) - Static method in class org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
-
- init(String) - Method in interface org.apache.hadoop.hbase.io.crypto.KeyProvider
-
Initialize the key provider
- init(String) - Method in class org.apache.hadoop.hbase.io.crypto.KeyStoreKeyProvider
-
- init() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
Build the scanner.
- init() - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
Build the scanner.
- init() - Method in interface org.apache.hadoop.hbase.mapreduce.VisibilityExpressionResolver
-
Giving a chance for the initialization.
- init(RegionCoprocessorEnvironment) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
System calls this after opening of regions.
- initCredentials(JobConf) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
- initCredentials(Job) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
- initCredentialsForCluster(Job, String) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Obtain an authentication token, for the specified cluster, on behalf of the current user
and add it to the credentials for the given map reduce job.
- initialize(HBaseProtos.EmptyMsg) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- initialize(JobConf) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormat
-
- initialize(JobConf) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
Handle subclass specific set up.
- initialize(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
- initialize(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Handle subclass specific set up.
- initialize(InputSplit, TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReader
-
Initializes the reader.
- initialize(InputSplit, TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl
-
Build the scanner.
- initializeTable(Connection, TableName) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
Allows subclasses to initialize the table information.
- initializeTable(Connection, TableName) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Allows subclasses to initialize the table information.
- initJob(String, String, String, Class<? extends TableMap>, JobConf) - Static method in class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
Use this before submitting a TableMap job.
- initJob(String, String, Class<? extends TableMap>, JobConf) - Static method in class org.apache.hadoop.hbase.mapred.IdentityTableMap
-
Use this before submitting a TableMap job.
- initJob(String, Scan, String, Class<? extends TableMapper>, Job) - Static method in class org.apache.hadoop.hbase.mapreduce.GroupingTableMapper
-
Use this before submitting a TableMap job.
- initJob(String, Scan, Class<? extends TableMapper>, Job) - Static method in class org.apache.hadoop.hbase.mapreduce.IdentityTableMapper
-
Use this before submitting a TableMap job.
- initMultiTableSnapshotMapperJob(Map<String, Collection<Scan>>, Class<? extends TableMap>, Class<?>, Class<?>, JobConf, boolean, Path) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
Sets up the job for reading from one or more multiple table snapshots, with one or more scans
per snapshot.
- initMultiTableSnapshotMapperJob(Map<String, Collection<Scan>>, Class<? extends TableMapper>, Class<?>, Class<?>, Job, boolean, Path) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Sets up the job for reading from one or more table snapshots, with one or more scans
per snapshot.
- initTableMapJob(String, String, Class<? extends TableMap>, Class<?>, Class<?>, JobConf) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
Use this before submitting a TableMap job.
- initTableMapJob(String, String, Class<? extends TableMap>, Class<?>, Class<?>, JobConf, boolean) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
- initTableMapJob(String, String, Class<? extends TableMap>, Class<?>, Class<?>, JobConf, boolean, Class<? extends InputFormat>) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
Use this before submitting a TableMap job.
- initTableMapperJob(String, Scan, Class<? extends TableMapper>, Class<?>, Class<?>, Job) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableMap job.
- initTableMapperJob(TableName, Scan, Class<? extends TableMapper>, Class<?>, Class<?>, Job) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableMap job.
- initTableMapperJob(byte[], Scan, Class<? extends TableMapper>, Class<?>, Class<?>, Job) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableMap job.
- initTableMapperJob(String, Scan, Class<? extends TableMapper>, Class<?>, Class<?>, Job, boolean, Class<? extends InputFormat>) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableMap job.
- initTableMapperJob(String, Scan, Class<? extends TableMapper>, Class<?>, Class<?>, Job, boolean, boolean, Class<? extends InputFormat>) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableMap job.
- initTableMapperJob(byte[], Scan, Class<? extends TableMapper>, Class<?>, Class<?>, Job, boolean, Class<? extends InputFormat>) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableMap job.
- initTableMapperJob(byte[], Scan, Class<? extends TableMapper>, Class<?>, Class<?>, Job, boolean) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableMap job.
- initTableMapperJob(String, Scan, Class<? extends TableMapper>, Class<?>, Class<?>, Job, boolean) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableMap job.
- initTableMapperJob(List<Scan>, Class<? extends TableMapper>, Class<?>, Class<?>, Job) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a Multi TableMap job.
- initTableMapperJob(List<Scan>, Class<? extends TableMapper>, Class<?>, Class<?>, Job, boolean) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a Multi TableMap job.
- initTableMapperJob(List<Scan>, Class<? extends TableMapper>, Class<?>, Class<?>, Job, boolean, boolean) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a Multi TableMap job.
- initTableReduceJob(String, Class<? extends TableReduce>, JobConf) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
Use this before submitting a TableReduce job.
- initTableReduceJob(String, Class<? extends TableReduce>, JobConf, Class) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
Use this before submitting a TableReduce job.
- initTableReduceJob(String, Class<? extends TableReduce>, JobConf, Class, boolean) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
Use this before submitting a TableReduce job.
- initTableReducerJob(String, Class<? extends TableReducer>, Job) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableReduce job.
- initTableReducerJob(String, Class<? extends TableReducer>, Job, Class) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableReduce job.
- initTableReducerJob(String, Class<? extends TableReducer>, Job, Class, String, String, String) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableReduce job.
- initTableReducerJob(String, Class<? extends TableReducer>, Job, Class, String, String, String, boolean) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Use this before submitting a TableReduce job.
- initTableSnapshotMapJob(String, String, Class<? extends TableMap>, Class<?>, Class<?>, JobConf, boolean, Path) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
Sets up the job for reading from a table snapshot.
- initTableSnapshotMapperJob(String, Scan, Class<? extends TableMapper>, Class<?>, Class<?>, Job, boolean, Path) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Sets up the job for reading from a table snapshot.
- INPUT_AUTOBALANCE_MAXSKEWRATIO - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Specify if ratio for data skew in M/R jobs, it goes well with the enabling hbase.mapreduce
.input.autobalance property.
- INPUT_TABLE - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Job parameter that specifies the input table.
- inputStreamFromByteRange(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.types.PBType
-
- instantiateFilter(Configuration) - Static method in class org.apache.hadoop.hbase.mapreduce.Import
-
Create a
Filter
to apply to all incoming keys (
KeyValues
) to
optionally not include in the job output
- InterfaceAudience - Class in org.apache.hadoop.hbase.classification
-
Annotation to inform users of a package, class or method's intended audience.
- InterfaceStability - Class in org.apache.hadoop.hbase.classification
-
Annotation to inform users of how much to rely on a particular package,
class or method not changing over time.
- InterfaceStability() - Constructor for class org.apache.hadoop.hbase.classification.InterfaceStability
-
- intFitsIn(int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Check how many bytes is required to store value.
- InvalidFamilyOperationException - Exception in org.apache.hadoop.hbase
-
Thrown if a request is table schema modification is requested but
made for an invalid family name.
- InvalidFamilyOperationException() - Constructor for exception org.apache.hadoop.hbase.InvalidFamilyOperationException
-
default constructor
- InvalidFamilyOperationException(String) - Constructor for exception org.apache.hadoop.hbase.InvalidFamilyOperationException
-
Constructor
- InvalidFamilyOperationException(Exception) - Constructor for exception org.apache.hadoop.hbase.InvalidFamilyOperationException
-
Constructor taking another exception.
- InvalidLabelException - Exception in org.apache.hadoop.hbase.security.visibility
-
- InvalidLabelException(String) - Constructor for exception org.apache.hadoop.hbase.security.visibility.InvalidLabelException
-
- InvalidRowFilterException - Exception in org.apache.hadoop.hbase.filter
-
Used to indicate an invalid RowFilter.
- InvalidRowFilterException() - Constructor for exception org.apache.hadoop.hbase.filter.InvalidRowFilterException
-
constructor
- InvalidRowFilterException(String) - Constructor for exception org.apache.hadoop.hbase.filter.InvalidRowFilterException
-
constructor
- IS_META - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL Used by rest interface to access this metadata
attribute which denotes if it is a catalog table, either
hbase:meta
or -ROOT-
- IS_MOB - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- IS_MOB_BYTES - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- IS_ROOT - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL Used by rest interface to access this metadata
attribute which denotes if the table is a -ROOT- region or not
- isAborted() - Method in interface org.apache.hadoop.hbase.client.Admin
-
- isAccessControllerRunning(Connection) - Static method in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
- isAllTime() - Method in class org.apache.hadoop.hbase.io.TimeRange
-
Check if it is for all time
- isAsyncPrefetch() - Method in class org.apache.hadoop.hbase.client.Scan
-
- isAuthorizationEnabled(Connection) - Static method in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
Return true if authorization is supported and enabled
- isAutoFlush() - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- isBalancerEnabled() - Method in interface org.apache.hadoop.hbase.client.Admin
-
Query the current state of the balancer
- isBalancerOn() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- isBlobCopy(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses BlobCopy
encoding, false otherwise.
- isBlobVar(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses BlobVar
encoding, false otherwise.
- isBlockCacheEnabled() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- isCacheBloomsOnWrite() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- isCacheDataInL1() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- isCacheDataOnWrite() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- isCacheIndexesOnWrite() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- isCatalogJanitorEnabled() - Method in interface org.apache.hadoop.hbase.client.Admin
-
Query on the catalog janitor state (Enabled/Disabled?)
- isCellAuthorizationEnabled(Connection) - Static method in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
Return true if cell authorization is supported and enabled
- isCellVisibilityEnabled(Connection) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
Return true if cell visibility features are supported and enabled
- isCheckExistenceOnly() - Method in class org.apache.hadoop.hbase.client.Get
-
- isClosed() - Method in interface org.apache.hadoop.hbase.client.Connection
-
Returns whether the connection is closed or not.
- isClosed() - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
- isClosestRowBefore() - Method in class org.apache.hadoop.hbase.client.Get
-
Deprecated.
since 2.0.0 and will be removed in 3.0.0
- isCompactionEnabled() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Check if the compaction enable flag of the table is true.
- isCompressTags() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- isCorrectEncoder(DataBlockEncoder, short) - Static method in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
Check if given encoder has this id.
- isDeadServer(ServerName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use thru HConnection
- isDelete(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- isDelete(byte) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- isDeleteColumnOrFamily(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- isDeleteColumns(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- isDeleteColumnVersion(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- isDeleteFamily(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- isDeleteFamilyVersion(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- isDeleteType(Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- isDoNotRetry() - Method in exception org.apache.hadoop.hbase.ipc.RemoteWithExtrasException
-
- isEmpty() - Method in class org.apache.hadoop.hbase.client.Mutation
-
Method to check if the familyMap is empty
- isEmpty() - Method in class org.apache.hadoop.hbase.client.Result
-
Check if the underlying Cell [] is empty or not
- isEmpty() - Method in class org.apache.hadoop.hbase.rest.client.Cluster
-
- isEmpty() - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
- isEncodedValue(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Returns true when src
appears to be positioned an encoded value,
false otherwise.
- isEvictBlocksOnClose() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- isFailed() - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- isFamilyEssential(byte[]) - Method in class org.apache.hadoop.hbase.filter.Filter
-
Check that given column family is essential for filter to check row.
- isFamilyEssential(byte[]) - Method in class org.apache.hadoop.hbase.filter.FilterList
-
- isFamilyEssential(byte[]) - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
The only CF this filter needs is given column family.
- isFamilyEssential(byte[]) - Method in class org.apache.hadoop.hbase.filter.SkipFilter
-
- isFamilyEssential(byte[]) - Method in class org.apache.hadoop.hbase.filter.WhileMatchFilter
-
- isFixedFloat32(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses fixed-width
Float32 encoding, false otherwise.
- isFixedFloat64(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses fixed-width
Float64 encoding, false otherwise.
- isFixedInt32(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses fixed-width
Int32 encoding, false otherwise.
- isFixedInt64(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses fixed-width
Int64 encoding, false otherwise.
- isFullServerName(String) - Static method in class org.apache.hadoop.hbase.ServerName
-
- isGetScan() - Method in class org.apache.hadoop.hbase.client.Scan
-
- isHBaseSecurityEnabled(Configuration) - Static method in class org.apache.hadoop.hbase.security.User
-
Returns whether or not secure authentication is enabled for HBase.
- isInMemory() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- isLegalFamilyName(byte[]) - Static method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- isLegalFullyQualifiedTableName(byte[]) - Static method in class org.apache.hadoop.hbase.TableName
-
Check passed byte array, "tableName", is legal user-space table name.
- isLegalNamespaceName(byte[]) - Static method in class org.apache.hadoop.hbase.TableName
-
- isLegalNamespaceName(byte[], int, int) - Static method in class org.apache.hadoop.hbase.TableName
-
Valid namespace characters are [a-zA-Z_0-9]
- isLegalTableQualifierName(byte[]) - Static method in class org.apache.hadoop.hbase.TableName
-
- isLegalTableQualifierName(byte[], boolean) - Static method in class org.apache.hadoop.hbase.TableName
-
- isLegalTableQualifierName(byte[], int, int) - Static method in class org.apache.hadoop.hbase.TableName
-
Qualifier names can only contain 'word' characters
[a-zA-Z_0-9]
or '_', '.' or '-'.
- isLegalTableQualifierName(byte[], int, int, boolean) - Static method in class org.apache.hadoop.hbase.TableName
-
- isLocal(Configuration) - Static method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
- isMasterRunning() - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use thru HConnection
- isMaxColumnInclusive() - Method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- isMetaRegion() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- isMetaRegion() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Checks if this table is hbase:meta
region.
- isMetaTable() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- isMetaTable() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Checks if the table is a hbase:meta
table
- isMinColumnInclusive() - Method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- isMobEnabled() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Gets whether the mob is enabled for the family.
- isNormalizationEnabled() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Check if normalization enable flag of the table is true.
- isNormalizerEnabled() - Method in interface org.apache.hadoop.hbase.client.Admin
-
Query the current state of the region normalizer
- isNull() - Method in class org.apache.hadoop.hbase.quotas.QuotaFilter
-
- isNull(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
is null, false
otherwise.
- isNullable() - Method in interface org.apache.hadoop.hbase.types.DataType
-
Indicates whether this instance supports encoding null values.
- isNullable() - Method in class org.apache.hadoop.hbase.types.FixedLengthWrapper
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.OrderedBytesBase
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.OrderedFloat32
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.OrderedFloat64
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.OrderedInt16
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.OrderedInt32
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.OrderedInt64
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.OrderedInt8
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.PBType
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.RawByte
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.RawBytes
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.RawDouble
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.RawFloat
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.RawInteger
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.RawLong
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.RawShort
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.RawString
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.Struct
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.Union2
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.Union3
-
- isNullable() - Method in class org.apache.hadoop.hbase.types.Union4
-
- isNumeric(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses Numeric
encoding, false otherwise.
- isNumericInfinite(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses Numeric
encoding and is Infinite
, false otherwise.
- isNumericNaN(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses Numeric
encoding and is NaN
, false otherwise.
- isNumericZero(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses Numeric
encoding and is 0
, false otherwise.
- isOffline() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- IsolationLevel - Enum in org.apache.hadoop.hbase.client
-
Specify Isolation levels in Scan operations.
- isOrderPreserving() - Method in interface org.apache.hadoop.hbase.types.DataType
-
Indicates whether this instance writes encoded byte[]
's
which preserve the natural sort order of the unencoded value.
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.FixedLengthWrapper
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.OrderedBytesBase
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.PBType
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.RawByte
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.RawBytes
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.RawDouble
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.RawFloat
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.RawInteger
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.RawLong
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.RawShort
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.RawString
-
- isOrderPreserving - Variable in class org.apache.hadoop.hbase.types.Struct
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.Struct
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.Union2
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.Union3
-
- isOrderPreserving() - Method in class org.apache.hadoop.hbase.types.Union4
-
- isPartial() - Method in class org.apache.hadoop.hbase.client.Result
-
Whether or not the result is a partial result.
- isPrefetchBlocksOnOpen() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- isProcedureFinished(String, String, Map<String, String>) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Check the current state of the specified procedure.
- isProcedureOwner(ProcedureInfo, User) - Static method in class org.apache.hadoop.hbase.ProcedureInfo
-
Check if the user is this procedure's owner
- isQuoteUnescaped(byte[], int) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Returns a boolean indicating whether the quote was escaped or not
- isRaw() - Method in class org.apache.hadoop.hbase.client.Scan
-
- isReadOnly() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Check if the readOnly flag of the table is set.
- isRecoverySupported() - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputCommitter
-
- isRemote() - Method in exception org.apache.hadoop.hbase.errorhandling.ForeignException
-
The cause of a ForeignException can be an exception that was generated on a local in process
thread, or a thread from a 'remote' separate process.
- isReturnResults() - Method in class org.apache.hadoop.hbase.client.Append
-
- isReturnResults() - Method in class org.apache.hadoop.hbase.client.Increment
-
- isReversed() - Method in class org.apache.hadoop.hbase.client.Scan
-
Get whether this scan is a reversed one.
- isReversed() - Method in class org.apache.hadoop.hbase.filter.Filter
-
- isRootRegion() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Check if the descriptor represents a -ROOT-
region.
- isSameHostnameAndPort(ServerName, ServerName) - Static method in class org.apache.hadoop.hbase.ServerName
-
- isScanMetricsEnabled() - Method in class org.apache.hadoop.hbase.client.Scan
-
- isSecurityEnabled() - Static method in class org.apache.hadoop.hbase.security.User
-
Returns whether or not Kerberos authentication is configured for Hadoop.
- isShowConfInServlet() - Static method in class org.apache.hadoop.hbase.HBaseConfiguration
-
- isSkippable() - Method in interface org.apache.hadoop.hbase.types.DataType
-
Indicates whether this instance is able to skip over it's encoded value.
- isSkippable() - Method in class org.apache.hadoop.hbase.types.FixedLengthWrapper
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.OrderedBlob
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.OrderedBytesBase
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.PBType
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.RawByte
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.RawBytes
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.RawDouble
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.RawFloat
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.RawInteger
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.RawLong
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.RawShort
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.RawString
-
- isSkippable - Variable in class org.apache.hadoop.hbase.types.Struct
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.Struct
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.Union2
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.Union3
-
- isSkippable() - Method in class org.apache.hadoop.hbase.types.Union4
-
- isSmall() - Method in class org.apache.hadoop.hbase.client.Scan
-
Get whether this scan is a small scan
- isSnapshotFinished(HBaseProtos.SnapshotDescription) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Check the current state of the passed snapshot.
- isSorted(Collection<byte[]>) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- isSplit() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- isSplitParent() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- isStale() - Method in class org.apache.hadoop.hbase.client.Result
-
Whether or not the results are coming from possibly stale data.
- isStartRowInclusive() - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
- isStopRowInclusive() - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
- isSystemTable() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- isSystemTable() - Method in class org.apache.hadoop.hbase.TableName
-
- isTableAvailable(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
- isTableAvailable(TableName, byte[][]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Use this api to check if the table has been created with the specified number of splitkeys
which was used while creating the given table.
- isTableAvailable(TableName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
- isTableAvailable(byte[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
- isTableAvailable(TableName, byte[][]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- isTableAvailable(byte[], byte[][]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- isTableAvailable(String) - Method in class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
- isTableAvailable(byte[]) - Method in class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
- isTableDisabled(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
- isTableDisabled(TableName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
- isTableDisabled(byte[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
- isTableEnabled(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
- isTableEnabled(TableName) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
A table that isTableEnabled == false and isTableDisabled == false
is possible.
- isTableEnabled(byte[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
- isText(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Return true when the next encoded value in src
uses Text encoding,
false otherwise.
- isValid() - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
- iterateOnSplits(byte[], byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Iterate over keys within the passed range, splitting at an [a,b) boundary.
- iterateOnSplits(byte[], byte[], boolean, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Iterate over keys within the passed range.
- iterator() - Method in class org.apache.hadoop.hbase.quotas.QuotaRetriever
-
- iterator(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.Struct
-
Retrieve an
Iterator
over the values encoded in
src
.
- iterator() - Method in class org.apache.hadoop.hbase.util.PairOfSameType
-
- main(String[]) - Static method in class org.apache.hadoop.hbase.HBaseConfiguration
-
For debugging.
- main(String[]) - Static method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
Test things basically work.
- main(String[]) - Static method in class org.apache.hadoop.hbase.mapred.RowCounter
-
- main(String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.CellCounter
-
Main entry point.
- main(String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.CopyTable
-
Main entry point.
- main(String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.Export
-
Main entry point.
- main(String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.Import
-
Main entry point.
- main(String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- main(String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
-
- main(String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.RowCounter
-
Main entry point.
- main(String[]) - Static method in class org.apache.hadoop.hbase.mapreduce.WALPlayer
-
Main entry point.
- main(String[]) - Static method in class org.apache.hadoop.hbase.mob.mapreduce.Sweeper
-
- main(String[]) - Static method in class org.apache.hadoop.hbase.snapshot.ExportSnapshot
-
- main(String[]) - Static method in class org.apache.hadoop.hbase.snapshot.SnapshotInfo
-
- main(String[]) - Static method in class org.apache.hadoop.hbase.util.Base64
-
Main program.
- main(String[]) - Static method in class org.apache.hadoop.hbase.util.VersionInfo
-
- MAJOR_COMPACTION_PERIOD - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for how often a region should should perform a major compaction
- majorCompact(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Major compact a table.
- majorCompact(TableName, byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Major compact a column family within a table.
- majorCompact(TableName, Admin.CompactType) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Major compact a table.
- majorCompact(TableName, byte[], Admin.CompactType) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Major compact a column family within a table.
- majorCompactRegion(byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Major compact a table or an individual region.
- majorCompactRegion(byte[], byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Major compact a column family within region.
- map(ImmutableBytesWritable, Result, OutputCollector<ImmutableBytesWritable, Result>, Reporter) - Method in class org.apache.hadoop.hbase.mapred.GroupingTableMap
-
Extract the grouping columns from value to construct a new key.
- map(ImmutableBytesWritable, Result, OutputCollector<ImmutableBytesWritable, Result>, Reporter) - Method in class org.apache.hadoop.hbase.mapred.IdentityTableMap
-
Pass the key, value to reduce
- map(ImmutableBytesWritable, Result, Mapper<ImmutableBytesWritable, Result, ImmutableBytesWritable, Result>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.GroupingTableMapper
-
Extract the grouping columns from value to construct a new key.
- map(ImmutableBytesWritable, Result, Mapper<ImmutableBytesWritable, Result, ImmutableBytesWritable, Result>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.IdentityTableMapper
-
Pass the key, value to reduce.
- map(LongWritable, Text, Mapper<LongWritable, Text, ImmutableBytesWritable, Put>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
Convert a line of TSV text into an HBase table row.
- map(LongWritable, Text, Mapper<LongWritable, Text, ImmutableBytesWritable, Text>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterTextMapper
-
Convert a line of TSV text into an HBase table row.
- mapKey(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- mapKey(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- MAPPER_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- MAPREDUCE_INPUT_AUTOBALANCE - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Specify if we enable auto-balance for input in M/R jobs.
- MASK_FOR_LOWER_INT_IN_LONG - Static variable in class org.apache.hadoop.hbase.util.Bytes
-
Mask to apply to a long to reveal the lower int only.
- MASTER_HANDLER_COUNT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- MASTER_IMPL - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for what master implementation to use.
- MASTER_INFO_PORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for master web API port
- MASTER_PORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for port master listens on.
- MASTER_TYPE_BACKUP - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for the master type being backup (waits for primary to go inactive).
- MasterNotRunningException - Exception in org.apache.hadoop.hbase
-
Thrown if the master is not running
- MasterNotRunningException() - Constructor for exception org.apache.hadoop.hbase.MasterNotRunningException
-
default constructor
- MasterNotRunningException(String) - Constructor for exception org.apache.hadoop.hbase.MasterNotRunningException
-
Constructor
- MasterNotRunningException(Exception) - Constructor for exception org.apache.hadoop.hbase.MasterNotRunningException
-
Constructor taking another exception.
- MasterNotRunningException(String, Exception) - Constructor for exception org.apache.hadoop.hbase.MasterNotRunningException
-
- matchedColumn - Variable in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- matchingColumn(Cell, byte[], byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingColumn(Cell, byte[], int, int, byte[], int, int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingColumn(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingFamily(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingFamily(Cell, byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingFamily(Cell, byte[], int, int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingQualifier(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingQualifier(Cell, byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Finds if the qualifier part of the cell and the KV serialized
byte[] are equal
- matchingQualifier(Cell, byte[], int, int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Finds if the qualifier part of the cell and the KV serialized
byte[] are equal
- matchingRow(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingRow(Cell, byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingRow(Cell, byte[], int, int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingRowColumn(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Compares the row and column of two keyvalues for equality
- matchingRows(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Compares the row of two keyvalues for equality
- matchingTimestamp(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingType(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingValue(Cell, Cell) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchingValue(Cell, byte[]) - Static method in class org.apache.hadoop.hbase.CellUtil
-
- matchVisibility(List<Tag>, Byte, List<Tag>, Byte) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
System uses this for deciding whether a Cell can be deleted by matching visibility expression
in Delete mutation and the cell in consideration.
- MAX_BACKOFF_KEY - Static variable in class org.apache.hadoop.hbase.client.backoff.ExponentialClientBackoffPolicy
-
- MAX_FILES_PER_REGION_PER_FAMILY - Static variable in class org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
-
- MAX_FILESIZE - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL Used by HBase Shell interface to access this metadata
attribute which denotes the maximum size of the store file after which
a region split occurs
- MAX_PRECISION - Static variable in class org.apache.hadoop.hbase.util.OrderedBytes
-
Max precision guaranteed to fit into a long
.
- MAX_ROW_LENGTH - Static variable in class org.apache.hadoop.hbase.HConstants
-
Max length a row can have because of the limitation in TFile.
- maxColumn - Variable in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- maxColumnInclusive - Variable in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- MAXIMUM_VALUE_LENGTH - Static variable in class org.apache.hadoop.hbase.HConstants
-
Maximum value length, enforced on KeyValue construction
- maxKeyValueSize(int) - Method in class org.apache.hadoop.hbase.client.BufferedMutatorParams
-
Override the maximum key-value size specified by the provided
Connection
's
Configuration
instance, via the configuration key
hbase.client.keyvalue.maxsize
.
- mayHaveClusterIssues() - Method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- MD5_HEX_LENGTH - Static variable in class org.apache.hadoop.hbase.HRegionInfo
-
- MD5Hash - Class in org.apache.hadoop.hbase.util
-
Utility class for MD5
MD5 hash produces a 128-bit digest.
- MD5Hash() - Constructor for class org.apache.hadoop.hbase.util.MD5Hash
-
- MEMSTORE_FLUSHSIZE - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL Used by HBase Shell interface to access this metadata
attribute which represents the maximum size of the memstore after which
its contents are flushed onto the disk
- merge(Configuration, Configuration) - Static method in class org.apache.hadoop.hbase.HBaseConfiguration
-
Merge two configurations.
- MERGEA_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The lower-half merge region column qualifier
- MERGEB_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The upper-half merge region column qualifier
- MergeRegionException - Exception in org.apache.hadoop.hbase.exceptions
-
Thrown when something is wrong in trying to merge two regions.
- MergeRegionException() - Constructor for exception org.apache.hadoop.hbase.exceptions.MergeRegionException
-
default constructor
- MergeRegionException(String) - Constructor for exception org.apache.hadoop.hbase.exceptions.MergeRegionException
-
Constructor
- mergeRegions(byte[], byte[], boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Merge two regions.
- META_REPLICAS_NUM - Static variable in class org.apache.hadoop.hbase.HConstants
-
- META_ROW_DELIMITER - Static variable in class org.apache.hadoop.hbase.HConstants
-
delimiter used between portions of a region name
- META_TABLE_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
Deprecated.
For upgrades of 0.94 to 0.96
- META_TABLE_NAME - Static variable in class org.apache.hadoop.hbase.TableName
-
The hbase:meta table's name.
- META_TABLEDESC - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
Deprecated.
- META_VERSION - Static variable in class org.apache.hadoop.hbase.HConstants
-
The current version of the meta table.
- META_VERSION_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The meta table version column qualifier.
- METADATA - Static variable in class org.apache.hadoop.hbase.HConstants
-
- METRICS_RIT_STUCK_WARNING_THRESHOLD - Static variable in class org.apache.hadoop.hbase.HConstants
-
Region in Transition metrics threshold time
- MIGRATION_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
Any artifacts left from migration can be moved here
- MIMETYPE_BINARY - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- MIMETYPE_HTML - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- MIMETYPE_JSON - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- MIMETYPE_PROTOBUF - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- MIMETYPE_PROTOBUF_IETF - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- MIMETYPE_TEXT - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- MIMETYPE_XML - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- MIN_KEEP_SEQID_PERIOD - Static variable in class org.apache.hadoop.hbase.HConstants
-
At least to keep MVCC values in hfiles for 5 days
- MIN_VERSIONS - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- minColumn - Variable in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- minColumnInclusive - Variable in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- MiniZooKeeperCluster - Class in org.apache.hadoop.hbase.zookeeper
-
TODO: Most of the code in this class is ripped from ZooKeeper tests.
- MiniZooKeeperCluster() - Constructor for class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
- MiniZooKeeperCluster(Configuration) - Constructor for class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
- MINUS_SIGN - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII for Minus Sign
- MINUTE_IN_SECONDS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- MOB_CACHE_BLOCKS - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_CACHE_EVICT_PERIOD - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_CACHE_EVICT_REMAIN_RATIO - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_CLEANER_PERIOD - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_COMPACTION_BATCH_SIZE - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
The max number of the mob files that is allowed in a batch of the mob compaction.
- MOB_COMPACTION_CHORE_PERIOD - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
The period that MobCompactionChore runs.
- MOB_COMPACTION_MERGEABLE_THRESHOLD - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
If the size of a mob file is less than this value, it's regarded as a small file and needs to
be merged in mob compaction.
- MOB_COMPACTION_THREADS_MAX - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
The max number of threads used in MobCompactor.
- MOB_COMPACTOR_CLASS_KEY - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_DELFILE_MAX_COUNT - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
The max number of del files that is allowed in the mob file compaction.
- MOB_DIR_NAME - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_FILE_CACHE_SIZE_KEY - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_REF_TAG - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_REGION_NAME - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_REGION_NAME_BYTES - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_SCAN_RAW - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_SCAN_REF_ONLY - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_SWEEP_TOOL_COMPACTION_MEMSTORE_FLUSH_SIZE - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_SWEEP_TOOL_COMPACTION_MERGEABLE_SIZE - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_SWEEP_TOOL_COMPACTION_RATIO - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_SWEEP_TOOL_COMPACTION_START_DATE - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_SWEEP_TOOL_COMPACTION_TEMP_DIR_NAME - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_TABLE_LOCK_SUFFIX - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- MOB_THRESHOLD - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- MOB_THRESHOLD_BYTES - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- MobConstants - Class in org.apache.hadoop.hbase.mob
-
The constants used in mob.
- modifyColumn(TableName, HColumnDescriptor) - Method in interface org.apache.hadoop.hbase.client.Admin
-
- modifyColumnFamily(TableName, HColumnDescriptor) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Modify an existing column family on a table.
- modifyFamily(HColumnDescriptor) - Method in class org.apache.hadoop.hbase.client.UnmodifyableHTableDescriptor
-
- modifyFamily(HColumnDescriptor) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Modifies the existing column family.
- modifyNamespace(NamespaceDescriptor) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Modify an existing namespace
- modifyTable(TableName, HTableDescriptor) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Modify an existing table, more IRB friendly version.
- move(byte[], byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Move the region r
to dest
.
- moveBufferToStream(OutputStream, ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copy the data to the output stream and update position in buffer.
- multiple(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Create a byte array which is multiple given bytes
- MultipleColumnPrefixFilter - Class in org.apache.hadoop.hbase.filter
-
This filter is used for selecting only those keys with columns that matches
a particular prefix.
- MultipleColumnPrefixFilter(byte[][]) - Constructor for class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- multiply(Double, Double) - Method in class org.apache.hadoop.hbase.client.coprocessor.DoubleColumnInterpreter
-
- MultiRowRangeFilter - Class in org.apache.hadoop.hbase.filter
-
Filter to support scan multiple row key ranges.
- MultiRowRangeFilter(List<MultiRowRangeFilter.RowRange>) - Constructor for class org.apache.hadoop.hbase.filter.MultiRowRangeFilter
-
- MultiRowRangeFilter.RowRange - Class in org.apache.hadoop.hbase.filter
-
- MultiRowRangeFilter.RowRange() - Constructor for class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
- MultiRowRangeFilter.RowRange(String, boolean, String, boolean) - Constructor for class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
If the startRow is empty or null, set it to HConstants.EMPTY_BYTE_ARRAY, means begin at the
start row of the table.
- MultiRowRangeFilter.RowRange(byte[], boolean, byte[], boolean) - Constructor for class org.apache.hadoop.hbase.filter.MultiRowRangeFilter.RowRange
-
- MultiTableInputFormat - Class in org.apache.hadoop.hbase.mapreduce
-
Convert HBase tabular data from multiple scanners into a format that
is consumable by Map/Reduce.
- MultiTableInputFormat() - Constructor for class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormat
-
- MultiTableInputFormatBase - Class in org.apache.hadoop.hbase.mapreduce
-
- MultiTableInputFormatBase() - Constructor for class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormatBase
-
- MultiTableOutputFormat - Class in org.apache.hadoop.hbase.mapreduce
-
Hadoop output format that writes to one or more HBase tables.
- MultiTableOutputFormat() - Constructor for class org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat
-
- MultiTableSnapshotInputFormat - Class in org.apache.hadoop.hbase.mapred
-
MultiTableSnapshotInputFormat generalizes
.TableSnapshotInputFormat
allowing a MapReduce job to run over one or more table snapshots, with one or more scans
configured for each.
- MultiTableSnapshotInputFormat() - Constructor for class org.apache.hadoop.hbase.mapred.MultiTableSnapshotInputFormat
-
- MultiTableSnapshotInputFormat - Class in org.apache.hadoop.hbase.mapreduce
-
MultiTableSnapshotInputFormat generalizes
TableSnapshotInputFormat
allowing a MapReduce job to run over one or more table snapshots, with one or more scans
configured for each.
- MultiTableSnapshotInputFormat() - Constructor for class org.apache.hadoop.hbase.mapreduce.MultiTableSnapshotInputFormat
-
- mutate(Mutation) - Method in interface org.apache.hadoop.hbase.client.BufferedMutator
-
- mutate(List<? extends Mutation>) - Method in interface org.apache.hadoop.hbase.client.BufferedMutator
-
- mutateRow(RowMutations) - Method in interface org.apache.hadoop.hbase.client.Table
-
Performs multiple mutations atomically on a single row.
- mutateRow(RowMutations) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- Mutation - Class in org.apache.hadoop.hbase.client
-
- Mutation() - Constructor for class org.apache.hadoop.hbase.client.Mutation
-
- MUTATION_OVERHEAD - Static variable in class org.apache.hadoop.hbase.client.Mutation
-
- P - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for 'P'
- padHead(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- padTail(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- PageFilter - Class in org.apache.hadoop.hbase.filter
-
Implementation of Filter interface that limits results to a specific page
size.
- PageFilter(long) - Constructor for class org.apache.hadoop.hbase.filter.PageFilter
-
Constructor that takes a maximum page size.
- Pair<T1,T2> - Class in org.apache.hadoop.hbase.util
-
A generic class for pairs.
- Pair() - Constructor for class org.apache.hadoop.hbase.util.Pair
-
Default constructor.
- Pair(T1, T2) - Constructor for class org.apache.hadoop.hbase.util.Pair
-
Constructor
- PairOfSameType<T> - Class in org.apache.hadoop.hbase.util
-
A generic, immutable class for pairs of objects both of type T
.
- PairOfSameType(T, T) - Constructor for class org.apache.hadoop.hbase.util.PairOfSameType
-
Constructor
- parseComparator(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Splits a column in comparatorType:comparatorValue form into separate byte arrays
- ParseConstants - Class in org.apache.hadoop.hbase.filter
-
ParseConstants holds a bunch of constants related to parsing Filter Strings
Used by
ParseFilter
- ParseConstants() - Constructor for class org.apache.hadoop.hbase.filter.ParseConstants
-
- parseDelimitedFrom(byte[], int, int) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Parses all the HRegionInfo instances from the passed in stream until EOF.
- ParseFilter - Class in org.apache.hadoop.hbase.filter
-
This class allows a user to specify a filter via a string
The string is parsed using the methods of this class and
a filter object is constructed.
- ParseFilter() - Constructor for class org.apache.hadoop.hbase.filter.ParseFilter
-
- parseFilterString(String) - Method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Parses the filterString and constructs a filter using it
- parseFilterString(byte[]) - Method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Parses the filterString and constructs a filter using it
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.BinaryComparator
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.BinaryPrefixComparator
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.BitComparator
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ByteArrayComparable
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ColumnCountGetFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ColumnPrefixFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.FamilyFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.Filter
-
Concrete implementers can signal a failure condition in their code by throwing an
IOException
.
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.FilterList
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.FirstKeyValueMatchingQualifiersFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.FuzzyRowFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.InclusiveStopFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.KeyOnlyFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.LongComparator
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.NullComparator
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.PageFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.PrefixFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.QualifierFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.RandomRowFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.RegexStringComparator
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.RowFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.SingleColumnValueExcludeFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.SkipFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.SubstringComparator
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.TimestampsFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ValueFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.filter.WhileMatchFilter
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
- parseFrom(byte[], int, int) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
- parseFrom(DataInputStream) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Parses an HRegionInfo instance from the passed in stream.
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.HTableDescriptor
-
- parseFrom(byte[]) - Static method in class org.apache.hadoop.hbase.ServerName
-
Get a ServerName from the passed in data bytes.
- parseFromOrNull(byte[]) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
- parseFromOrNull(byte[], int, int) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
- parseHostname(String) - Static method in class org.apache.hadoop.hbase.ServerName
-
- parsePort(String) - Static method in class org.apache.hadoop.hbase.ServerName
-
- parser - Variable in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
- parseRegionName(byte[]) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Separate elements of a regionName.
- parseServerName(String) - Static method in class org.apache.hadoop.hbase.ServerName
-
- parseSimpleFilterExpression(byte[]) - Method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Constructs a filter object given a simple filter expression
- parseStartcode(String) - Static method in class org.apache.hadoop.hbase.ServerName
-
- parseTableCFsFromConfig(String) - Static method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
- parseVersionedServerName(byte[]) - Static method in class org.apache.hadoop.hbase.ServerName
-
- passedPrefix - Variable in class org.apache.hadoop.hbase.filter.PrefixFilter
-
- PASSWORD - Static variable in interface org.apache.hadoop.hbase.io.crypto.KeyProvider
-
- password - Variable in class org.apache.hadoop.hbase.io.crypto.KeyStoreKeyProvider
-
- PASSWORDFILE - Static variable in interface org.apache.hadoop.hbase.io.crypto.KeyProvider
-
- passwordFile - Variable in class org.apache.hadoop.hbase.io.crypto.KeyStoreKeyProvider
-
- pbkdf128(String...) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Return a 128 bit key derived from the concatenation of the supplied
arguments using PBKDF2WithHmacSHA1 at 10,000 iterations.
- pbkdf128(byte[]...) - Static method in class org.apache.hadoop.hbase.io.crypto.Encryption
-
Return a 128 bit key derived from the concatenation of the supplied
arguments using PBKDF2WithHmacSHA1 at 10,000 iterations.
- PBType<T extends com.google.protobuf.Message> - Class in org.apache.hadoop.hbase.types
-
A base-class for
DataType
implementations backed by protobuf.
- PBType() - Constructor for class org.apache.hadoop.hbase.types.PBType
-
- peek() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Retrieve the next byte from this range without incrementing position.
- Permission - Class in org.apache.hadoop.hbase.security.access
-
Base permissions instance representing the ability to perform a given set
of actions.
- Permission() - Constructor for class org.apache.hadoop.hbase.security.access.Permission
-
Empty constructor for Writable implementation.
- Permission(Permission.Action...) - Constructor for class org.apache.hadoop.hbase.security.access.Permission
-
- Permission(byte[]) - Constructor for class org.apache.hadoop.hbase.security.access.Permission
-
- Permission.Action - Enum in org.apache.hadoop.hbase.security.access
-
- PHOENIX - Static variable in class org.apache.hadoop.hbase.HBaseInterfaceAudience
-
- PleaseHoldException - Exception in org.apache.hadoop.hbase
-
This exception is thrown by the master when a region server was shut down and
restarted so fast that the master still hasn't processed the server shutdown
of the first instance, or when master is initializing and client call admin
operations, or when an operation is performed on a region server that is still starting.
- PleaseHoldException(String) - Constructor for exception org.apache.hadoop.hbase.PleaseHoldException
-
- PleaseHoldException(String, Throwable) - Constructor for exception org.apache.hadoop.hbase.PleaseHoldException
-
- PleaseHoldException(Throwable) - Constructor for exception org.apache.hadoop.hbase.PleaseHoldException
-
- pluralize(Collection<?>) - Static method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- pluralize(int) - Static method in exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- pool(ExecutorService) - Method in class org.apache.hadoop.hbase.client.BufferedMutatorParams
-
Override the default executor pool defined by the hbase.htable.threads.*
configuration values.
- popArguments(Stack<ByteBuffer>, Stack<Filter>) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Pops an argument from the operator stack and the number of arguments required by the operator
from the filterStack and evaluates them
- populatePut(byte[], ImportTsv.TsvParser.ParsedLine, Put, int) - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
- PositionedByteRange - Interface in org.apache.hadoop.hbase.util
-
Extends
ByteRange
with additional methods to support tracking a
consumers position within the viewport.
- post(String, String, byte[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a POST request
- post(Cluster, String, String, byte[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a POST request
- post(String, Header[], byte[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a POST request
- post(Cluster, String, Header[], byte[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a POST request
- PreemptiveFastFailException - Exception in org.apache.hadoop.hbase.exceptions
-
Thrown when the client believes that we are trying to communicate to has
been repeatedly unresponsive for a while.
- PreemptiveFastFailException(long, long, long, ServerName) - Constructor for exception org.apache.hadoop.hbase.exceptions.PreemptiveFastFailException
-
- PREFETCH_BLOCKS_ON_OPEN - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
Key for the PREFETCH_BLOCKS_ON_OPEN attribute.
- prefix - Variable in class org.apache.hadoop.hbase.filter.ColumnPrefixFilter
-
- prefix - Variable in class org.apache.hadoop.hbase.filter.PrefixFilter
-
- PrefixFilter - Class in org.apache.hadoop.hbase.filter
-
Pass results that have same row prefix.
- PrefixFilter(byte[]) - Constructor for class org.apache.hadoop.hbase.filter.PrefixFilter
-
- prettyPrint(String) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Use logging.
- ProcedureInfo - Class in org.apache.hadoop.hbase
-
Procedure information
- ProcedureInfo(long, String, String, ProcedureProtos.ProcedureState, long, ErrorHandlingProtos.ForeignExceptionMessage, long, long, byte[]) - Constructor for class org.apache.hadoop.hbase.ProcedureInfo
-
- processBatch(List<? extends Row>, TableName, ExecutorService, Object[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
- processBatch(List<? extends Row>, byte[], ExecutorService, Object[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- processBatchCallback(List<? extends Row>, TableName, ExecutorService, Object[], Batch.Callback<R>) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
- processBatchCallback(List<? extends Row>, byte[], ExecutorService, Object[], Batch.Callback<R>) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
Unsupported API
- processParameter(String, String) - Method in class org.apache.hadoop.hbase.io.crypto.KeyStoreKeyProvider
-
- processParameters(URI) - Method in class org.apache.hadoop.hbase.io.crypto.KeyStoreKeyProvider
-
- put(TableName, Put) - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer
-
The put request will be buffered by its corresponding buffer queue.
- put(TableName, List<Put>) - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer
-
The puts request will be buffered by their corresponding buffer queue.
- put(byte[], List<Put>) - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer
-
- put(TableName, Put, int) - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer
-
The put request will be buffered by its corresponding buffer queue.
- put(byte[], Put, int) - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer
-
- put(byte[], Put) - Method in class org.apache.hadoop.hbase.client.HTableMultiplexer
-
- Put - Class in org.apache.hadoop.hbase.client
-
Used to perform Put operations for a single row.
- Put(byte[]) - Constructor for class org.apache.hadoop.hbase.client.Put
-
Create a Put operation for the specified row.
- Put(byte[], long) - Constructor for class org.apache.hadoop.hbase.client.Put
-
Create a Put operation for the specified row, using a given timestamp.
- Put(byte[], int, int) - Constructor for class org.apache.hadoop.hbase.client.Put
-
We make a copy of the passed in row key to keep local.
- Put(ByteBuffer, long) - Constructor for class org.apache.hadoop.hbase.client.Put
-
- Put(ByteBuffer) - Constructor for class org.apache.hadoop.hbase.client.Put
-
- Put(byte[], int, int, long) - Constructor for class org.apache.hadoop.hbase.client.Put
-
We make a copy of the passed in row key to keep local.
- Put(Put) - Constructor for class org.apache.hadoop.hbase.client.Put
-
Copy constructor.
- put(Put) - Method in interface org.apache.hadoop.hbase.client.Table
-
Puts some data in the table.
- put(List<Put>) - Method in interface org.apache.hadoop.hbase.client.Table
-
Puts some data in the table, in batch.
- put(String, String, byte[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a PUT request
- put(Cluster, String, String, byte[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a PUT request
- put(String, Header[], byte[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a PUT request
- put(Cluster, String, Header[], byte[]) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Send a PUT request
- put(Put) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- put(List<Put>) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- put(int, byte) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Store val
at index
.
- put(int, byte[]) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Store val
at index
.
- put(int, byte[], int, int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Store length
bytes from val
into this range, starting at
index
.
- put(byte) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Store val
at the next position in this range.
- put(byte[]) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Store the content of val
in this range, starting at the next position.
- put(byte[], int, int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Store length
bytes from val
into this range.
- put(int, byte) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- put(int, byte[]) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- put(int, byte[], int, int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- put(int, byte) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- put(int, byte[]) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- put(int, byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- put(int, byte) - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- put(int, byte[]) - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- put(int, byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- put(byte) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- put(byte[]) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- put(byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- put(int, byte) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- put(int, byte[]) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- put(int, byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- put(byte) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- put(byte[]) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- put(byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- put(int, byte) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- put(int, byte[]) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- put(int, byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- putAsShort(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Put an int value as short out to the specified byte array position.
- putBigDecimal(byte[], int, BigDecimal) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Put a BigDecimal value out to the specified byte array position.
- putByte(ByteBuffer, int, byte) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- putByte(byte[], int, byte) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Write a single byte out to the specified byte array position.
- putByteBuffer(byte[], int, ByteBuffer) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Add the whole content of the ByteBuffer to the bytes arrays.
- putBytes(byte[], int, byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Put bytes at the specified byte array position.
- PutCombiner<K> - Class in org.apache.hadoop.hbase.mapreduce
-
Combine Puts.
- PutCombiner() - Constructor for class org.apache.hadoop.hbase.mapreduce.PutCombiner
-
- putCompressedInt(OutputStream, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Put in buffer integer using 7 bit encoding.
- putDouble(byte[], int, double) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- putFloat(byte[], int, float) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- putInt(OutputStream, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Put in output stream 32 bit integer (Big Endian byte order).
- putInt(ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Put an int value out to the given ByteBuffer's current position in big-endian format.
- putInt(int, int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Store the int value at index
- putInt(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Put an int value out to the specified byte array position.
- putInt(int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Store int val
at the next position in this range.
- putInt(int, int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- putInt(int, int) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- putInt(int, int) - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- putInt(int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- putInt(int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- putInt(int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- putInt(int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- putIntUnsafe(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Deprecated.
As of release 2.0.0, this will be removed in HBase 3.0.0.
- putLong(OutputStream, long, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- putLong(ByteBuffer, long) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Put a long value out to the given ByteBuffer's current position in big-endian format.
- putLong(int, long) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Store the long value at index
- putLong(byte[], int, long) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Put a long value out to the specified byte array position.
- putLong(long) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Store long val
at the next position in this range.
- putLong(int, long) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- putLong(int, long) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- putLong(int, long) - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- putLong(long) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- putLong(int, long) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- putLong(long) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- putLong(int, long) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- putLongUnsafe(byte[], int, long) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Deprecated.
As of release 2.0.0, this will be removed in HBase 3.0.0.
- putShort(ByteBuffer, short) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Put a short value out to the given ByteBuffer's current position in big-endian format.
- putShort(int, short) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Store the short value at index
- putShort(byte[], int, short) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Put a short value out to the specified byte array position.
- putShort(short) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Store short val
at the next position in this range.
- putShort(int, short) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- putShort(int, short) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- putShort(int, short) - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- putShort(short) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- putShort(int, short) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- putShort(short) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- putShort(int, short) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- putShortUnsafe(byte[], int, short) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Deprecated.
As of release 2.0.0, this will be removed in HBase 3.0.0.
- PutSortReducer - Class in org.apache.hadoop.hbase.mapreduce
-
Emits sorted Puts.
- PutSortReducer() - Constructor for class org.apache.hadoop.hbase.mapreduce.PutSortReducer
-
- putVLong(int, long) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Store the long value at index
as a VLong
- putVLong(long) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Store the long val
at the next position as a VLong
- putVLong(int, long) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- putVLong(int, long) - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- putVLong(long) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- putVLong(int, long) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- putVLong(long) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- putVLong(int, long) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- R - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for 'R'
- random - Static variable in class org.apache.hadoop.hbase.filter.RandomRowFilter
-
- random(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Fill given array with random bytes.
- random(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Fill given array with random bytes at the specified position.
- RandomRowFilter - Class in org.apache.hadoop.hbase.filter
-
A filter that includes rows based on a chance.
- RandomRowFilter(float) - Constructor for class org.apache.hadoop.hbase.filter.RandomRowFilter
-
Create a new filter with a specified chance for a row to be included.
- RawByte - Class in org.apache.hadoop.hbase.types
-
- RawByte() - Constructor for class org.apache.hadoop.hbase.types.RawByte
-
- RawBytes - Class in org.apache.hadoop.hbase.types
-
- RawBytes() - Constructor for class org.apache.hadoop.hbase.types.RawBytes
-
- RawBytes(Order) - Constructor for class org.apache.hadoop.hbase.types.RawBytes
-
- RawBytesFixedLength - Class in org.apache.hadoop.hbase.types
-
- RawBytesFixedLength(Order, int) - Constructor for class org.apache.hadoop.hbase.types.RawBytesFixedLength
-
Create a RawBytesFixedLength
using the specified order
and length
.
- RawBytesFixedLength(int) - Constructor for class org.apache.hadoop.hbase.types.RawBytesFixedLength
-
Create a RawBytesFixedLength
of the specified length
.
- RawBytesTerminated - Class in org.apache.hadoop.hbase.types
-
- RawBytesTerminated(Order, byte[]) - Constructor for class org.apache.hadoop.hbase.types.RawBytesTerminated
-
Create a RawBytesTerminated
using the specified terminator and
order
.
- RawBytesTerminated(Order, String) - Constructor for class org.apache.hadoop.hbase.types.RawBytesTerminated
-
Create a RawBytesTerminated
using the specified terminator and
order
.
- RawBytesTerminated(byte[]) - Constructor for class org.apache.hadoop.hbase.types.RawBytesTerminated
-
Create a RawBytesTerminated
using the specified terminator.
- RawBytesTerminated(String) - Constructor for class org.apache.hadoop.hbase.types.RawBytesTerminated
-
Create a RawBytesTerminated
using the specified terminator.
- rawCells() - Method in class org.apache.hadoop.hbase.client.Result
-
Return the array of Cells backing this Result instance.
- RawDouble - Class in org.apache.hadoop.hbase.types
-
- RawDouble() - Constructor for class org.apache.hadoop.hbase.types.RawDouble
-
- RawFloat - Class in org.apache.hadoop.hbase.types
-
- RawFloat() - Constructor for class org.apache.hadoop.hbase.types.RawFloat
-
- RawInteger - Class in org.apache.hadoop.hbase.types
-
- RawInteger() - Constructor for class org.apache.hadoop.hbase.types.RawInteger
-
- RawLong - Class in org.apache.hadoop.hbase.types
-
- RawLong() - Constructor for class org.apache.hadoop.hbase.types.RawLong
-
- RawShort - Class in org.apache.hadoop.hbase.types
-
- RawShort() - Constructor for class org.apache.hadoop.hbase.types.RawShort
-
- RawString - Class in org.apache.hadoop.hbase.types
-
- RawString() - Constructor for class org.apache.hadoop.hbase.types.RawString
-
- RawString(Order) - Constructor for class org.apache.hadoop.hbase.types.RawString
-
- RawStringFixedLength - Class in org.apache.hadoop.hbase.types
-
- RawStringFixedLength(Order, int) - Constructor for class org.apache.hadoop.hbase.types.RawStringFixedLength
-
Create a RawStringFixedLength
using the specified
order
and length
.
- RawStringFixedLength(int) - Constructor for class org.apache.hadoop.hbase.types.RawStringFixedLength
-
Create a RawStringFixedLength
of the specified length
.
- RawStringTerminated - Class in org.apache.hadoop.hbase.types
-
- RawStringTerminated(Order, byte[]) - Constructor for class org.apache.hadoop.hbase.types.RawStringTerminated
-
Create a RawStringTerminated
using the specified terminator and
order
.
- RawStringTerminated(Order, String) - Constructor for class org.apache.hadoop.hbase.types.RawStringTerminated
-
Create a RawStringTerminated
using the specified terminator and
order
.
- RawStringTerminated(byte[]) - Constructor for class org.apache.hadoop.hbase.types.RawStringTerminated
-
Create a RawStringTerminated
using the specified terminator.
- RawStringTerminated(String) - Constructor for class org.apache.hadoop.hbase.types.RawStringTerminated
-
Create a RawStringTerminated
using the specified terminator.
- read() - Method in class org.apache.hadoop.hbase.util.Base64.Base64InputStream
-
Reads enough of the input stream to convert to/from Base64 and returns
the next byte.
- read(byte[], int, int) - Method in class org.apache.hadoop.hbase.util.Base64.Base64InputStream
-
- readAsInt(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to an int value
- readAsVLong(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Reads a zero-compressed encoded long from input buffer and returns it.
- readByteArray(DataInput) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Read byte-array written with a WritableableUtils.vint prefix.
- readByteArrayThrowsRuntime(DataInput) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Read byte-array written with a WritableableUtils.vint prefix.
- readCompressedInt(InputStream) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Read integer from stream coded in 7 bits and increment position.
- readCompressedInt(ByteBuffer) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Read integer from buffer coded in 7 bits and increment position.
- readFields(DataInput) - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- readFields(DataInput) - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- readFields(DataInput) - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Reads the values of each field.
- readFields(DataInput) - Method in class org.apache.hadoop.hbase.security.access.Permission
-
- readLong(InputStream, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Read long which was written to fitInBytes bytes and increment position.
- readLong(ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Read long which was written to fitInBytes bytes and increment position.
- READONLY - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL Used by rest interface to access this metadata
attribute which denotes if the table is Read Only
- ReadOnlyByteRangeException - Exception in org.apache.hadoop.hbase.util
-
Exception thrown when a read only byte range is modified
- ReadOnlyByteRangeException() - Constructor for exception org.apache.hadoop.hbase.util.ReadOnlyByteRangeException
-
- readStringFixedSize(DataInput, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Reads a fixed-size field and interprets it as a string padded with zeros.
- readVLong(ByteBuffer) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Similar to
WritableUtils.readVLong(DataInput)
but reads from a
ByteBuffer
.
- readVLong(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- RECOVERED_EDITS_DIR - Static variable in class org.apache.hadoop.hbase.HConstants
-
- recoverTask(TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputCommitter
-
- reduce(Stack<ByteBuffer>, Stack<Filter>, ByteBuffer) - Method in class org.apache.hadoop.hbase.filter.ParseFilter
-
This function is called while parsing the filterString and an operator is parsed
- reduce(ImmutableBytesWritable, Iterator<Put>, OutputCollector<ImmutableBytesWritable, Put>, Reporter) - Method in class org.apache.hadoop.hbase.mapred.IdentityTableReduce
-
No aggregation, output pairs of (key, record)
- reduce(Writable, Iterable<Mutation>, Reducer<Writable, Mutation, Writable, Mutation>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.IdentityTableReducer
-
Writes each given record, consisting of the row key and the given values,
to the configured OutputFormat
.
- reduce(ImmutableBytesWritable, Iterable<KeyValue>, Reducer<ImmutableBytesWritable, KeyValue, ImmutableBytesWritable, KeyValue>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.KeyValueSortReducer
-
- reduce(K, Iterable<Put>, Reducer<K, Put, K, Put>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.PutCombiner
-
- reduce(ImmutableBytesWritable, Iterable<Put>, Reducer<ImmutableBytesWritable, Put, ImmutableBytesWritable, KeyValue>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.PutSortReducer
-
- reduce(ImmutableBytesWritable, Iterable<Text>, Reducer<ImmutableBytesWritable, Text, ImmutableBytesWritable, KeyValue>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.TextSortReducer
-
- RegexStringComparator - Class in org.apache.hadoop.hbase.filter
-
- RegexStringComparator(String) - Constructor for class org.apache.hadoop.hbase.filter.RegexStringComparator
-
Constructor
Adds Pattern.DOTALL to the underlying Pattern
- RegexStringComparator(String, RegexStringComparator.EngineType) - Constructor for class org.apache.hadoop.hbase.filter.RegexStringComparator
-
Constructor
Adds Pattern.DOTALL to the underlying Pattern
- RegexStringComparator(String, int) - Constructor for class org.apache.hadoop.hbase.filter.RegexStringComparator
-
Constructor
- RegexStringComparator(String, int, RegexStringComparator.EngineType) - Constructor for class org.apache.hadoop.hbase.filter.RegexStringComparator
-
Constructor
- RegexStringComparator.EngineType - Enum in org.apache.hadoop.hbase.filter
-
Engine implementation type (default=JAVA)
- regexStringType - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
RegexStringType byte array
- REGION_IMPL - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REGION_MEMSTORE_REPLICATION - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL flag to indicate whether or not the memstore should be replicated
for read-replicas (CONSISTENCY => TIMELINE).
- REGION_REPLICATION - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL number of region replicas for the table.
- REGION_SERVER_CLASS - Static variable in class org.apache.hadoop.hbase.mapreduce.TableOutputFormat
-
Optional specification of the rs class name of the peer cluster
- REGION_SERVER_HANDLER_ABORT_ON_ERROR_PERCENT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REGION_SERVER_HANDLER_COUNT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REGION_SERVER_HIGH_PRIORITY_HANDLER_COUNT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REGION_SERVER_IMPL - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for what region server implementation to use.
- REGION_SERVER_IMPL - Static variable in class org.apache.hadoop.hbase.mapreduce.TableOutputFormat
-
Optional specification of the rs impl name of the peer cluster
- REGION_SERVER_REPLICATION_HANDLER_COUNT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REGION_SPLIT_THREADS_MAX - Static variable in class org.apache.hadoop.hbase.HConstants
-
The max number of threads used for splitting storefiles in parallel during
the region split process.
- RegionException - Exception in org.apache.hadoop.hbase
-
Thrown when something happens related to region handling.
- RegionException() - Constructor for exception org.apache.hadoop.hbase.RegionException
-
default constructor
- RegionException(String) - Constructor for exception org.apache.hadoop.hbase.RegionException
-
Constructor
- REGIONINFO_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The regioninfo column qualifier
- REGIONINFO_QUALIFIER_STR - Static variable in class org.apache.hadoop.hbase.HConstants
-
The RegionInfo qualifier as a string
- RegionInRecoveryException - Exception in org.apache.hadoop.hbase.exceptions
-
Thrown when a read request issued against a region which is in recovering state.
- RegionInRecoveryException() - Constructor for exception org.apache.hadoop.hbase.exceptions.RegionInRecoveryException
-
default constructor
- RegionInRecoveryException(String) - Constructor for exception org.apache.hadoop.hbase.exceptions.RegionInRecoveryException
-
Constructor
- RegionLoad - Class in org.apache.hadoop.hbase
-
Encapsulates per-region load metrics.
- RegionLoad(ClusterStatusProtos.RegionLoad) - Constructor for class org.apache.hadoop.hbase.RegionLoad
-
- regionLoadPB - Variable in class org.apache.hadoop.hbase.RegionLoad
-
- RegionLocator - Interface in org.apache.hadoop.hbase.client
-
Used to view region location information for a single HBase table.
- RegionOfflineException - Exception in org.apache.hadoop.hbase.client
-
Thrown when a table can not be located
- RegionOfflineException() - Constructor for exception org.apache.hadoop.hbase.client.RegionOfflineException
-
default constructor
- RegionOfflineException(String) - Constructor for exception org.apache.hadoop.hbase.client.RegionOfflineException
-
- REGIONSERVER_INFO_PORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
A configuration key for regionserver info port
- REGIONSERVER_INFO_PORT_AUTO - Static variable in class org.apache.hadoop.hbase.HConstants
-
A flag that enables automatic selection of regionserver info port
- REGIONSERVER_METRICS_PERIOD - Static variable in class org.apache.hadoop.hbase.HConstants
-
The period (in milliseconds) between computing region server point in time metrics
- REGIONSERVER_PORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for port region server listens on.
- RegionServerAbortedException - Exception in org.apache.hadoop.hbase.regionserver
-
Thrown by the region server when it is aborting.
- RegionServerAbortedException(String) - Constructor for exception org.apache.hadoop.hbase.regionserver.RegionServerAbortedException
-
- RegionServerRunningException - Exception in org.apache.hadoop.hbase.regionserver
-
Thrown if the region server log directory exists (which indicates another
region server is running at the same address)
- RegionServerRunningException() - Constructor for exception org.apache.hadoop.hbase.regionserver.RegionServerRunningException
-
Default Constructor
- RegionServerRunningException(String) - Constructor for exception org.apache.hadoop.hbase.regionserver.RegionServerRunningException
-
Constructs the exception and supplies a string as the message
- RegionServerStoppedException - Exception in org.apache.hadoop.hbase.regionserver
-
Thrown by the region server when it is in shutting down state.
- RegionServerStoppedException(String) - Constructor for exception org.apache.hadoop.hbase.regionserver.RegionServerStoppedException
-
- RegionTooBusyException - Exception in org.apache.hadoop.hbase
-
Thrown by a region server if it will block and wait to serve a request.
- RegionTooBusyException() - Constructor for exception org.apache.hadoop.hbase.RegionTooBusyException
-
default constructor
- RegionTooBusyException(String) - Constructor for exception org.apache.hadoop.hbase.RegionTooBusyException
-
Constructor
- registerFilter(String, String) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Register a new filter with the parser.
- relocateRegion(TableName, byte[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- relocateRegion(byte[], byte[]) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
internal method, do not use through HConnection
- RemoteAdmin - Class in org.apache.hadoop.hbase.rest.client
-
- RemoteAdmin(Client, Configuration) - Constructor for class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
Constructor
- RemoteAdmin(Client, Configuration, String) - Constructor for class org.apache.hadoop.hbase.rest.client.RemoteAdmin
-
Constructor
- RemoteHTable - Class in org.apache.hadoop.hbase.rest.client
-
HTable interface to remote tables accessed via REST gateway
- RemoteHTable(Client, String) - Constructor for class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
Constructor
- RemoteHTable(Client, Configuration, String) - Constructor for class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
Constructor
- RemoteHTable(Client, Configuration, byte[]) - Constructor for class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
Constructor
- RemoteWithExtrasException - Exception in org.apache.hadoop.hbase.ipc
-
A RemoteException
with some extra information.
- RemoteWithExtrasException(String, String, boolean) - Constructor for exception org.apache.hadoop.hbase.ipc.RemoteWithExtrasException
-
- RemoteWithExtrasException(String, String, String, int, boolean) - Constructor for exception org.apache.hadoop.hbase.ipc.RemoteWithExtrasException
-
- remove(byte[]) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- remove(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- remove(Bytes) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- remove(byte[]) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- remove(String) - Method in class org.apache.hadoop.hbase.rest.client.Cluster
-
Remove a node from the cluster
- remove(String, int) - Method in class org.apache.hadoop.hbase.rest.client.Cluster
-
Remove a node from the cluster
- remove() - Method in class org.apache.hadoop.hbase.types.StructIterator
-
- removeConfiguration(String) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- removeConfiguration(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- removeConfiguration(String) - Method in class org.apache.hadoop.hbase.NamespaceDescriptor.Builder
-
- removeConfiguration(String) - Method in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- removeCoprocessor(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Remove a coprocessor from those set on the table
- removeExtraHeader(String) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Remove an extra header.
- removeFamily(byte[]) - Method in class org.apache.hadoop.hbase.client.UnmodifyableHTableDescriptor
-
- removeFamily(byte[]) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Removes the HColumnDescriptor with name specified by the parameter column
from the table descriptor
- removePeer(String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Removes a peer cluster and stops the replication to it.
- removePeerTableCFs(String, String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Remove some table-cfs from table-cfs config of the specified peer
- removePeerTableCFs(String, Map<TableName, ? extends Collection<String>>) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Remove some table-cfs from config of the specified peer
- removeQuotesFromByteArray(byte[]) - Static method in class org.apache.hadoop.hbase.filter.ParseFilter
-
Takes a quoted byte array and converts it into an unquoted byte array
For example: given a byte array representing 'abc', it returns a
byte array representing abc
- renewLease() - Method in interface org.apache.hadoop.hbase.client.ResultScanner
-
Allow the client to renew the scanner's lease on the server.
- renewLease() - Method in class org.apache.hadoop.hbase.client.TableSnapshotScanner
-
- REPLAY_QOS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REPLICA_ID_DELIMITER - Static variable in class org.apache.hadoop.hbase.HRegionInfo
-
- REPLICA_ID_FORMAT - Static variable in class org.apache.hadoop.hbase.HRegionInfo
-
- REPLICATION - Static variable in class org.apache.hadoop.hbase.HBaseInterfaceAudience
-
- REPLICATION_CODEC_CONF_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for setting replication codec class name
- REPLICATION_ENABLE_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REPLICATION_ENABLE_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REPLICATION_QOS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REPLICATION_SCOPE - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- REPLICATION_SCOPE_BYTES - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- REPLICATION_SCOPE_GLOBAL - Static variable in class org.apache.hadoop.hbase.HConstants
-
Scope tag for globally scoped data.
- REPLICATION_SCOPE_LOCAL - Static variable in class org.apache.hadoop.hbase.HConstants
-
Scope tag for locally scoped data.
- REPLICATION_SERVICE_CLASSNAME_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REPLICATION_SINK_SERVICE_CLASSNAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REPLICATION_SOURCE_MAXTHREADS_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- REPLICATION_SOURCE_MAXTHREADS_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Maximum number of threads used by the replication source for shipping edits to the sinks
- REPLICATION_SOURCE_SERVICE_CLASSNAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
- ReplicationAdmin - Class in org.apache.hadoop.hbase.client.replication
-
This class provides the administrative interface to HBase cluster
replication.
- ReplicationAdmin(Configuration) - Constructor for class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Constructor that creates a connection to the local ZooKeeper ensemble.
- ReplicationException - Exception in org.apache.hadoop.hbase.replication
-
An HBase Replication exception.
- ReplicationException() - Constructor for exception org.apache.hadoop.hbase.replication.ReplicationException
-
- ReplicationException(String) - Constructor for exception org.apache.hadoop.hbase.replication.ReplicationException
-
- ReplicationException(String, Throwable) - Constructor for exception org.apache.hadoop.hbase.replication.ReplicationException
-
- ReplicationException(Throwable) - Constructor for exception org.apache.hadoop.hbase.replication.ReplicationException
-
- REPLICATIONGLOBAL - Static variable in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
- ReplicationPeerConfig - Class in org.apache.hadoop.hbase.replication
-
A configuration for the replication peer cluster.
- ReplicationPeerConfig() - Constructor for class org.apache.hadoop.hbase.replication.ReplicationPeerConfig
-
- REPLICATIONTYPE - Static variable in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
- RESERVED_NAMESPACES - Static variable in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- RESERVED_NAMESPACES_BYTES - Static variable in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- reset() - Method in class org.apache.hadoop.hbase.filter.ColumnCountGetFilter
-
- reset() - Method in class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
- reset() - Method in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- reset() - Method in class org.apache.hadoop.hbase.filter.Filter
-
Reset the state of the filter between rows.
- reset() - Method in class org.apache.hadoop.hbase.filter.FilterList
-
- reset() - Method in class org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter
-
- reset() - Method in class org.apache.hadoop.hbase.filter.PrefixFilter
-
- reset() - Method in class org.apache.hadoop.hbase.filter.RandomRowFilter
-
- reset() - Method in class org.apache.hadoop.hbase.filter.RowFilter
-
- reset() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- reset() - Method in class org.apache.hadoop.hbase.filter.SkipFilter
-
- reset() - Method in class org.apache.hadoop.hbase.filter.WhileMatchFilter
-
- reset() - Method in interface org.apache.hadoop.hbase.io.crypto.Decryptor
-
Reset state, reinitialize with the key and iv
- reset() - Method in interface org.apache.hadoop.hbase.io.crypto.Encryptor
-
Reset state, reinitialize with the key and iv
- reset() - Method in class org.apache.hadoop.hbase.types.StructBuilder
-
Reset the sequence of accumulated fields.
- reset() - Method in class org.apache.hadoop.hbase.util.FastLongHistogram
-
Resets the histogram for new counting.
- resetCacheConfig(Configuration) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Enable a basic on-heap cache for these jobs.
- Response - Class in org.apache.hadoop.hbase.rest.client
-
The HTTP result code, response headers, and body of a HTTP response.
- Response(int) - Constructor for class org.apache.hadoop.hbase.rest.client.Response
-
Constructor
- Response(int, Header[]) - Constructor for class org.apache.hadoop.hbase.rest.client.Response
-
Constructor
- Response(int, Header[], byte[]) - Constructor for class org.apache.hadoop.hbase.rest.client.Response
-
Constructor
- Response(int, Header[], byte[], InputStream) - Constructor for class org.apache.hadoop.hbase.rest.client.Response
-
Constructor
- REST_AUTHENTICATION_PRINCIPAL - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- REST_AUTHENTICATION_TYPE - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- REST_DNS_INTERFACE - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- REST_DNS_NAMESERVER - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- REST_KERBEROS_PRINCIPAL - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- REST_KEYTAB_FILE - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- REST_SSL_ENABLED - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- REST_SSL_KEYSTORE_KEYPASSWORD - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- REST_SSL_KEYSTORE_PASSWORD - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- REST_SSL_KEYSTORE_STORE - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- restart(byte[]) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
Restart from survivable exceptions by creating a new scanner.
- restart(byte[]) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
Restart from survivable exceptions by creating a new scanner.
- restart(byte[]) - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReader
-
Restart from survivable exceptions by creating a new scanner.
- restart(byte[]) - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl
-
Restart from survivable exceptions by creating a new scanner.
- restoreSnapshot(byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Restore the specified snapshot on the original table.
- restoreSnapshot(String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Restore the specified snapshot on the original table.
- restoreSnapshot(byte[], boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Restore the specified snapshot on the original table.
- restoreSnapshot(String, boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Restore the specified snapshot on the original table.
- RestoreSnapshotException - Exception in org.apache.hadoop.hbase.snapshot
-
Thrown when a snapshot could not be restored due to a server-side error when restoring it.
- RestoreSnapshotException(String, HBaseProtos.SnapshotDescription) - Constructor for exception org.apache.hadoop.hbase.snapshot.RestoreSnapshotException
-
- RestoreSnapshotException(String, Throwable, HBaseProtos.SnapshotDescription) - Constructor for exception org.apache.hadoop.hbase.snapshot.RestoreSnapshotException
-
- RestoreSnapshotException(String) - Constructor for exception org.apache.hadoop.hbase.snapshot.RestoreSnapshotException
-
- RestoreSnapshotException(String, Exception) - Constructor for exception org.apache.hadoop.hbase.snapshot.RestoreSnapshotException
-
- Result - Class in org.apache.hadoop.hbase.client
-
Single row result of a
Get
or
Scan
query.
- Result() - Constructor for class org.apache.hadoop.hbase.client.Result
-
Creates an empty Result w/ no KeyValue payload; returns null if you call
Result.rawCells()
.
- ResultScanner - Interface in org.apache.hadoop.hbase.client
-
Interface for client-side scanning.
- resumeEncoding() - Method in class org.apache.hadoop.hbase.util.Base64.Base64OutputStream
-
Resumes encoding of the stream.
- RetriesExhaustedException - Exception in org.apache.hadoop.hbase.client
-
Exception thrown by HTable methods when an attempt to do something (like
commit changes) fails after a bunch of retries.
- RetriesExhaustedException(String) - Constructor for exception org.apache.hadoop.hbase.client.RetriesExhaustedException
-
- RetriesExhaustedException(String, IOException) - Constructor for exception org.apache.hadoop.hbase.client.RetriesExhaustedException
-
- RetriesExhaustedException(String, int, List<Throwable>) - Constructor for exception org.apache.hadoop.hbase.client.RetriesExhaustedException
-
Create a new RetriesExhaustedException from the list of prior failures.
- RetriesExhaustedException(int, List<RetriesExhaustedException.ThrowableWithExtraContext>) - Constructor for exception org.apache.hadoop.hbase.client.RetriesExhaustedException
-
Create a new RetriesExhaustedException from the list of prior failures.
- RetriesExhaustedWithDetailsException - Exception in org.apache.hadoop.hbase.client
-
This subclass of
RetriesExhaustedException
is thrown when we have more information about which rows were causing which
exceptions on what servers.
- RetriesExhaustedWithDetailsException(List<Throwable>, List<Row>, List<String>) - Constructor for exception org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException
-
- retrieveGetCounterWithStringsParams(TaskAttemptContext) - Static method in class org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl
-
In new mapreduce APIs, TaskAttemptContext has two getCounter methods
Check if getCounter(String, String) method is available.
- RETRY_BACKOFF - Static variable in class org.apache.hadoop.hbase.HConstants
-
Retrying we multiply hbase.client.pause setting by what we have in this array until we
run out of array items.
- returnCompressor(Compressor) - Method in enum org.apache.hadoop.hbase.io.compress.Compression.Algorithm
-
- returnDecompressor(Decompressor) - Method in enum org.apache.hadoop.hbase.io.compress.Compression.Algorithm
-
- reversed - Variable in class org.apache.hadoop.hbase.filter.Filter
-
- reverseDNS(InetAddress) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Deprecated.
mistakenly made public in 0.98.7. scope will change to package-private
- revoke(Connection, TableName, String, byte[], byte[], Permission.Action...) - Static method in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
Revokes the permission on the table
- revoke(Connection, String, String, Permission.Action...) - Static method in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
Revokes the permission on the table for the specified user.
- revoke(Connection, String, Permission.Action...) - Static method in class org.apache.hadoop.hbase.security.access.AccessControlClient
-
Revoke global permissions for the specified user.
- rollWALWriter(ServerName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Roll the log writer.
- row - Variable in class org.apache.hadoop.hbase.client.Mutation
-
- Row - Interface in org.apache.hadoop.hbase.client
-
Has a row.
- ROW_KEYS_PARAM_NAME - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- RowCounter - Class in org.apache.hadoop.hbase.mapred
-
A job with a map to count rows.
- RowCounter() - Constructor for class org.apache.hadoop.hbase.mapred.RowCounter
-
- RowCounter - Class in org.apache.hadoop.hbase.mapreduce
-
A job with a just a map phase to count rows.
- RowCounter() - Constructor for class org.apache.hadoop.hbase.mapreduce.RowCounter
-
- RowFilter - Class in org.apache.hadoop.hbase.filter
-
This filter is used to filter based on the key.
- RowFilter(CompareFilter.CompareOp, ByteArrayComparable) - Constructor for class org.apache.hadoop.hbase.filter.RowFilter
-
Constructor.
- RowMutations - Class in org.apache.hadoop.hbase.client
-
Performs multiple mutations atomically on a single row.
- RowMutations() - Constructor for class org.apache.hadoop.hbase.client.RowMutations
-
Constructor for Writable.
- RowMutations(byte[]) - Constructor for class org.apache.hadoop.hbase.client.RowMutations
-
Create an atomic mutation for the specified row.
- RowTooBigException - Exception in org.apache.hadoop.hbase.client
-
Gets or Scans throw this exception if running without in-row scan flag
set and row size appears to exceed max configured size (configurable via
hbase.table.max.rowsize).
- RowTooBigException(String) - Constructor for exception org.apache.hadoop.hbase.client.RowTooBigException
-
- RowTooBigException - Exception in org.apache.hadoop.hbase.regionserver
-
- RowTooBigException(String) - Constructor for exception org.apache.hadoop.hbase.regionserver.RowTooBigException
-
Deprecated.
- RPAREN - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for RPAREN
- RPC_CODEC_CONF_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Configuration key for setting RPC codec class name
- RPC_CURRENT_VERSION - Static variable in class org.apache.hadoop.hbase.HConstants
-
- RPC_HEADER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The first four bytes of Hadoop RPC connections
- RpcRetryingCaller<T> - Interface in org.apache.hadoop.hbase.client
-
- run(String[]) - Method in class org.apache.hadoop.hbase.mapred.RowCounter
-
- run(String[]) - Method in class org.apache.hadoop.hbase.mapreduce.CellCounter
-
- run(String[]) - Method in class org.apache.hadoop.hbase.mapreduce.CopyTable
-
- run(String[]) - Method in class org.apache.hadoop.hbase.mapreduce.Export
-
- run(String[]) - Method in class org.apache.hadoop.hbase.mapreduce.Import
-
- run(String[]) - Method in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- run(String[]) - Method in class org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
-
- run(String[]) - Method in class org.apache.hadoop.hbase.mapreduce.RowCounter
-
- run(String[]) - Method in class org.apache.hadoop.hbase.mapreduce.WALPlayer
-
- run(String[]) - Method in class org.apache.hadoop.hbase.mob.mapreduce.Sweeper
-
Main method for the tool.
- run(String[]) - Method in class org.apache.hadoop.hbase.snapshot.ExportSnapshot
-
Execute the export snapshot by copying the snapshot metadata, hfiles and wals.
- run(String[]) - Method in class org.apache.hadoop.hbase.snapshot.SnapshotInfo
-
- runAs(PrivilegedAction<T>) - Method in class org.apache.hadoop.hbase.security.User
-
Executes the given action within the context of this user.
- runAs(PrivilegedExceptionAction<T>) - Method in class org.apache.hadoop.hbase.security.User
-
Executes the given action within the context of this user.
- runAsLoginUser(PrivilegedExceptionAction<T>) - Static method in class org.apache.hadoop.hbase.security.User
-
Executes the given action as the login user
- runCatalogScan() - Method in interface org.apache.hadoop.hbase.client.Admin
-
Ask for a scan of the catalog table
- S - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for 'S'
- Scan - Class in org.apache.hadoop.hbase.client
-
Used to perform Scan operations.
- Scan() - Constructor for class org.apache.hadoop.hbase.client.Scan
-
Create a Scan operation across all rows.
- Scan(byte[], Filter) - Constructor for class org.apache.hadoop.hbase.client.Scan
-
- Scan(byte[]) - Constructor for class org.apache.hadoop.hbase.client.Scan
-
Create a Scan operation starting at the specified row.
- Scan(byte[], byte[]) - Constructor for class org.apache.hadoop.hbase.client.Scan
-
Create a Scan operation for the range of rows specified.
- Scan(Scan) - Constructor for class org.apache.hadoop.hbase.client.Scan
-
Creates a new instance of this class while copying all values.
- Scan(Get) - Constructor for class org.apache.hadoop.hbase.client.Scan
-
Builds a scan object with the same specs as get.
- SCAN - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Base-64 encoded scanner.
- SCAN_ATTRIBUTES_METRICS_DATA - Static variable in class org.apache.hadoop.hbase.client.Scan
-
Deprecated.
- SCAN_ATTRIBUTES_METRICS_ENABLE - Static variable in class org.apache.hadoop.hbase.client.Scan
-
- SCAN_ATTRIBUTES_TABLE_NAME - Static variable in class org.apache.hadoop.hbase.client.Scan
-
- SCAN_BATCH_SIZE - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- SCAN_BATCHSIZE - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Set the maximum number of values to return for each call to next().
- SCAN_CACHEBLOCKS - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Set to false to disable server-side caching of blocks for this scan.
- SCAN_CACHEDROWS - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
The number of rows for caching that will be passed to scanners.
- SCAN_COLUMN - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- SCAN_COLUMN_FAMILY - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Column Family to Scan
- SCAN_COLUMNS - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Space delimited list of columns and column families to scan.
- SCAN_END_ROW - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- SCAN_END_TIME - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- SCAN_FETCH_SIZE - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- SCAN_FILTER - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- SCAN_LIMIT - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- SCAN_MAX_VERSIONS - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- SCAN_MAXVERSIONS - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
The maximum number of version to return.
- SCAN_ROW_START - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Scan start row
- SCAN_ROW_STOP - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Scan stop row
- SCAN_START_ROW - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- SCAN_START_TIME - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- SCAN_TIMERANGE_END - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
The ending timestamp used to filter columns with a specific range of versions.
- SCAN_TIMERANGE_START - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
The starting timestamp used to filter columns with a specific range of versions.
- SCAN_TIMESTAMP - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
The timestamp used to filter columns with a specific timestamp.
- ScanLabelGenerator - Interface in org.apache.hadoop.hbase.security.visibility
-
This would be the interface which would be used add labels to the RPC context
and this would be stored against the UGI.
- ScanMetrics - Class in org.apache.hadoop.hbase.client.metrics
-
Provides metrics related to scan operations (both server side and client side metrics).
- ScanMetrics() - Constructor for class org.apache.hadoop.hbase.client.metrics.ScanMetrics
-
constructor
- ScannerTimeoutException - Exception in org.apache.hadoop.hbase.client
-
Thrown when a scanner has timed out.
- SCANS - Static variable in class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormat
-
Job parameter that specifies the scan list.
- searchDelimiterIndex(byte[], int, int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- searchDelimiterIndexInReverse(byte[], int, int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Find index of passed delimiter walking from end of buffer backwards.
- second - Variable in class org.apache.hadoop.hbase.util.Pair
-
- SecurityCapability - Enum in org.apache.hadoop.hbase.client.security
-
Available security capabilities
- SEPARATOR_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- SEQNUM_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The open seqnum column qualifier
- SEQNUM_QUALIFIER_STR - Static variable in class org.apache.hadoop.hbase.HConstants
-
The open seqnum column qualifier
- serialize(String, Throwable) - Static method in exception org.apache.hadoop.hbase.errorhandling.ForeignException
-
Converts a ForeignException to an array of bytes.
- SERVER_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The server column qualifier
- SERVER_QUALIFIER_STR - Static variable in class org.apache.hadoop.hbase.HConstants
-
The server column qualifier
- ServerLoad - Class in org.apache.hadoop.hbase
-
This class is used for exporting current state of load on a RegionServer.
- ServerLoad(ClusterStatusProtos.ServerLoad) - Constructor for class org.apache.hadoop.hbase.ServerLoad
-
- serverLoad - Variable in class org.apache.hadoop.hbase.ServerLoad
-
- ServerName - Class in org.apache.hadoop.hbase
-
Instance of an HBase ServerName.
- SERVERNAME_PATTERN - Static variable in class org.apache.hadoop.hbase.ServerName
-
- SERVERNAME_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
- SERVERNAME_QUALIFIER_STR - Static variable in class org.apache.hadoop.hbase.HConstants
-
The serverName column qualifier.
- SERVERNAME_SEPARATOR - Static variable in class org.apache.hadoop.hbase.ServerName
-
This character is used as separator between server hostname, port and
startcode.
- ServerNotRunningYetException - Exception in org.apache.hadoop.hbase.ipc
-
- ServerNotRunningYetException(String) - Constructor for exception org.apache.hadoop.hbase.ipc.ServerNotRunningYetException
-
- ServerSideScanMetrics - Class in org.apache.hadoop.hbase.client.metrics
-
Provides server side metrics related to scan operations.
- ServerSideScanMetrics() - Constructor for class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
- set(byte[]) - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- set(byte[], int, int) - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- set(int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Reuse this ByteRange
over a new byte[].
- set(byte[]) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Reuse this ByteRange
over a new byte[].
- set(byte[], int, int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Reuse this ByteRange
over a new byte[].
- set(byte[]) - Method in class org.apache.hadoop.hbase.util.Bytes
-
- set(byte[], int, int) - Method in class org.apache.hadoop.hbase.util.Bytes
-
- set(long) - Method in class org.apache.hadoop.hbase.util.Counter
-
- set(int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- set(byte[]) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- set(byte[], int, int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- set(int) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- set(byte[]) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- set(byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- set(int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- set(byte[]) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- set(byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- set(int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- set(byte[]) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- set(byte[], int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- setACL(String, Permission) - Method in class org.apache.hadoop.hbase.client.Append
-
- setACL(Map<String, Permission>) - Method in class org.apache.hadoop.hbase.client.Append
-
- setACL(String, Permission) - Method in class org.apache.hadoop.hbase.client.Delete
-
- setACL(Map<String, Permission>) - Method in class org.apache.hadoop.hbase.client.Delete
-
- setACL(Map<String, Permission>) - Method in class org.apache.hadoop.hbase.client.Get
-
- setACL(String, Permission) - Method in class org.apache.hadoop.hbase.client.Get
-
- setACL(String, Permission) - Method in class org.apache.hadoop.hbase.client.Increment
-
- setACL(Map<String, Permission>) - Method in class org.apache.hadoop.hbase.client.Increment
-
- setACL(String, Permission) - Method in class org.apache.hadoop.hbase.client.Mutation
-
- setACL(Map<String, Permission>) - Method in class org.apache.hadoop.hbase.client.Mutation
-
- setACL(String, Permission) - Method in class org.apache.hadoop.hbase.client.Put
-
- setACL(Map<String, Permission>) - Method in class org.apache.hadoop.hbase.client.Put
-
- setACL(String, Permission) - Method in class org.apache.hadoop.hbase.client.Query
-
- setACL(Map<String, Permission>) - Method in class org.apache.hadoop.hbase.client.Query
-
- setACL(Map<String, Permission>) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setACL(String, Permission) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setAllowPartialResults(boolean) - Method in class org.apache.hadoop.hbase.client.Scan
-
Setting whether the caller wants to see the partial results that may be returned from the
server.
- setAsyncPrefetch(boolean) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setAttribute(String, byte[]) - Method in class org.apache.hadoop.hbase.client.Append
-
- setAttribute(String, byte[]) - Method in interface org.apache.hadoop.hbase.client.Attributes
-
Sets an attribute.
- setAttribute(String, byte[]) - Method in class org.apache.hadoop.hbase.client.Delete
-
- setAttribute(String, byte[]) - Method in class org.apache.hadoop.hbase.client.Get
-
- setAttribute(String, byte[]) - Method in class org.apache.hadoop.hbase.client.Increment
-
- setAttribute(String, byte[]) - Method in class org.apache.hadoop.hbase.client.OperationWithAttributes
-
- setAttribute(String, byte[]) - Method in class org.apache.hadoop.hbase.client.Put
-
- setAttribute(String, byte[]) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setAuthorizations(Authorizations) - Method in class org.apache.hadoop.hbase.client.Get
-
- setAuthorizations(Authorizations) - Method in class org.apache.hadoop.hbase.client.Query
-
Sets the authorizations to be used by this Query
- setAuthorizations(Authorizations) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setAuths(Configuration, String[], String) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
- setAuths(Connection, String[], String) - Static method in class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
Sets given labels globally authorized for the user.
- setAuths(byte[], List<byte[]>) - Method in interface org.apache.hadoop.hbase.security.visibility.VisibilityLabelService
-
Sets given labels globally authorized for the user.
- setBalancerRunning(boolean, boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Turn the load balancer on or off.
- setBatch(int) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set the maximum number of values to return for each call to next()
- setBlockCacheEnabled(boolean) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setBlocksize(int) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setBloomFilterType(BloomType) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setBody(byte[]) - Method in class org.apache.hadoop.hbase.rest.client.Response
-
- setCacheBlocks(boolean) - Method in class org.apache.hadoop.hbase.client.Get
-
Set whether blocks should be cached for this Get.
- setCacheBlocks(boolean) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set whether blocks should be cached for this Scan.
- setCacheBloomsOnWrite(boolean) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setCacheDataInL1(boolean) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setCacheDataOnWrite(boolean) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setCacheIndexesOnWrite(boolean) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setCaching(int) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set the number of rows for caching that will be passed to scanners.
- setCellVisibility(CellVisibility) - Method in class org.apache.hadoop.hbase.client.Append
-
- setCellVisibility(CellVisibility) - Method in class org.apache.hadoop.hbase.client.Delete
-
- setCellVisibility(CellVisibility) - Method in class org.apache.hadoop.hbase.client.Increment
-
- setCellVisibility(CellVisibility) - Method in class org.apache.hadoop.hbase.client.Mutation
-
Sets the visibility expression associated with cells in this Mutation.
- setCellVisibility(CellVisibility) - Method in class org.apache.hadoop.hbase.client.Put
-
- setChance(float) - Method in class org.apache.hadoop.hbase.filter.RandomRowFilter
-
Set the chance that a row is included.
- setCharset(Charset) - Method in class org.apache.hadoop.hbase.filter.RegexStringComparator
-
Specifies the
Charset
to use to convert the row key to a String.
- setCheckExistenceOnly(boolean) - Method in class org.apache.hadoop.hbase.client.Get
-
- setCipher(Cipher) - Method in class org.apache.hadoop.hbase.io.crypto.Context
-
- setCipher(Cipher) - Method in class org.apache.hadoop.hbase.io.crypto.Encryption.Context
-
- setClientAckTime(long) - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- setClosestRowBefore(boolean) - Method in class org.apache.hadoop.hbase.client.Get
-
Deprecated.
since 2.0.0 and will be removed in 3.0.0
- setCluster(Cluster) - Method in class org.apache.hadoop.hbase.rest.client.Client
-
- setClusterIds(List<UUID>) - Method in class org.apache.hadoop.hbase.client.Append
-
- setClusterIds(List<UUID>) - Method in class org.apache.hadoop.hbase.client.Delete
-
- setClusterIds(List<UUID>) - Method in class org.apache.hadoop.hbase.client.Increment
-
- setClusterIds(List<UUID>) - Method in class org.apache.hadoop.hbase.client.Mutation
-
Marks that the clusters with the given clusterIds have consumed the mutation
- setClusterIds(List<UUID>) - Method in class org.apache.hadoop.hbase.client.Put
-
- setClusterKey(String) - Method in class org.apache.hadoop.hbase.replication.ReplicationPeerConfig
-
Set the clusterKey which is the concatenation of the slave cluster's:
hbase.zookeeper.quorum:hbase.zookeeper.property.clientPort:zookeeper.znode.parent
- setCode(int) - Method in class org.apache.hadoop.hbase.rest.client.Response
-
- setCompactionCompressionType(Compression.Algorithm) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Compression types supported in hbase.
- setCompactionEnabled(boolean) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Setting the table compaction enable flag.
- setCompressionType(Compression.Algorithm) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Compression types supported in hbase.
- setCompressTags(boolean) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Set whether the tags should be compressed along with DataBlockEncoding.
- setConf(Configuration) - Method in class org.apache.hadoop.hbase.io.crypto.Context
-
- setConf(Configuration) - Method in class org.apache.hadoop.hbase.io.crypto.DefaultCipherProvider
-
- setConf(Configuration) - Method in class org.apache.hadoop.hbase.mapreduce.GroupingTableMapper
-
Sets the configuration.
- setConf(Configuration) - Method in class org.apache.hadoop.hbase.mapreduce.HRegionPartitioner
-
Sets the configuration.
- setConf(Configuration) - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormat
-
Sets the configuration.
- setConf(Configuration) - Method in class org.apache.hadoop.hbase.mapreduce.SimpleTotalOrderPartitioner
-
- setConf(Configuration) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Sets the configuration.
- setConf(Configuration) - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputFormat
-
- setConfiguration(String, String) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setConfiguration(String, String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- setConfiguration(String, String) - Method in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- setConsistency(Consistency) - Method in class org.apache.hadoop.hbase.client.Get
-
- setConsistency(Consistency) - Method in class org.apache.hadoop.hbase.client.Query
-
Sets the consistency level for this operation
- setConsistency(Consistency) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setCounter(String, long) - Method in class org.apache.hadoop.hbase.client.metrics.ServerSideScanMetrics
-
- setDataBlockEncoding(DataBlockEncoding) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Set data block encoding algorithm used in block cache.
- setDefaultClientPort(int) - Method in class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
- setDFSReplication(short) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Set the replication factor to hfile(s) belonging to this family
- setDurability(Durability) - Method in class org.apache.hadoop.hbase.client.Append
-
- setDurability(Durability) - Method in class org.apache.hadoop.hbase.client.Delete
-
- setDurability(Durability) - Method in class org.apache.hadoop.hbase.client.Increment
-
- setDurability(Durability) - Method in class org.apache.hadoop.hbase.client.Mutation
-
Set the durability for this mutation
- setDurability(Durability) - Method in class org.apache.hadoop.hbase.client.Put
-
- setDurability(Durability) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- setEncryptionKey(byte[]) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Set the raw crypto key attribute for the family
- setEncryptionType(String) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Set the encryption algorithm for use with this family
- setEndKey(Configuration, byte[]) - Static method in class org.apache.hadoop.hbase.mapreduce.SimpleTotalOrderPartitioner
-
- setEndRow(byte[]) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- setEndRow(byte[]) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- setEvictBlocksOnClose(boolean) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setExists(Boolean) - Method in class org.apache.hadoop.hbase.client.Result
-
- setFamilyCellMap(NavigableMap<byte[], List<Cell>>) - Method in class org.apache.hadoop.hbase.client.Append
-
- setFamilyCellMap(NavigableMap<byte[], List<Cell>>) - Method in class org.apache.hadoop.hbase.client.Delete
-
- setFamilyCellMap(NavigableMap<byte[], List<Cell>>) - Method in class org.apache.hadoop.hbase.client.Increment
-
- setFamilyCellMap(NavigableMap<byte[], List<Cell>>) - Method in class org.apache.hadoop.hbase.client.Mutation
-
Method for setting the put's familyMap
- setFamilyCellMap(NavigableMap<byte[], List<Cell>>) - Method in class org.apache.hadoop.hbase.client.Put
-
- setFamilyMap(Map<byte[], NavigableSet<byte[]>>) - Method in class org.apache.hadoop.hbase.client.Scan
-
Setting the familyMap
- setFilter(Filter) - Method in class org.apache.hadoop.hbase.client.Get
-
- setFilter(Filter) - Method in class org.apache.hadoop.hbase.client.Query
-
Apply the specified server-side filter when performing the Query.
- setFilter(Filter) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setFilterIfMissing(boolean) - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
Set whether entire row should be filtered if column is not found.
- setFirst(T1) - Method in class org.apache.hadoop.hbase.util.Pair
-
Replace the first element of the pair.
- setFlushPolicyClassName(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
This sets the class associated with the flush policy which determines determines the stores
need to be flushed when flushing a region.
- setFoundKV(boolean) - Method in class org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter
-
- setHeaders(Header[]) - Method in class org.apache.hadoop.hbase.rest.client.Response
-
- setHTable(HTable) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
- setHTable(Table) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- setHTable(Table) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- setHTable(HTable) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
- setHTable(Table) - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReader
-
Deprecated.
Use setTable() instead.
- setHTable(Table) - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl
-
Sets the HBase table.
- setId(String) - Method in class org.apache.hadoop.hbase.client.Append
-
- setId(String) - Method in class org.apache.hadoop.hbase.client.Delete
-
- setId(String) - Method in class org.apache.hadoop.hbase.client.Get
-
- setId(String) - Method in class org.apache.hadoop.hbase.client.Increment
-
- setId(String) - Method in class org.apache.hadoop.hbase.client.OperationWithAttributes
-
This method allows you to set an identifier on an operation.
- setId(String) - Method in class org.apache.hadoop.hbase.client.Put
-
- setId(String) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setInMemory(boolean) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setInput(Configuration, Map<String, Collection<Scan>>, Path) - Static method in class org.apache.hadoop.hbase.mapred.MultiTableSnapshotInputFormat
-
Configure conf to read from snapshotScans, with snapshots restored to a subdirectory of
restoreDir.
- setInput(JobConf, String, Path) - Static method in class org.apache.hadoop.hbase.mapred.TableSnapshotInputFormat
-
Configures the job to use TableSnapshotInputFormat to read from a snapshot.
- setInput(Configuration, Map<String, Collection<Scan>>, Path) - Static method in class org.apache.hadoop.hbase.mapreduce.MultiTableSnapshotInputFormat
-
- setInput(Job, String, Path) - Static method in class org.apache.hadoop.hbase.mapreduce.TableSnapshotInputFormat
-
Configures the job to use TableSnapshotInputFormat to read from a snapshot.
- setInputColumns(byte[][]) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
- setInputColumns(byte[][]) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- setInputColumns(byte[][]) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- setIsolationLevel(IsolationLevel) - Method in class org.apache.hadoop.hbase.client.Get
-
- setIsolationLevel(IsolationLevel) - Method in class org.apache.hadoop.hbase.client.Query
-
Set the isolation level for this query.
- setIsolationLevel(IsolationLevel) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setIv(byte[]) - Method in interface org.apache.hadoop.hbase.io.crypto.Decryptor
-
Set the initialization vector
- setIv(byte[]) - Method in interface org.apache.hadoop.hbase.io.crypto.Encryptor
-
Set the initialization vector
- setKeepDeletedCells(KeepDeletedCells) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setKey(Key) - Method in class org.apache.hadoop.hbase.io.crypto.Context
-
- setKey(Key) - Method in interface org.apache.hadoop.hbase.io.crypto.Decryptor
-
Set the secret key
- setKey(Key) - Method in class org.apache.hadoop.hbase.io.crypto.Encryption.Context
-
- setKey(byte[]) - Method in class org.apache.hadoop.hbase.io.crypto.Encryption.Context
-
- setKey(Key) - Method in interface org.apache.hadoop.hbase.io.crypto.Encryptor
-
Set the secret key
- setKeyValues(Configuration, String, Collection<Map.Entry<String, String>>) - Static method in class org.apache.hadoop.hbase.util.ConfigurationUtil
-
- setKeyValues(Configuration, String, Collection<Map.Entry<String, String>>, char) - Static method in class org.apache.hadoop.hbase.util.ConfigurationUtil
-
Store a collection of Map.Entry's in conf, with each entry separated by ','
and key values delimited by delimiter.
- setLatestVersionOnly(boolean) - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
Set whether only the latest version of the column value should be compared.
- setLength(int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Update the length of this range.
- setLength(int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- setLength(int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
Update the length of this range.
- setLimit(int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Limits the byte range upto a specified value.
- setLimit(int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- setLoadColumnFamiliesOnDemand(boolean) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set the value indicating whether loading CFs on demand should be allowed (cluster
default is false).
- setMaxFileSize(long) - Method in class org.apache.hadoop.hbase.client.UnmodifyableHTableDescriptor
-
- setMaxFileSize(long) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Sets the maximum size upto which a region can grow to after which a region
split is triggered.
- setMaxResultSize(long) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set the maximum result size.
- setMaxResultsPerColumnFamily(int) - Method in class org.apache.hadoop.hbase.client.Get
-
Set the maximum number of values to return per row per Column Family
- setMaxResultsPerColumnFamily(int) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set the maximum number of values to return per row per Column Family
- setMaxVersions() - Method in class org.apache.hadoop.hbase.client.Get
-
Get all available versions.
- setMaxVersions(int) - Method in class org.apache.hadoop.hbase.client.Get
-
Get up to the specified number of versions of each column.
- setMaxVersions() - Method in class org.apache.hadoop.hbase.client.Scan
-
Get all available versions.
- setMaxVersions(int) - Method in class org.apache.hadoop.hbase.client.Scan
-
Get up to the specified number of versions of each column.
- setMaxVersions(int) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setMemStoreFlushSize(long) - Method in class org.apache.hadoop.hbase.client.UnmodifyableHTableDescriptor
-
- setMemStoreFlushSize(long) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Represents the maximum size of the memstore after which the contents of the
memstore are flushed to the filesystem.
- setMetaRegion(boolean) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL Used to denote if the current table represents
-ROOT-
or hbase:meta
region.
- setMinVersions(int) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setMobEnabled(boolean) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Enables the mob for the family.
- setMobThreshold(long) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Sets the mob threshold of the family.
- setName(byte[]) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Deprecated.
- setName(TableName) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Deprecated.
- setNamespaceFilter(String) - Method in class org.apache.hadoop.hbase.quotas.QuotaFilter
-
Set the namespace filter regex
- setNonceKey(NonceKey) - Method in class org.apache.hadoop.hbase.ProcedureInfo
-
- setNormalizationEnabled(boolean) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Setting the table normalization enable flag.
- setNormalizerRunning(boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Turn region normalizer on or off.
- setNumMapTasks(String, JobConf) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
Sets the number of map tasks for the given job configuration to the
number of regions the given table has.
- setNumReduceTasks(String, JobConf) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
Sets the number of reduce tasks for the given job configuration to the
number of regions the given table has.
- setNumReduceTasks(String, Job) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Sets the number of reduce tasks for the given job configuration to the
number of regions the given table has.
- setOffline(boolean) - Method in class org.apache.hadoop.hbase.HRegionInfo
-
The parent of a region split is offline while split daughters hold
references to the parent.
- setOffset(int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Update the beginning of this range.
- setOffset(int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- setOffset(int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
Update the beginning of this range.
- setOwner(User) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Deprecated.
- setOwnerString(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Deprecated.
- setPeerTableCFs(String, String) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
- setPeerTableCFs(String, Map<TableName, ? extends Collection<String>>) - Method in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
Set the replicable table-cf config of the specified peer
- setPosition(int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
Update the position
index.
- setPrefetchBlocksOnOpen(boolean) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setQuota(QuotaSettings) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Apply the new quota settings.
- setRaw(boolean) - Method in class org.apache.hadoop.hbase.client.Scan
-
Enable/disable "raw" mode for this scan.
- setReadOnly(boolean) - Method in class org.apache.hadoop.hbase.client.UnmodifyableHTableDescriptor
-
- setReadOnly(boolean) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Setting the table as read only sets all the columns in the table as read
only.
- setRegionCachePrefetch(TableName, boolean) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
does nothing since since 0.99
- setRegionCachePrefetch(byte[], boolean) - Method in interface org.apache.hadoop.hbase.client.HConnection
-
Deprecated.
does nothing since 0.99
- setRegionMemstoreReplication(boolean) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Enable or Disable the memstore replication from the primary region to the replicas.
- setRegionReplication(int) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
Sets the number of replicas per region.
- setRegionSplitPolicyClassName(String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
This sets the class associated with the region split policy which
determines when a region split should occur.
- setReplicaId(int) - Method in class org.apache.hadoop.hbase.client.Get
-
- setReplicaId(int) - Method in class org.apache.hadoop.hbase.client.Query
-
Specify region replica id where Query will fetch data from.
- setReplicaId(int) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setReplicationEndpointImpl(String) - Method in class org.apache.hadoop.hbase.replication.ReplicationPeerConfig
-
Sets the ReplicationEndpoint plugin class for this peer.
- setReturnResults(boolean) - Method in class org.apache.hadoop.hbase.client.Append
-
- setReturnResults(boolean) - Method in class org.apache.hadoop.hbase.client.Increment
-
- setReversed(boolean) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set whether this scan is a reversed one
- setReversed(boolean) - Method in class org.apache.hadoop.hbase.filter.Filter
-
alter the reversed scan flag
- setReversed(boolean) - Method in class org.apache.hadoop.hbase.filter.FilterList
-
- setRootRegion(boolean) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
INTERNAL Used to denote if the current table represents
-ROOT-
region.
- setRowFilter(Filter) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
Allows subclasses to set the
Filter
to be used.
- setRowFilter(Filter) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- setRowFilter(Filter) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- setRowOffsetPerColumnFamily(int) - Method in class org.apache.hadoop.hbase.client.Get
-
Set offset for the row per Column Family.
- setRowOffsetPerColumnFamily(int) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set offset for the row per Column Family.
- setRowPrefixFilter(byte[]) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set a filter (using stopRow and startRow) so the result set only contains rows where the
rowKey starts with the specified prefix.
- setScan(Scan) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Sets the scan defining the actual details like columns etc.
- setScan(Scan) - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReader
-
Sets the scan defining the actual details like columns etc.
- setScan(Scan) - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl
-
Sets the scan defining the actual details like columns etc.
- setScanMetricsEnabled(boolean) - Method in class org.apache.hadoop.hbase.client.Scan
-
- setScannerCaching(JobConf, int) - Static method in class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
Sets the number of rows to return and cache with each scanner iteration.
- setScannerCaching(Job, int) - Static method in class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
Sets the number of rows to return and cache with each scanner iteration.
- setScans(List<Scan>) - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormatBase
-
Allows subclasses to set the list of
Scan
objects.
- setScope(int) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setSecond(T2) - Method in class org.apache.hadoop.hbase.util.Pair
-
Replace the second element of the pair.
- setSequenceId(Cell, long) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Sets the given seqId to the cell.
- setSmall(boolean) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set whether this scan is a small scan
- setSplit(boolean) - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- setStartKey(Configuration, byte[]) - Static method in class org.apache.hadoop.hbase.mapreduce.SimpleTotalOrderPartitioner
-
- setStartRow(byte[]) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set the start row of the scan.
- setStartRow(byte[]) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- setStartRow(byte[]) - Method in class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- setStatistics(ClientProtos.RegionLoadStats) - Method in class org.apache.hadoop.hbase.client.Result
-
Set load information about the region to the information about the result
- setStopRow(byte[]) - Method in class org.apache.hadoop.hbase.client.Scan
-
Set the stop row of the scan.
- setTable(Table) - Method in class org.apache.hadoop.hbase.mapreduce.TableRecordReader
-
- setTableFilter(String) - Method in class org.apache.hadoop.hbase.quotas.QuotaFilter
-
Set the table filter regex
- setTableRecordReader(TableRecordReader) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
- setTableRecordReader(TableRecordReader) - Method in class org.apache.hadoop.hbase.mapreduce.MultiTableInputFormatBase
-
- setTableRecordReader(TableRecordReader) - Method in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
- setTickTime(int) - Method in class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
- setTimeRange(long, long) - Method in class org.apache.hadoop.hbase.client.Get
-
Get versions of columns only within the specified timestamp range,
[minStamp, maxStamp).
- setTimeRange(long, long) - Method in class org.apache.hadoop.hbase.client.Increment
-
Sets the TimeRange to be used on the Get for this increment.
- setTimeRange(long, long) - Method in class org.apache.hadoop.hbase.client.Scan
-
Get versions of columns only within the specified timestamp range,
[minStamp, maxStamp).
- setTimestamp(Cell, long) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Sets the given timestamp to the cell.
- setTimestamp(Cell, byte[], int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Sets the given timestamp to the cell.
- setTimestamp(long) - Method in class org.apache.hadoop.hbase.client.Delete
-
Set the timestamp of the delete.
- setTimeStamp(long) - Method in class org.apache.hadoop.hbase.client.Get
-
Get versions of columns with the specified timestamp.
- setTimeStamp(long) - Method in class org.apache.hadoop.hbase.client.Scan
-
Get versions of columns with the specified timestamp.
- setTimeToLive(int) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setTimeToLive(String) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setTTL(long) - Method in class org.apache.hadoop.hbase.client.Append
-
- setTTL(long) - Method in class org.apache.hadoop.hbase.client.Delete
-
- setTTL(long) - Method in class org.apache.hadoop.hbase.client.Increment
-
- setTTL(long) - Method in class org.apache.hadoop.hbase.client.Mutation
-
Set the TTL desired for the result of the mutation, in milliseconds.
- setTTL(long) - Method in class org.apache.hadoop.hbase.client.Put
-
- setup(Reducer<ImmutableBytesWritable, Text, ImmutableBytesWritable, KeyValue>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.TextSortReducer
-
Handles initializing this class with objects specific to it (i.e., the parser).
- setup(Mapper<LongWritable, Text, ImmutableBytesWritable, Put>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
Handles initializing this class with objects specific to it (i.e., the parser).
- setup(Mapper<LongWritable, Text, ImmutableBytesWritable, Text>.Context) - Method in class org.apache.hadoop.hbase.mapreduce.TsvImporterTextMapper
-
Handles initializing this class with objects specific to it (i.e., the parser).
- setupJob(JobContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputCommitter
-
- setupSetQuotaRequest(MasterProtos.SetQuotaRequest.Builder) - Method in class org.apache.hadoop.hbase.quotas.QuotaSettings
-
Called by toSetQuotaRequestProto()
the subclass should implement this method to set the specific SetQuotaRequest
properties.
- setupTask(TaskAttemptContext) - Method in class org.apache.hadoop.hbase.mapreduce.TableOutputCommitter
-
- setUserFilter(String) - Method in class org.apache.hadoop.hbase.quotas.QuotaFilter
-
Set the user filter regex
- setValue(byte[], byte[]) - Method in class org.apache.hadoop.hbase.client.UnmodifyableHTableDescriptor
-
- setValue(String, String) - Method in class org.apache.hadoop.hbase.client.UnmodifyableHTableDescriptor
-
- setValue(byte[], byte[]) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setValue(String, String) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- setValue(byte[], byte[]) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- setValue(Bytes, Bytes) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- setValue(String, String) - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- setVersions(int, int) - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
Set minimum and maximum versions to keep
- setWriteBufferSize(long) - Method in interface org.apache.hadoop.hbase.client.Table
-
- setWriteBufferSize(long) - Method in class org.apache.hadoop.hbase.rest.client.RemoteHTable
-
- shallowCopy() - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Create a new ByteRange
that points at this range's byte[].
- shallowCopy() - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- shallowCopy() - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- shallowCopy() - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- shallowCopy() - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- shallowCopy() - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- shallowCopySubRange(int, int) - Method in interface org.apache.hadoop.hbase.util.ByteRange
-
Create a new ByteRange
that points at this range's byte[].
- shallowCopySubRange(int, int) - Method in interface org.apache.hadoop.hbase.util.PositionedByteRange
-
- shallowCopySubRange(int, int) - Method in class org.apache.hadoop.hbase.util.SimpleByteRange
-
- shallowCopySubRange(int, int) - Method in class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
- shallowCopySubRange(int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
- shallowCopySubRange(int, int) - Method in class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
- SHUFFLE_MAPS - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
Specify if we have to shuffle the map tasks.
- shutdown() - Method in interface org.apache.hadoop.hbase.client.Admin
-
Shuts down the HBase cluster
- shutdown() - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
Shut down the mini HBase cluster
- shutdown() - Method in class org.apache.hadoop.hbase.rest.client.Client
-
Shut down the client.
- shutdown() - Method in class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
- SimpleByteRange - Class in org.apache.hadoop.hbase.util
-
- SimpleByteRange() - Constructor for class org.apache.hadoop.hbase.util.SimpleByteRange
-
- SimpleByteRange(int) - Constructor for class org.apache.hadoop.hbase.util.SimpleByteRange
-
- SimpleByteRange(byte[]) - Constructor for class org.apache.hadoop.hbase.util.SimpleByteRange
-
Create a new ByteRange
over the provided bytes
.
- SimpleByteRange(byte[], int, int) - Constructor for class org.apache.hadoop.hbase.util.SimpleByteRange
-
Create a new ByteRange
over the provided bytes
.
- SimpleMutableByteRange - Class in org.apache.hadoop.hbase.util
-
- SimpleMutableByteRange() - Constructor for class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
Create a new ByteRange
lacking a backing array and with an
undefined viewport.
- SimpleMutableByteRange(int) - Constructor for class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
Create a new ByteRange
over a new backing array of size
capacity
.
- SimpleMutableByteRange(byte[]) - Constructor for class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
Create a new ByteRange
over the provided bytes
.
- SimpleMutableByteRange(byte[], int, int) - Constructor for class org.apache.hadoop.hbase.util.SimpleMutableByteRange
-
Create a new ByteRange
over the provided bytes
.
- SimplePositionedByteRange - Class in org.apache.hadoop.hbase.util
-
- SimplePositionedByteRange() - Constructor for class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
Create a new PositionedByteRange
lacking a backing array and with
an undefined viewport.
- SimplePositionedByteRange(int) - Constructor for class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
Create a new PositionedByteRange
over a new backing array of
size capacity
.
- SimplePositionedByteRange(byte[]) - Constructor for class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
Create a new PositionedByteRange
over the provided bytes
.
- SimplePositionedByteRange(byte[], int, int) - Constructor for class org.apache.hadoop.hbase.util.SimplePositionedByteRange
-
Create a new PositionedByteRange
over the provided bytes
.
- SimplePositionedMutableByteRange - Class in org.apache.hadoop.hbase.util
-
Extends the basic AbstractPositionedByteRange
implementation with
position support and it is a mutable version.
- SimplePositionedMutableByteRange() - Constructor for class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
Create a new PositionedByteRange
lacking a backing array and with
an undefined viewport.
- SimplePositionedMutableByteRange(int) - Constructor for class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
Create a new PositionedByteRange
over a new backing array of size
capacity
.
- SimplePositionedMutableByteRange(byte[]) - Constructor for class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
Create a new PositionedByteRange
over the provided bytes
.
- SimplePositionedMutableByteRange(byte[], int, int) - Constructor for class org.apache.hadoop.hbase.util.SimplePositionedMutableByteRange
-
Create a new PositionedByteRange
over the provided bytes
.
- SimpleTotalOrderPartitioner<VALUE> - Class in org.apache.hadoop.hbase.mapreduce
-
A partitioner that takes start and end keys and uses bigdecimal to figure
which reduce a key belongs to.
- SimpleTotalOrderPartitioner() - Constructor for class org.apache.hadoop.hbase.mapreduce.SimpleTotalOrderPartitioner
-
- SINGLE_QUOTE - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for a single quote
- SingleColumnValueExcludeFilter - Class in org.apache.hadoop.hbase.filter
-
A
Filter
that checks a single column value, but does not emit the
tested column.
- SingleColumnValueExcludeFilter(byte[], byte[], CompareFilter.CompareOp, byte[]) - Constructor for class org.apache.hadoop.hbase.filter.SingleColumnValueExcludeFilter
-
Constructor for binary compare of the value of a single column.
- SingleColumnValueExcludeFilter(byte[], byte[], CompareFilter.CompareOp, ByteArrayComparable) - Constructor for class org.apache.hadoop.hbase.filter.SingleColumnValueExcludeFilter
-
Constructor for binary compare of the value of a single column.
- SingleColumnValueExcludeFilter(byte[], byte[], CompareFilter.CompareOp, ByteArrayComparable, boolean, boolean) - Constructor for class org.apache.hadoop.hbase.filter.SingleColumnValueExcludeFilter
-
Constructor for protobuf deserialization only.
- SingleColumnValueFilter - Class in org.apache.hadoop.hbase.filter
-
This filter is used to filter cells based on value.
- SingleColumnValueFilter(byte[], byte[], CompareFilter.CompareOp, byte[]) - Constructor for class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
Constructor for binary compare of the value of a single column.
- SingleColumnValueFilter(byte[], byte[], CompareFilter.CompareOp, ByteArrayComparable) - Constructor for class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
Constructor for binary compare of the value of a single column.
- SingleColumnValueFilter(byte[], byte[], CompareFilter.CompareOp, ByteArrayComparable, boolean, boolean) - Constructor for class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
Constructor for protobuf deserialization only.
- size() - Method in class org.apache.hadoop.hbase.client.Mutation
-
Number of KeyValues carried by this Mutation.
- size() - Method in class org.apache.hadoop.hbase.client.Result
-
- size() - Method in class org.apache.hadoop.hbase.io.ByteBufferOutputStream
-
- SIZEOF_BOOLEAN - Static variable in class org.apache.hadoop.hbase.util.Bytes
-
Size of boolean in bytes
- SIZEOF_BYTE - Static variable in class org.apache.hadoop.hbase.util.Bytes
-
Size of byte in bytes
- SIZEOF_CHAR - Static variable in class org.apache.hadoop.hbase.util.Bytes
-
Size of char in bytes
- SIZEOF_DOUBLE - Static variable in class org.apache.hadoop.hbase.util.Bytes
-
Size of double in bytes
- SIZEOF_FLOAT - Static variable in class org.apache.hadoop.hbase.util.Bytes
-
Size of float in bytes
- SIZEOF_INT - Static variable in class org.apache.hadoop.hbase.util.Bytes
-
Size of int in bytes
- SIZEOF_LONG - Static variable in class org.apache.hadoop.hbase.util.Bytes
-
Size of long in bytes
- SIZEOF_SHORT - Static variable in class org.apache.hadoop.hbase.util.Bytes
-
Size of short in bytes
- sizeToString(long) - Static method in class org.apache.hadoop.hbase.quotas.QuotaSettings
-
- skip(PositionedByteRange) - Method in interface org.apache.hadoop.hbase.types.DataType
-
Skip src
's position forward over one encoded value.
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.FixedLengthWrapper
-
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.OrderedBytesBase
-
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawByte
-
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawBytes
-
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawDouble
-
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawFloat
-
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawInteger
-
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawLong
-
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawShort
-
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.RawString
-
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.Struct
-
- skip() - Method in class org.apache.hadoop.hbase.types.StructIterator
-
Bypass the next encoded value.
- skip(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
Skip src
's position forward over one encoded value.
- skip(ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Increment position in buffer.
- skip(PositionedByteRange) - Static method in class org.apache.hadoop.hbase.util.OrderedBytes
-
Skip buff
's position forward over one encoded value.
- SKIP_ARRAY - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
SKIP Array
- SKIP_BUFFER - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
- SKIP_LINES_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- SkipFilter - Class in org.apache.hadoop.hbase.filter
-
A wrapper filter that filters an entire row if any of the Cell checks do
not pass.
- SkipFilter(Filter) - Constructor for class org.apache.hadoop.hbase.filter.SkipFilter
-
- snapshot(String, TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Take a snapshot for the given table.
- snapshot(byte[], TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
public void snapshot(final String snapshotName, Create a timestamp consistent snapshot for the
given table.
- snapshot(String, TableName, HBaseProtos.SnapshotDescription.Type) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Create typed snapshot of the table.
- snapshot(HBaseProtos.SnapshotDescription) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Take a snapshot and wait for the server to complete that snapshot (blocking).
- SNAPSHOT_DIR_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
Name of the directory to store all snapshots.
- SnapshotCreationException - Exception in org.apache.hadoop.hbase.snapshot
-
Thrown when a snapshot could not be created due to a server-side error when
taking the snapshot.
- SnapshotCreationException(String) - Constructor for exception org.apache.hadoop.hbase.snapshot.SnapshotCreationException
-
Used internally by the RPC engine to pass the exception back to the client.
- SnapshotCreationException(String, HBaseProtos.SnapshotDescription) - Constructor for exception org.apache.hadoop.hbase.snapshot.SnapshotCreationException
-
Failure to create the specified snapshot
- SnapshotCreationException(String, Throwable, HBaseProtos.SnapshotDescription) - Constructor for exception org.apache.hadoop.hbase.snapshot.SnapshotCreationException
-
Failure to create the specified snapshot due to an external cause
- SnapshotDoesNotExistException - Exception in org.apache.hadoop.hbase.snapshot
-
Thrown when the server is looking for a snapshot but can't find the snapshot on the filesystem
- SnapshotDoesNotExistException(String) - Constructor for exception org.apache.hadoop.hbase.snapshot.SnapshotDoesNotExistException
-
- SnapshotDoesNotExistException(HBaseProtos.SnapshotDescription) - Constructor for exception org.apache.hadoop.hbase.snapshot.SnapshotDoesNotExistException
-
- SnapshotExistsException - Exception in org.apache.hadoop.hbase.snapshot
-
Thrown when a snapshot exists but should not
- SnapshotExistsException(String) - Constructor for exception org.apache.hadoop.hbase.snapshot.SnapshotExistsException
-
- SnapshotExistsException(String, HBaseProtos.SnapshotDescription) - Constructor for exception org.apache.hadoop.hbase.snapshot.SnapshotExistsException
-
Failure due to the snapshot already existing
- SnapshotInfo - Class in org.apache.hadoop.hbase.snapshot
-
Tool for dumping snapshot information.
- SnapshotInfo() - Constructor for class org.apache.hadoop.hbase.snapshot.SnapshotInfo
-
- SOCKET_RETRY_WAIT_MS - Static variable in class org.apache.hadoop.hbase.HConstants
-
The delay when re-trying a socket operation in a loop (HBASE-4712)
- sortAndMerge(List<MultiRowRangeFilter.RowRange>) - Static method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter
-
sort the ranges and if the ranges with overlap, then merge them.
- sortedPrefixes - Variable in class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- split(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Split a table.
- split(TableName, byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Split a table.
- split(byte[], byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Split passed range.
- split(byte[], byte[], boolean, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Split passed range.
- SPLIT_LOGDIR_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
Used to construct the name of the splitlog directory for a region server
- SPLIT_POLICY - Static variable in class org.apache.hadoop.hbase.HTableDescriptor
-
- SPLITA_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The lower-half split region column qualifier
- SPLITB_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The upper-half split region column qualifier
- splitRegion(byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Split an individual region.
- splitRegion(byte[], byte[]) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Split an individual region.
- splitStoreFile(LoadIncrementalHFiles.LoadQueueItem, Table, byte[], byte[]) - Method in class org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
-
- src - Variable in class org.apache.hadoop.hbase.types.StructIterator
-
- stampSet - Variable in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- START - Static variable in class org.apache.hadoop.hbase.mapreduce.SimpleTotalOrderPartitioner
-
Deprecated.
- START_TIME_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.HLogInputFormat
-
Deprecated.
- START_TIME_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.WALInputFormat
-
- STARTCODE_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The startcode column qualifier
- STARTCODE_QUALIFIER_STR - Static variable in class org.apache.hadoop.hbase.HConstants
-
The startcode column qualifier
- startsWith(byte[], byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Return true if the byte array on the right is a prefix of the byte
array on the left.
- startup() - Method in class org.apache.hadoop.hbase.LocalHBaseCluster
-
Start the cluster.
- startup(File) - Method in class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
- startup(File, int) - Method in class org.apache.hadoop.hbase.zookeeper.MiniZooKeeperCluster
-
- STATE_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
- STATE_QUALIFIER_STR - Static variable in class org.apache.hadoop.hbase.HConstants
-
The state column qualifier
- STATUS_MULTICAST_ADDRESS - Static variable in class org.apache.hadoop.hbase.HConstants
-
IP to use for the multicast status messages between the master and the clients.
- STATUS_MULTICAST_BIND_ADDRESS - Static variable in class org.apache.hadoop.hbase.HConstants
-
The address to use for binding the local socket for receiving multicast.
- STATUS_MULTICAST_PORT - Static variable in class org.apache.hadoop.hbase.HConstants
-
The port to use for the multicast messages.
- STATUS_PUBLISHED - Static variable in class org.apache.hadoop.hbase.HConstants
-
Setting to activate, or not, the publication of the status by the master.
- STATUS_PUBLISHED_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
- stopMaster() - Method in interface org.apache.hadoop.hbase.client.Admin
-
Shuts down the current HBase master only.
- StoppedRpcClientException - Exception in org.apache.hadoop.hbase.ipc
-
- StoppedRpcClientException() - Constructor for exception org.apache.hadoop.hbase.ipc.StoppedRpcClientException
-
- StoppedRpcClientException(String) - Constructor for exception org.apache.hadoop.hbase.ipc.StoppedRpcClientException
-
- stopRegionServer(String) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Stop the designated regionserver
- store - Variable in class org.apache.hadoop.hbase.io.crypto.KeyStoreKeyProvider
-
- Struct - Class in org.apache.hadoop.hbase.types
-
Struct
is a simple
DataType
for implementing "compound
rowkey" and "compound qualifier" schema design strategies.
- Struct(DataType[]) - Constructor for class org.apache.hadoop.hbase.types.Struct
-
Create a new Struct
instance defined as the sequence of
HDataType
s in memberTypes
.
- StructBuilder - Class in org.apache.hadoop.hbase.types
-
A helper for building
Struct
instances.
- StructBuilder() - Constructor for class org.apache.hadoop.hbase.types.StructBuilder
-
Create an empty StructBuilder
.
- StructIterator - Class in org.apache.hadoop.hbase.types
-
An
Iterator
over encoded
Struct
members.
- StructIterator(PositionedByteRange, DataType[]) - Constructor for class org.apache.hadoop.hbase.types.StructIterator
-
Construct StructIterator
over the values encoded in src
using the specified types
definition.
- SubstringComparator - Class in org.apache.hadoop.hbase.filter
-
This comparator is for use with SingleColumnValueFilter, for filtering based on
the value of a given column.
- SubstringComparator(String) - Constructor for class org.apache.hadoop.hbase.filter.SubstringComparator
-
Constructor
- substringType - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
SubstringType byte array
- sumOfMillisSecBetweenNexts - Variable in class org.apache.hadoop.hbase.client.metrics.ScanMetrics
-
sum of milliseconds between sequential next calls
- suspendEncoding() - Method in class org.apache.hadoop.hbase.util.Base64.Base64OutputStream
-
Suspends encoding of the stream.
- Sweeper - Class in org.apache.hadoop.hbase.mob.mapreduce
-
The sweep tool.
- Sweeper() - Constructor for class org.apache.hadoop.hbase.mob.mapreduce.Sweeper
-
- SYSTEM_NAMESPACE - Static variable in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- SYSTEM_NAMESPACE_NAME - Static variable in class org.apache.hadoop.hbase.NamespaceDescriptor
-
System namespace name.
- SYSTEM_NAMESPACE_NAME_STR - Static variable in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- SYSTEMTABLE_QOS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- TAB - Static variable in class org.apache.hadoop.hbase.filter.ParseConstants
-
ASCII code for tab
- Table - Interface in org.apache.hadoop.hbase.client
-
Used to communicate with a single HBase table.
- TABLE_FAMILY - Static variable in class org.apache.hadoop.hbase.HConstants
-
The catalog family
- TABLE_FAMILY_STR - Static variable in class org.apache.hadoop.hbase.HConstants
-
The catalog family as a string
- TABLE_MAX_ROWSIZE_DEFAULT - Static variable in class org.apache.hadoop.hbase.HConstants
-
Default max row size (1 Gb).
- TABLE_MAX_ROWSIZE_KEY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Max size of single row for Get's or Scan's without in-row scanning flag set.
- TABLE_MULTIPLEXER_FLUSH_PERIOD_MS - Static variable in class org.apache.hadoop.hbase.client.HTableMultiplexer
-
- TABLE_MULTIPLEXER_INIT_THREADS - Static variable in class org.apache.hadoop.hbase.client.HTableMultiplexer
-
- TABLE_MULTIPLEXER_MAX_RETRIES_IN_QUEUE - Static variable in class org.apache.hadoop.hbase.client.HTableMultiplexer
-
- TABLE_NAME - Static variable in class org.apache.hadoop.hbase.mapreduce.Import
-
- TABLE_ROW_TEXTKEY - Static variable in class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
Specify if the row key in table is text (ASCII between 32~126),
default is true.
- TABLE_STATE_QUALIFIER - Static variable in class org.apache.hadoop.hbase.HConstants
-
The serialized table state qualifier
- tableExists(TableName) - Method in interface org.apache.hadoop.hbase.client.Admin
-
- TableExistsException - Exception in org.apache.hadoop.hbase
-
Thrown when a table exists but should not
- TableExistsException() - Constructor for exception org.apache.hadoop.hbase.TableExistsException
-
default constructor
- TableExistsException(String) - Constructor for exception org.apache.hadoop.hbase.TableExistsException
-
Constructor
- TableExistsException(TableName) - Constructor for exception org.apache.hadoop.hbase.TableExistsException
-
- TableInfoMissingException - Exception in org.apache.hadoop.hbase
-
Failed to find .tableinfo file under table dir
- TableInfoMissingException() - Constructor for exception org.apache.hadoop.hbase.TableInfoMissingException
-
- TableInfoMissingException(String) - Constructor for exception org.apache.hadoop.hbase.TableInfoMissingException
-
- TableInfoMissingException(String, Throwable) - Constructor for exception org.apache.hadoop.hbase.TableInfoMissingException
-
- TableInfoMissingException(Throwable) - Constructor for exception org.apache.hadoop.hbase.TableInfoMissingException
-
- TableInputFormat - Class in org.apache.hadoop.hbase.mapred
-
Convert HBase tabular data into a format that is consumable by Map/Reduce.
- TableInputFormat() - Constructor for class org.apache.hadoop.hbase.mapred.TableInputFormat
-
- TableInputFormat - Class in org.apache.hadoop.hbase.mapreduce
-
Convert HBase tabular data into a format that is consumable by Map/Reduce.
- TableInputFormat() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableInputFormat
-
- TableInputFormatBase - Class in org.apache.hadoop.hbase.mapred
-
- TableInputFormatBase() - Constructor for class org.apache.hadoop.hbase.mapred.TableInputFormatBase
-
- TableInputFormatBase - Class in org.apache.hadoop.hbase.mapreduce
-
- TableInputFormatBase() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableInputFormatBase
-
- TableMap<K extends org.apache.hadoop.io.WritableComparable<? super K>,V> - Interface in org.apache.hadoop.hbase.mapred
-
Scan an HBase table to sort by a specified sort column.
- TableMapper<KEYOUT,VALUEOUT> - Class in org.apache.hadoop.hbase.mapreduce
-
Extends the base Mapper
class to add the required input key
and value classes.
- TableMapper() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableMapper
-
- TableMapReduceUtil - Class in org.apache.hadoop.hbase.mapred
-
- TableMapReduceUtil() - Constructor for class org.apache.hadoop.hbase.mapred.TableMapReduceUtil
-
- TableMapReduceUtil - Class in org.apache.hadoop.hbase.mapreduce
-
- TableMapReduceUtil() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
-
- TableName - Class in org.apache.hadoop.hbase
-
Immutable POJO class for representing a table name.
- TableNotDisabledException - Exception in org.apache.hadoop.hbase
-
Thrown if a table should be offline but is not
- TableNotDisabledException() - Constructor for exception org.apache.hadoop.hbase.TableNotDisabledException
-
default constructor
- TableNotDisabledException(String) - Constructor for exception org.apache.hadoop.hbase.TableNotDisabledException
-
Constructor
- TableNotDisabledException(byte[]) - Constructor for exception org.apache.hadoop.hbase.TableNotDisabledException
-
- TableNotDisabledException(TableName) - Constructor for exception org.apache.hadoop.hbase.TableNotDisabledException
-
- TableNotEnabledException - Exception in org.apache.hadoop.hbase
-
Thrown if a table should be enabled but is not
- TableNotEnabledException() - Constructor for exception org.apache.hadoop.hbase.TableNotEnabledException
-
default constructor
- TableNotEnabledException(String) - Constructor for exception org.apache.hadoop.hbase.TableNotEnabledException
-
Constructor
- TableNotEnabledException(TableName) - Constructor for exception org.apache.hadoop.hbase.TableNotEnabledException
-
- TableNotEnabledException(byte[]) - Constructor for exception org.apache.hadoop.hbase.TableNotEnabledException
-
- TableNotFoundException - Exception in org.apache.hadoop.hbase
-
Thrown when a table can not be located
- TableNotFoundException() - Constructor for exception org.apache.hadoop.hbase.TableNotFoundException
-
default constructor
- TableNotFoundException(String) - Constructor for exception org.apache.hadoop.hbase.TableNotFoundException
-
- TableNotFoundException(byte[]) - Constructor for exception org.apache.hadoop.hbase.TableNotFoundException
-
- TableNotFoundException(TableName) - Constructor for exception org.apache.hadoop.hbase.TableNotFoundException
-
- TableOutputCommitter - Class in org.apache.hadoop.hbase.mapreduce
-
Small committer class that does not do anything.
- TableOutputCommitter() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableOutputCommitter
-
- TableOutputFormat - Class in org.apache.hadoop.hbase.mapred
-
Convert Map/Reduce output and write it to an HBase table
- TableOutputFormat() - Constructor for class org.apache.hadoop.hbase.mapred.TableOutputFormat
-
- TableOutputFormat<KEY> - Class in org.apache.hadoop.hbase.mapreduce
-
Convert Map/Reduce output and write it to an HBase table.
- TableOutputFormat() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableOutputFormat
-
- TablePartiallyOpenException - Exception in org.apache.hadoop.hbase.snapshot
-
Thrown if a table should be online/offline but is partially open
- TablePartiallyOpenException() - Constructor for exception org.apache.hadoop.hbase.snapshot.TablePartiallyOpenException
-
- TablePartiallyOpenException(String) - Constructor for exception org.apache.hadoop.hbase.snapshot.TablePartiallyOpenException
-
- TablePartiallyOpenException(TableName) - Constructor for exception org.apache.hadoop.hbase.snapshot.TablePartiallyOpenException
-
- TablePartiallyOpenException(byte[]) - Constructor for exception org.apache.hadoop.hbase.snapshot.TablePartiallyOpenException
-
- TableRecordReader - Class in org.apache.hadoop.hbase.mapred
-
Iterate over an HBase table data, return (Text, RowResult) pairs
- TableRecordReader() - Constructor for class org.apache.hadoop.hbase.mapred.TableRecordReader
-
- TableRecordReader - Class in org.apache.hadoop.hbase.mapreduce
-
Iterate over an HBase table data, return (ImmutableBytesWritable, Result)
pairs.
- TableRecordReader() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableRecordReader
-
- TableRecordReaderImpl - Class in org.apache.hadoop.hbase.mapred
-
Iterate over an HBase table data, return (Text, RowResult) pairs
- TableRecordReaderImpl() - Constructor for class org.apache.hadoop.hbase.mapred.TableRecordReaderImpl
-
- TableRecordReaderImpl - Class in org.apache.hadoop.hbase.mapreduce
-
Iterate over an HBase table data, return (ImmutableBytesWritable, Result)
pairs.
- TableRecordReaderImpl() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl
-
- TableReduce<K extends org.apache.hadoop.io.WritableComparable,V> - Interface in org.apache.hadoop.hbase.mapred
-
Write a table, sorting by the input key
- TableReducer<KEYIN,VALUEIN,KEYOUT> - Class in org.apache.hadoop.hbase.mapreduce
-
Extends the basic Reducer
class to add the required key and
value input/output classes.
- TableReducer() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableReducer
-
- TableSnapshotInputFormat - Class in org.apache.hadoop.hbase.mapred
-
TableSnapshotInputFormat allows a MapReduce job to run over a table snapshot.
- TableSnapshotInputFormat() - Constructor for class org.apache.hadoop.hbase.mapred.TableSnapshotInputFormat
-
- TableSnapshotInputFormat - Class in org.apache.hadoop.hbase.mapreduce
-
TableSnapshotInputFormat allows a MapReduce job to run over a table snapshot.
- TableSnapshotInputFormat() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableSnapshotInputFormat
-
- TableSnapshotScanner - Class in org.apache.hadoop.hbase.client
-
A Scanner which performs a scan over snapshot files.
- TableSnapshotScanner(Configuration, Path, String, Scan) - Constructor for class org.apache.hadoop.hbase.client.TableSnapshotScanner
-
Creates a TableSnapshotScanner.
- TableSnapshotScanner(Configuration, Path, Path, String, Scan) - Constructor for class org.apache.hadoop.hbase.client.TableSnapshotScanner
-
Creates a TableSnapshotScanner.
- TableSplit - Class in org.apache.hadoop.hbase.mapred
-
A table split corresponds to a key range [low, high)
- TableSplit() - Constructor for class org.apache.hadoop.hbase.mapred.TableSplit
-
default constructor
- TableSplit(TableName, byte[], byte[], String) - Constructor for class org.apache.hadoop.hbase.mapred.TableSplit
-
Constructor
- TableSplit(byte[], byte[], byte[], String) - Constructor for class org.apache.hadoop.hbase.mapred.TableSplit
-
- TableSplit - Class in org.apache.hadoop.hbase.mapreduce
-
A table split corresponds to a key range (low, high) and an optional scanner.
- TableSplit() - Constructor for class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Default constructor.
- TableSplit(TableName, Scan, byte[], byte[], String) - Constructor for class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Creates a new instance while assigning all variables.
- TableSplit(TableName, Scan, byte[], byte[], String, long) - Constructor for class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Creates a new instance while assigning all variables.
- TableSplit(TableName, byte[], byte[], String) - Constructor for class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Creates a new instance without a scanner.
- TableSplit(TableName, byte[], byte[], String, long) - Constructor for class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Creates a new instance without a scanner.
- TableState.State - Enum in org.apache.hadoop.hbase.client
-
- tagsIterator(byte[], int, int) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Util method to iterate through the tags
- tail(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- takeSnapshotAsync(HBaseProtos.SnapshotDescription) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Take a snapshot without waiting for the server to complete that snapshot (asynchronous) Only a
single snapshot should be taken at a time, or results may be undefined.
- targetReplicaId - Variable in class org.apache.hadoop.hbase.client.Query
-
- TEMP_DIR_NAME - Static variable in class org.apache.hadoop.hbase.mob.MobConstants
-
- term - Variable in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
- TerminatedWrapper<T> - Class in org.apache.hadoop.hbase.types
-
Wraps an existing DataType
implementation as a terminated
version of itself.
- TerminatedWrapper(DataType<T>, byte[]) - Constructor for class org.apache.hadoop.hbase.types.TerminatedWrapper
-
Create a terminated version of the wrapped
.
- TerminatedWrapper(DataType<T>, String) - Constructor for class org.apache.hadoop.hbase.types.TerminatedWrapper
-
Create a terminated version of the wrapped
.
- terminatorPosition(PositionedByteRange) - Method in class org.apache.hadoop.hbase.types.TerminatedWrapper
-
Return the position at which term
begins within src
,
or -1
if term
is not found.
- testCipherProvider(Configuration) - Static method in class org.apache.hadoop.hbase.util.EncryptionTest
-
Check that the configured cipher provider can be loaded and initialized, or
throw an exception.
- testEncryption(Configuration, String, byte[]) - Static method in class org.apache.hadoop.hbase.util.EncryptionTest
-
Check that the specified cipher can be loaded and initialized, or throw
an exception.
- testKeyProvider(Configuration) - Static method in class org.apache.hadoop.hbase.util.EncryptionTest
-
Check that the configured key provider can be loaded and initialized, or
throw an exception.
- TextSortReducer - Class in org.apache.hadoop.hbase.mapreduce
-
Emits Sorted KeyValues.
- TextSortReducer() - Constructor for class org.apache.hadoop.hbase.mapreduce.TextSortReducer
-
- THREAD_WAKE_FREQUENCY - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for how often threads should wake up
- throttleNamespace(String, ThrottleType, long, TimeUnit) - Static method in class org.apache.hadoop.hbase.quotas.QuotaSettingsFactory
-
Throttle the specified namespace.
- throttleTable(TableName, ThrottleType, long, TimeUnit) - Static method in class org.apache.hadoop.hbase.quotas.QuotaSettingsFactory
-
Throttle the specified table.
- ThrottleType - Enum in org.apache.hadoop.hbase.quotas
-
Describe the Throttle Type.
- throttleUser(String, ThrottleType, long, TimeUnit) - Static method in class org.apache.hadoop.hbase.quotas.QuotaSettingsFactory
-
Throttle the specified user.
- throttleUser(String, TableName, ThrottleType, long, TimeUnit) - Static method in class org.apache.hadoop.hbase.quotas.QuotaSettingsFactory
-
Throttle the specified user on the specified table.
- throttleUser(String, String, ThrottleType, long, TimeUnit) - Static method in class org.apache.hadoop.hbase.quotas.QuotaSettingsFactory
-
Throttle the specified user on the specified namespace.
- ThrottlingException - Exception in org.apache.hadoop.hbase.quotas
-
Describe the throttling result.
- ThrottlingException(String) - Constructor for exception org.apache.hadoop.hbase.quotas.ThrottlingException
-
- ThrottlingException(ThrottlingException.Type, long, String) - Constructor for exception org.apache.hadoop.hbase.quotas.ThrottlingException
-
- ThrottlingException.Type - Enum in org.apache.hadoop.hbase.quotas
-
- throwNumReadRequestsExceeded(long) - Static method in exception org.apache.hadoop.hbase.quotas.ThrottlingException
-
- throwNumRequestsExceeded(long) - Static method in exception org.apache.hadoop.hbase.quotas.ThrottlingException
-
- throwNumWriteRequestsExceeded(long) - Static method in exception org.apache.hadoop.hbase.quotas.ThrottlingException
-
- throwReadSizeExceeded(long) - Static method in exception org.apache.hadoop.hbase.quotas.ThrottlingException
-
- throwRequestSizeExceeded(long) - Static method in exception org.apache.hadoop.hbase.quotas.ThrottlingException
-
- throwWriteSizeExceeded(long) - Static method in exception org.apache.hadoop.hbase.quotas.ThrottlingException
-
- TimeoutException - Exception in org.apache.hadoop.hbase.errorhandling
-
Exception for timeout of a task.
- TimeoutException(String, long, long, long) - Constructor for exception org.apache.hadoop.hbase.errorhandling.TimeoutException
-
Exception indicating that an operation attempt has timed out
- TimeRange - Class in org.apache.hadoop.hbase.io
-
Represents an interval of version timestamps.
- TimeRange() - Constructor for class org.apache.hadoop.hbase.io.TimeRange
-
Default constructor.
- TimeRange(long) - Constructor for class org.apache.hadoop.hbase.io.TimeRange
-
Represents interval [minStamp, Long.MAX_VALUE)
- TimeRange(byte[]) - Constructor for class org.apache.hadoop.hbase.io.TimeRange
-
Represents interval [minStamp, Long.MAX_VALUE)
- TimeRange(long, long) - Constructor for class org.apache.hadoop.hbase.io.TimeRange
-
Represents interval [minStamp, maxStamp)
- TimeRange(byte[], byte[]) - Constructor for class org.apache.hadoop.hbase.io.TimeRange
-
Represents interval [minStamp, maxStamp)
- TIMESTAMP_CONF_KEY - Static variable in class org.apache.hadoop.hbase.mapreduce.ImportTsv
-
- TimestampsFilter - Class in org.apache.hadoop.hbase.filter
-
Filter that returns only cells whose timestamp (version) is
in the specified list of timestamps (versions).
- TimestampsFilter(List<Long>) - Constructor for class org.apache.hadoop.hbase.filter.TimestampsFilter
-
Constructor for filter that retains only those
cells whose timestamp (version) is in the specified
list of timestamps.
- timeToString(TimeUnit) - Static method in class org.apache.hadoop.hbase.quotas.QuotaSettings
-
- TNAME - Static variable in class org.apache.hadoop.hbase.client.replication.ReplicationAdmin
-
- toArray(List<byte[]>) - Static method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- toArray(List<byte[]>) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toBigDecimal(ByteBuffer, int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Reads a BigDecimal value at the given buffer's offset.
- toBigDecimal(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to a BigDecimal
- toBigDecimal(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to a BigDecimal value
- toBinaryByteArrays(String[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toBinaryFromHex(byte) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Takes a ASCII digit in the range A-F0-9 and returns
the corresponding integer/ordinal value.
- toBoolean(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toByte() - Method in enum org.apache.hadoop.hbase.client.IsolationLevel
-
- toByte(ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.BinaryComparator
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.BinaryPrefixComparator
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.BitComparator
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.ByteArrayComparable
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.ColumnCountGetFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.ColumnPrefixFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.FamilyFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.Filter
-
TODO: JAVADOC
Concrete implementers can signal a failure condition in their code by throwing an
IOException
.
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.FilterList
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.FirstKeyValueMatchingQualifiersFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.FuzzyRowFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.InclusiveStopFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.KeyOnlyFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.LongComparator
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.MultiRowRangeFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.NullComparator
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.PageFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.PrefixFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.QualifierFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.RandomRowFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.RegexStringComparator
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.RowFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueExcludeFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.SkipFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.SubstringComparator
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.TimestampsFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.ValueFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.filter.WhileMatchFilter
-
- toByteArray() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- toByteArray() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- toByteArray() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- toByteArray(int, int) - Method in class org.apache.hadoop.hbase.io.ByteBufferOutputStream
-
- toByteArrays(String[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toByteArrays(String) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toByteArrays(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toBytes() - Method in enum org.apache.hadoop.hbase.client.IsolationLevel
-
- toBytes() - Method in class org.apache.hadoop.hbase.TableName
-
- toBytes(ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copy the bytes from position to limit into a new byte[] of the exact length and sets the
position and limit back to their original values (though not thread safe).
- toBytes(ByteBuffer, int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Copy the given number of bytes from specified offset into a new byte[]
- toBytes(ByteBuffer) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Returns a new byte array, copied from the given buf
,
from the index 0 (inclusive) to the limit (exclusive),
regardless of the current position.
- toBytes(String) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a string to a UTF-8 byte array.
- toBytes(boolean) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Convert a boolean to a byte array.
- toBytes(long) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Convert a long value to a byte array using big-endian.
- toBytes(float) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toBytes(double) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Serialize a double as the IEEE 754 double format output.
- toBytes(int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Convert an int value to a byte array.
- toBytes(short) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toBytes(BigDecimal) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Convert a BigDecimal value to a byte array
- toBytesBinary(String) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toByteString() - Method in class org.apache.hadoop.hbase.util.Bytes
-
- toDelimitedByteArray() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
Use this instead of
HRegionInfo.toByteArray()
when writing to a stream and you want to use
the pb mergeDelimitedFrom (w/o the delimiter, pb reads to EOF which may not be what you want).
- toDelimitedByteArray(HRegionInfo...) - Static method in class org.apache.hadoop.hbase.HRegionInfo
-
Serializes given HRegionInfo's as a byte array.
- toDouble(ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Reads a double value at the given buffer's offset.
- toDouble(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toDouble(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toFloat(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Presumes float encoded as IEEE 754 floating-point "single format"
- toFloat(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Presumes float encoded as IEEE 754 floating-point "single format"
- toHex(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Convert a byte range into a hex string
- toHex(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Convert a byte array into a hex string
- toInt(ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Reads an int value at the given buffer's offset.
- toInt(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to an int value
- toInt(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to an int value
- toInt(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to an int value
- toIntUnsafe(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Deprecated.
As of release 2.0.0, this will be removed in HBase 3.0.0.
- toJSON(int) - Method in class org.apache.hadoop.hbase.client.Operation
-
Produces a JSON object for fingerprint and details exposure in a
parseable format.
- toJSON() - Method in class org.apache.hadoop.hbase.client.Operation
-
Produces a JSON object sufficient for description of a query
in a debugging or logging context.
- TokenUtil - Class in org.apache.hadoop.hbase.security.token
-
Utility methods for obtaining authentication tokens.
- TokenUtil() - Constructor for class org.apache.hadoop.hbase.security.token.TokenUtil
-
- toLong(ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Reads a long value at the given buffer's offset.
- toLong(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to a long value.
- toLong(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to a long value.
- toLong(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to a long value.
- toLongUnsafe(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Deprecated.
As of release 2.0.0, this will be removed in HBase 3.0.0.
- toMap(int) - Method in class org.apache.hadoop.hbase.client.Delete
-
- toMap(int) - Method in class org.apache.hadoop.hbase.client.Get
-
Compile the details beyond the scope of getFingerprint (row, columns,
timestamps, etc.) into a Map along with the fingerprinted information.
- toMap(int) - Method in class org.apache.hadoop.hbase.client.Mutation
-
Compile the details beyond the scope of getFingerprint (row, columns,
timestamps, etc.) into a Map along with the fingerprinted information.
- toMap(int) - Method in class org.apache.hadoop.hbase.client.Operation
-
Produces a Map containing a summary of the details of a query
beyond the scope of the fingerprint (i.e.
- toMap() - Method in class org.apache.hadoop.hbase.client.Operation
-
Produces a Map containing a full summary of a query.
- toMap(int) - Method in class org.apache.hadoop.hbase.client.Scan
-
Compile the details beyond the scope of getFingerprint (row, columns,
timestamps, etc.) into a Map along with the fingerprinted information.
- TOOLS - Static variable in class org.apache.hadoop.hbase.HBaseInterfaceAudience
-
Denotes classes used as tools (Used from cmd line).
- toShort(ByteBuffer, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
Reads a short value at the given buffer's offset.
- toShort(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to a short value
- toShort(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to a short value
- toShort(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts a byte array to a short value
- toShortString() - Method in class org.apache.hadoop.hbase.ServerName
-
- toShortUnsafe(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Deprecated.
As of release 2.0.0, this will be removed in HBase 3.0.0.
- toString(Cell, boolean) - Static method in class org.apache.hadoop.hbase.CellUtil
-
Returns a string representation of the cell
- toString() - Method in class org.apache.hadoop.hbase.client.Increment
-
- toString(int) - Method in class org.apache.hadoop.hbase.client.Operation
-
Produces a string representation of this Operation.
- toString() - Method in class org.apache.hadoop.hbase.client.Operation
-
Produces a string representation of this Operation.
- toString() - Method in class org.apache.hadoop.hbase.client.Result
-
- toString() - Method in class org.apache.hadoop.hbase.ClusterStatus
-
- toString() - Method in exception org.apache.hadoop.hbase.errorhandling.ForeignException
-
- toString() - Method in class org.apache.hadoop.hbase.filter.ColumnCountGetFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.ColumnPaginationFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.ColumnPrefixFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.ColumnRangeFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.CompareFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.DependentColumnFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.FilterList
-
- toString(int) - Method in class org.apache.hadoop.hbase.filter.FilterList
-
- toString() - Method in class org.apache.hadoop.hbase.filter.FuzzyRowFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.InclusiveStopFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- toString(int) - Method in class org.apache.hadoop.hbase.filter.MultipleColumnPrefixFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.PageFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.PrefixFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.SingleColumnValueFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.SkipFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.TimestampsFilter
-
- toString(int) - Method in class org.apache.hadoop.hbase.filter.TimestampsFilter
-
- toString() - Method in class org.apache.hadoop.hbase.filter.WhileMatchFilter
-
- toString() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- toString() - Method in class org.apache.hadoop.hbase.HRegionInfo
-
- toString() - Method in class org.apache.hadoop.hbase.HRegionLocation
-
- toString() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- toString() - Method in class org.apache.hadoop.hbase.io.crypto.Context
-
- toString() - Method in class org.apache.hadoop.hbase.io.ImmutableBytesWritable
-
- toString() - Method in class org.apache.hadoop.hbase.io.TimeRange
-
- toString() - Method in class org.apache.hadoop.hbase.mapred.TableSplit
-
- toString() - Method in class org.apache.hadoop.hbase.mapreduce.TableSplit
-
Returns the details about this instance as a string.
- toString() - Method in class org.apache.hadoop.hbase.NamespaceDescriptor
-
- toString() - Method in class org.apache.hadoop.hbase.RegionLoad
-
- toString() - Method in class org.apache.hadoop.hbase.replication.ReplicationPeerConfig
-
- toString() - Method in class org.apache.hadoop.hbase.rest.client.Cluster
-
- toString() - Method in class org.apache.hadoop.hbase.security.access.Permission
-
- toString() - Method in class org.apache.hadoop.hbase.security.User
-
- toString() - Method in class org.apache.hadoop.hbase.security.visibility.Authorizations
-
- toString() - Method in class org.apache.hadoop.hbase.security.visibility.CellVisibility
-
- toString() - Method in class org.apache.hadoop.hbase.ServerLoad
-
- toString() - Method in class org.apache.hadoop.hbase.ServerName
-
- toString() - Method in class org.apache.hadoop.hbase.TableName
-
- toString() - Method in class org.apache.hadoop.hbase.util.Bytes
-
- toString(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- toString(byte[], String, byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Joins two byte arrays together using a separator.
- toString(byte[], int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
This method will convert utf8 encoded bytes into a string.
- toString(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
This method will convert utf8 encoded bytes into a string.
- toString() - Method in class org.apache.hadoop.hbase.util.Counter
-
- toString() - Method in class org.apache.hadoop.hbase.util.Pair
-
- toString() - Method in class org.apache.hadoop.hbase.util.PairOfSameType
-
- toStringBinary(ByteBuffer, int, int) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- toStringBinary(ByteBuffer) - Static method in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- toStringBinary(byte[]) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Write a printable representation of a byte array.
- toStringBinary(ByteBuffer) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Converts the given byte buffer to a printable representation,
from the index 0 (inclusive) to the limit (exclusive),
regardless of the current position.
- toStringBinary(byte[], int, int) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
Write a printable representation of a byte array.
- toStringCustomizedValues() - Method in class org.apache.hadoop.hbase.HColumnDescriptor
-
- toStringCustomizedValues() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- toStringTableAttributes() - Method in class org.apache.hadoop.hbase.HTableDescriptor
-
- toStruct() - Method in class org.apache.hadoop.hbase.types.StructBuilder
-
Retrieve the
Struct
represented by
this
.
- transformCell(Cell) - Method in class org.apache.hadoop.hbase.filter.Filter
-
Give the filter a chance to transform the passed KeyValue.
- transformCell(Cell) - Method in class org.apache.hadoop.hbase.filter.FilterList
-
- transformCell(Cell) - Method in class org.apache.hadoop.hbase.filter.KeyOnlyFilter
-
- transformCell(Cell) - Method in class org.apache.hadoop.hbase.filter.SkipFilter
-
- transformCell(Cell) - Method in class org.apache.hadoop.hbase.filter.WhileMatchFilter
-
- truncateTable(TableName, boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Truncate a table.
- truncateTableAsync(TableName, boolean) - Method in interface org.apache.hadoop.hbase.client.Admin
-
Truncate the table but does not block and wait for it be completely enabled.
- tryAtomicRegionLoad(Connection, TableName, byte[], Collection<LoadIncrementalHFiles.LoadQueueItem>) - Method in class org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
-
Attempts to do an atomic load of many hfiles into a region.
- ts - Variable in class org.apache.hadoop.hbase.client.Mutation
-
- ts - Variable in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
Timestamp for all inserted rows
- TsvImporterMapper - Class in org.apache.hadoop.hbase.mapreduce
-
Write table content out to files in hdfs.
- TsvImporterMapper() - Constructor for class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
- TsvImporterTextMapper - Class in org.apache.hadoop.hbase.mapreduce
-
Write table content out to map output files.
- TsvImporterTextMapper() - Constructor for class org.apache.hadoop.hbase.mapreduce.TsvImporterTextMapper
-
- TTL - Static variable in class org.apache.hadoop.hbase.HColumnDescriptor
-
- ttl - Variable in class org.apache.hadoop.hbase.mapreduce.TsvImporterMapper
-
- typeA - Variable in class org.apache.hadoop.hbase.types.Union2
-
- typeB - Variable in class org.apache.hadoop.hbase.types.Union2
-
- typeC - Variable in class org.apache.hadoop.hbase.types.Union3
-
- typeD - Variable in class org.apache.hadoop.hbase.types.Union4
-
- types - Variable in class org.apache.hadoop.hbase.types.StructIterator
-
- VALID_NAMESPACE_REGEX - Static variable in class org.apache.hadoop.hbase.TableName
-
- VALID_TABLE_QUALIFIER_REGEX - Static variable in class org.apache.hadoop.hbase.TableName
-
- VALID_USER_TABLE_REGEX - Static variable in class org.apache.hadoop.hbase.TableName
-
- validateInput(JobConf) - Method in class org.apache.hadoop.hbase.mapred.TableInputFormat
-
- value() - Method in class org.apache.hadoop.hbase.client.Result
-
Returns the value of the first column in the Result.
- VALUE_MASK - Static variable in class org.apache.hadoop.hbase.util.ByteBufferUtils
-
- ValueFilter - Class in org.apache.hadoop.hbase.filter
-
This filter is used to filter based on column value.
- ValueFilter(CompareFilter.CompareOp, ByteArrayComparable) - Constructor for class org.apache.hadoop.hbase.filter.ValueFilter
-
Constructor.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.client.Admin.CompactType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.client.Consistency
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.client.Durability
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.client.IsolationLevel
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.client.security.SecurityCapability
-
Returns the enum constant of this type with the specified name.
- valueOf(int) - Static method in enum org.apache.hadoop.hbase.client.security.SecurityCapability
-
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.client.TableState.State
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.filter.BitComparator.BitwiseOp
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.filter.CompareFilter.CompareOp
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.filter.Filter.ReturnCode
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.filter.FilterList.Operator
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.filter.RegexStringComparator.EngineType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.io.compress.Compression.Algorithm
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.KeepDeletedCells
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.quotas.QuotaScope
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.quotas.QuotaType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.quotas.ThrottleType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.quotas.ThrottlingException.Type
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.regionserver.BloomType
-
Returns the enum constant of this type with the specified name.
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.security.access.Permission.Action
-
Returns the enum constant of this type with the specified name.
- valueOf(String, int, long) - Static method in class org.apache.hadoop.hbase.ServerName
-
Retrieve an instance of ServerName.
- valueOf(String) - Static method in class org.apache.hadoop.hbase.ServerName
-
Retrieve an instance of ServerName.
- valueOf(String, long) - Static method in class org.apache.hadoop.hbase.ServerName
-
Retrieve an instance of ServerName.
- valueOf(String, String) - Static method in class org.apache.hadoop.hbase.TableName
-
- valueOf(byte[]) - Static method in class org.apache.hadoop.hbase.TableName
-
- valueOf(String) - Static method in class org.apache.hadoop.hbase.TableName
-
- valueOf(byte[], byte[]) - Static method in class org.apache.hadoop.hbase.TableName
-
- valueOf(ByteBuffer, ByteBuffer) - Static method in class org.apache.hadoop.hbase.TableName
-
- valueOf(String) - Static method in enum org.apache.hadoop.hbase.util.Order
-
Returns the enum constant of this type with the specified name.
- values() - Static method in enum org.apache.hadoop.hbase.client.Admin.CompactType
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.client.Consistency
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.client.Durability
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.client.IsolationLevel
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.client.security.SecurityCapability
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.client.TableState.State
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.filter.BitComparator.BitwiseOp
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.filter.CompareFilter.CompareOp
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.filter.Filter.ReturnCode
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.filter.FilterList.Operator
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.filter.RegexStringComparator.EngineType
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.io.compress.Compression.Algorithm
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.io.encoding.DataBlockEncoding
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.KeepDeletedCells
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.quotas.QuotaScope
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.quotas.QuotaType
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.quotas.ThrottleType
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.quotas.ThrottlingException.Type
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.regionserver.BloomType
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.security.access.Permission.Action
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- values() - Static method in enum org.apache.hadoop.hbase.util.Order
-
Returns an array containing the constants of this enum type, in
the order they are declared.
- VERSION - Static variable in class org.apache.hadoop.hbase.security.access.Permission
-
- VERSION_FILE_NAME - Static variable in class org.apache.hadoop.hbase.HConstants
-
name of version file
- VERSION_FILE_WRITE_ATTEMPTS - Static variable in class org.apache.hadoop.hbase.HConstants
-
Parameter name for how often we should try to write a version file, before failing
- VERSION_STRING - Static variable in interface org.apache.hadoop.hbase.rest.Constants
-
- VersionInfo - Class in org.apache.hadoop.hbase.util
-
This class finds the package info for hbase and the VersionAnnotation
information.
- VersionInfo() - Constructor for class org.apache.hadoop.hbase.util.VersionInfo
-
- VERSIONS - Static variable in class org.apache.hadoop.hbase.HConstants
-
- vintToBytes(long) - Static method in class org.apache.hadoop.hbase.util.Bytes
-
- VISIBILITY_EXP_RESOLVER_CLASS - Static variable in class org.apache.hadoop.hbase.mapreduce.CellCreator
-
- VisibilityClient - Class in org.apache.hadoop.hbase.security.visibility
-
Utility client for doing visibility labels admin operations.
- VisibilityClient() - Constructor for class org.apache.hadoop.hbase.security.visibility.VisibilityClient
-
- VisibilityControllerNotReadyException - Exception in org.apache.hadoop.hbase.security.visibility
-
- VisibilityControllerNotReadyException(String) - Constructor for exception org.apache.hadoop.hbase.security.visibility.VisibilityControllerNotReadyException
-
- VisibilityExpEvaluator - Interface in org.apache.hadoop.hbase.security.visibility
-
During the read (ie.
- VisibilityExpressionResolver - Interface in org.apache.hadoop.hbase.mapreduce
-
Interface to convert visibility expressions into Tags for storing along with Cells in HFiles.
- VisibilityLabelService - Interface in org.apache.hadoop.hbase.security.visibility
-
The interface which deals with visibility labels and user auths admin service as well as the cell
visibility expression storage part and read time evaluation.