Package com.mapr.fs.hbase
Class HTableImpl
- java.lang.Object
-
- org.apache.hadoop.hbase.client.mapr.AbstractHTable
-
- com.mapr.fs.hbase.HTableImpl
-
- Direct Known Subclasses:
HTableImpl11
public class HTableImpl extends org.apache.hadoop.hbase.client.mapr.AbstractHTable
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description classHTableImpl.FamilyInfo
-
Field Summary
Fields Modifier and Type Field Description protected booleanautoFlushstatic java.lang.StringCONFIG_PARAM_FLUSH_ON_READprotected booleanflushOnReadprotected com.mapr.fs.MapRHTablemaprTableprotected byte[]tableNameStores table path in a byte array
-
Constructor Summary
Constructors Constructor Description HTableImpl(org.apache.hadoop.conf.Configuration conf, byte[] tableName)Creates an object to access a MapR table.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description org.apache.hadoop.hbase.client.Resultappend(org.apache.hadoop.hbase.client.Append append)java.lang.Object[]batch(java.util.List<? extends org.apache.hadoop.hbase.client.Row> actions)voidbatch(java.util.List<? extends org.apache.hadoop.hbase.client.Row> actions, java.lang.Object[] results)booleancheckAndDelete(byte[] row, byte[] family, byte[] qualifier, byte[] value, org.apache.hadoop.hbase.client.Delete delete)booleancheckAndDelete(byte[] row, byte[] family, byte[] qualifier, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp compareOp, byte[] value, org.apache.hadoop.hbase.client.Delete delete)booleancheckAndMutate(byte[] row, byte[] family, byte[] qualifier, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp compareOp, byte[] value, org.apache.hadoop.hbase.client.RowMutations rm)booleancheckAndMutateImpl(byte[] row, byte[] family, byte[] qualifier, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp compareOp, byte[] value, org.apache.hadoop.hbase.client.RowMutations rm, boolean throwerr)booleancheckAndPut(byte[] row, byte[] family, byte[] qualifier, byte[] value, org.apache.hadoop.hbase.client.Put put)booleancheckAndPut(byte[] row, byte[] family, byte[] qualifier, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp compareOp, byte[] value, org.apache.hadoop.hbase.client.Put put)protected voidcheckMutation(org.apache.hadoop.hbase.client.Mutation m)voidclose()voiddelete(java.util.List<org.apache.hadoop.hbase.client.Delete> deletes)voiddelete(org.apache.hadoop.hbase.client.Delete delete)java.lang.Boolean[]exists(java.util.List<org.apache.hadoop.hbase.client.Get> gets)booleanexists(org.apache.hadoop.hbase.client.Get get)voidflushCommits()org.apache.hadoop.hbase.client.Result[]get(java.util.List<org.apache.hadoop.hbase.client.Get> gets)org.apache.hadoop.hbase.client.Resultget(org.apache.hadoop.hbase.client.Get get)org.apache.hadoop.conf.ConfigurationgetConfiguration()HTableImpl.FamilyInfogetFamilyInfo(byte[] row, byte[] family)org.apache.hadoop.hbase.HRegionLocationgetRegionLocation(byte[] row)java.util.NavigableMap<org.apache.hadoop.hbase.HRegionInfo,org.apache.hadoop.hbase.ServerName>getRegionLocations()org.apache.hadoop.hbase.client.ResultgetRowOrBefore(byte[] row, byte[] family)org.apache.hadoop.hbase.client.ResultScannergetScanner(org.apache.hadoop.hbase.client.Scan scan)org.apache.hadoop.hbase.util.Pair<byte[][],byte[][]>getStartEndKeys()org.apache.hadoop.hbase.HTableDescriptorgetTableDescriptor()byte[]getTableName()Returns the complete table path as a byte arrayorg.apache.hadoop.hbase.client.Resultincrement(org.apache.hadoop.hbase.client.Increment increment)longincrementColumnValue(byte[] row, byte[] family, byte[] qualifier, long amount)longincrementColumnValue(byte[] row, byte[] family, byte[] qualifier, long amount, boolean writeToWAL)longincrementColumnValue(byte[] row, byte[] family, byte[] qualifier, long amount, org.apache.hadoop.hbase.client.Durability durability)booleanisAutoFlush()voidmutateRow(org.apache.hadoop.hbase.client.RowMutations rm)com.mapr.fs.jni.MapRPutMutateToMapRPut(byte[] row, org.apache.hadoop.hbase.client.Mutation mut)voidput(java.util.List<org.apache.hadoop.hbase.client.Put> puts)voidput(org.apache.hadoop.hbase.client.Put put)voidsetAutoFlush(boolean autoFlush)voidsetAutoFlush(boolean autoFlush, boolean clearBufferOnFail)voidsetFlushOnRead(boolean val)booleanshouldFlushOnRead()-
Methods inherited from class org.apache.hadoop.hbase.client.mapr.AbstractHTable
batchCoprocessorService, batchCoprocessorService, clearRegionCache, coprocessorService, coprocessorService, coprocessorService, getEndKeys, getRegionLocation, getRegionLocation, getScanner, getScanner, getStartKeys, getWriteBufferSize, setWriteBufferSize
-
-
-
-
Field Detail
-
CONFIG_PARAM_FLUSH_ON_READ
public static final java.lang.String CONFIG_PARAM_FLUSH_ON_READ
- See Also:
- Constant Field Values
-
autoFlush
protected boolean autoFlush
-
flushOnRead
protected boolean flushOnRead
-
maprTable
protected final com.mapr.fs.MapRHTable maprTable
-
tableName
protected byte[] tableName
Stores table path in a byte array
-
-
Constructor Detail
-
HTableImpl
public HTableImpl(org.apache.hadoop.conf.Configuration conf, byte[] tableName) throws java.io.IOExceptionCreates an object to access a MapR table.- Parameters:
conf- Configuration object to use.tableName- Name of the table.- Throws:
java.io.IOException- if a remote or network exception occurs
-
-
Method Detail
-
getTableName
public byte[] getTableName()
Returns the complete table path as a byte array- Specified by:
getTableNamein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable
-
getConfiguration
public org.apache.hadoop.conf.Configuration getConfiguration()
- Specified by:
getConfigurationin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable
-
flushCommits
public void flushCommits() throws java.io.InterruptedIOException- Specified by:
flushCommitsin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.InterruptedIOException
-
close
public void close() throws java.io.IOException- Specified by:
closein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
getTableDescriptor
public org.apache.hadoop.hbase.HTableDescriptor getTableDescriptor() throws java.io.IOException- Specified by:
getTableDescriptorin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
exists
public boolean exists(org.apache.hadoop.hbase.client.Get get) throws java.io.IOException- Specified by:
existsin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
exists
public java.lang.Boolean[] exists(java.util.List<org.apache.hadoop.hbase.client.Get> gets) throws java.io.IOException- Specified by:
existsin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
batch
public void batch(java.util.List<? extends org.apache.hadoop.hbase.client.Row> actions, java.lang.Object[] results) throws java.io.IOException, java.lang.InterruptedException- Specified by:
batchin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOExceptionjava.lang.InterruptedException
-
batch
public java.lang.Object[] batch(java.util.List<? extends org.apache.hadoop.hbase.client.Row> actions) throws java.io.IOException, java.lang.InterruptedException- Specified by:
batchin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOExceptionjava.lang.InterruptedException
-
get
public org.apache.hadoop.hbase.client.Result get(org.apache.hadoop.hbase.client.Get get) throws java.io.IOException- Specified by:
getin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
get
public org.apache.hadoop.hbase.client.Result[] get(java.util.List<org.apache.hadoop.hbase.client.Get> gets) throws java.io.IOException- Specified by:
getin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
getRowOrBefore
public org.apache.hadoop.hbase.client.Result getRowOrBefore(byte[] row, byte[] family) throws java.io.IOException- Specified by:
getRowOrBeforein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
getScanner
public org.apache.hadoop.hbase.client.ResultScanner getScanner(org.apache.hadoop.hbase.client.Scan scan) throws java.io.IOException- Specified by:
getScannerin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
put
public void put(org.apache.hadoop.hbase.client.Put put) throws java.io.InterruptedIOException- Specified by:
putin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.InterruptedIOException
-
put
public void put(java.util.List<org.apache.hadoop.hbase.client.Put> puts) throws java.io.InterruptedIOException- Specified by:
putin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.InterruptedIOException
-
getFamilyInfo
public HTableImpl.FamilyInfo getFamilyInfo(byte[] row, byte[] family) throws org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException, java.io.IOException
- Throws:
org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyExceptionjava.io.IOException
-
MutateToMapRPut
public com.mapr.fs.jni.MapRPut MutateToMapRPut(byte[] row, org.apache.hadoop.hbase.client.Mutation mut) throws java.io.IOException- Throws:
java.io.IOException
-
checkAndPut
public boolean checkAndPut(byte[] row, byte[] family, byte[] qualifier, byte[] value, org.apache.hadoop.hbase.client.Put put) throws org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException, java.io.IOException- Specified by:
checkAndPutin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyExceptionjava.io.IOException
-
checkAndPut
public boolean checkAndPut(byte[] row, byte[] family, byte[] qualifier, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp compareOp, byte[] value, org.apache.hadoop.hbase.client.Put put) throws java.io.IOException- Overrides:
checkAndPutin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
delete
public void delete(org.apache.hadoop.hbase.client.Delete delete) throws java.io.IOException- Specified by:
deletein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
delete
public void delete(java.util.List<org.apache.hadoop.hbase.client.Delete> deletes) throws java.io.IOException- Specified by:
deletein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
checkAndDelete
public boolean checkAndDelete(byte[] row, byte[] family, byte[] qualifier, byte[] value, org.apache.hadoop.hbase.client.Delete delete) throws org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException, java.io.IOException- Specified by:
checkAndDeletein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyExceptionjava.io.IOException
-
checkAndDelete
public boolean checkAndDelete(byte[] row, byte[] family, byte[] qualifier, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp compareOp, byte[] value, org.apache.hadoop.hbase.client.Delete delete) throws java.io.IOException- Overrides:
checkAndDeletein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
mutateRow
public void mutateRow(org.apache.hadoop.hbase.client.RowMutations rm) throws java.io.IOException- Specified by:
mutateRowin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
append
public org.apache.hadoop.hbase.client.Result append(org.apache.hadoop.hbase.client.Append append) throws java.io.IOException- Specified by:
appendin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
increment
public org.apache.hadoop.hbase.client.Result increment(org.apache.hadoop.hbase.client.Increment increment) throws java.io.IOException- Specified by:
incrementin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
incrementColumnValue
public long incrementColumnValue(byte[] row, byte[] family, byte[] qualifier, long amount) throws java.io.IOException- Specified by:
incrementColumnValuein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
incrementColumnValue
public long incrementColumnValue(byte[] row, byte[] family, byte[] qualifier, long amount, boolean writeToWAL) throws java.io.IOException- Specified by:
incrementColumnValuein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
incrementColumnValue
public long incrementColumnValue(byte[] row, byte[] family, byte[] qualifier, long amount, org.apache.hadoop.hbase.client.Durability durability) throws java.io.IOException- Specified by:
incrementColumnValuein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
setFlushOnRead
public void setFlushOnRead(boolean val)
-
shouldFlushOnRead
public boolean shouldFlushOnRead()
-
setAutoFlush
public void setAutoFlush(boolean autoFlush)
- Specified by:
setAutoFlushin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable
-
setAutoFlush
public void setAutoFlush(boolean autoFlush, boolean clearBufferOnFail)- Specified by:
setAutoFlushin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable
-
isAutoFlush
public boolean isAutoFlush()
- Specified by:
isAutoFlushin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable
-
getRegionLocation
public org.apache.hadoop.hbase.HRegionLocation getRegionLocation(byte[] row) throws java.io.IOException- Specified by:
getRegionLocationin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
getStartEndKeys
public org.apache.hadoop.hbase.util.Pair<byte[][],byte[][]> getStartEndKeys() throws java.io.IOException- Specified by:
getStartEndKeysin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
getRegionLocations
public java.util.NavigableMap<org.apache.hadoop.hbase.HRegionInfo,org.apache.hadoop.hbase.ServerName> getRegionLocations() throws java.io.IOException- Specified by:
getRegionLocationsin classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
checkMutation
protected void checkMutation(org.apache.hadoop.hbase.client.Mutation m) throws java.io.IOException- Throws:
java.io.IOException
-
checkAndMutate
public boolean checkAndMutate(byte[] row, byte[] family, byte[] qualifier, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp compareOp, byte[] value, org.apache.hadoop.hbase.client.RowMutations rm) throws java.io.IOException- Overrides:
checkAndMutatein classorg.apache.hadoop.hbase.client.mapr.AbstractHTable- Throws:
java.io.IOException
-
checkAndMutateImpl
public boolean checkAndMutateImpl(byte[] row, byte[] family, byte[] qualifier, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp compareOp, byte[] value, org.apache.hadoop.hbase.client.RowMutations rm, boolean throwerr) throws java.io.IOException- Throws:
java.io.IOException
-
-