Are you on trunk? Filter interface is richer there. As to whether the below is possible, it sounds reasonable. I can't say for certain though unless I dug in. You might be up against the edge of whats possibly. Seems like you want to detect when we change rows in your filter. Filters are inherently row-scoped. Perhaps that is the issue you are running into?
Look at trunk. I believe you get a last whack at influencing whats returned just before its passed back to the client per row IIRC. St.Ack On Fri, Jun 4, 2010 at 12:40 PM, Raghava Mutharaju <[email protected]> wrote: > Hi Stack, > > The custom filter is meant to do this --- given a columnFamily and a > value, it should check in all the columns whether that value is present or > not. If it is present then include the row or else skip it. > > I started of by writing this on the lines of SingleColumnValueFilter(scvf). > But the important difference is that in scvf, if the column is found and the > given value is not found against that column, then we can skip to next row > (using ReturnCode.NEXT_ROW). In my case, if there is a family match, I have > to check all the columns. Even if family does not match, I cannot do a skip > or next_row because there could be a match with other families of the row. > > I put in a few log statements and traced the flow. I am wondering whether > this type of filter is possible?? > > Regards, > Raghava. > > On Fri, Jun 4, 2010 at 10:50 AM, Stack <[email protected]> wrote: > >> Then something else is going on. Can you do a bit of digging? >> St.Ack >> >> On Thu, Jun 3, 2010 at 11:38 PM, Raghava Mutharaju >> <[email protected]> wrote: >> > Nope, it doesn't. >> > >> > After recompiling, I just changed the $HBASE_HOME to the newly created >> build >> > folder. I think that is the only change required? >> > >> > Regards, >> > Raghava. >> > >> > On Fri, Jun 4, 2010 at 1:46 AM, Stack <[email protected]> wrote: >> > >> >> Does your filter start working if you add it in here w/ a code of its >> >> own and recompile and relaunch? >> >> St.Ack >> >> >> >> On Thu, Jun 3, 2010 at 10:40 PM, Raghava Mutharaju >> >> <[email protected]> wrote: >> >> > I found the codes in HBaseObjectWritable file and saw the jira too :). >> >> > This change only effects the performance isn't it (particularly for >> small >> >> > cell data), not the actual functionality. >> >> > >> >> > Regards, >> >> > Raghava. >> >> > >> >> > On Fri, Jun 4, 2010 at 1:18 AM, Stack <[email protected]> wrote: >> >> > >> >> >> Look in that java file and see how all extant filters are mentioned >> >> >> and assigned a code. You need to add yours too. >> >> >> St.Ack >> >> >> >> >> >> On Thu, Jun 3, 2010 at 10:14 PM, Raghava Mutharaju >> >> >> <[email protected]> wrote: >> >> >> > Thank you Angus and Guilherme :). I added the filter jar to >> $HBase/lib >> >> >> and >> >> >> > the exceptions went away. Although, there now the filter doesn't >> >> return >> >> >> any >> >> >> > rows - I have to check this out. >> >> >> > >> >> >> > Stack: >> >> >> > About HBaseObjectWritable, I have used it in serializing(write() >> >> >> method) >> >> >> > and deserializing(readFields() method) as was done in other >> existing >> >> >> > filters. Are you referring to this or something else? What does >> adding >> >> it >> >> >> to >> >> >> > HBaseObjectWritable mean? >> >> >> > >> >> >> > Thank you again. >> >> >> > >> >> >> > Regards, >> >> >> > Raghava. >> >> >> > >> >> >> > On Fri, Jun 4, 2010 at 1:04 AM, Stack <[email protected]> wrote: >> >> >> > >> >> >> >> You have to add it to HBaseObjectWritable too, IIRC. See how >> other >> >> >> >> filters are mentioned in there (The need to do this has to go >> away. >> >> I >> >> >> >> filed HBASE-2666). >> >> >> >> St.Ack >> >> >> >> >> >> >> >> On Thu, Jun 3, 2010 at 9:01 PM, Angus He <[email protected]> >> wrote: >> >> >> >> > Even if HBase is running in standalone mode, the scan operation >> is >> >> >> >> > still running in another jvm, to be specific, in HMaster >> process. >> >> >> >> > >> >> >> >> > So you still have to either put the custom filter jar in >> $HBASE/lib >> >> or >> >> >> >> > set up the $HBASE/conf/hbase-env.sh properly. >> >> >> >> > >> >> >> >> > On Fri, Jun 4, 2010 at 4:37 AM, Raghava Mutharaju >> >> >> >> > <[email protected]> wrote: >> >> >> >> >> The custom filter doesn't need any additional jars. >> >> >> >> >> Another point I forgot to mention is that, I am running this on >> a >> >> >> single >> >> >> >> >> node (laptop) to test my filter. >> >> >> >> >> >> >> >> >> >> Regards, >> >> >> >> >> Raghava. >> >> >> >> >> >> >> >> >> >> On Thu, Jun 3, 2010 at 4:23 PM, Guilherme Germoglio < >> >> >> >> [email protected]>wrote: >> >> >> >> >> >> >> >> >> >>> please check if the jars needed for your custom filter >> >> >> implementation >> >> >> >> are >> >> >> >> >>> in >> >> >> >> >>> hbase's classpath >> >> >> >> >>> >> >> >> >> >>> On Thu, Jun 3, 2010 at 5:16 PM, Raghava Mutharaju < >> >> >> >> >>> [email protected] >> >> >> >> >>> > wrote: >> >> >> >> >>> >> >> >> >> >>> > Hi all, >> >> >> >> >>> > >> >> >> >> >>> > I wrote a custom filter and used it with scan. I am >> >> getting >> >> >> the >> >> >> >> >>> > following exceptions. If I use any built-in filters, it >> works >> >> >> fine. I >> >> >> >> >>> > searched around and one of the suggestions was to increase >> the >> >> >> lease >> >> >> >> >>> > timeout. Since it works fine for built-in filters, I am >> >> assuming >> >> >> that >> >> >> >> >>> this >> >> >> >> >>> > is not the case. Is it that there is something wrong with my >> >> >> filter >> >> >> >> >>> > implementation? >> >> >> >> >>> > >> >> >> >> >>> > Regards, >> >> >> >> >>> > Raghava. >> >> >> >> >>> > >> >> >> >> >>> > Master log: >> >> >> >> >>> > >> >> >> >> >>> > org.apache.hadoop.hbase.UnknownScannerException: Name: -1 >> >> >> >> >>> > at >> >> >> >> >>> > >> >> >> >> >>> >> >> >> >> >> >> >> >> >> >> org.apache.hadoop.hbase.regionserver.HRegionServer.next(HRegionServer.java:1889) >> >> >> >> >>> > at >> sun.reflect.NativeMethodAccessorImpl.invoke0(Native >> >> >> Method) >> >> >> >> >>> > >> >> >> >> >>> > at >> >> >> >> >>> > >> >> >> >> >>> >> >> >> >> >> >> >> >> >> >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >> >> >> >> >>> > at >> >> >> >> >>> > >> >> >> >> >>> >> >> >> >> >> >> >> >> >> >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >> >> >> >> >>> > at java.lang.reflect.Method.invoke(Method.java:597) >> >> >> >> >>> > >> >> >> >> >>> > at >> >> >> >> >>> > >> >> >> org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:657) >> >> >> >> >>> > at >> >> >> >> >>> > >> >> >> >> >> >> >> >> >> >> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:915) >> >> >> >> >>> > 2010-06-03 16:02:10,775 INFO >> org.apache.hadoop.ipc.HBaseServer: >> >> >> IPC >> >> >> >> >>> > Server handler 9 on 60770, call next(-1, 1) from >> >> >> >> 130.108.56.225:61091: >> >> >> >> >>> > error: org.apache.hadoop.hbase.UnknownScannerException: >> Name: >> >> -1 >> >> >> >> >>> > >> >> >> >> >>> > >> >> >> >> >>> > Console: >> >> >> >> >>> > >> >> >> >> >>> > Exception in thread "main" >> >> >> >> >>> > org.apache.hadoop.hbase.client.RetriesExhaustedException: >> >> Trying >> >> >> to >> >> >> >> >>> contact >> >> >> >> >>> > region server 130.108.56.225:60770 for region >> >> >> table1,,1274648045785, >> >> >> >> row >> >> >> >> >>> > '', >> >> >> >> >>> > but failed after 10 attempts. >> >> >> >> >>> > Exceptions: >> >> >> >> >>> > java.io.IOException: Call to /130.108.56.225:60770 failed >> on >> >> >> local >> >> >> >> >>> > exception: java.io.EOFException >> >> >> >> >>> > java.io.IOException: Call to /130.108.56.225:60770 failed >> on >> >> >> local >> >> >> >> >>> > exception: java.io.EOFException >> >> >> >> >>> > java.io.IOException: Call to /130.108.56.225:60770 failed >> on >> >> >> local >> >> >> >> >>> > exception: java.io.EOFException >> >> >> >> >>> > java.io.IOException: Call to /130.108.56.225:60770 failed >> on >> >> >> local >> >> >> >> >>> > exception: java.io.EOFException >> >> >> >> >>> > java.io.IOException: Call to /130.108.56.225:60770 failed >> on >> >> >> local >> >> >> >> >>> > exception: java.io.EOFException >> >> >> >> >>> > java.io.IOException: Call to /130.108.56.225:60770 failed >> on >> >> >> local >> >> >> >> >>> > exception: java.io.EOFException >> >> >> >> >>> > java.io.IOException: Call to /130.108.56.225:60770 failed >> on >> >> >> local >> >> >> >> >>> > exception: java.io.EOFException >> >> >> >> >>> > java.io.IOException: Call to /130.108.56.225:60770 failed >> on >> >> >> local >> >> >> >> >>> > exception: java.io.EOFException >> >> >> >> >>> > java.io.IOException: Call to /130.108.56.225:60770 failed >> on >> >> >> local >> >> >> >> >>> > exception: java.io.EOFException >> >> >> >> >>> > java.io.IOException: Call to /130.108.56.225:60770 failed >> on >> >> >> local >> >> >> >> >>> > exception: java.io.EOFException >> >> >> >> >>> > >> >> >> >> >>> > at >> >> >> >> >>> > >> >> >> >> >>> > >> >> >> >> >>> >> >> >> >> >> >> >> >> >> >> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.getRegionServerWithRetries(HConnectionManager.java:1055) >> >> >> >> >>> > at >> >> >> >> >>> > >> >> >> >> >>> > >> >> >> >> >>> >> >> >> >> >> >> >> >> >> >> org.apache.hadoop.hbase.client.HTable$ClientScanner.nextScanner(HTable.java:2003) >> >> >> >> >>> > at >> >> >> >> >>> > >> >> >> >> >>> > >> >> >> >> >>> >> >> >> >> >> >> >> >> >> >> org.apache.hadoop.hbase.client.HTable$ClientScanner.initialize(HTable.java:1923) >> >> >> >> >>> > at >> >> >> >> org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:403) >> >> >> >> >>> > at >> >> >> >> >>> > >> >> >> >> >> >> org.knoesis.reasoning.MR.FilterReadClient.main(FilterReadClient.java:58) >> >> >> >> >>> > >> >> >> >> >>> > Line 58 above has a call to getScanner(CustomFilterInstance) >> of >> >> >> >> HTable. >> >> >> >> >>> > >> >> >> >> >>> >> >> >> >> >>> >> >> >> >> >>> >> >> >> >> >>> -- >> >> >> >> >>> Guilherme >> >> >> >> >>> >> >> >> >> >>> msn: [email protected] >> >> >> >> >>> homepage: http://sites.google.com/site/germoglio/ >> >> >> >> >>> >> >> >> >> >> >> >> >> >> > >> >> >> >> > >> >> >> >> > >> >> >> >> > -- >> >> >> >> > Regards >> >> >> >> > Angus >> >> >> >> > >> >> >> >> >> >> >> > >> >> >> >> >> > >> >> >> > >> >
