How many columns? Its columns right, and not column families? Are the 1k rows contiguous? Can you Scan? For insert of 1k rows, you know how to do that now, right? Will they be substantial rows -- 10s to 100s of ks? -- or just small? Do you have multiput available in the REST interface, I don't recall.
Try REST since you know that interface. Jython might be faster though a test done more than a year ago had jython as slow (http://ryantwopointoh.blogspot.com/2009/01/performance-of-hbase-importing.html) but a bunch has changed since then -- hbase-wise and jython has probably gotten a lot better. If jython route, make sure you keep the interpreter afloat rather than launch it per request (so yes, fastcgi would make sense). St.Ack On Fri, Dec 10, 2010 at 9:59 PM, Jack Levin <[email protected]> wrote: > Hello. We plan to run a set of queries on tables with multiple > columns. What is the most efficient method to say, insert 1000 rows, > and/or read 1000 rows. > We are considering just using REST. But what about jython? Will it be > faster? Another way to have our apps talk to nginx and some sort of > app tier running via fast-cgi. > > Any ideas? > > -Jack >
