If speed of filtering and sorting is the main focus in your queries, let's say not many join operations, you migh not get any benefit from hawq or gpdb. They are analytic databases. What's the size of the total dataset? Maybe geode can help in your case.
On Thursday, January 28, 2016, 陶进 <[email protected]> wrote: > hi guys, > > We have several huge tables,and some of the table would more than 10 > billion rows.each table had the same columns,each row is about 100 Byte. > > Our query run on each singal table to filter and sort some records,such > as select a,b,c from t where a=1 b='hello' order by 1,2. > > Now we use mongodb,and the bigest table had 4 billion rows.it could > returned in 10 seconds.Now we want to use hawq as our query engine.Could > they run the above query in 10 seconds? what the hardware of the > server?how many node would need? > > > Thanks. > > --- > Avast 防毒软件已对此电子邮件执行病毒检查。 > https://www.avast.com/antivirus > > -- Yu-wei Sung
