On Sep19, 2011, at 19:46 , Stephen Frost wrote: > I agree that it'd be interesting to do, but I share Lord Stark's > feelings about the challenges and lack of potential gain- it's a very > small set of queries that would benefit from this. You need to be > working with enough data to make the cost of tranferring it all over to > the GPU worthwhile, just for starters..
I wonder if anyone has ever tried to employ a GPU for more low-level tasks. Things like sorting or hashing are hard to move to the GPU in postgres because, in the general case, they involve essentially arbitrary user-defined functions. But couldn't for example the WAL CRC computation be moved to a GPU? Or, to get really crazy, even the search for the optimal join order (only for a large number of joins though, i.e. where we currently switch to a genetic algorithmn)? best regards, Florian Pflug -- Sent via pgsql-hackers mailing list (email@example.com) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers