On Sat, June 9, 2007 07:36, Gregory Stark wrote:
> "Billings, John" <[EMAIL PROTECTED]> writes:
>> Does anyone think that PostgreSQL could benefit from using the video
>> card as a parallel computing device? I'm working on a project using
>> Nvidia's CUDA with an 8800 series video card to handle non-graphical
>> algorithms. I'm curious if anyone thinks that this technology could be
>> used to speed up a database? If so which part of the database, and what
>> kind of parallel algorithms would be used?
> There has been some interesting research on sorting using the GPU which
> could be very interesting for databases.
> Perhaps this can be done using OpenGL already but I kind of doubt it.
GPUs have been used to great effect for spatial joins. And yes, using
OpenGL so that it was portable. I saw a paper about that as an Oracle
plugin a few years back.
It works something like this, IIRC: a spatial join looks for objects that
overlap with the query area. Normally you go through an R-tree index to
identify objects that are in the same general area (space-filling curves
help there). Then you filter the objects you get, to see which ones
actually overlap your query area.
The GL trick inserted an intermediate filter that set up the objects found
in the R-tree index, and the query area, as 3D objects. Then it used GL's
collision detection as an intermediate filter to find apparent matches.
It has to be slightly conservative because GL doesn't make the sort of
guarantees you'd want for this trick, so there's a final software pass
that only needs to look at cases where there's any doubt.
---------------------------(end of broadcast)---------------------------
TIP 2: Don't 'kill -9' the postmaster