Re: [HACKERS] Using the GPU

2007-06-09 Thread Nicolas Barbier

2007/6/9, Gregory Stark [EMAIL PROTECTED]:


There has been some interesting research on sorting using the GPU which could
be very interesting for databases.

However I think Postgres would be unlikely to go the route of having compiled
driver code for every possible video card. It's unlikely to be interesting for
database developers until there's some abstract interface designed for these
kinds of optimizations which it can use without caring about the specific
graphics card.

Perhaps this can be done using OpenGL already but I kind of doubt it.


url:http://en.wikipedia.org/wiki/GLSL

There are (of course) competing standards such as:

url:http://en.wikipedia.org/wiki/High_Level_Shader_Language

and:

url:http://en.wikipedia.org/wiki/Cg_%28programming_language%29.

greetings,
Nicolas

--
Nicolas Barbier
http://www.gnu.org/philosophy/no-word-attachments.html

---(end of broadcast)---
TIP 1: if posting/reading through Usenet, please send an appropriate
  subscribe-nomail command to [EMAIL PROTECTED] so that your
  message can get through to the mailing list cleanly


Re: [HACKERS] Using the GPU

2007-06-09 Thread Lukas Kahwe Smith

Gregory Stark wrote:

Billings, John [EMAIL PROTECTED] writes:


Does anyone think that PostgreSQL could benefit from using the video
card as a parallel computing device?  I'm working on a project using
Nvidia's CUDA with an 8800 series video card to handle non-graphical
algorithms.  I'm curious if anyone thinks that this technology could be
used to speed up a database?  If so which part of the database, and what
kind of parallel algorithms would be used?


There has been some interesting research on sorting using the GPU which could
be very interesting for databases.


Without knowing a thing about all of this, my first thought it might be 
useful for GIS and things of that sort.


regards,
Lukas

---(end of broadcast)---
TIP 7: You can help support the PostgreSQL project by donating at

   http://www.postgresql.org/about/donate


Re: [HACKERS] Using the GPU

2007-06-09 Thread Jeroen T. Vermeulen
On Sat, June 9, 2007 07:36, Gregory Stark wrote:
 Billings, John [EMAIL PROTECTED] writes:

 Does anyone think that PostgreSQL could benefit from using the video
 card as a parallel computing device?  I'm working on a project using
 Nvidia's CUDA with an 8800 series video card to handle non-graphical
 algorithms.  I'm curious if anyone thinks that this technology could be
 used to speed up a database?  If so which part of the database, and what
 kind of parallel algorithms would be used?

 There has been some interesting research on sorting using the GPU which
 could be very interesting for databases.

 Perhaps this can be done using OpenGL already but I kind of doubt it.

GPUs have been used to great effect for spatial joins.  And yes, using
OpenGL so that it was portable.  I saw a paper about that as an Oracle
plugin a few years back.

It works something like this, IIRC: a spatial join looks for objects that
overlap with the query area.  Normally you go through an R-tree index to
identify objects that are in the same general area (space-filling curves
help there).  Then you filter the objects you get, to see which ones
actually overlap your query area.

The GL trick inserted an intermediate filter that set up the objects found
in the R-tree index, and the query area, as 3D objects.  Then it used GL's
collision detection as an intermediate filter to find apparent matches. 
It has to be slightly conservative because GL doesn't make the sort of
guarantees you'd want for this trick, so there's a final software pass
that only needs to look at cases where there's any doubt.


Jeroen



---(end of broadcast)---
TIP 2: Don't 'kill -9' the postmaster


[HACKERS] Using the GPU

2007-06-08 Thread Billings, John
  
Does anyone think that PostgreSQL could benefit from using the video
card as a parallel computing device?  I'm working on a project using
Nvidia's CUDA with an 8800 series video card to handle non-graphical
algorithms.  I'm curious if anyone thinks that this technology could be
used to speed up a database?  If so which part of the database, and what
kind of parallel algorithms would be used?
Thanks, sorry if this is a duplicate message.
-- John Billings
 


Re: [HACKERS] Using the GPU

2007-06-08 Thread Vincent Janelle
Aren't most databases constrained by I/O?  And postgresql by how fast
your kernel can switch between processes under a concurrent load?

 

From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Billings, John
Sent: Friday, June 08, 2007 10:55 AM
To: pgsql-hackers@postgresql.org
Subject: [HACKERS] Using the GPU

 

  

Does anyone think that PostgreSQL could benefit from using the video
card as a parallel computing device?  I'm working on a project using
Nvidia's CUDA with an 8800 series video card to handle non-graphical
algorithms.  I'm curious if anyone thinks that this technology could be
used to speed up a database?  If so which part of the database, and what
kind of parallel algorithms would be used?

Thanks, sorry if this is a duplicate message.

-- John Billings

 



Re: [HACKERS] Using the GPU

2007-06-08 Thread Gregory Stark
Billings, John [EMAIL PROTECTED] writes:

 Does anyone think that PostgreSQL could benefit from using the video
 card as a parallel computing device?  I'm working on a project using
 Nvidia's CUDA with an 8800 series video card to handle non-graphical
 algorithms.  I'm curious if anyone thinks that this technology could be
 used to speed up a database?  If so which part of the database, and what
 kind of parallel algorithms would be used?

There has been some interesting research on sorting using the GPU which could
be very interesting for databases.

However I think Postgres would be unlikely to go the route of having compiled
driver code for every possible video card. It's unlikely to be interesting for
database developers until there's some abstract interface designed for these
kinds of optimizations which it can use without caring about the specific
graphics card.

Perhaps this can be done using OpenGL already but I kind of doubt it.

-- 
  Gregory Stark
  EnterpriseDB  http://www.enterprisedb.com


---(end of broadcast)---
TIP 6: explain analyze is your friend