Erik Hofman <[EMAIL PROTECTED]> writes:

> Alex Romosan wrote:
>> or you can add a call to glEnable(GL_POINT_SPRITE):
>
>> +    glEnable(GL_POINT_SPRITE);
>> this allowed me to use GL_POINT_SMOOTH on my nvidia card (enhanced
>> lighting works fine now).
>> i think this would be a better solution (tested only on my nvidia
>> card
>> though).
>
> Ok, I've added support for point sprites. I does indeed increase the
> framerate in my PC. I find this a rather peculiar extension though.
> Especially since SGI has always been using point sprites without
> naming them.

i still think we are papering over a real bug in the fgfs display code
when we enable point sprites. the attached program:

#include <stdio.h>
#include <stdlib.h>

#include <GL/gl.h>      /* Header File For The OpenGL Library */
#include <GL/glu.h>     /* Header File For The GLU Library */
#include <GL/glut.h>

#define NB_PIXELS 1000



void GLUTdraw(void)
{
int i;
int done=0;
GLfloat pixels[NB_PIXELS*3];
  for(i=0;i<NB_PIXELS;i++)
  {
    pixels[3*i]=rand()%250-125;
    pixels[3*i+1]=rand()%250-125;
    pixels[3*i+2]=rand()%250-125;
  }
  do
  {
    glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);

    glRotatef(2.0,1.0,1.0,1.0);
    glRotatef(1.0,0.0,1.0,1.0);

    glColor4ub(255,255,255,255);
    glBegin(GL_POINTS);
    for(i=0;i<NB_PIXELS;i++)
    {
      glVertex3f(pixels[3*i],pixels[3*i+1],pixels[3*i+2]);
    }
    glEnd();
    glutSwapBuffers();
  }
  while(!done);
}

int main(int argc,char *argv[])
{
  glutInit(&argc, argv);		// initialize the toolkit
  glutInitDisplayMode( GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH | GLUT_STENCIL);
  glutInitWindowSize(640,480);	// set window size
  glutInitWindowPosition(100, 150);	// set window position on screen
  glutCreateWindow("testgl"); // open the screen window

  glutDisplayFunc(GLUTdraw);

  glViewport(0,0,640,480);

  glMatrixMode(GL_PROJECTION);
  glLoadIdentity();
  glOrtho(-100,100,-100,100,-500,500);

  glMatrixMode(GL_MODELVIEW);
  glLoadIdentity();

  glEnable(GL_POINT_SMOOTH);
  glHint(GL_POINT_SMOOTH_HINT,GL_DONT_CARE);
  glPointSize(5.0f);
  glutMainLoop();

  return 0;
}
enables GL_POINT_SMOOTH and runs just fine on my nvidia card without
the need to enable GL_POINT_SPRITE. so it's something else we do
along the way that gets the driver in a very confused state.

btw, if i understand correctly GL_POINT_SMOOTH is supposed to give you
round points instead of square ones (GL_FASTEST/GL_NICEST hints refer
to the most efficient/correct option, while GL_DONT_CARE means the
client has no preference). GL_POINT_SPRITE will also let you map a
texture across a point sprite if GL_COORD_REPLACE is enabled.

still trying to figure out the real reason why the nvidia driver is
slow when we enable GL_POINT_SMOOTH in fgfs (and learning a lot more
about openGL then i ever wanted to know).

--alex--

-- 
| I believe the moment is at hand when, by a paranoiac and active |
|  advance of the mind, it will be possible (simultaneously with  |
|  automatism and other passive states) to systematize confusion  |
|  and thus to help to discredit completely the world of reality. |

Reply via email to