What's wrong with this OpenGL model picking code?

openglNewbie
  • What's wrong with this OpenGL model picking code? openglNewbie

    I am making simple model viewer using OpenGL. When I want to pick an object OpenGL returns nothing or an object that is in another place.

    This is my code:

    GLuint buff[1024] = {0};
    GLint hits,view[4];
    
    glSelectBuffer(1024,buff);
    glGetIntegerv(GL_VIEWPORT, view);
    
    glMatrixMode(GL_PROJECTION);
    glPushMatrix();
    glLoadIdentity();
    gluPickMatrix(x,y,1.0,1.0,view);
    gluPerspective(45,(float)view[2]/(float)view[4],1.0,1500.0);
    
    glMatrixMode(GL_MODELVIEW);
    glRenderMode(GL_SELECT);
    
    glLoadIdentity();
    //I make the same transformations for normal render
    glTranslatef(0, 0, -zoom);
    glMultMatrixf(transform.M);
    glInitNames();
    glPushName(-1);
    for(int j=0;j<allNodes.size();j++)
    {
        glLoadName(allNodes.at(j)->id);
        allNodes.at(j)->Draw(textures);
    }
    glPopName();
    glMatrixMode(GL_PROJECTION);
    glPopMatrix();
    hits = glRenderMode(GL_RENDER);
    

  • Do you have an ATI card? I think they deprecated some portions of OpenGL (picking included) a long time ago.

  • No one use OpenGL picking this days, you have to implement it yourself. Transforming from camera to worlds space is a bit tricky, here is a good article http://trac.bookofhook.com/bookofhook/trac.cgi/wiki/MousePicking

  • Ok, but for 3d viewer application opengl isn't better ? My doesn't work, beacouse I have two mistakes: 1.

    gluPerspective(45,(float)view[2]/(float)view[3],1.0,1500.0);
    

    2.

    gluPickMatrix(x,view[3]-y,1.0,1.0,view);
    

    So it is looking that picking still works on ati, my friend said my that picking can be making by software implementation of opengl.

  • OpenGL picking using the selection buffer is deprecated. If your scene is small enough, you should use colour-based picking; otherwise, you can use gluUnProject to obtain a world coordinate from a mouse coordinate and a depth read from the depth buffer using glReadPixels().

Tags
c++ opengl graphics-programming
Related questions and answers
  • Im writing a game for which, one of the models are loaded from an .obj file. It's a model of a plane, and I want to rotate the propeller. The object file is broken into groups, and the propeller is identified, so that's all good. I'm writing the game in C++ with OpenGl/GLFW The drawing function is: int win_width; int win_height; glfwGetWindowSize(&win_width, &win_height); float win_aspect = (float)win_width / (float)win_height; glViewport(0, 0, win_width, win_height); glMatrixMode(GL_PROJECTION); glLoadIdentity(); gluPerspective(90, win_aspect, 1, 100.0); glMatrixMode(GL_MODELVIEW

  • ; outside that space, white is drawn. Here is my OpenGL init code... SDL_SetVideoMode(windowWidth, windowHeight, 16, SDL_HWSURFACE|SDL_GL_DOUBLEBUFFER|SDL_OPENGL); glClearColor(0, 0, 0, 0); glClearDepth(1.0f); glViewport(0, 0, windowWidth, windowHeight); glMatrixMode(GL_PROJECTION); glLoadIdentity(); glOrtho(0, windowWidth, windowHeight, 0, 1, -1); glMatrixMode(GL_MODELVIEW); glEnable(GL_TEXTURE_2D...; glGenTextures(1, &thisIsMyTexture); temporarySurface = loadImage("theimagetomytexture.png"); glBindTexture(GL_TEXTURE_2D,thisIsMyTexture); glTexImage2D(GL_TEXTURE_2D, 0, 4, temporarySurface-&gt;w

  • , texscale + sourceY); glVertex2i(x, y + tileHeight); glEnd(); glLoadIdentity(); My initialization code for OpenGL: // Set the OpenGL state after creating the context with SDL_SetVideoMode glClearColor(0, 0, 0, 0); glDisable(GL_DEPTH_TEST); glEnable(GL_TEXTURE_2D); //Enable 2D rendering glViewport(0, 0, Width, Height); //Set Up openGL viewport (screen) glMatrixMode(GL_PROJECTION); glLoadIdentity(); glOrtho(0, Width, Height, 0, -1, 1); glMatrixMode(GL_MODELVIEW); glLoadIdentity(); Just in-case, here's my image loading code, I think that perhaps this may be were the problem lies somehow

  • ); glBindFramebuffer(GL_FRAMEBUFFER, 0); } Here is my drawing box code, which just takes a transformation matrix and calls the appropriate functions. The current values of P is a projection matrix, and an identity matrix for the view matrix (V). void drawBox(const Matrix4& M) { const Matrix4 MVP = M * V * P; if (boundshader) { glUniformMatrix4fv((*boundshader)("MVP"), 1, GL_FALSE, &...() { ///////////////////////////////////////// // Render to FBO glClearColor(0, 0, 0.2f,0); glBindFramebuffer(GL_FRAMEBUFFER, fbo_id); glViewport(0, 0, m_FBOWidth, m_FBOHeight); glClear(GL_COLOR_BUFFER_BIT

  • ); // Set The Color Of The Model ( NEW ) // ORIGINAL DRAWING CODE //Draw the model as an interpolation between the two frames glBegin(GL_TRIANGLES); for(int i = 0; i < numTriangles; i++) { MD2Triangle* triangle = triangles + i; for(int j = 0; j < 3; j++) { MD2Vertex* v1 = frame1-&gt;vertices + triangle-&gt;vertices[j..._TEXTURE_MIN_FILTER, GL_NEAREST); glTexImage1D (GL_TEXTURE_1D, 0, GL_RGB, 32, 0, GL_RGB , GL_FLOAT, shaderData); // Upload ( NEW ) } This is the drawing for the animated MD2 model: void

  • in the shader code) for( int n = 0; n < vShaderArgs.size(); n ++) glBindAttribLocation( m_nProgramId, n, vShaderArgs[n].sFieldName.c_str() ); // Create and bind to a vertex array object, which...]; // nx, ny, nz; float colour[4]; // r, g, b, a float padding[20]; // padded for performance }; I've already written a working VertexBufferObject class that creates a vertex buffer... ); glDisableClientState( GL_COLOR_ARRAY ); glBindBuffer( GL_ARRAY_BUFFER, 0 ); glBindBuffer( GL_ELEMENT_ARRAY_BUFFER, 0 ); } Back to the Vertex Array Object though. My code for creating the Vertex

  • buffer and use it to create the render target ID3D10Texture2D* pBackBuffer; swapchain-&gt;GetBuffer(0, __uuidof(ID3D10Texture2D), (LPVOID*)&pBackBuffer); device-&gt;CreateRenderTargetView...(); // close and release the render target view device-&gt;Release(); // close and release the 3D device } void Init() { // create eight vertices to represent the corners of the cube VERTEX...() { } Cube::~Cube() { pBuffer-&gt;Release(); // why does this only work when put here? because it's created here? I thnk so, why not iBuffer though? } void Cube::Draw() { render_frame(); } void Cube

  • I don't get how coord (a UV vec2 in my mind) is equal to the dot product (a scalar value)? Same problem I had before with "g". What do I set the plane to be? In my opengl c++ 3.0 code, I set it to [0, 0, 1, 0] (basically unit z) and glTexGen works great. I'm still missing something. My vert shader looks basically like this: WVPMatrix = World View Project Matrix. POSITION is the model vertex... replacement for glTexGen with GL_OBJECT_LINEAR. For OpenGL ES 2.0. In Ogl GLSL there is the gl_TextureMatrix that makes this easier, but thats not available on OpenGL ES 2.0 / OpenGL ES Shader

  • ); glClearColor(0.0f, 0.0f, 0.0f, 1.0f); // Clear the background of our window to red glMatrixMode (GL_PROJECTION); glLoadIdentity (); glOrtho(0, XSIZE, YSIZE, 0, 0, 1); glMatrixMode (GL_MODELVIEW); glDisable(GL_DEPTH_TEST); GLuint image; ilInit(); ilutRenderer(ILUT_OPENGL); ILuint texid; ILboolean success; ilGenImages(1, &texid...I'm learning C++, and I'm writing my first OpenGL program. Unfortunately, it seems to be defaulting to Software Rendering (CPU uses bounces, GPU uses stays at 1%). I'm using SDL as the Windowing