OpenGL ES Framebuffer creation problem

Neil M
  • OpenGL ES Framebuffer creation problem Neil M

    I'm having some problems porting my D3D code to OpenGL ES. I have a Graphics Device class that encapsulates all rendering commands. The code below is in an ObjC++ file.

    The problem code is the called to get the render buffer width and height. It doesn't actually return any value in the w or h GLints.

    It should be noted that the code below is from an older iPhone project, and it works there. Am I missing something really stupid?

    Checking the framebuffer status returns a value of complete, since OpenGL ES seems quite happy to initialize a depth buffer with the garbage w and h values I pass to it!

    bool GraphicsDevice::Initialize(id<EAGLDrawable> eaglLayer)  
    {
       EAGLContext *context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
       if(context == nil)
       { 
          return false;
       }
    
       [EAGLContext setCurrentContext:context];
    
       this->eaglContext = context;
    
       glGenFramebuffers(1, &frameBuffer);
       glBindBuffer(GL_FRAMEBUFFER, frameBuffer);
       glGenRenderbuffers(1, &renderBuffer);
       glBindBuffer(GL_RENDERBUFFER, renderBuffer);
       [eaglContext renderbufferStorage:GL_RENDERBUFFER fromDrawable:eaglLayer];
    
       GLint w, h;
       glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &w);
       glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &h);
    
       SetWidth(static_cast<unsigned int>(w));
       SetHeight(static_cast<unsigned int>(h));
    
      glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER,    renderBuffer);
    
       glGenRenderbuffers(1, &depthBuffer);
       glBindBuffer(GL_RENDERBUFFER, depthBuffer);
       glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, w, h);
    
       glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthBuffer);
    
       GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
       if(status != GL_FRAMEBUFFER_COMPLETE)
       {
          System::Log("Framebuffer not complete");
          return false;
       }
    
       return true;
    }
    

  • In the book "iPhone 3D Programming" there are examples of this. The author uses the OES extensions. Like

    glGenRenderbuffersOES(1, &m_renderbuffer);
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, m_renderbuffer);
    

    or

    glGenFramebuffersOES(1, &m_framebuffer);
    glBindFramebufferOES(GL_FRAMEBUFFER_OES, m_framebuffer);
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES,
                                 GL_COLOR_ATTACHMENT0_OES,
                                 GL_RENDERBUFFER_OES,
                                 m_renderbuffer);
    

    I've tested these commands:

    GLint w, h;
    glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &w);
    glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &h);
    

    and the width and height are correct.

    You can download the book examples here

    See the "HelloArrow" project. Within the class RenderingEngine1

  • I was trying to create a framebuffer from an EAGLLayer that already had a framebuffer created from it. I don't know why OpenGL ES was saying the framebuffer was complete and not setting an error right away, but there you have it.

Tags
c++ opengl-es objective-c
Related questions and answers
  • ); //start rendering scene one way or another glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT); glUniformMatrix4fv(mats.projHandle,1,GL_FALSE, glm::value_ptr(mats.projMatrix)); glUniformMatrix4fv... is the init code for the FBO and it's texture: glGenTextures(1,&amp;fboimg); glBindTexture(GL_TEXTURE_2D,fboimg); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameterf...(GL_TEXTURE_2D, 0, GL_RGBA8, 512, 512, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0); glBindTexture(GL_TEXTURE_2D, 0); glGenFramebuffers(1,&amp;fboHandle); glBindFramebuffer(GL_FRAMEBUFFER,fboHandle

  • wrong in my frame buffer set up code, or elsewhere. But I can't see what. The FBO is set up through the following function: unsigned int fbo_id; unsigned int depth_buffer; int m_FBOWidth, m_FBOHeight; unsigned int m_TextureID; void initFBO() { m_FBOWidth = screen_width; m_FBOHeight = screen_height; glGenRenderbuffers(1, &amp;depth_buffer); glBindRenderbuffer(GL_RENDERBUFFER, depth_buffer); glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT, m_FBOWidth, m_FBOHeight); glGenTextures(1, &amp;m_TextureID); glBindTexture(GL_TEXTURE_2D, m_TextureID

  • , 0, 0, 0); glDisable(GL_DEPTH_TEST); glEnable(GL_TEXTURE_2D); //Enable 2D rendering glViewport(0, 0, Width, Height); //Set Up openGL viewport (screen) glMatrixMode(GL_PROJECTION); glLoadIdentity(); glOrtho(0, Width, Height, 0, -1, 1); glMatrixMode(GL_MODELVIEW); glLoadIdentity(); Just in-case, here's my image loading code, I think that perhaps this may be were the problem lies somehow..._MIN_FILTER, GL_LINEAR); glTexImage2D (GL_TEXTURE_2D, 0, GL_RGBA, image->w, image->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image->pixels); return texture; }

  • it returned: %i", rval); return 1; } allegro_message("Setting Graphics Mode...Press X To Begin Game. Default Missile count: %i", bullets->size()); set_color_depth(32); int ret...("Couldn't load one or more sprites..."); return 0; } buffer = create_bitmap(WIDTH, HEIGHT); if (buffer == NULL) { allegro_message("Couldn't create buffer... it is the pad that is causing the problem. Maybe the bullet is being drawn out of the screen already. But I am not sure since I cannot debug maybe somebody can see the flaw in my code. main.cpp class

  • i think i just found the solution. 1) the problem is that backbuffer surface and source surface are of different formats - that is why exception was thrown. 2) the image path needed double slash "C..., PM_REMOVE)) { TranslateMessage (&amp;msg); DispatchMessage (&amp;msg); } else { render(); } } return static_cast<;int>...(D3D_SDK_VERSION))) return false; std::wstring wsPath = L"C:\wood.bmp"; // path to the image D3DXIMAGE_INFO Info; if (FAILED(D3DXGetImageInfoFromFile(wsPath.c_str(), &amp;Info

  • (zbd)); zbd.Width = SCREEN_WIDTH; // set the width of the depth buffer zbd.Height = SCREEN_HEIGHT; // set the height of the depth buffer zbd.ArraySize = 1; // we are only creating one texture... be on the depth buffer is 0.0 viewport.MaxDepth = 1; // the farthest an object can be on the depth buffer is 1.0 device->RSSetViewports(1, &amp;viewport); // set the viewport...; scd.BufferDesc.Width = SCREEN_WIDTH; scd.BufferDesc.Height = SCREEN_HEIGHT; scd.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT; scd.OutputWindow = hWnd; scd.SampleDesc.Count = 1; scd.SampleDesc.Quality = 0

  • , In); // Get The Current Line ( NEW ) shaderData[i][0] = shaderData[i][1] = shaderData[i][2] = float(atof (Line)); // Copy Over The Value ( NEW ) } fclose... (1, &amp;shaderTexture[0]); // Get A Free Texture ID ( NEW ) glBindTexture (GL_TEXTURE_1D, shaderTexture[0]); // Bind This Texture. From Now On It Will Be 1D...); // Set The Color Of The Model ( NEW ) // ORIGINAL DRAWING CODE //Draw the model as an interpolation between the two frames glBegin(GL_TRIANGLES); for(int i = 0; i

  • is identified, so that's all good. I'm writing the game in C++ with OpenGl/GLFW The drawing function is: int win_width; int win_height; glfwGetWindowSize(&amp;win_width, &amp;win_height); float win_aspect = (float)win_width / (float)win_height; glViewport(0, 0, win_width, win_height); glMatrixMode(GL_PROJECTION); glLoadIdentity(); gluPerspective(90, win_aspect, 1, 100.0); glMatrixMode(GL_MODELVIEW...(GL_LIGHT0); glEnable(GL_LIGHT1); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glColor3f(0.0f, 0.0f, 0.0f); int vertexIndex = 0, normalIndex; glRotatef(90, 0, 1, 0); glPushMatrix(); for(int

  • = xyCoords.y; m_font->DrawTextA(NULL, stateStream.str().c_str(), -1, &amp;rct, DT_LEFT|DT_NOCLIP , D3DXCOLOR(1,0,0,1)); This all works perfectly. Where am I going wrong with the first bit of code...) * 2.0f - 1.0f, (xyCoords.y / SCREEN_HEIGHT) * 2.0f - 1.0f); lineRenderer.Draw2DLine(origin, headPos, D3DXCOLOR(1, 0, 0, 1), D3DXCOLOR(1, 1, 0, 1)); origin has a value of -1, -1 which is fine (texture coords for the screen go from -1 to 1 in each axis) but headPos has the value of -1.#IND000 in both the x and y. I think this may be because D3DXVec3Project is returning MASSIVE numbers, although I

Data information