How to convert pixel coordinates to GL coordinates in 2D space?

user6484
  • How to convert pixel coordinates to GL coordinates in 2D space? user6484

    I know that if you want to display a sprite on screen (in 2D) you can use glOrtho and essentially make 1 gl unit equal to 1 pixel, so when I plot out the vertices for say a 128x128 image (on a quad), I can define the vertices as -64/64, -64-64, etc and then when I map my texture coords to that quad, the image is displayed at a 1:1 ratio.

    However, lets say I wanted to not use glOrtho and wanted to have a perspective view, so I can combine 2D sprites with 3D models and whatnot? I'm at a loss on how to convert/set up the coordinates for the planes/quads I want to draw images to, in a way that will match the resolution of the image. That is, define coordinates or convert them in such a way so that when a 2D sprite is right at the near plane, that sprite is mapped as 128x128 pixels on the screen.

  • You can calculate the size of the screen at the near plane by using a little trigonometry:

    screen_width_in_world = 2.0 * tan(0.5 * horizontal_fov) * near_clip_distance
    screen_height_in_world = 2.0 * tan(0.5 * vertical_fov) * near_clip_distance
    

    That's the size in world coordinate units of the "window" on the near plane that represents the screen. If you multiply those by 128 / screen_width_in_pixels or 128 / screen_height_in_pixels, you'll get the appropriate world space width and height for your sprite, to make it 128x128 at the near plane (and proportionally smaller as it gets further away). This assumes the sprite is screen-facing.

Tags
c++ opengl
Related questions and answers
  • I am making a game using OpenGL, with SDL_Image employed to load the textures. When drawn on screen, there's no alpha -- in an exact square around the opaque contents of the texture, black is drawn...); glClearDepth(1.0f); glViewport(0, 0, windowWidth, windowHeight); glMatrixMode(GL_PROJECTION); glLoadIdentity(); glOrtho(0, windowWidth, windowHeight, 0, 1, -1); glMatrixMode(GL_MODELVIEW); glEnable(GL_TEXTURE_2D... = SDL_DisplayFormat(zoomedImage); SDL_FreeSurface(zoomedImage); } return optimizedImage; } ...here is the code I actually use to create the GL texture... GLuint thisIsMyTexture

  • I have a very simple effect file shown below. I am using this to draw 2D lines, however it is not behaving how I expected and I can't seem to get my head round why. If I draw a line that goes from 0,0 to 100, 100 for example, I would expect it to draw a line from one corner of the screen to a little way in. I would expect it to treat these numbers as screen coordinates. Instead, the line is huge! A line of about 2 long fills the whole screen. Why is this? How can I modify my shader to 'think' in screen coordinates? // a struct for the vertex shader return value struct VSOut { float4 Col

  • () { } Cube::~Cube() { pBuffer->Release(); // why does this only work when put here? because it's created here? I thnk so, why not iBuffer though? } void Cube::Draw() { render_frame(); } void Cube...Basically when placed in the same file this works fine, but if placed in separate files (and I have tested this) just after Init() is called, the pointer to ID3D10* device's value is suddenly 0x00000000. If it's all in the same file, the device pointer has a memory address all the way through. I'm really stumped here. Also wasn't quite sure whether this was a gamedev or stackoverflow question so

  • (); } return EXIT_SUCCESS; } But instead of just setting the position I want to use Sprite.Move() and gradually move the sprite from one position to another. The question is how? Later I plan on adding a node system into each map so I can use Dijkstra's algorithm, but I'll still need this for moving between nodes. ...I want to be able to move a sprite from a current location to another based upon where the user clicks in the window. This is the code that I have: #include <SFML/Graphics.hpp> int main

  • = 1 C.x = 19, C.y = 12, C.z = 3 given these coordinates how can I build a matrix that will translate and rotate my model such that both triangles have the exact same world space ? That is, I want the first vertex in my triangle model to have the same coordinates as A, the second to have the same coordinates as B, and same goes for C. nb: I'm using instanced rendering so I can't just give each...I've got 3 points in space that define a triangle. I've also got a vertex buffer made up of three vertices, that also represent a triangle that I will refer to as a "model". How can I can I find

  • NEHE'S TUT glBegin (GL_TRIANGLES); // Tell OpenGL What We Want To Draw for(int i = 0; i < numTriangles; i++) { MD2Triangle* triangle..., and vertices arrays if (Objects[i].textured) glEnableClientState(GL_TEXTURE_COORD_ARRAY); if (lit) glEnableClientState(GL_NORMAL_ARRAY...].MatIndex].tex.Use(); // AFTER THE TEXTURE IS APPLIED I INSERT THE TOON FUNCTIONS HERE (FIRST PASS) glHint (GL_LINE_SMOOTH_HINT, GL_NICEST); // Use The Good

  • ]); } glEnd(); } // Use the original orientation. glDisable(GL_CULL_FACE); glPopAttrib(); glPopMatrix(); } void MD2::Interpolate(vec3_t* Vertices) { for(int i = 0; i...; m_MD2Header.num_xyz; ++vertex) { Vertices[i][0] = (Frame->vertices[i].v[0] * Frame->scale[0]) + Frame->translate[0]; Vertices[i][1] = (Frame->vertices[i].v[1] * Frame->scale[1]) + Frame->translate[1]; Vertices[i][2] = (Frame->vertices[i].v[2] * Frame->scale[2]) + Frame->translate[2]; Normals[i] = Frame->vertices[i

  • , 0, 0, 0); glDisable(GL_DEPTH_TEST); glEnable(GL_TEXTURE_2D); //Enable 2D rendering glViewport(0, 0, Width, Height); //Set Up openGL viewport (screen) glMatrixMode(GL_PROJECTION); glLoadIdentity(); glOrtho(0, Width, Height, 0, -1, 1); glMatrixMode(GL_MODELVIEW); glLoadIdentity(); Just in-case, here's my image loading code, I think that perhaps this may be were the problem lies somehow..., bmask, amask); SDL_BlitSurface(LoadedImage, NULL, image, NULL); glGenTextures(1, &texture); glBindTexture(GL_TEXTURE_2D, texture); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL

  • ); glDrawArrays(GL_TRIANGLES, 0, 6); glBindTexture(GL_TEXTURE_2D, 0); } Tried everything I can think of or find in a FBO tutorial or have read about. I don't get any errors and it returns as complete. (Also I can't seem to get it glTexImage2d a image of size width and height. says invalid values, and if I try to use GL_TEXTURE_RECTANGLE it says invalid enum :-/ but that's for a different question... is the init code for the FBO and it's texture: glGenTextures(1,&fboimg); glBindTexture(GL_TEXTURE_2D,fboimg); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameterf

Data information