D3DXVec3Project returns odd values

SirYakalot
  • D3DXVec3Project returns odd values SirYakalot

    I would have thought that if the object is on-screen that this function should return screen coordinates. When used in conjunction with the directX draw text function, it works fine. Textual overlays track the 3D objects perfectly, however consider the following code:

    D3DXVec3Project(&xyCoords, &HeadPosition(), &viewport, &matProjection, &matView, &matFinal);//for thought bubble
    
    D3DXVECTOR2 origin = D3DXVECTOR2(
        (0.0f / SCREEN_WIDTH) * 2.0f - 1.0f,
        (0.0f / SCREEN_HEIGHT) * 2.0f - 1.0f);
    
    D3DXVECTOR2 headPos = D3DXVECTOR2(
        (xyCoords.x / SCREEN_WIDTH) * 2.0f - 1.0f,
        (xyCoords.y / SCREEN_HEIGHT) * 2.0f - 1.0f);
    
    lineRenderer.Draw2DLine(origin, headPos, D3DXCOLOR(1, 0, 0, 1), D3DXCOLOR(1, 1, 0, 1));
    

    origin has a value of -1, -1 which is fine (texture coords for the screen go from -1 to 1 in each axis) but headPos has the value of -1.#IND000 in both the x and y. I think this may be because D3DXVec3Project is returning MASSIVE numbers, although I could be wrong.

    Who knows what's going on under the hood of DrawText_A in the following code, but it clearly compensates for whatevers not working with the above code.

    D3DXVec3Project(&xyCoords, &HeadPosition(), &viewport, &matProjection, &matView, &matFinal);
    
        rct.left   = xyCoords.x;
        rct.right  = xyCoords.x;
        rct.top    = xyCoords.y;
        rct.bottom = xyCoords.y;
    
        m_font->DrawTextA(NULL, stateStream.str().c_str(), -1, &rct, DT_LEFT|DT_NOCLIP ,   D3DXCOLOR(1,0,0,1));
    

    This all works perfectly. Where am I going wrong with the first bit of code?

    EDIT - the code below gives me the following values (also listed below)

    D3DXVECTOR2 origin = D3DXVECTOR2(
        (0.0f / 1280.0f) * 2.0f - 1.0f,
        (0.0f / 720.0f) * 2.0f - 1.0f);
    
    D3DXVECTOR2 headPos = D3DXVECTOR2(
        (xyCoords.x / 1280.0f) * 2.0f - 1.0f,
        (xyCoords.y / 720.0f) * 2.0f - 1.0f);
    

    for origin I get (-1, 1.2) and for headPos I get (0, 1.401e-043#DEN). FYI HeadPosition() will return a vector3 that has a 0 z component and x and y values from about 2 to roughly 33.

    Now how on earth can me with a calculator be better than the computer? what am I missing?

    0.0f / 720.0f) * 2.0f - 1.0f should clearly = -1, not 1.2! what is going on?

    0 / anything = 0, 0 * 2 = 0, 0 - 1 = -1!!!

  • -1.#IND means division by 0, make sure that SCREEN_WIDTH and SCREEN_HEIGHT have a non-zero value.

Tags
c++ directx directx10
Related questions and answers
  • (); }; #endif If I call Init() from game.cpp, the code works fine, if I call cube->init(), just as this function is entered (not when the other one is left), the device pointer suddenly has an address...; scd.BufferDesc.Width = SCREEN_WIDTH; scd.BufferDesc.Height = SCREEN_HEIGHT; scd.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT; scd.OutputWindow = hWnd; scd.SampleDesc.Count = 1; scd.SampleDesc.Quality = 0...Basically when placed in the same file this works fine, but if placed in separate files (and I have tested this) just after Init() is called, the pointer to ID3D10* device's value is suddenly

  • , In); // Get The Current Line ( NEW ) shaderData[i][0] = shaderData[i][1] = shaderData[i][2] = float(atof (Line)); // Copy Over The Value ( NEW ) } fclose... (1, &shaderTexture[0]); // Get A Free Texture ID ( NEW ) glBindTexture (GL_TEXTURE_1D, shaderTexture[0]); // Bind This Texture. From Now On It Will Be 1D... ) glColor3fv (&outlineColor[0]); // Set The Outline Color ( NEW ) // HERE I AM PARSING THE VERTICES AGAIN (NOT IN THE ORIGINAL FUNCTION) FOR THE OUTLINE AS PER

  • wrong in my frame buffer set up code, or elsewhere. But I can't see what. The FBO is set up through the following function: unsigned int fbo_id; unsigned int depth_buffer; int m_FBOWidth, m_FBOHeight; unsigned int m_TextureID; void initFBO() { m_FBOWidth = screen_width; m_FBOHeight = screen_height; glGenRenderbuffers(1, &depth_buffer); glBindRenderbuffer(GL_RENDERBUFFER...); glBindFramebuffer(GL_FRAMEBUFFER, 0); } Here is my drawing box code, which just takes a transformation matrix and calls the appropriate functions. The current values of P is a projection matrix

  • ); pRotation->SetMatrix(&temporaryLines[i].rotation._11); // set the rotation matrix in the effect pPass->Apply(0); device->DrawIndexed(2, 0, 0); } temporaryLines.clear...I am setting an HLSL effect variable in the following way in a number of places. extern ID3D10EffectVectorVariable* pColour; pColour = pEffect->GetVariableByName("Colour")->AsVector(); pColour->SetFloatVector(temporaryLines[i].colour); In one of the places it is set in a loop, each line in the vector temporaryLines has a D3DXCOLOR variable associated with it. The most annoying thing

  • is identified, so that's all good. I'm writing the game in C++ with OpenGl/GLFW The drawing function is: int win_width; int win_height; glfwGetWindowSize(&amp;win_width, &amp;win_height); float win_aspect = (float)win_width / (float)win_height; glViewport(0, 0, win_width, win_height); glMatrixMode(GL_PROJECTION); glLoadIdentity(); gluPerspective(90, win_aspect, 1, 100.0); glMatrixMode(GL_MODELVIEW... a = 0; a < (int)groups.size(); a++) { if(groups[a].type == "prop") { //Code for rotation glPopMatrix(); glPushMatrix(); float x,y,z; x = y = z = 0; int

  • set_location(int X, int Y) { location[0] = X; location[1] = Y;}; bool Alive() {return alive;}; void SetLife(bool f) {alive= f;}; //////////////////////////////// Move //////////// bool move(int X, int Y) { location[0] += X; location[1] += Y; return true;}; };// end of sprite...((char*)new_map[r].c_str());} // make sure this works later // Insert Time if(command== ALL || command== TIME){ enum{ time_loc_y= 22, time_loc_x= 38

  • I'm having some problems porting my D3D code to OpenGL ES. I have a Graphics Device class that encapsulates all rendering commands. The code below is in an ObjC++ file. The problem code is the called to get the render buffer width and height. It doesn't actually return any value in the w or h GLints. It should be noted that the code below is from an older iPhone project, and it works there. Am I missing something really stupid? Checking the framebuffer status returns a value of complete, since OpenGL ES seems quite happy to initialize a depth buffer with the garbage w and h values I pass

  • I am making a game using OpenGL, with SDL_Image employed to load the textures. When drawn on screen, there's no alpha -- in an exact square around the opaque contents of the texture, black is drawn...); glClearDepth(1.0f); glViewport(0, 0, windowWidth, windowHeight); glMatrixMode(GL_PROJECTION); glLoadIdentity(); glOrtho(0, windowWidth, windowHeight, 0, 1, -1); glMatrixMode(GL_MODELVIEW); glEnable(GL_TEXTURE_2D... = SDL_DisplayFormat(zoomedImage); SDL_FreeSurface(zoomedImage); } return optimizedImage; } ...here is the code I actually use to create the GL texture... GLuint thisIsMyTexture

  • Dear all, this is going to be tough: I have created a game object factory that generates objects of my wish. However, I get memory leaks which I can not fix. Memory leaks are generated by return new Object(); in the bottom part of the code sample. static BaseObject * CreateObjectFunc() { return new Object(); } How and where to delete the pointers? I wrote bool ReleaseClassType(). Despite... true; } Before taking a look at the code below, let me help you in that my CGameObjectFactory creates pointers to functions creating particular object type. The pointers are stored within vFactories

Data information