OpenAL - alGetSourcei and AL_BUFFERS_PROCESSED gives junk

Anthony
  • OpenAL - alGetSourcei and AL_BUFFERS_PROCESSED gives junk Anthony

    Posted this question on SO but got no answers. Maybe somebody can help me here.

    I recently had a well-working program which streamed WAV and Ogg sounds with OpenAL. I then decided to abstract the source and buffer objects into C++ classes. I got as far as the source class. My function which returns the number of processed buffers is not altering the integer passed to alGetSourcei.

    int ALSource::GetBuffersProcessed() const {
        ALint processed;
        alGetSourcei(this->source, AL_BUFFERS_PROCESSED, &processed);
        int error = alGetError();
        if(error != AL_NO_ERROR)
                return -1;
        return processed;
    }
    

    I checked that error is never anything but AL_NO_ERROR. I generate the source in the constructor:

    alGenSources(1, &source);
    

    This also never gives any error.

    The actual symptom of this is that processed is declared but not initialised. It's final value is the initial junk value that it starts out with. (Usually something like -8834824334).

    Would this be expected behaviour? The OpenAL specification states that the value should be from 0 - any but is not really any more specific than that. The programmers' guide is the same.

  • OK, here goes the solution. The root cause was an implicit destructor call which destroyed my AL context. What's interesting is that a bug in the Windows implementation causes OpenAL to fail silently when there is no current context. So all my calls to alGetError returned AL_NO_ERROR because there was no context. It's a nasty little bug, in my opinion.

Tags
c++ software-engineering openal
Related questions and answers
  • = xyCoords.y; m_font->DrawTextA(NULL, stateStream.str().c_str(), -1, &rct, DT_LEFT|DT_NOCLIP , D3DXCOLOR(1,0,0,1)); This all works perfectly. Where am I going wrong with the first bit of code...) * 2.0f - 1.0f, (xyCoords.y / SCREEN_HEIGHT) * 2.0f - 1.0f); lineRenderer.Draw2DLine(origin, headPos, D3DXCOLOR(1, 0, 0, 1), D3DXCOLOR(1, 1, 0, 1)); origin has a value of -1, -1 which is fine (texture coords for the screen go from -1 to 1 in each axis) but headPos has the value of -1.#IND000 in both the x and y. I think this may be because D3DXVec3Project is returning MASSIVE numbers, although I

  • }; // for the old/new screen command // Some nCurses setup int r = 0, c = 0; // current row and column (upper-left is (0,0)) const int nrows = 56, // number of rows in window ncols = 79...(GetTickCount()); return rand() % scope;}; ////////////////////////// GeneratePathStart() /////////////////////////////// void GeneratePathStart(vector<string>&amp; buff){ int wall...; for(char i= '0'; i!= ' ';swall++) i= buff[buff.size()-1][swall+1]; int sspace= 0; int I= swall+1; for(char i= '0'; i!= 'X';sspace++, I++) i= buff[buff.size()-1][I

  • i think i just found the solution. 1) the problem is that backbuffer surface and source surface are of different formats - that is why exception was thrown. 2) the image path needed double slash "C...); //================================================================================================================================// code starts here //================// int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int nCmdShow) { if (!initWindow(hInstance)) return...(D3D_SDK_VERSION))) return false; std::wstring wsPath = L"C:\wood.bmp"; // path to the image D3DXIMAGE_INFO Info; if (FAILED(D3DXGetImageInfoFromFile(wsPath.c_str(), &amp;Info

  • ---------------------------------------------------------------------------------- Thank you Zacharmarz for your guidance, I eventually got the code working with vertexArrays (since the regular array was causing...I have a MD2 model loader, I am trying to substitute its immediate drawing function with a Vertex Buffer Object one.... I am getting a really annoying access violation reading error and I can't figure out why, but mostly I'd like an opinion as to whether this looks correct (never used VBOs before). This is the original function (that compiles ok) which calculates the keyframe and draws

  • wrong in my frame buffer set up code, or elsewhere. But I can't see what. The FBO is set up through the following function: unsigned int fbo_id; unsigned int depth_buffer; int m_FBOWidth, m_FBOHeight; unsigned int m_TextureID; void initFBO() { m_FBOWidth = screen_width; m_FBOHeight = screen_height; glGenRenderbuffers(1, &amp;depth_buffer); glBindRenderbuffer(GL_RENDERBUFFER...); glBindFramebuffer(GL_FRAMEBUFFER, 0); } Here is my drawing box code, which just takes a transformation matrix and calls the appropriate functions. The current values of P is a projection matrix

  • , In); // Get The Current Line ( NEW ) shaderData[i][0] = shaderData[i][1] = shaderData[i][2] = float(atof (Line)); // Copy Over The Value ( NEW ) } fclose... }; // Color Of The Lines ( NEW ) // ORIGINAL PART OF THE FUNCTION //Figure out the two frames between which we are interpolating int frameIndex1 = (int)(time * (endFrame - startFrame + 1...); // Set The Color Of The Model ( NEW ) // ORIGINAL DRAWING CODE //Draw the model as an interpolation between the two frames glBegin(GL_TRIANGLES); for(int i = 0; i

  • stencil are set on the output merger, and I'm using the depth map shader resource view as a texture in my shader, but the depth value in the red channel is constantly 1. I'm not getting any runtime errors from D3D, and no compile time warning or anything. I'm not sure what I'm missing here at all. I have the impression the depth value is always being set to 1. I have not set any depth/stencil...I'm rendering to a depth map in order to use it as a shader resource view, but when I sample the depth map in my shader, the red component has a value of 1 while all other channels have a value of 0

  • (); playerMoveInX = true; return; } playerMoveInX = false; } } And for the frames I have a class Animation which handles the following information: Uint32 frameRate, oldTime; int frameDelay; int frameCount; And in the function that animates I do the following, much like the MoveX in Sprite: int Animation::animate() { if( oldTime + frameRate > SDL_GetTicks() ) { return -1...; maxFrames) { animationAlreadyEnd = true; currentFrame = returnFrame; } } return currentFrame; } I got working all that and everything executes apparently fine, but in some

  • it returned: %i", rval); return 1; } allegro_message("Setting Graphics Mode...Press X To Begin Game. Default Missile count: %i", bullets->size()); set_color_depth(32); int ret = set_gfx_mode(GFX_AUTODETECT,480,272,0,0); if (ret != 0) { allegro_message("Error setting grahpic mode! Because of it returned: %i", ret); return ret; } background...!"); return 0; } while(!done) { sceCtrlReadBufferPositive(&amp;pad, 1); if (pad.Buttons &amp; PSP_CTRL_START) { done = true; } else

Data information