Problem With Smooth Animation In C++/SDL/OpenGL

oscar.rpr
  • Problem With Smooth Animation In C++/SDL/OpenGL oscar.rpr

    I been working in the animation of a 2D platformer game in C++/SDL/OpenGL, and my team and I reach the point where we need to establish that every animation of the player (Walking, Running, etc..) needs a different framerate for our concept, but I use as a guideline the Game Programming All In One as a example for the smooth animation, and in the book recommend have variables that limits the movement and the changes of frames.

    To clarify what I mean I have in my Sprite class these parameters:

    std::vector< Vector2f > delayMovementSprite;
    std::vector< int > frameDelayPerAnimation;
    GLfloat countX, countY;
    

    Where the vector delayMovementSprite contains all the values for the differents animations and countX increment in every frame until it's equal or greater than the value that correspond in the vector delayMovementSprite.

    Something like this:

    void Sprite::movePosXWithSpeed()
    {
      playerMoveInX = false || playerMoveInX;
    
      countX++;
      if ( countX > delayMovementSprite.at(getCurrentState()).x )
      {
        countX = 0;
        if ( handlerAnimation->getAnimationDirection() == SpriteData::RIGHT )
        {
          if ( position.x + getSpeedX() + width < 6368.f )
          {
            position.x += getSpeedX();
            playerMoveInX = true;
            return;
          }
        }
    
        else if ( position.x + getSpeedX() + width  > 0 )
        {
          position.x += getSpeedX();
          playerMoveInX = true;
          return;
        }
        playerMoveInX = false;
      }
    }
    

    And for the frames I have a class Animation which handles the following information:

    Uint32 frameRate, oldTime;
    int frameDelay;
    int frameCount;
    

    And in the function that animates I do the following, much like the MoveX in Sprite:

    int Animation::animate() 
    {
      if( oldTime + frameRate > SDL_GetTicks() ) 
      {
        return -1;
      }
    
      oldTime += frameRate;
      frameCount++;
    
      if ( frameCount > frameDelay )
      {
        animationAlreadyEnd = false;
        frameCount = 0;
        currentFrame += incrementFrame;
    
        if( currentFrame > maxFrames)
        {
          animationAlreadyEnd = true;
          currentFrame = returnFrame;
        }
      }
    
      return currentFrame;
    }
    

    I got working all that and everything executes apparently fine, but in some points of the game the animation doesn't look really smooth when in other points it is.

    I leave the video of the "gameplay" so everyone could see what I mean.

    http://www.youtube.com/watch?v=uoxKEYzwkcQ

    I currently using during the execution of the game a general Timer in 60 FPS.

    If anyone needs more information, don't hesitate in ask.

    Thanks a lot for the help.

  • Instead of delaying a set number of frames, why not only update frame if the time elapsed since the last update is xx? (for instance 100 ms)

    You're animations will look great- and won't depend on your framerate.

    In other words: have a class variable to store the update time, and when your update function rolls around, if (the current time - previous update time > desired frame time) increment frame()

  • I use a slightly different approach that you may want to consider. Instead of messing around with frame rates and vectors, I simply define some constants for the 'delay' I want between each frame

    const float PLAYER_WALK_ANIMATION_DELAY = 0.5f;
    

    and then I throw around some static's in the draw function

    static float playerWalkDelay = 0.0f;
    playerWalkDelay += 0.1f;
    if (playerWalkDelay > PLAYER_WALK_ANIMATION_DELAY) {
       playerWalkDelay = 0.0f;
       animateWalk();   
    }
    

    Like every solution to a problem this has it's pro's and con's. But I have used it in a similar game and it has worked well.

    I have two side notes I would like to make. Firstly, your game looks great! Give me a PM if you need some alpha testing. Secondly, I would definitely isolate my player sprite while I am coding and testing the animations (the video is great but it makes it hard to see where/what the problem occurs; if you are debugging like this you are making life hard for yourself).

Tags
c++ opengl 2d animation sdl
Related questions and answers
  • //////////// bool move(int X, int Y) { location[0] += X; location[1] += Y; return true;}; };// end of sprite... /////////////////////////////////////////////////////////////////////////////////////////////////////////////// class Sprite { private: string name; char symbol; float shield; int location[2]; bool alive; public: ///////////////////// Get and SET all the privates... destroy_shield() {shield--; if(shield<0) alive= false;} string get_name() {return name;}; string set_name(string aName) {name = aName;}; int* get_location(){return location;}; void

  • () { Alive = false; } void Draw(BITMAP* buffer, BITMAP* sprite) { draw_sprite(buffer, sprite, X, Y); } }; std::vector<;Missile*&gt... = (WIDTH / 2) - 64; allegro_message("Initialzing ship class"); s-&gt;Init(x); int frame = 0; BITMAP* buffer = NULL; BITMAP* background = NULL; BITMAP* ship = NULL; SceCtrlData... it returned: %i", rval); return 1; } allegro_message("Setting Graphics Mode...Press X To Begin Game. Default Missile count: %i", bullets-&gt;size()); set_color_depth(32); int ret

  • Dear all, this is going to be tough: I have created a game object factory that generates objects of my wish. However, I get memory leaks which I can not fix. Memory leaks are generated by return new Object(); in the bottom part of the code sample. static BaseObject * CreateObjectFunc() { return new Object(); } How and where to delete the pointers? I wrote bool ReleaseClassType(). Despite... true; } Before taking a look at the code below, let me help you in that my CGameObjectFactory creates pointers to functions creating particular object type. The pointers are stored within vFactories

  • that'll have the inherited level classes pushed onto it...This is what my code looks like at the moment, I've tried various things and get the same result (segmentation fault). //level.h class Level...() { bgX = 0; bgY = 0; bg2X = 0; bg2Y = 0; width = 2048; height = 480; loaded = false; time = 0; } Level::~Level() { } //virtual functions are empty... I'm not sure exactly what I'm supposed to include in the inherited class structure, but this is what I have at the moment... //level1.h class Level1: public Level { public: Level1(); ~Level1(); void load(); void

  • ); // Set The Color Of The Model ( NEW ) // ORIGINAL DRAWING CODE //Draw the model as an interpolation between the two frames glBegin(GL_TRIANGLES); for(int i = 0; i... NEHE'S TUT glBegin (GL_TRIANGLES); // Tell OpenGL What We Want To Draw for(int i = 0; i < numTriangles; i++) { MD2Triangle* triangle...I am quite new to OpenGL, I have managed after long trial and error to integrate Nehe's Cel-Shading rendering with my Model loaders, and have them drawn using the Toon shade and outline

  • well... I'm building the animation system of my game engine (the skeletal and skinned animation stuff), and I came to a point where I added so much functionality in the frame and node structures... with different types of objects in the same scene. I was also thinking about making a MeshNode class and then make a Mesh object that contains them, but then I have some conflict on where to store some data... coords?) SPRITE_DATA* pSpriteData; // Set to NULL or point to sprite-like animation data. }; // This structure holds data about properties of an animation frame // Note that all these data

  • (D3D_SDK_VERSION))) return false; std::wstring wsPath = L"C:\wood.bmp"; // path to the image D3DXIMAGE_INFO Info; if (FAILED(D3DXGetImageInfoFromFile(wsPath.c_str(), &Info..._VERTEXPROCESSING, &d3dpp, &pd3dDevice))) return false; return true; } void render(void) { // check to make sure you have a valid Direct3D device if (NULL == pd3dDevice) return...); //================================================================================================================================// code starts here //================// int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int nCmdShow) { if (!initWindow(hInstance)) return

  • this on what I had done there, but modifying it to fit with Ogre. It is working but not correctly and I would like some help to understand what it is I am doing wrong. I have a state system and when... every frame to handle input and render etc I call an UpdatePhysics method to update the physics simulation. void GameState::UpdatePhysics(unsigned int TDeltaTime) { World-&gt;stepSimulation(TDeltaTime... and colliding strangely with the ground. I have tried to capture the effect with the attached image. I would appreciate any help in understanding what I have done wrong. Thanks. EDIT : Solution The following

  • I am making a game using OpenGL, with SDL_Image employed to load the textures. When drawn on screen, there's no alpha -- in an exact square around the opaque contents of the texture, black is drawn; outside that space, white is drawn. Here is my OpenGL init code... SDL_SetVideoMode(windowWidth, windowHeight, 16, SDL_HWSURFACE|SDL_GL_DOUBLEBUFFER|SDL_OPENGL); glClearColor(0, 0, 0, 0... = SDL_DisplayFormat(zoomedImage); SDL_FreeSurface(zoomedImage); } return optimizedImage; } ...here is the code I actually use to create the GL texture... GLuint thisIsMyTexture

Data information