Passing an OpenGL texture ID loaded in NDK back to java

Ash McConnell
  • Passing an OpenGL texture ID loaded in NDK back to java Ash McConnell

    Is it possible to load a texture using the NDK, but pass the result back to use in java.

    I have used the code in the link above and it appears to load the png correctly and generates a texture ID without an error (the id is 0 before the loading function is called and non-zero afterwards).

    Is it not allowed to mix opengl operations between C++ and java?

    Thanks for your help All the best, Ash

  • I do quite a bit of the opposite -- I leverage the Android facilities for loading images from disk and into OpenGL, and then return the texture ID back over JNI.

    So basically, I have native code, a JNI bridge, and then a series of utility functions in Java that do the actual work, and my call sequence looks something like:

    [native code] -> [JNI code] -> [Java Code]

    and of course the texture ID is passed back to native code. There's no reason this can't work the other way around, though I have to wonder what compels you to do so, since it's so easy to do in Java/Android as it is. What is the problem you're encountering?

    Make sure you aren't attempting to make OpenGL calls simultaneously from multiple threads, and check to see if the OpenGL error flag is being set anywhere.

Tags
c++ java android ndk
Related questions and answers
  • I am making a game using OpenGL, with SDL_Image employed to load the textures. When drawn on screen, there's no alpha -- in an exact square around the opaque contents of the texture, black is drawn... = SDL_DisplayFormat(zoomedImage); SDL_FreeSurface(zoomedImage); } return optimizedImage; } ...here is the code I actually use to create the GL texture... GLuint thisIsMyTexture... information I can think of. Help would be appreciated! :D

  • I am quite new to OpenGL, I have managed after long trial and error to integrate Nehe's Cel-Shading rendering with my Model loaders, and have them drawn using the Toon shade and outline...? Something unnecessary that maybe you guys could spot? Both MD2 and 3DS loader have an InitToon() function called upon creation to load the shader initToon(){ int i... (1, &shaderTexture[0]); // Get A Free Texture ID ( NEW ) glBindTexture (GL_TEXTURE_1D, shaderTexture[0]); // Bind This Texture. From Now On It Will Be 1D

  • (); glOrtho(0, Width, Height, 0, -1, 1); glMatrixMode(GL_MODELVIEW); glLoadIdentity(); Just in-case, here's my image loading code, I think that perhaps this may be were the problem lies somehow: LoadImage(string filename, bool loadingTileSheet) { SDL_Surface *LoadedImage = NULL; GLuint texture; Uint32 rmask, gmask, bmask, amask; LoadedImage = IMG_Load(filename.c_str...I'm using SDL & openGL to render a tile-map. The issue is that the tile-map rendering is extremely messed up, and I'm just a bit unsure what I'm doing wrong exactly. It should just be the first

  • ++ or java I just dont have much experience using the jni. Ive built some sample projects where I pass primitives like floats and perform operations on them on the native side and then passed them back but I cant figure out how to create a scene in opengl in c++ and then pass it to java. Ive been looking at writing everything using java but Im not sure if the java bindings make for some performance...Ive been reteaching myself opengl so I can make a game on android. However Ive been struggling with how to build objects and scenes in opengl using c/c++ and passing them through the jni

  • I'm writing a game engine which is going very fine. However, I'm now posed with handling textures. My Engine is in 2D, for simplicity reasons mostly to get a good idea of how to work with OpenGL. I do this in C++, and I have a hierarchical set-up when it comes to classes. There's a base class and some other classes derive from that. So now I'm onto the part where I want to load up some textures... to render the scene. What I'm interested in, what is the best practice to handle this, and how to handle OpenGL texture storage. I understand OpenGL keeps the textures in it's texture memory, and that I

  • I'm having trouble implementing render to texture with OpenGL 3. My issue is that after rendering to the frame buffer, it appears the rendered object becomes deformed, which may imply a bad... wrong in my frame buffer set up code, or elsewhere. But I can't see what. The FBO is set up through the following function: unsigned int fbo_id; unsigned int depth_buffer; int m_FBOWidth, m_FBOHeight...); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE

  • I'm trying to load MD2 models (stolen from Cube) and my loader seems to be loading the models fine, but I can't say the same for the drawing of the model. Here's my code: typedef float vec3_t[3...) + ('D'<<8) + 'I') #define MD2_VERSION 8 class MD2 : public VertexModel { public: MD2(const std::string& Path); ~MD2(); void Draw(timestep_t Time); private...-facing polygons. glPushAttrib(GL_POLYGON_BIT); glFrontFace(GL_CW); glEnable(GL_CULL_FACE); glCullFace(GL_BACK); // Bind the texture. //glEnable(GL

  • I have a DirectX10 texture (ID3D10Texture2D) that I load from disk with the following code: CComPtr<ID3D10Device> spD3D; // Initialized correctly elsewhere hr = D3DX10CreateTextureFromFile... in order to get a texture that is CPU readable, I tried setting the D3DX10_IMAGE_LOAD_INFO structure to include D3D10_CPU_ACCESS_READ. This fails on D3DX10CreateShaderResourceViewFromFile() with E_INVALIDARG. So I guess I've given up trying to load in a texture and be able to read its data from on the CPU. So I thought, what about making a texture with the following D3D10_TEXTURE2D_DESC flags

  • I'm trying to get a 2 pass post-processing system going in OpenGL in a cross-platform manor using FBOs. I'm starting the dev on mac OSX (since in the past I've found it the most finicky to get working of windows/linux/osx), I have a toggle to toggle between using the FBO(post-processing) and not. The shaders are working, but it seems the FBO didn't load the texture unit bound to it. The following.... (Also I can't seem to get it glTexImage2d a image of size width and height. says invalid values, and if I try to use GL_TEXTURE_RECTANGLE it says invalid enum :-/ but that's for a different question

Data information