How to store the graphics in memory using spawner

edited October 2010 in Help request
I Tested iphone demo, and found that it will continueally reload the image after delete the ball which generate from the spawner. If I want to load the image before game starts and store them in the memory instead of loading them in the runtime, what can I do.
I guess it would work by load the image manually, applying
orxGraphic_createFromConfig(...). Is it useful?
By the way, when will orx delete the image automatically. if there is no reference, it will deleted?
I hope you catch my meaning.
thx.

Comments

  • edited October 2010
    The rule is that a texture will remain in memory as long as it's used (ie. as long as something is referencing it), so yes, a texture with no one referencing it will be deleted from memory.
    If you want to keep it in memory, the lazy and easy way would be to create a copy of the corresponding object or graphic and disabling it.
    The most efficient way (and slightly less convenient) would be to call orxTexture_CreateFromFile() / orxTexture_Delete() to add/remove an extra reference on the texture, making sure it'll stay in memory.

    I personally like the lazy way! ;)
  • edited October 2010
    Will you store the whole size of pictures that do not shrink in the opengl texture, as long as any part of the bitmap have been loaded and used in the game?
    I think it will occupy a larger amount of memory. Could you give me some advice to do some optimizations? thank you
  • edited October 2010
    I'm not sure I understand what you mean on this one.
    Are you talking about texture atlas?
    As a rule of thumb, the less memory you waste, the better of course, so I'd regroup textures that will be used together in the same atlas, sprite-sheet style, as they are likely to be used together. Not sure if it was the answer you were expecting.
  • edited November 2010
    orx with GLFW seems do not assign depth buffer.
    in the script, I can assign depth size, but after I check out the setVideoMode in the source, I found the meaning is assign the total bits of the opengl buffer size instead of assigning the depth size.

    switch(_pstVideoMode->u32Depth)

    {

    /* 16-bit */

    case 16:

    {

    /* Updates video mode */

    eResult = (glfwOpenWindow((int)_pstVideoMode->u32Width, (int)_pstVideoMode->u32Height, 5, 6, 5, 0, 0, 0, sstDisplay.u32GLFWFlags) != GL_FALSE) ? orxSTATUS_SUCCESS : orxSTATUS_FAILURE;



    break;

    }



    /* 24-bit */

    case 24:

    {

    /* Updates video mode */

    eResult = (glfwOpenWindow((int)_pstVideoMode->u32Width, (int)_pstVideoMode->u32Height, 8, 8, 8, 0, 0, 0, sstDisplay.u32GLFWFlags) != GL_FALSE) ? orxSTATUS_SUCCESS : orxSTATUS_FAILURE;



    break;

    }



    /* 32-bit */

    default:

    case 32:

    {

    /* Updates video mode */

    eResult = (glfwOpenWindow((int)_pstVideoMode->u32Width, (int)_pstVideoMode->u32Height, 8, 8, 8, 8, 0, 0, sstDisplay.u32GLFWFlags) != GL_FALSE) ? orxSTATUS_SUCCESS : orxSTATUS_FAILURE;



    break;

    }

    }
    glfwOpenWindow((int)_pstVideoMode->u32Width, (int)_pstVideoMode->u32Height, 8, 8, 8, 8, 0, 0, sstDisplay.u32GLFWFlags

    Then, I attemp orx with sdml, the depth size assigning is ok, is what the obvious meaning of it.
    if the depth size is not able to assign. Combining 3d object into my orx application is impossible. I hope this could be modified in new release version. Thank you for your passion to this lib.
  • edited November 2010
    laschweinski wrote:
    orx with GLFW seems do not assign depth buffer.
    in the script, I can assign depth size, but after I check out the setVideoMode in the source, I found the meaning is assign the total bits of the opengl buffer size instead of assigning the depth size.

    ScreenDepth has nothing to do with depth buffer, it's only about the depth of the pixel format (you know 16bit, 32bit, ...).

    And yes, I didn't setup a depth buffer in the plugins as orx is 2D, but there's no problem in doing so by default. Do you need it very soon or can it wait a bit?
    Thank you for your passion to this lib.

    My pleasure. Glad you find it useful. I'm looking forward the Android release. :)
  • edited November 2010
    I am able to recompile the lib by myself. But I hope it could modify in next release version. Some games need to add some 3d feature into it. Shader is a good option but I hope the lib could provide more option for developers.
  • edited November 2010
    I'll add the depth buffer but I don't think orx would be the best choice of someone for 3D games. At least not right now.

    Currently all the plugins are OpenGL based but that won't be true all the time nor for all platforms.

    I'd be happy if someone wants to add 3D support to orx but I have no plans on doing it myself in the foreseeable future.
  • edited November 2010
    Converting orx to a 3d game engine is not what I mean. I just want orx to complete some simple 3d support. In this case, adding the depth size is total sufficient.
  • edited November 2010
    Sure, that shouldn't be to hard. It's very easy with the computer plugins and requires a bit more work on iPhone.
    Should I look into the Android branch r will you take care of it?
  • edited November 2010
    I modified the features in android by configuring the eglConfig yesterday
  • edited November 2010
    There's now an optional depthbuffer (the condition being Display::DepthBuffer in the config file) in GLFW, SDL and iPhone plugins.
    I haven't tested the iPhone version (only wrote the code on notepad so it might not even compile).

    For now the depth buffer will there when rendering to screen and not when rendering to a texture. Also the depth test is off by default (so as not to mess up with orx's rendering). You can manually turn it on/off when needed.

    Question: Did you port the rendering to texture support in the Android version?
Sign In or Register to comment.