I Tested iphone demo, and found that it will continueally reload the image after delete the ball which generate from the spawner. If I want to load the image before game starts and store them in the memory instead of loading them in the runtime, what can I do.
I guess it would work by load the image manually, applying
orxGraphic_createFromConfig(...). Is it useful?
By the way, when will orx delete the image automatically. if there is no reference, it will deleted?
I hope you catch my meaning.
thx.
Comments
If you want to keep it in memory, the lazy and easy way would be to create a copy of the corresponding object or graphic and disabling it.
The most efficient way (and slightly less convenient) would be to call orxTexture_CreateFromFile() / orxTexture_Delete() to add/remove an extra reference on the texture, making sure it'll stay in memory.
I personally like the lazy way!
I think it will occupy a larger amount of memory. Could you give me some advice to do some optimizations? thank you
Are you talking about texture atlas?
As a rule of thumb, the less memory you waste, the better of course, so I'd regroup textures that will be used together in the same atlas, sprite-sheet style, as they are likely to be used together. Not sure if it was the answer you were expecting.
in the script, I can assign depth size, but after I check out the setVideoMode in the source, I found the meaning is assign the total bits of the opengl buffer size instead of assigning the depth size.
switch(_pstVideoMode->u32Depth)
{
/* 16-bit */
case 16:
{
/* Updates video mode */
eResult = (glfwOpenWindow((int)_pstVideoMode->u32Width, (int)_pstVideoMode->u32Height, 5, 6, 5, 0, 0, 0, sstDisplay.u32GLFWFlags) != GL_FALSE) ? orxSTATUS_SUCCESS : orxSTATUS_FAILURE;
break;
}
/* 24-bit */
case 24:
{
/* Updates video mode */
eResult = (glfwOpenWindow((int)_pstVideoMode->u32Width, (int)_pstVideoMode->u32Height, 8, 8, 8, 0, 0, 0, sstDisplay.u32GLFWFlags) != GL_FALSE) ? orxSTATUS_SUCCESS : orxSTATUS_FAILURE;
break;
}
/* 32-bit */
default:
case 32:
{
/* Updates video mode */
eResult = (glfwOpenWindow((int)_pstVideoMode->u32Width, (int)_pstVideoMode->u32Height, 8, 8, 8, 8, 0, 0, sstDisplay.u32GLFWFlags) != GL_FALSE) ? orxSTATUS_SUCCESS : orxSTATUS_FAILURE;
break;
}
}
glfwOpenWindow((int)_pstVideoMode->u32Width, (int)_pstVideoMode->u32Height, 8, 8, 8, 8, 0, 0, sstDisplay.u32GLFWFlags
Then, I attemp orx with sdml, the depth size assigning is ok, is what the obvious meaning of it.
if the depth size is not able to assign. Combining 3d object into my orx application is impossible. I hope this could be modified in new release version. Thank you for your passion to this lib.
ScreenDepth has nothing to do with depth buffer, it's only about the depth of the pixel format (you know 16bit, 32bit, ...).
And yes, I didn't setup a depth buffer in the plugins as orx is 2D, but there's no problem in doing so by default. Do you need it very soon or can it wait a bit?
My pleasure. Glad you find it useful. I'm looking forward the Android release.
Currently all the plugins are OpenGL based but that won't be true all the time nor for all platforms.
I'd be happy if someone wants to add 3D support to orx but I have no plans on doing it myself in the foreseeable future.
Should I look into the Android branch r will you take care of it?
I haven't tested the iPhone version (only wrote the code on notepad so it might not even compile).
For now the depth buffer will there when rendering to screen and not when rendering to a texture. Also the depth test is off by default (so as not to mess up with orx's rendering). You can manually turn it on/off when needed.
Question: Did you port the rendering to texture support in the Android version?