Rebecca NessonA graphical pseudo-3D environment in an Android app

Rebecca Nesson posted on Tuesday, March 13th, 2012 | Android, iPhone, Mobile

At PRX we’re lucky to get to work with creative, fun-loving clients who want their apps to be more interesting to play with than the average app made from iOS standard components or Android widgets.  In one app we’re currently developing, we’re creating an engaging and fun pop-up book style environment in which the user encounters the program content as she navigates through an imaginary world.  It’s beautiful and fun and a real programming challenge.  On the iOS side, Devin created the 3D-ish environment using native iOS layers positioned in 3D space.  It’s my job to create the same effect in the Android version of the app.  The native views in Android don’t support this kind of positioning in z space and there isn’t a build in “camera” that can be transformed to give the illusion of depth.  OpenGL could provide the 3D environment, but it would be a steep learning curve for me and it would make it harder to use the usual Android widgets and activities for performing the basic functions of the app like presenting lists of content and playing audio.  Enter AndEngine.

AndEngine is a free 2D game engine for Android.  It allows the creation of a game activity that I could combine with other Android activities to present content.  (I use Android Fragments via the Android Support V4 library to incorporate traditional Android views into the game environment.)  Although AndEngine is intended for 2D games, a forum thread demonstrated how to do the same perspective trick to the camera we’re doing on the iOS side:

 private void setFrustum(GL10 pGL)
 {
    // set field of view to 60 degrees
   float fov_degrees = 60;
   float fov_radians = fov_degrees / 180 * (float)Math.PI;

   // set aspect ratio and distance of the screen
   float aspect = this.getWidth() / this.getHeight();
   float camZ = this.getHeight()/2 / (float)Math.tan(fov_radians/2);

   // set projection
   GLHelper.setProjectionIdentityMatrix(pGL);
   GLU.gluPerspective(pGL, fov_degrees, aspect, camZ/10, camZ*10);

   // set view
   GLU.gluLookAt(pGL, 0, 0, camZ, 0, 0, 0, 0, 1, 0); // move camera back
   pGL.glScalef(1,-1,1); // reverse y-axis
   pGL.glTranslatef(-CAMERA_WIDTH/2,-CAMERA_HEIGHT/2,0); // origin at top left
}

What’s happening here is that the camera is being pulled back away from the scene and a perspective transform is being applied that causes things in the distance to appear farther away.  I can’t explain it any better than the cryptic m34 transform that is applied to the camera on the iOS side, but the effect is the same.

The only other modification I had to make to AndEngine was to create a 3D sprite class that wraps the provided Sprite class and allows the user to set the z position of sprites as well as their x,y position.  In our app world the user doesn’t interact directly with the scene but rather with an scrolling mechanism that moves the scene “on rails” as the user scrolls.  The effect is beautiful but also somewhat hard to capture in screenshots.  You’ll just have to buy the app when it comes out!

The good news is, the app is shaping up beautifully and AndEngine has really come through for what we needed to do.  But there’s a big remaining issue that I’d like to solve.  AndEngine takes care of all of the touches on the scene and passes them to the sprites.  But it does it based on their x,y coordinates.  Unfortunately, the x,y coordinates it calculates based on the touches of the screen do not correspond to the location of the sprites within the scene because of the perspective transformation based on depth.  Under the covers OpenGL knows where the sprites are because it drew them correctly on the screen, but AndEngine itself does not know.  Additionally, I can only get access to a GL10 instance which does not provide the functions I need to project and unproject coordinates.  For now I’m working around this issue, but theoretically I should be able to do the math to convert 2D screen coordinates into 3D scene coordinates using the ratio of the scene size to the screen size, the position of the camera, the angle of view, and the distance of the object in question from the camera.  So far I haven’t succeeded in doing it, but when I get a few days to step back from the project I’ll turn to it again.  If you think you know how it should be done, please comment!

2 Comments to A graphical pseudo-3D environment in an Android app

[...] and how they come to be, you’ll want to check out PRX Lead Developer Rebecca Nesson’s post over at PRX Labs on the work she’s doing to create a beautiful, pseudo 3D environment in an Android [...]

[...] Did you know we have a blog on tech and innovation at PRX? Check out the latest post on creating a beautiful, pseudo 3D environment in an Android app. [...]

Leave a comment

Support Us!

PRX

Categories & projects

Archives