Android

Matt MacDonaldStatus Rejected: You Should Create A Canary Build For Your iPhone App

Matt MacDonald posted on Friday, March 15th, 2013 | Android, iPhone, Mobile | No Comments

Submitting your shiny new iOS app to Apple for review is anxiety provoking. We’ve built a number of apps for both iOS and Android and the review and approval process for iOS still makes me worry. We spend time developing apps, prepping marketing materials and coordinating press releases with PR firms on larger projects. You don’t want to see this message just before your targeted launch date:

iTunes Connect rejection notice

iTunes Connect rejection notice

Enter the “canary build”. We’ve do our best to adhere to code and UI guidelines that Apple publishes, they change, we try to keep up. Still, we’ve found that the only way to really know if your app is going to be approved is to actually submit it for review. We call these “canary builds”. When we hit critical milestones in our development process we often will submit a build to Apple that we have no intention of releasing to the public. The approval process is a pain, but you really don’t want to find out that you have an issue just days before your app is released. We use the setting “Release Control Hold for Developer Release” when submitting these updates so that we can get an app into the approval queue, have the Apple review team ferret out any potential issues and then make our changes.

After having a few apps rejected a little too close to a deadline we started using these as a way to catch things earlier in the process.

Hope that helps someone else.

That is all,
Matt MacDonald

Chris RhodenAnnouncing PlayerHater. Hate the Player, not the Game.

Chris Rhoden posted on Tuesday, March 27th, 2012 | Android, Mobile | No Comments

TL; DR: Happy Tuesday! I wrote a library for working with background audio in Android apps. PRX is letting me give it away. Yay Android! Yay PRX!

Let’s talk a little history, shall we?

PRX makes mobile apps for public radio programs and stations. When we were asked to make an Android app for This American Life, we found that the Android ecosystem was just a little bit fractured. We built a very large and somewhat messy chunk of code to help us work through the issues of supporting 4 different major versions of an operating system, including handling weird and widely covered bugs and device/os interactions.

But no more! We found that by dropping support for the very very old versions of Android, we were able to lock into a much more stable API. There’s still a whole bunch of work that needs to be done in order to start playing audio in the background properly, though (think foreground notifications, the prepare/play api, and handling audio session changes). So I set to work building something completely from scratch which tackles these problems. We even thought long and hard about what should happen if your audio is interrupted with a phone call (we start back up again when you hang up if the call took less than 5 minutes. Otherwise, we kill the session.)

There are a whole bunch of goals for the player moving forward, including a default player widget and notification with play/pause buttons where they’re supported. For now, we hope that the dramatically simplified API and sensible default behavior will be useful to some people, and we can gain enough traction to make PlayerHater the de-facto way to play background audio on Android.

Check out PlayerHater on GitHub and let us know what you think!

Rebecca NessonA graphical pseudo-3D environment in an Android app

Rebecca Nesson posted on Tuesday, March 13th, 2012 | Android, iPhone, Mobile | 2 Comments

At PRX we’re lucky to get to work with creative, fun-loving clients who want their apps to be more interesting to play with than the average app made from iOS standard components or Android widgets.  In one app we’re currently developing, we’re creating an engaging and fun pop-up book style environment in which the user encounters the program content as she navigates through an imaginary world.  It’s beautiful and fun and a real programming challenge.  On the iOS side, Devin created the 3D-ish environment using native iOS layers positioned in 3D space.  It’s my job to create the same effect in the Android version of the app.  The native views in Android don’t support this kind of positioning in z space and there isn’t a build in “camera” that can be transformed to give the illusion of depth.  OpenGL could provide the 3D environment, but it would be a steep learning curve for me and it would make it harder to use the usual Android widgets and activities for performing the basic functions of the app like presenting lists of content and playing audio.  Enter AndEngine.

AndEngine is a free 2D game engine for Android.  It allows the creation of a game activity that I could combine with other Android activities to present content.  (I use Android Fragments via the Android Support V4 library to incorporate traditional Android views into the game environment.)  Although AndEngine is intended for 2D games, a forum thread demonstrated how to do the same perspective trick to the camera we’re doing on the iOS side:

 private void setFrustum(GL10 pGL)
 {
    // set field of view to 60 degrees
   float fov_degrees = 60;
   float fov_radians = fov_degrees / 180 * (float)Math.PI;

   // set aspect ratio and distance of the screen
   float aspect = this.getWidth() / this.getHeight();
   float camZ = this.getHeight()/2 / (float)Math.tan(fov_radians/2);

   // set projection
   GLHelper.setProjectionIdentityMatrix(pGL);
   GLU.gluPerspective(pGL, fov_degrees, aspect, camZ/10, camZ*10);

   // set view
   GLU.gluLookAt(pGL, 0, 0, camZ, 0, 0, 0, 0, 1, 0); // move camera back
   pGL.glScalef(1,-1,1); // reverse y-axis
   pGL.glTranslatef(-CAMERA_WIDTH/2,-CAMERA_HEIGHT/2,0); // origin at top left
}

What’s happening here is that the camera is being pulled back away from the scene and a perspective transform is being applied that causes things in the distance to appear farther away.  I can’t explain it any better than the cryptic m34 transform that is applied to the camera on the iOS side, but the effect is the same.

The only other modification I had to make to AndEngine was to create a 3D sprite class that wraps the provided Sprite class and allows the user to set the z position of sprites as well as their x,y position.  In our app world the user doesn’t interact directly with the scene but rather with an scrolling mechanism that moves the scene “on rails” as the user scrolls.  The effect is beautiful but also somewhat hard to capture in screenshots.  You’ll just have to buy the app when it comes out!

The good news is, the app is shaping up beautifully and AndEngine has really come through for what we needed to do.  But there’s a big remaining issue that I’d like to solve.  AndEngine takes care of all of the touches on the scene and passes them to the sprites.  But it does it based on their x,y coordinates.  Unfortunately, the x,y coordinates it calculates based on the touches of the screen do not correspond to the location of the sprites within the scene because of the perspective transformation based on depth.  Under the covers OpenGL knows where the sprites are because it drew them correctly on the screen, but AndEngine itself does not know.  Additionally, I can only get access to a GL10 instance which does not provide the functions I need to project and unproject coordinates.  For now I’m working around this issue, but theoretically I should be able to do the math to convert 2D screen coordinates into 3D scene coordinates using the ratio of the scene size to the screen size, the position of the camera, the angle of view, and the distance of the object in question from the camera.  So far I haven’t succeeded in doing it, but when I get a few days to step back from the project I’ll turn to it again.  If you think you know how it should be done, please comment!

Chris RhodenAndroid Love: Part 1 of 2

Chris Rhoden posted on Thursday, February 3rd, 2011 | Android, Mobile | No Comments

Hey everyone! First, I want to apologize for what has been a much longer than anticipated break in posts on our technical blog. The tech team is keeping very busy, and it’s difficult to remember to make a post sometimes. That said, we know it’s very important for us to share what we are working on at a more technical level.

Recently, I’ve been getting to play with Android much more and it’s been making me very happy. Specifically, I have been working with the code provided by our Google Summer of Code student last year which will (very shortly!) become the Public Radio Player port for Android Smartphones. We’re very, very excited.

Force Close Dialog

Force Close Dialog. Oh No!

As I have been playing with the application, adding polish, and fleshing out functionality, I have run into a couple of situations where serious computation is being done on the UI thread. The application slows to a halt, and the user receives the dreaded Force Close dialog. No good.

In general, it’s a good idea to keep everything that does not deal directly with the UI on a separate thread. Unfortunately, because the Android UI framework is not thread safe, you can’t simply spin off threads that perform some long-running action and then update the UI; you need to actually tell the code living on the UI thread to update.

There are dozens and dozens of ways you can accomplish this, but there are only two that are appropriate if you want anyone to be able to read your code in the future. They’re useful in different situations, and they’re Handlers and AsyncTasks.

I’ll talk about AsyncTasks next week, but this week I will cover a pattern that is used all over Android, Handlers.

Android Handlers: How Do They Work?

Handlers basically allow interthread communication by allowing one to pass Messages back and forth. This means that you can create a Handler on the UI thread, fire up a new thread passing reference to your Handler, and do all of your UI stuff where it belongs.

Now, there are a couple of reasons why you might not want to use Handlers for everything. For one, they’re expensive (sort of) and they’re not the most readable way to handle background tasks. There are some best practices for working with Handlers which alleviate some of this, and I am going to scratch the surface here.

First, you should always use one Handler per Activity. Messages include an integer property called what, which is typically used to describe how the Handler should process the message with a switch statement:

Handler mHandler = new Handler(){
    @Override
    public void handleMessage(Message m){
        switch(m.what){
        case MESSAGE_1:
            doSomething(m.obj);
            break;
        case MESSAGE_A:
            doSomethingElse(m.obj);
        }
    }
}

You should also take advantage of the properties available in Messages as much as possible, rather than subclassing it. Check out the documentation for what properties are available.

For Example

Let’s take a look at using a Handler in the context of an Android Activity:


package org.prx.myapp;

import android.app.Activity;
import android.app.ProgressDialog;
import android.location.Location;
import android.os.Handler;
import android.os.Message;
import org.prx.myapp.LocationHelper;

class ShowLocationActivity extends Activity {

    protected final LocationHelper locHelper;
    protected final ProgressDialog mDialog;
    private Handler mHandler = new Handler(){
        @Override
        public void handleMessage(Message msg){
            switch(msg.what){
            case LocationHelper.LOCATION_OBTAINED:
                locHelper.calulateDistances((Location) msg.obj, this);
                mDialog.setMessage("Calculating distances...");
                break;
            case LocationHelper.DISTANCES_CALCULATED:
                mDialog.dismiss();
            }
        }
    };

    @Override
    public void onCreate(Bundle savedInstanceState){
        super.onCreate(savedInstanceState);
        if (locHelper == null) locHelper = new LocationHelper(this);
        mDialog = ProgressDialog.show(this, "", "Getting your location...", true);
        locHelper.obtainLocation(mHandler);
    }
}

In our LocationHelper class, we accept a Handler for each of the long running methods. Our calculateDistances() method might look something like the following:

public void calculateDistances(final Location loc, final Handler h){
    new Thread(new Runnable(){
        public void run(){
            /* Do some expensive calculation here */
            Message.obtain(h, DISTANCES_CALCULATED, null, null).sendToTarget();
    }).start();
}

This method starts up a new thread that then informs the main UI thread when the calculations are done. At no point is the main UI thread blocked, but we are still able to properly present the user with information about what is currently happening.

Handlers are not the Last Word

As I briefly mentioned earlier, there is another system available (called an AsyncTask) which provides an additional layer of abstraction for cases like this. I will go into those in detail next week, with a full working code example.

Thanks so much for reading, and if you have any questions, drop them in the comments below!

Tags: , ,

Support Us!

PRX

Categories & projects

Archives