Mobile

Matt MacDonaldStatus Rejected: You Should Create A Canary Build For Your iPhone App

Matt MacDonald posted on Friday, March 15th, 2013 | Android, iPhone, Mobile | No Comments

Submitting your shiny new iOS app to Apple for review is anxiety provoking. We’ve built a number of apps for both iOS and Android and the review and approval process for iOS still makes me worry. We spend time developing apps, prepping marketing materials and coordinating press releases with PR firms on larger projects. You don’t want to see this message just before your targeted launch date:

iTunes Connect rejection notice

iTunes Connect rejection notice

Enter the “canary build”. We’ve do our best to adhere to code and UI guidelines that Apple publishes, they change, we try to keep up. Still, we’ve found that the only way to really know if your app is going to be approved is to actually submit it for review. We call these “canary builds”. When we hit critical milestones in our development process we often will submit a build to Apple that we have no intention of releasing to the public. The approval process is a pain, but you really don’t want to find out that you have an issue just days before your app is released. We use the setting “Release Control Hold for Developer Release” when submitting these updates so that we can get an app into the approval queue, have the Apple review team ferret out any potential issues and then make our changes.

After having a few apps rejected a little too close to a deadline we started using these as a way to catch things earlier in the process.

Hope that helps someone else.

That is all,
Matt MacDonald

Chris RhodeniOS 6.0 Causes CDN Overages

Chris Rhoden posted on Wednesday, November 14th, 2012 | iPad, iPhone, Mobile | 26 Comments

We received a report from the folks at This American Life of extremely high bills from their CDN for the month of October. It is our belief after researching the problem that this is caused by bugs in the iOS 6 Audio Playback frameworks resulting in files being downloaded multiple times – this could result in dramatic overage charges for both content distributers and data plan customers.

Background

We had seen a pretty intense spike in traffic on 99% Invisible and The Moth (both of which we host) last month, and had a pretty good idea of the when the spike began as a result. At the time, we had chalked the rather extreme increase in bandwidth (seen below) to the release of Apple’s new Podcast app, which featured 99% Invisible prominently at release. We figured that Apple had brought 99% Invisible and The Moth some new subscribers, and were pretty happy once we had battened the hatches a bit.

But when we heard from This American Life that they were seeing an order of magnitude increase in their bandwidth usage, we needed to ensure that there wasn’t a problem with our apps that was causing unusual download behavior. Based on our research, it looks like the issue is iOS 6.

The Behavior

To begin, we wanted to know if there was a way that we could differentiate traffic originating from one of our apps from traffic originating from other apps. Because we are using the AV Foundation framework, it turned out that we couldn’t (the User Agent String is the OS Standard one, not app specific). We were able to see that the version of iOS was 6.0 but not the name of the app playing the audio. However, the Apache logs we looked at suggested something unusual. In the following screenshot, the file being downloaded is 8614457 bytes long.

Click to view full size.

What you can see is that the first 2 bytes of the file (in most cases, this will be ID, as in ID3) are downloaded in one request and then what appears to be the file being downloaded multiple times on iOS 6 and only once on iOS 5. (This appears to be an artifact of the way that Apache logs range requests, and we have reason to believe that the file was not downloaded many complete times, but there are still clearly problems.)

Following this, we set up a proxy so that we could watch requests as they were coming from the app. The player appears to get into a state where it makes multiple requests per second and closes them rapidly. Because the ranges of these requests seem to overlap and the requests themselves each carry some overhead, this causes a single download of an MP3 to use significantly more bandwidth than in iOS 5. In one case, the playback of a single 30MB episode caused the transfer of over 100MB of data.

We saw this behavior start after a period of behaving correctly (in some cases behaving correctly for about 5 minutes before the issue appeared) in both our own apps and the Apple Podcast app. We were able to reproduce the issue with several podcasts in the Podcast app, including podcasts using Limelight and Akamai CDNs. We have been unable to reproduce the issue using iOS 5 or using iOS 6.0.1, but there are still many people using iOS 6.0.0. We believe that this issue, combined with the bug causing the phone to behave as though it is connected to WiFi even when it is not, could account for the significant data overages reported with the release of iOS 6.

The strangest bit of behavior happens when the ranges on these requests reach the end of the file. We were able to consistently see that when the file has completed downloading, it begins downloading again from the beginning of the file and continues for as long as one is streaming the file. This means that, for as long as one is listening to audio being streamed with iOS 6, it is using significant amounts of data. Watch the range headers in this video, which is monitoring the HTTP activity of the stock Podcast app (v1.1.2) on iOS 6.0.0 playing back an episode of This Week in Tech. The file finishes buffering and is completely downloaded at around 0:36.

Conclusion

There appears to be a system-wide problem with the AV Foundation framework in iOS 6.0.0, resulting in significantly higher data costs to iPhone users and internet distributors. Users who have not done so should immediately upgrade iOS 6.0.0 devices to iOS 6.0.1, which we can confirm appears to fix the issue on Wifi. While some carriers are offering concessions to customers who may have been affected by this problem, Apple does not appear to have acknowledged the specific issue. The release notes for iOS 6.0.1 mention a change related to Wifi (likely referring to the problem with devices that reported that they were connected to Wifi while connected to 3G and LTE networks), which may be related to the change which fixed this issue.

Caveats

Our tests did not cover 3g or LTE data, as we relied on connecting to Wifi to perform them. Because of the server logs we have access to, it appears that this issue exists over mobile broadband as well.

Matt MacDonaldKCRW Music Mine app on Spotify

Matt MacDonald posted on Tuesday, June 19th, 2012 | KCRW | No Comments

When PRX worked with KCRW last year to design and develop the Music Mine app for iPad, we knew that there was a reason we spent time and effort building a reusable API — we just didn’t know that we’d be building a Spotify app with it.

Last fall, the popular music-streaming service Spotify announced that they were creating a platform for developers and music-focused companies to create apps that would be available in their desktop music software. Basically it’s the ability to have an app inside of an app — in our case Music Mine inside of Spotify.

KCRW recently reached out to PRX, and within a week, we had conceptualized, designed, built, and delivered a fully functional Music Mine Spotify app to KCRW. We got the word from KCRW late last week that the app had been approved. We’re excited to be the first public media app on the Spotify platform and are even more excited to see the analytics and numbers once they are available.

If you use Spotify you can click on this link and it’ll install the app for you.

Matt & The PRX Tech Team

Chris RhodenPBS Frontline QR Code Hack

Chris Rhoden posted on Monday, May 7th, 2012 | Javascript, Mobile, Node.js | 1 Comment

Hey everyone! I recently built a system which was designed to turn your phone into a sort of smart remote control which works over the internet. This is the first part of a two-part post about that system, where I discuss what I built. In a future post, I will discuss how everything was put together from a much more technical side.

A couple of weeks ago, I participated in a hackfest which was thrown by the folks at Frontline. The goal was to spend a few hours putting together a glimpse of the future of documentary storytelling. Four projects came out of the day, but I would like to talk about the project I worked on.

As far as I could tell, I was the only developer resource on the team I was on, which resulted in a bit more focus on the technical aspects of things than is typical. As such, I can tell you that the documentary I worked with was:

  1. Already slated to receive some level of interactive work.
  2. About the abuses of our economic system by Wall Street.
  3. A Frontline documentary.

There’s really not much else I could say. I did end up watching the trailer a bunch of times, but even that started blurring, as my primary goal was to get something that sort of worked ready for demo.

The point of all of this context is that it seems very clear to me that this could be used in a variety of situations – not just those where a documentary is concerned. Anyway, on to the demo.

The first interaction you will have with the system is a QR code, which you are invited to scan with your phone. This QR code points at a mobile-optimized website.

When the website loads in your mobile browser, the QR code disappears and a video begins playing on your computer. The video on the computer screen is deliberately uncluttered, as the point is to allow you to watch the video in a lean-back manner, with no need to interact with your computer.

In your mobile browser, you are given a big green button which also acts as an indicator for your progress through the film. Pressing the button causes a bookmark to be saved of where you are in the film, including context for the segment you were watching and, in some cases, additional content and research. During our demo, one segment of the film, when bookmarked, provided access to watch additional raw footage.

When the bookmark button is pressed, a small indication is shown on the main screen that the bookmark has been saved.

The goal for the project was to provide a way to augment a viewing experience while in no way compromising the narrative created by the filmmaker. There were a number of additional features which I would have loved to have implemented, such as controls for the onscreen content, and a dashboard showing your bookmarks on the main screen when the video has finished. That having been said, I believe that the goals for the project were shown off well, and am proud of what we walked out with.

Next time, I’ll go into the actual guts of how this system worked, and maybe some other uses for the technology.

Chris RhodenAnnouncing PlayerHater. Hate the Player, not the Game.

Chris Rhoden posted on Tuesday, March 27th, 2012 | Android, Mobile | No Comments

TL; DR: Happy Tuesday! I wrote a library for working with background audio in Android apps. PRX is letting me give it away. Yay Android! Yay PRX!

Let’s talk a little history, shall we?

PRX makes mobile apps for public radio programs and stations. When we were asked to make an Android app for This American Life, we found that the Android ecosystem was just a little bit fractured. We built a very large and somewhat messy chunk of code to help us work through the issues of supporting 4 different major versions of an operating system, including handling weird and widely covered bugs and device/os interactions.

But no more! We found that by dropping support for the very very old versions of Android, we were able to lock into a much more stable API. There’s still a whole bunch of work that needs to be done in order to start playing audio in the background properly, though (think foreground notifications, the prepare/play api, and handling audio session changes). So I set to work building something completely from scratch which tackles these problems. We even thought long and hard about what should happen if your audio is interrupted with a phone call (we start back up again when you hang up if the call took less than 5 minutes. Otherwise, we kill the session.)

There are a whole bunch of goals for the player moving forward, including a default player widget and notification with play/pause buttons where they’re supported. For now, we hope that the dramatically simplified API and sensible default behavior will be useful to some people, and we can gain enough traction to make PlayerHater the de-facto way to play background audio on Android.

Check out PlayerHater on GitHub and let us know what you think!

Rebecca NessonA graphical pseudo-3D environment in an Android app

Rebecca Nesson posted on Tuesday, March 13th, 2012 | Android, iPhone, Mobile | 2 Comments

At PRX we’re lucky to get to work with creative, fun-loving clients who want their apps to be more interesting to play with than the average app made from iOS standard components or Android widgets.  In one app we’re currently developing, we’re creating an engaging and fun pop-up book style environment in which the user encounters the program content as she navigates through an imaginary world.  It’s beautiful and fun and a real programming challenge.  On the iOS side, Devin created the 3D-ish environment using native iOS layers positioned in 3D space.  It’s my job to create the same effect in the Android version of the app.  The native views in Android don’t support this kind of positioning in z space and there isn’t a build in “camera” that can be transformed to give the illusion of depth.  OpenGL could provide the 3D environment, but it would be a steep learning curve for me and it would make it harder to use the usual Android widgets and activities for performing the basic functions of the app like presenting lists of content and playing audio.  Enter AndEngine.

AndEngine is a free 2D game engine for Android.  It allows the creation of a game activity that I could combine with other Android activities to present content.  (I use Android Fragments via the Android Support V4 library to incorporate traditional Android views into the game environment.)  Although AndEngine is intended for 2D games, a forum thread demonstrated how to do the same perspective trick to the camera we’re doing on the iOS side:

 private void setFrustum(GL10 pGL)
 {
    // set field of view to 60 degrees
   float fov_degrees = 60;
   float fov_radians = fov_degrees / 180 * (float)Math.PI;

   // set aspect ratio and distance of the screen
   float aspect = this.getWidth() / this.getHeight();
   float camZ = this.getHeight()/2 / (float)Math.tan(fov_radians/2);

   // set projection
   GLHelper.setProjectionIdentityMatrix(pGL);
   GLU.gluPerspective(pGL, fov_degrees, aspect, camZ/10, camZ*10);

   // set view
   GLU.gluLookAt(pGL, 0, 0, camZ, 0, 0, 0, 0, 1, 0); // move camera back
   pGL.glScalef(1,-1,1); // reverse y-axis
   pGL.glTranslatef(-CAMERA_WIDTH/2,-CAMERA_HEIGHT/2,0); // origin at top left
}

What’s happening here is that the camera is being pulled back away from the scene and a perspective transform is being applied that causes things in the distance to appear farther away.  I can’t explain it any better than the cryptic m34 transform that is applied to the camera on the iOS side, but the effect is the same.

The only other modification I had to make to AndEngine was to create a 3D sprite class that wraps the provided Sprite class and allows the user to set the z position of sprites as well as their x,y position.  In our app world the user doesn’t interact directly with the scene but rather with an scrolling mechanism that moves the scene “on rails” as the user scrolls.  The effect is beautiful but also somewhat hard to capture in screenshots.  You’ll just have to buy the app when it comes out!

The good news is, the app is shaping up beautifully and AndEngine has really come through for what we needed to do.  But there’s a big remaining issue that I’d like to solve.  AndEngine takes care of all of the touches on the scene and passes them to the sprites.  But it does it based on their x,y coordinates.  Unfortunately, the x,y coordinates it calculates based on the touches of the screen do not correspond to the location of the sprites within the scene because of the perspective transformation based on depth.  Under the covers OpenGL knows where the sprites are because it drew them correctly on the screen, but AndEngine itself does not know.  Additionally, I can only get access to a GL10 instance which does not provide the functions I need to project and unproject coordinates.  For now I’m working around this issue, but theoretically I should be able to do the math to convert 2D screen coordinates into 3D scene coordinates using the ratio of the scene size to the screen size, the position of the camera, the angle of view, and the distance of the object in question from the camera.  So far I haven’t succeeded in doing it, but when I get a few days to step back from the project I’ll turn to it again.  If you think you know how it should be done, please comment!

Matt MacDonaldPledge your support for mobile

Matt MacDonald posted on Wednesday, February 29th, 2012 | Mobile | 1 Comment

Hi,

I wanted to make a quick plug for a session that I’m organizing at the 2012 Integrated Media Association conference next week. The session is titled: Pledge your support for mobile and it will be both an overview of current pledge/donation practices and also a hands on working session. We’ll review existing pledge and donation practices, what works and where people get hung up and then both the session panelists and attendees will attempt to streamline and significantly improve the pledge process for public radio and television stations.

The major themes for the IMA conference this year are:

  1. Motivate attendees to take individual ownership for innovation
  2. Provide the attendees with knowledge that they can take home and use immediately

I’m really excited about our session panelists and facilitators, I’ve worked with each of them in the past on mobile and tablet projects. Devin Chalmers (KCRW Music Mine iPad, Radiolab iPhone), Dan Saltzman from Effective UI (KCRW Music Mine iPad) and Marine Boudeau from New York Public Radio (WNYC, WQXR, Radiolab iPhone and Android). Each has worked outside of public media and has extensive experience in user centered design and interfaces.

I’m also excited to try out a hybrid session with both a conference panel and then a hands on hackathon approach. My hope is that we all learn some new things, then create something new and all leave with actionable steps. This will be a session where attendees will be encouraged to participate, one where we use our collective strengths to improve a process that is ripe for revamp. If you are attending IMA I’m hoping you can join us.

Our session is on Thursday March 8th starting at 2pm.

I’ve uploaded my in-progress slides for the session in an attempt to entice people to come checkout our session.

Daniel GrossExploring The Depths Of KCRW Music Mine: Part One

Daniel Gross posted on Monday, September 19th, 2011 | iPad, KCRW, Mobile | No Comments

KCRW Music Mine launched recently to rave reviews. The free iPad app is the result of a collaboration between KCRWRoundArch and PRX, which led the project and development efforts. Music Mine pushes the limits of the tablet platform to create a truly unique music discovery experience. Daniel Gross interviews Matt MacDonald, PRX’s Director of Project Management, about the app.

What can Music Mine do?

The app is a iPad music player that opens up to a grid of 100 songs from artists that played on KCRW shows in the last week. What the app does is highlight these tracks that have been hand-picked by a KCRW DJ who wanted you to hear it. You might find odd pairings, maybe TV on the Radio right next to the newest Jeff Bridges country song. What they have in common is that these are all great songs that were picked by humans with taste and a point of view, not selected by an algorithm. The app lets you explore each of these artists in greater depth and we use software to add supporting songs, videos, photos, bios and blog posts.

How did it all get started?

We started talking with KCRW in the summer of 2010 about building an iPad app. PRX has been doing iOS development since late 2008 with apps including the Public Radio Player for iPhone and This American Life app for iPhone, iPad and Android. So we knew we could make this project ambitious.

As a public radio station, KCRW produces a ton of content. But we wanted to build an app tightly focused on their music. Anil Dewan at KCRW helped us understand their online music identity. But given the existing competition facing KCRW and listening platforms, we were asking: how the heck do we stand out and differentiate ourselves?

And? What makes Music Mine so different from other apps?

The biggest difference is Music Mine’s human touch. Other music apps like Pandora, Discover, Spotify or Last.fm are designed to provide access to every song and every artist in their huge catalogs. When you’re listening to Wilco, say, those apps use an algorithm to automatically recommend the next 5 or 6 ‘related’ artists. KCRW has a huge and diverse catalog but we wanted to amplify and highlight the work that the DJs do by narrowing what you should be listening to.

The user interactions in the app have an attitude and a point of view like the KCRW DJs. One goal for us was to encourage, musical exploration and delight so some of our interaction decisions basically force you to try out music that you might not be familiar with. We made sure to encourage that by not adding features like search or sorting and filtering tools. We also could have directly presented the underlying ordered list that backs the grid layout but we felt that doing so would have made you less likely to try out new songs and that you might only search and tap those you know.

So you want an app with people behind it – but you still want to take advantage of technology, right? What do you need to do that?

Yep, we definitely want to use technology to help us out here. Our office is in Harvard Square, and the local tech scene here is pretty amazing. One of the companies we’d wanted to work with for a long time was The Echo Nest over in Somerville. When we started dreaming up the Music Mine idea we went straight to them.

The Echo Nest provides APIs that allow us to request a ton of information about artists and songs, found audio, video, relevant blog posts, bios and photos. We knew that merging the KCRW playlists with Echo Nest data sources would give us the ability to create a unique window into the KCRW music universe. Rebecca Nesson here at PRX led the effort to merge the data sources from KCRW and Echo Nest. While each of the songs available in the grid is initially picked by a human, PRX wrote software to help narrow down the thousands of tracks that could be available to the 100 available on the grid.

How did the project evolve as you built and rebuilt the product?

A big question early in the project was: should the primary experience for how people listen in the app be focused on the long, multi-hour DJ sets, or center on playback of individual tracks? PRX and RoundArch interviewed people specifically about what they might want in a music discovery app from KCRW and after hearing from them we decided to focus the experience on track-by-track selection. That decision proved challenging and mid-way through the project there was a point where we really had to consider ditching the track-by-track access and move back to a DJ set focused app.

The sub-views in the app like the artist photo, song, video, bio, blog pages came together pretty quickly and didn’t change much over the project but the home view, the grid, was definitely an area where we iterated on ideas for a long time. Yeah — that took a really long time. We might have had 8 or 10 significant tweaks to the layout and presentation. But I think all the time we spent iterating on the ideas really paid off in the execution.

Alternate DJ set focused home view

Early DJ focused home view

Initial grid comp

Final grid design.

How do you feel about it? Is there anything you’d improve or add?

Well…no. Actually, I’m really happy with it. I just re-read the original product vision and I think we’ve really held true to that over the last ten or so months.

The KCRW iPad Experience project will focus on creating an iPad experience that quickly engages target users by leveraging KCRW’s unique style, taste and hand picked music. In addition, tight conceptual focus and appealing interaction models will aid in driving long-term user engagement.

People that we interviewed about what they might want in an music discovery app said they wanted us to make good music important to them, help them find new good music and improve their awareness. We’ve done that. People wanted us to connect them with great new artists, and help them learn more about music and artists they’re already familiar with. I think we’ve done that too.

What’s next from the PRX team?

I’m really excited about an iPhone project that we’re working on with a very popular public radio show. I feel like we’re developing a unique way to help people interact and connect with the show. We’re pushing the device’s capabilities, coming up with new interactions that can surprise and delight people.

Any last thoughts?

People talk about software as engineering, and it’s partially that, but it’s also a craft. This kind of work is like building a beautiful piece of furniture. When it’s done poorly, you end up with a cheap nothing that you throw away. But when it’s done well, you’ll never see all the work that went into it. Sometimes that means a lack of appreciation for the thought and care that was put in.

Devin Chalmers, Rebecca Nesson and Andrew Kuklewicz are top-notch developers, maybe the best people I’ve ever worked with. But even with the best people working together, it requires time and freedom to explore and do it right. We could have finished a long time ago if we’d done it poorly. Instead, we came up with something that we’re proud of, KCRW is happy with, and it seems so are the people using it.

Part Two

Coming up: Rebecca Nesson and Devin Chalmers will be digging into other more technical aspects of the KCRW Music Mine app, perspective transformations on scroll views, knob-twiddly stuff to tweak behavior and how we had to figure out layouts of a 4×6 grid in both portrait and landscape modes.

Chris KalafarskiBuilding a Better Contact Sheet

Chris Kalafarski posted on Monday, April 25th, 2011 | iPhone, Mobile | No Comments

Last week we decided to update the Photoshop file we provide to radio stations to allow them to customize the look of our core station iPhone app. There were two main problems with the version we had been using, the first being simply the organization of the layers. There was no standardized mapping of layers in the file to the individual files we need to use in the final app. The other problem was that our process for breaking the file down into its component parts was extremely time consuming. We were using an AppleScript that had been built in-house which extracted specific layers from the document, one-by-one. Additionally, we were maintaining low-res and high-res versions of the template for retina and standard displays.

The solution we decided to implement isn’t perfect, but it dramatically speeds up the processing step on our end, allows for a much more sensible organization of the component graphics within the document, and provides a buffer between what the station designer is doing on their end and what we’re doing on our end.

Buttons in the SpriteSheet

Buttons in the SpriteSheet

In order to ensure proper and meaningful organization of the template throughout the workflow, each graphical element we require from the stations is now represented as its own Smart Object. Each button, icon, background, etc is a distinct smart object. This way we can be sure that all changes being made to a graphic are in the appropriate and anticipated place. If our default icon is a single, raster layer but the station has a complex multi-layer vector replacement, there is no issue. Unlike the old version where new layers essentially went unaccounted for unless we were told explicitly that they had been added, now all the changes being made to each graphic are entirely encapsulated in a single object. This also allows us to set the canvas size on a per-graphic basis, so a designer doesn’t accidentally build a graphic that is too large for the space it will have in the final iPhone app.

Beyond that, the file is simply organized in a way such that relevant smart objects are in layer groups, so it’s easy for the designer to find what she is looking for. Icons are all in one place, backgrounds are in another, and status indicators another. All of the default objects are positioned roughly as they would appear in the final app inside a virtual iPhone, ensuring the designer is building the graphics in the correct context.

By using smart objects, we are also able to rethink the way the file is processed once it’s returned to PRX. Rather than run it through a script that is heavily dependent on specific layer names, we are now taking advantage of the dynamic Slice tool that Photoshop offers. You may be familiar with using Slice in Photoshop or Fireworks from when people were building websites with tables. It allows you to define contiguous regions of a file which can automatically be batch output, and additionally each slice can be given a unique name that persists inside the Photoshop file itself. The tool also allows slices to be dynamically linked to specific files in the file, and resizes to fit the contents of the layer. In our case, each smart object is a layer that has been linked to a slice, so if a station designer should choose to replace a default with an image that is larger or smaller, the slice we end up outputting is adjusted automatically.

Unfortunately, the slice tool is a relatively simplistic tool, and is only intended to work on the merged contents of the defined area. This is problematic, for instance, with toolbars, where we allow the designer to change both the background and the buttons. In our template we keep those graphics stacked on top of each other, just as they would appear in the app. The slice tool, though, would not handle that situation properly, and would export a single image rather than the separate parts.

Because we’re using smart objects, though, it’s trivial to duplicate each smart object somewhere else in the file where they can be arranged to prevent overlap. Duplicates of smart objects all point to the same source, so when the designer makes their changes in the part of the file where the layers are arranged to mimic the iPhone, the exploded view of the graphics we maintain behind the seems immediately reflects the changes.

Once we get the file back, we simply hide the designer-facing layers and bring up the matching layers we need for the batch export. A trip to the Save for Web dialog makes the export process take all of 10 seconds, orders of magnitudes faster than the old AppleScript. Since the names of the files being generated are being pulled from the slice metadata we ensure all of the station’s images will end up in the appropriate files we need to produce the iPhone app.

There are two caveats with this process. The first is that even though the the canvas size necessary for the user-facing layers is relatively small (technically just the resolution of a retina display), the actual file ends up being much larger. Because we need an unobstructed view of every single element, even when they are placed as closely together as possible the canvas ends up being about 4000×1500. This ends up being just empty space most of the time, and not really a major issue, but it is not ideal.

Because we will need to extract images through cropping, the canvas is much larger than the working area.

 

The other problem is something Photoshop is actually doing to give the user more control. Because smart objects can contain vector graphics, Photoshop allows them to be positioned at a sub-pixel level, even in the raster-based world of the actual Photoshop file. It does this even when the smart object contains only raster items itself. An unfortunate side effect of this is that unpredictable rendering can occur. Sometimes when the canvas height or width of a smart object is an odd number, Photoshop will try to center it inside the parent document, which places it halfway between real pixels. When that happens edge pixels of the smart object’s contents are improperly rendered (one edge is truncated a row early, and the edge pixels of the opposite edge are repeated). It’s not a hard problem to correct, but some care must be given since the entire document is based on smart objects.

Rebecca NessonBuilding an iOS Application Platform

Rebecca Nesson posted on Monday, March 21st, 2011 | Git, iPhone, Mobile | No Comments

In the last several months PRX introduced a new product to stations: an iPhone public radio station app platform. In the upcoming months the first several apps built on this platform will begin to appear in the iTunes app store. The goal with these apps is to provide a highly customizable application that allows stations to showcase their content and their brand. The challenge for the stations is that they are on constrained budgets and cannot pay the large app development costs usually associated with a custom built iPhone app. The challenge for us to is provide them with a satisfying alternative that has a full feature set, a custom feel, and a price tag that they can afford. To meet this challenge we’ve developed a platform that abstracts as much data as possible out of the application and streamlines the maintenance and updates of shared code between our applications. Here’s a laundry list of the techniques we’re using to accomplish this.

The Server Backend

We use a rails application backend server to maintain the data about the stations. This includes the information about their programs and their schedules as well as other app-related data such as the content of various HTML-based views that appear in the app. We also use this backend to integrate with third-party providers of data for the app, such as Publink.com which provides access to some stations’ member or listener discount systems. Using a backend server application makes it easy to update data in the app without a new application release to the iTunes store, offers the potential of letting stations maintain their own data, and helps us to standardize the models and formats of the data on which the iPhone code relies.

App Customization using Inheritance

After removing as much of the data (and the variations in the data formats) from the phone as possible, there still remains the problem of how to provide a core set of features and native views for the iPhone app but also to give stations a lot of leeway to customize the way their app looks and functions and to maintain the ability to provide improvements to features without breaking or overwriting customizations. We’re using inheritance to solve this problem. Each model, view, and controller in our app has both a “Base” version and a “Custom” version where the Base version is the parent class of the Custom version. We develop all features in the Base versions and expect clients not to make customizations or other changes within these files. That way when we update a Base version of a class we can push the change into all of the apps without fear of overwriting a customization. Throughout the code outside of the class we refer only to the Custom version of a class so that any customizations made in the custom version will be used instead of the code in the Base. This allows a station to make small changes to a particular area of their app or even to completely redo the way a particular area of the application works.

Managing Improvements Using git branches

One of the trickiest parts of keeping the platform development manageable was figuring out how to maintain the code bases for each app in a way that allows easy integration of improvements over time. We’re using git (and GitHub) for this purpose. We maintain a master version of the code as a main branch. It includes all of the features and the Custom classes but no customizations. Each station app has its own branch based on the master branch but including all of the customizations. When I’m developing, I always work in a specific station’s branch. When I make changes to Custom classes or non-shared resources I commit them in the station branch. When I make a change to Base classes or other shared code or resources, I commit in the master branch and merge back into the branch I’m working on. When it’s time to go to work on another station’s app, I check out the branch and the first thing I do is merge with the master branch to pull in the latest Base changes.

A cool side effect of this process is that GitHub maintains a helpful view of how up-to-date a given app is with the changes to Base as well as how customized an a given app is. Here’s a snapshot of that view right now:

Customized Graphics with Photoshop Contact Sheets

It was important to us and to our clients to work with their own designers to create a their look for their apps. In order to do this we abstracted out as many common and reusable user interface elements as we could and created a Photoshop “contact sheet” for the app. This contact sheet provides templates of the graphics that are used throughout the app separated into a single layer per item. We provide this contact sheet to our clients’ designers and they replace the default graphics with their own designs. This allows stations to use the defaults where they like but also to come up with their own design and look for the app. We then created an AppleScript that exports all of the images and saves them with the file names the app expects. This keeps the designer’s role strictly to design and limits the amount of time we as developers have to spend to incorporate the graphics into the app.

Using .strings Files for Text Customization

One other facet of customization worth mentioning, although we still haven’t quite worked out the kinks in it, is the repurposing of iOS internationalization features to do text customization within the app. Rather than using literal strings for text throughout the app, we pull the strings out of .strings files. This allows stations to provide their own “translations” for each bit of text in the app without having to make a customization within the code. I call this customization method half-baked because when we add new strings to the app it is best if we can regenerate the strings files which will cause them to be regenerated using the defaults specified in the application’s code. To avoid this we could add new strings in new .strings files, but this would result in a proliferation of these files over time.

Support Us!

PRX

Categories & projects

Archives