Improving UX on the Cheap

Being an app developer is a lot like being a parent: You never want to hear that you have an ugly baby.

But in the case of the developer, a bit of criticism can lead to improvement. App builders can avoid a lot of pain down the road if the bad news comes early enough in the software lifecycle. Smaller shops and individual developers can test early iterations of their apps on users, and the resulting feedback can inform subsequent versions that win over customers.

Usability testing is one technique for flagging dire problems before an app is unleashed on an unsuspecting world. Usability testing provides a snapshot of a user’s initial reaction to an app, while other methods may be employed to track longer-term usage.

Testing requires planning and the ability to craft focused and probing questions. As for monetary investment, tests don’t have to be expensive. There are plenty of low-cost DIY affairs, and even those involving an outside testing firm need not break the bank.

Testing Approaches

Usability testing can help developers unearth some scary app-killing deficiencies. “Usability testing is a wonderful, powerful research technique that finds the big hitters, the big problems that will shatter the user experience,” says Gavin Lew, managing director of User Centric Inc., a user experience research and design firm based in Oakbrook Terrace, Ill.

Developers should quickly build a simulator that users can test as opposed to taking the time to create a “gorgeous prototype,” suggests Lew. The objective is to focus on a handful of core app functions that naive users can interact with and put through their paces. Users may stumble on some features and find that others work well. The resulting feedback should help guide subsequent app iterations.

“Usability testing and the interactive design process are all about making mistakes faster,” says Lew. And fortunately, the user test population doesn’t have to be enormous to obtain actionable results.

Blink Interactive Inc., a user experience research and design firm in Seattle, Wash., recommends having eight to 10 users for simple studies, says Tom Satwicz, user researcher at Blink, on the company’s blog.

Developers that are concerned with focus group size -- and, more to the point, cost -- can use Blink’s usability sample size calculator to get an idea of what to expect. The calculator lets developers adjust such factors as the number of user groups to be compared (e.g., novice and expert users) and the number of designs to be compared, for example.

Lew, meanwhile, says seven participants will work for a usability test. With usability testing -- also called formative or iterative user research -- major issues that unravel the user experience tend to manifest quickly and often, he notes. “So, if a user can test these two to three features in a 60-minute, one-on-one usability testing session -- without introducing undue bias from the feature use -- pragmatically speaking, seven participants is sufficient,” says Lew.

Lew’s company can conduct an application test involving two or three features in a single day, which shouldn’t be cost-prohibitive, he says. User Centric’s one-day test includes the user test and a workshop with developers to explain the results. Developers should participate in the one-day test to watch the user test to get a firsthand look at the users’ experience and understand the context of use, says Lew. 

The usability test isn’t the only one developers should consider, however. Usability testing, says Lew, works well when it comes to gauging the first hour or so of a user’s experience. After that point, other forms of testing will be required to determine how the user’s experience evolves over a week or a quarter of ownership. 

“You have to recognize that usability testing is not the only technique,” he says. “We must use other research techniques to truly create engaging user experiences.” He cites soak-testing and longitudinal studies as examples of tests that track an app over an extended period of use.


Test Preparation

Tests that focus on the user experience require planning to get the most out of the process.

Developers should take the time to define what a meaningful research question is before pursuing a user test, says Lew. Just asking users if they like an app isn’t particularly revealing. The idea is to focus on specific instances that can generate design insight.

“The point is to not have blanket statements like, ‘What are users doing?’” says Lew. Align your questions to business goals (e.g., features) and to whether the users actually use the features as the business intended, suggests Lew. “This way, the research is grounded. We can relate observations and findings to business objectives. This makes context and recommendations much more relevant.”

Some companies offer assistance with creating tests., a Mountain View, Calif., company that provides a usability testing service for websites and mobile apps, offers a Task Bank from which the company’s clients can draw upon when setting up their mobile tests, notes a company spokesman. also lets customers select a test’s participants from the company’s user panel. The company is still in the process of growing its mobile user panel, so at this point the only demographic selectors are age range and gender. But a strategy game developer seeking strategy gamers could list “special requirements/demographic requirements” in the “Scenario” of the test. Users self-select based on the information listed. charges $39 per test participant.


A/B Testing Your Mobile App

A test long used by web developers to make sites more appealing or boost e-commerce sales is now emerging in the mobile app space.

A/B testing puts two versions of a website -- or a particular web page or some other website feature -- in front of visitors to gauge preferences. Feedback from such testing can prove valuable: Determining what site elements engage users can improve the user experience along with the sales conversation rate.

While A/B testing is a common practice for websites, the same cannot be said for mobile apps. However, market developments of late point to a greater availability of such services. PathMapp recently launched with the aim of bringing A/B testing to mobile apps. In addition, Swrve, which provides mobile A/B testing as part of its real-time feedback platform, raised $6.25 million this fall to fund the company’s expansion. [Disclosure: Intel Capital is among the investors; Intel is the sponsor of this content.]

Time for Testing?

Swrve targets the game app sector with its testing service. Hugh Reynolds, chief executive officer at Swrve, says some mobile developers have embraced mobile A/B testing, while others “need some encouragement to get down and dirty.”

But Reynolds believes it’s only a matter of time before the majority of game makers pursue A/B testing. “The widespread acceptance of the concept of A/B testing in the online world -- think Google AdWords -- means that most game developers understand that sooner or later they’ll have to join the party,” Reynolds says. He contends that mobile A/B testing will become commonplace in the game app space within a couple of years.

On the other hand,, which provides A/B testing software for retailers and agencies, has discovered that customers, for the most part, aren’t quite ready for mobile app testing. The company in October disclosed plans to support users of Clutch.IO, a mobile A/B testing company acquired earlier this year by Twitter. Following the acquisition, Twitter said it would disband Clutch.IO’s hosted A/B testing service. sought to pick up the mobile testing slack, but found few takers. “It turned out that there was very little interest,” says Dennis van der Heijden,’s chief executive officer and co-founder. “We thought it was a great opportunity, but our client base is more interested in testing the mobile versions of their sites than testing mobile apps.”

As it happens, the company’s interactive agency and e-commerce customers tend to outsource mobile app development to a third party, van der Heijden explains. So customers don’t become involved in mobile A/B testing, if any occurs.

A segment of’s customers -- about 15 percent -- may look into testing their mobile websites, however. Leading e-commerce clients have expressed interest in a beta version of’s mobile testing features, according to van der Heijden. The other 85 percent don’t have mobile websites. The need for mobile website A/B testing will increase as the percentage of e-commerce purchases made via mobile devices grows, van der Heijden suggests.

A Need for Testing

Game developers seem a bit more enthusiastic about mobile A/B testing than their e-commerce counterparts.

“As a mobile developer ourselves, we have a definite need for this that’s not being met by the current industry,” says Michael Orlando, chief executive officer of IDC Projects, a Rolla, Mo., research and development company focusing on games.

Orlando says he has heard of a number of places to go for in-app A/B testing. But he would like to be able to test outside of an app for external factors such as app names and icons.

“The importance of these factors is really high for those that want to have a big presence in the app store,” he notes. “The advertising needed in order to get up to the top of the stores relies heavily on these factors.”

Orlando says IDC Projects has created a tool the company uses internally to conduct some A/B testing inside its apps. The tool generates a custom HTML page that presents a player with two choices, which Orlando says could be anything from screenshots to general ideas for a game. The player then selects which one he or she thinks is better.

“Even if they choose randomly, if we do it enough, say over 1,000 responses, we can get a fairly accurate idea how good one idea is over the other,” Orlando says.

Orlando has not seen this type of testing in other apps or in any kind of commercialized format such as an SDK or a web service. He says the company may open the tool up to other developers, if there is sufficient interest in

Mobile Apps and Accessibility

When it comes to making information technology accessible to people with disabilities, websites have received much of the attention.

In 1998, Congress amended the Rehabilitation Act of 1973, directing federal agencies to make IT resources accessible to both government employees and the public. A considerable chunk of the work under the amendments -- typically referred to as Section 508 -- involves getting government websites to comply. Soon after Section 508, the World Wide Web Consortium published its Web Content Accessibility Guidelines, a set of recommendations for improving the accessibility of web content.

Recently, mobile app development has also started coming into the accessibility discussion. Developers and accessibility experts now say that the general approaches used in the web world can also apply to the rapidly expanding field of mobile devices and apps. Mobile OS makers -- including Apple, RIM/BlackBerry and Google -- even offer specific guidance on developing accessible mobile apps.

Forms of Accessibility and How to Integrate Them

Accessibility in app design may take a number of forms. For blind and low-vision users, assistive technologies include screen readers. Screen reader software, such as Apple’s VoiceOver, translates the information appearing on a display to speech. A screen reader may also drive a braille display, which raises dots through holes in a keyboard-like device to permit reading.

Examples of accommodations for deaf and hearing-impaired users include captioning services such as Purple Communications’ ClearCaptions, which debuted in 2011. In the mobile category, the service is available for Android devices and iPhones.

Mobile app developers can help widen the scope of mobile apps disabled people can use in addition to those purpose-built accessibility technologies. And the task of building accessible apps doesn’t have to be tremendously time consuming, notes Doug Brashear, mobile practice director at NavigationArts, a web and application design and development consulting firm. That’s particularly the case when the mobile OS has accessibility features baked in.

“Surprisingly, the current crop of mobile devices, particularly iPhones, has more accessibility features built into the operating system than you’d ever expect,” Brashear says. “A small amount of additional design and development time -- over what is normally required -- can yield a highly usable and accessible app.” Apple iOS’ accessibility features, for example, can get developers 75 percent of the way there, according to Brashear.

Crista Earl, director of web operations at the American Foundation for the Blind, also notes Apple’s accessibility features. Among the major capabilities: VoiceOver and Zoom. VoiceOver, she notes, originated as a screen reader for the Mac plaform and later migrated to iOS. Zoom lets users magnify an app’s entire screen as opposed to individual elements, according to Apple. Earl also says that Android, as open source software, enables app makers to develop accessible apps or apps geared toward niche markets.

Accessible App Design Tips

Many accessibility principles for websites also readily apply to mobile development. Section 508 guidelines, for instance, call for text labels to accompany images and navigational controls such as buttons. Screen readers can’t interpret a button without the supplemental text. “Put an explicit label on your controls,” Earl advises.

Similarly, web accessibility guidelines declare that information shouldn’t be conveyed only as color -- as in the case of distinguishing various subway lines on a map. Text labels provide an alternative method for conveying information here as well.

Much of what Brashear’s company would do to build a mobile app’s user interface, he says, would be the same steps it would take to create a website that complies with Section 508 or the Americans with Disabilities Act. But some elements of accessible development don’t carry over from websites to apps.

“There are a whole set of things specific to mobile because of the screen size and the fact they are touchable,” Brashear says. He suggests developers adhere to the standard UI elements for a given platform, which he says greatly aids the intuitiveness of an app. The idea is to let users leverage the experience they have had with other apps. The more customized the app, “the harder it is going to be, especially for a sight- challenged person, to understand,” Brashear says.

Developers should also enable landscape viewing as an accessibility practice, suggests Brashear, who notes that some apps lock the orientation to be portrait only. He says landscape mode is helpful in providing a bigger view, overall, and for facilitating the use of a virtual keyboard.

Brashear also cites the following mobile app accessibility recommendations:

·         Keep the need to enter text to a minimum, since small or virtual keyboards can be difficult to use.

·         Locate actions in your app away from areas of the screen that perform other functions.

·         Provide large finger targets for on-screen buttons or links.

Know Your Audience

Understanding users is central to any app project. When developing for accessibility, app makers need to “understand the nature of the challenges involved,” Brashear says

To that end, he advises developers to read, research and learn from people with accessibility needs. He points to forums such as AppleVis, a website designed for blind and low-vision users of iPhones, iPads and other Apple products.

Consulting disabled users is also important as the app moves through the development cycle. “When testing is being done, work with people who have the disabilities that you want to serve,” says Nancy Massey, president of Inc., a company which consults on accessibility and Section 508 issues.

Developers often tend to make accessibility more complicated than it needs to be, says Massey, who adds that there’s ample crossover between general usability and accessibility in the mobile technology field. An app built with a clear and simple design that’s attractive to users may go a long way toward meeting accessibility goals, she says. “What makes something user friendly often makes it accessible." 

Has UI and UX Innovation Plateaued?

Different strokes for different folks. That’s trite but true when it comes to how people interact with smartphones and tablets. That description also sums up a big challenge for app developers, OS vendors and device manufacturers: designing a UI that each one believes is the ideal way to interact with a device while still accepting the fact that many users will be confused or prefer an alternative.

Case in point: Steve Jobs famously dissed the stylus as a lousy substitute for the finger. But if most iPad -- and tablet -- users agreed, there wouldn’t be such a healthy market for Bluetooth keyboards. It’s easy to assume people buy these add-ons because they don’t like typing on glass, but that’s not the only reason.

“On many devices and within many apps, having a soft keyboard means not having the full real estate of the screen available,” says Daria Loi, Intel user experience (UX) innovation manager. “I have a never-ending list of users who report frustrations with their soft keyboard covering part of the screen.” [Disclosure: Intel is the sponsor of this content.]

UI and UX preferences also vary by culture, giving developers, OS vendors and device manufacturers another set of variables to accommodate. “I recently interviewed users in Japan who love using the stylus on their tablet or convertible as it enables touch without fingerprints,” Loi says. “Users in India told me that they love the hand for some applications but the stylus for others -- in particular more complex apps such as Illustrator, Photoshop, 3D Studio Max and so on.”

One UI to Rule Them All?
Whether it’s touch, speech control or even eye tracking, a break-the-mold UI has to be intuitive enough so that users aren’t daunted by a steep learning curve. With Metro, Microsoft takes that challenge to another level. Its new UI spans multiple device types and thus multiple use-case scenarios.

“All meaningful change requires learning,” says Casey McGee, senior marketing manager for Microsoft’s Windows Phone division. “The key is to expose people to as many relatable scenarios as possible and make learning an enjoyable and rewarding process of discovery. Microsoft is doing that by using similar UI constructs across several consumer products, phones, PCs and game consoles.”

Metro is noteworthy in part because if consumers and business users embrace it, then developers can leverage their work across multiple device types. “Windows Phone and Windows 8 in particular are both part of one comprehensive set of platform offerings from Microsoft, all based on a common set of tools, platform technologies and a consistent Metro UI,” McGee says. “The two use the same familiar Metro UI, and with Windows Phone 8, are now built on the same shared Windows core. This means that developers will be able to leverage much of their work writing applications and games for one to deliver experiences to the other.”

Metro gives developers a single framework for designing user experiences, with a set of controls for each form factor. “Developers building games based on DirectX will be able to reuse substantial amounts of their code in delivering games for both Windows and Windows Phone,” McGee says. “Developers building applications using XAML/.NET will be able to reuse substantial amounts of their business logic code across Windows and Windows Phone.”

Speak Up for a Better UX? Or Look Down?

The growing selection of speech-controlled apps and UIs -- including Google Now, Siri and Samsung’s S Voice -- shows that some developers and vendors believe that voice is a viable alternative to the finger, at least for some tasks. When people can simply say what they want, the theory goes, it’s less daunting and confusing that having to learn and remember that, say, two fingers swiped in a circle zooms in the page.

But simply adding speech-control features doesn’t automatically guarantee an intuitive user experience. In fact, when implemented poorly, it can make the user experience worse.

One common pitfall is assuming that users will stick to industry lingo rather than using vernacular terms. For example, a travel app that expects users to say “depart” instead of “leave” might frustrate them by responding with: “Not recognized. Try again.” That’s also an example of the difference between simple speech recognition and what’s known as “natural language understanding”: The former looks for a match in its database of terminology, while the latter tries to understand the user’s intent.

“The correct answer for voice is to get more and more intelligent like humans so that you don’t have to get the right word,” says Don Norman, co-founder of the Nelson Norman Group, a consultancy that specializes in UX.

Eye tracking is another potential UI. It could be a good fit for messier applications, such as turning pages in a cookbook app instead of coating the tablet or smartphone screen with flour. But like voice and touch, eye tracking will see myriad implementations as the industry casts about for the best approach.

“Eye tracking is an interesting thing,” Loi says. “I can see usefulness and some interesting usages. My only concern is that we, the industry, might fall in love blindly with it and start to believe it can do anything. Do you remember the initial attitude toward gestural controls? All those unlikely UIs to be navigated through unlikely new gestural languages?  Sometimes the industry gets very excited about a new technology, and the first reaction is to squeeze it into every device and context, without carefully considering why and what for exactly.”

In the case of voice, the user experience could get worse before it gets better simply because there’s a growing selection of solutions that make it relatively easy for developers to speech-enable their apps. That freedom to experiment means users will have to contend with a wide variety of speech UI designs.

“I consider this an exciting, wonderful period,” Norman says. “But for the everyday person trying to get stuff done, it will be more and more frustrating.”

Photo: Corbis Images

Augmented Reality: Expanding the User Experience

Augmented reality and mobile apps have met before, but some developers contend that the next encounters will produce an even more sophisticated class of technology.

The field of augmented reality places a digital overlay on the real-world view through a mobile device’s camera. Over the past couple of years, developers have taken advantage of a mobile platform’s camera and GPS to provide apps that help users find particular stores, restaurants or other points of interest. Games such as ARDefender also employ augmented reality.

But app creators have begun to engage more of a mobile device’s sensors -- accelerometers and gyroscopes, for example. Augmented reality apps that use detailed animations are also in the works. The objective: inject augmented reality technology in a wider range of apps to boost the user experience.

Expanding Augmented Reality’s Scope
“It’s been kind of a novelty item, but we are working with different companies that are putting augmented reality into everyday apps,” says Terry Hoy, senior vice president and director of sales at Gravity Jack, a company in Liberty Lake, Wash., that specializes in augmented reality software development.

Hoy cites one augmented reality app that adds an animation showing how a particular item may be taken apart and reassembled. He says that type of app is something the company can do today, noting that it should be available in the market in about six months. Another app is designed to pinpoint the location of a heating and cooling system in a building and then help technicians learn about the HVAC unit if they are not familiar with that particular model. Hoy says that app is a work in progress.

“Augmented reality is such a new technology that I think it is wide open,” says Hoy.

Gravity Jack worked for two years on an augmented reality SDK, which Hoy says will help spark the delivery of apps that go beyond the current offerings.

Trak Lord, head of U.S. marketing for metaio Inc., a software company that focuses on augmented reality, says metaio anticipates the technology moving toward the visualization of complex real-world objects, such as engine parts. He says store-finder apps still have their uses, but he adds that customers -- for the most part -- are no longer solely interested in location-driven augmented reality apps.

“No one says, ‘I only want to do GPS,’” he notes. Instead, customers who are interested in GPS overlay it into an existing, full-service application instead of building an application around that single function.

In another nod toward greater sophistication, software developers’ R&D shops are harnessing sensor data to further improve the user experience.

According to Lord, metaio “can now use sensor data to make phones aware of gravity.” A mobile device’s camera has no sense of up versus down, but metaio’s Gravity-Aligned Feature Description technology provides the camera with an up and down framework. Gravity awareness lets apps render “virtual content that behaves like real objects,” according to metaio. The company cites virtual accessories -- such as a pair of earrings that move according to how users turn their heads -- as an example of improved object rendering.

“As mobile devices and the underlying hardware become more sophisticated, we’re starting to see more developers combining internal sensor data to create more robust augmented reality experiences,” says Lord.

Limits of Augmented Reality Development
Augmented reality’s growth track bumps into a couple of obstacles, namely computing power and programming skill.

The mobile device’s processor places restrictions on what developers can currently accomplish. Certain types of animation, for example, can run into some limitations, even on the more powerful tablets, says Hoy.

“If an augmented reality application has rich graphics, this will require higher performance of a smart device’s processor,” adds Eugene Filipkov, senior game developer and augmented reality expert at Eligraphics Studio, an Elinext Group member company based in Minsk, Belarus.

More graphics and functionality require more resources, says Filipkov. He notes that some new smartphones have dual-core processors in which one of the second core’s functions is graphics processing.

Developments in the mobile processor space, however, seek to boost to augmented reality apps. Intel, for example, is taking an integrated approach. The company’s Z2460 processor includes an integrated 2D/3D graphics engine. [Disclosure: Intel is the sponsor of this content.]

OpenCL, a standard maintained by the Khronos Group, also has implications for augmented reality. The specification lets developers code applications that harness GPUs for general-purpose, non-graphical processing. On a mobile device, the additional processing capability could help power augmented reality applications.

How App Developers Can Use Augmented Reality
On the skills side, a novice developer might find it “quite challenging to create a

high-quality augmented reality application with rich functionality and cool graphics,” says Filipkov. In his opinion, the prerequisites for developing a successful augmented reality app include a very good command of mathematics, knowledge of matrices, a good sense of space, and knowledge of the C language. The latter, he notes, is especially helpful with OpenCV, a programming library that targets computer vision.

Some developments, however, aim to broaden the augmented reality base for app developers. In May, metaio launched software -- metaio Creator -- that lets non-developers take on augmented reality. Lord describes metaio Creator as “a more intuitive and straightforward way of making augmented reality more accessible.” The product, which reduces the process of adding augmented reality to a three-step drag-and-drop workflow, does not require knowledge of programming. Lord says the company has been presenting Creator at universities in the San Francisco area, where metaio’s U.S. office is located.

Creator users deploy augmented reality through metaio’s junaio mobile AR browser, so they need a free junaio developer account in addition to Creator. A future version of Creator will be compatible with all of metaio’s development tools, including the mobile SDK for creating custom mobile apps.