Park Mobile

Building a platform for National Parks

Timeframe

  • January 2016

    July 2019

Roles

  • Product Designer

Tasks

  • UI Design
  • UX
  • Prototyping
  • Animations
  • Info Architecture
  • Content Strategy

Tools

  • Sketch
  • Origami
  • Airtable
  • Invision
  • XCode

Summary

I was a core member of the team tasked with building the National Park Service official mobile app. I provided direction and deliverables across the full scope of the project from piloting and prototypes to deployment and maintenance.

In order to produce a framework that could adapt reliably to the needs of 300 million annual visitors across more than 400 sites nationwide we built a flexible content model and editor, facilitating high quality contributions from hundreds of park rangers around the country.


Challenges

Experience at scale

Absent a mature content and data API from the Park Service, each instance of the app would require a new set of contributors from the relevant parks who would need to create and organize a fresh dataset and content package. Our response was twofold: on the backend I scaffolded UX fieldwork into a robust editor model that could be translated into a CMS; on the frontend I helped to polish the interface and reinforce visual patterns that could adapt to a wide variety of content types and image qualities without compromising the overarching Park Mobile and NPS brands.

Data (and other deserts)

Given the remoteness and extreme conditions in most national parks, we knew to expect significant challenges around connectivity and byte-budgeting. This was exacerbated by a lackluster CDN beyond our control and wildly inconsistent data policies across the park system. We responded creatively with a myriad of compromises to mitigate visitor pain and re-center the core experience, included amplifying preparatory touchpoints in wifi-enabled locations, improving the clarity of asynchronous interactions, and building resilient live features to deliver the freshest data possible at every turn.

Accessibility matters

I was responsible for devising a proprietary indoor proximity system to deliver audio descriptions at visitor centers across the system in compliance with federal accessibility guidelines. In order to accomplish this within budget, I led the design and configuration of an iBeacon-enabled indoor exhibit module that could be adapted for a wide variety of physical spaces.

Piloting content

When I joined the project early deployment of a pilot app was already underway at a handful of parks. For the pilot there were several goals in mind:

  1. Learn how the park service thinks about its content. What is important to them? What do they think is important to visitors?
  2. Determine how the app will be adapted for parks of various types and sizes. What guidelines should shape the adoption process of individual managers and contributors?
  3. Formalize content strategies that will empower each park to set their own priorities. What facets of the app should be universally required, as opposed to arbitrarily included on a case-by-case basis?


Analyzing patterns

In order to align our process with the needs of park editors, we took a look at the content choices each was making for their pilot app. This observational process informed our questions in stakeholder interviews and other design exercises.

A whiteboard depicting content types for 8 different parks
Identifying content patterns

It was important to be mindful of how we were grouping content types because any assumptions we introduced (even implicitly) would have a ripple effect on how contributors thought about their role and imagined possibilities. While we needed some explicit terms and categories to ground our conversations, I took extra precaution to encourage outside-the-box thinking and remind everyone to center their own perspectives. In this case, unique expectations were especially useful for identifying edge cases and refining scope.

A list of content items at 10 different parks
Quantifying feature categories


Refactoring

I sorted an expanded set of content types into buckets according to interaction methods, in order to understand how our new interface could be modeled. This provided us clarity as we outlined how we would be re-interpreting content from the pilot app.

A diagram of content types organized by discovery mode
Modes of discovery for various content

An outline of content from a pilot app
Outlining content elements

I built spreadsheets to refactor data points into the new model. This produced a helpful byproduct of illuminating content balance and variety across different apps, which helped us align qualitative research with the nuts and bolts of the implementation.

A spreadsheet depicting a new location selection menu for rows of older content items
Refactoring old content

Rows depicting categories and their respective subitems
Grouping attributes

A list of categories in a menu being assigned to a site attribute
Categorizing service tags

Deriving an interface

Our next task was to identify an appropriate interaction pattern for the app's home screen. For context, the pilot had been designed to feature on the home page a prominent image from the park and recent news with other elements hidden behind a few layers of interaction.

Based on the availability of relevant news and a shifting understanding of visitor priorities, we found the recent news feature less than relevant and noticed the slideshow of other images from the park (which was accessed by tapping the main image,) was rarely viewed. Was this because the gallery was unimportant, or were our users struggling to access something they wanted?

Klondike National Park pilot app home screen, annotated
Pilot app home screen

Screens with certain touchable regions marked with percentages from analytics data
Reviewing analytics

One way or another we needed to make better use of the available screen real estate. We arrived at a grid-based system that could provide one-tap access to any number of content items or features. I helped to iteratively refine the home page by testing various arrangements and tiles, building color schemes, and identifying harmonious measurements for the layout.

A photo of whiteboard notes on the home page structure
Developing home tile logic

After introducing the updated grid interface to editors and in conjunction with the aforementioned image gallery oversight, we noticed an editorial opportunity: using tile images instead of our default color palette provided a visual overview of the park at a glance while providing quick access to related information. I updates our documentation to reflect this insight in a learning moment for our content contributors.

Home screens, one with colorful backgrounds and white icons, another with photo backgrounds
Home tile progressions

In conjunction with design updates I produced documentation to assist editors in acclimating to the new app. Our guidelines took a number of forms over the project's lifespan, including PDF documents, a standalone knowledge base website, and eventually inline CMS field comments.

A snippet of documentation describing the parts of the home screen
An excerpted diagram from mid-stage documentation


Animating prototypes

I built a series of animated prototypes in Origami to conceive and illustrate behaviors for our developers.


Modeling a content editor

One of the biggest concerns for the project's long-term success was an effective content management system. Taking from our client and visitor UX research and pilot data, I built an Airtable database to model a new CMS that would empower the project's hundreds of contributors to do their best work. My approach gave us an opportunity to separate our assumptions and intuitions in one scope from the technical requirements of another, bringing it all together in an intuitive and comprehensive accounting.

A spreadsheet grouping

I performed a thorough review of existing content in dynamic relationship to editor fields and version scope.

Spreadsheet of content items
Evaluating content for scope and context

I added screenshots to clarify how editor fields fed data to user facing interfaces.

A database of screens with associated content fields
Connecting content and UI

I synthesized existing content and new fields into a comprehensive set of editor requirements.

Spreadsheet of editor pages and fields
Aligning user stories with editor contexts

My tables automatically generated a comprehensive editor specification based on the provided tables, views and priorities. This specification could grow over time as new content and fields were introduced.

Kanban cards for editor items with pages per column
Bringing editor elements into focus

Anticipating poor connections

We needed to figure out how best to deliver all this rich media and content to visitors in places with poor-to-no connectivity. I helped identify and prioritize downloads based on visitor needs, interface dependencies and timeliness.

A diagram of downloadable content types, sizes and priorities
Visualizing data priorities

Once we'd made sense of our priorities for downloading logic, I produced technical specifications that everyone on the team could understand and discuss.

A series of timelines depicting how information is downloaded
Downloading logic diagrams

Dealing with imperfection

Our approach to offline use was never perfect. Low connectivity is a hard problem to solve, in our case made harder still by the bifurcation of resources between our AWS backend and the government CDN hosting NPS media.

As we continued to refine our implementation, I pushed to build out more transparency for our users around what was happening with their connection and how it was impacting their app.

Two screens with notifications for failed and successful download
Excerpt from data messaging


Keeping things fresh

As the Park Service works to modernize their digital infrastructure, new opportunities are emerging to provide live data directly to visitors on their mobile devices. I led the effort to design and implement a series of live data layers in the app, which we prototyped at Yellowstone National Park.

An array of visual blocks depicting live data UI elements
Live data elements

I also worked on evolutions of the push notification logic, an important component of the timely UX strategy.

Several phone screens with push notifications
Outlining notification touchpoints

Improving Accessibility

In order to provide better accessibility to park visitors, we were asked to design an indoor proximity system to provide spatially-relevant audio descriptions for visitor centers and other exhibits. I led the effort to conceive and model our approach using iBeacons and a proprietary ranging algorithm. The resulting feature is called the "Indoor Exhibit Module" or IEM.

Diagram of indoor exhibit module data model
The Indoor Exhibit Module

I provided support both remotely and in person across dozens of IEM installations across the park system. One of my favorite trips was to install and configure beacons at Kennecott in Wrangell - St Elias National Park, Alaska.

A map of beacons for exhibits at Kennecott in Alaska
IEM map for Kennecott

I helped to improve our guidelines for IEM implementation by aligning the experience of park staff and visitors with the technical eccentricities of bluetooth and Apple's black-box iBeacon ranging functions. My role as a liaison for this feature allowed me to bring the needs of visitors, park staff and the app development team into a careful balance.

A panorama overlooking Kennecott Glacier in Alaska
A panorama at Kennecott

As our understanding of the challenges involved with successful audio descriptions improved, I helped to iteratively refactor the feature to meet evolving requirements.

Screenshots of data sheets for the indoor exhibit module
IEM refactoring

Polishing VoiceOver

I also helped to define and improve our audio description logic throughout the app so visitors using iOS VoiceOver would have the best possible experience.

Voiceover specifications for an app screen
Excerpt from VoiceOver specifications

Outcomes

Since its introduction Park Mobile has been adopted by more than 30 parks in various stages of production, enriching many thousands of visits annually. Foregrounding flexibility and accessibility across our design process allowed the work to empower hundreds of dedicated civil servants around the country to tell impactful stories and broaden their stewardship of the public wilderness.

The project imprinted on me the importance of transparency and inclusivity in the pursuit of public good. I relished the opportunity to approach issues from peripheral perspectives and build things for wider populations to enjoy and learn from.

I look forward to the app's widening presence across the park service and for the role similar technologies will play in the struggle to preserve environmental legacies for our future.