From these findings, we developed a persona and an outcome statement to act as a representation of our primary user based on our research:

Outcome Statement

Situation | Jamie arrives at the  — she’s looking forward to the exhibition.

Problem | She spots an exhibit she wants to see, but there’s a crowd — she can’t get near enough to see it or read the mounted information

Outcome | An integrated that provides an interactive information and audio experience.

Solution | Jamie can have access to information, enjoying the exhibition as she goes at her own pace.

User Flow

A user flow was devised to outline the steps Jamie will take when using the app during an exhibition. This flow assumes Jamie has previously set up an account and purchased a ticket through the app.

User Flow

Develop

Minimum Viable Product

Using the key flow, we held a design studio to identify the key features that would help define the MVP. Using the list of features provided by the client, as well as concepts devised through the Crazy Eights exercise, we assorted these features on to a feature-prioritisation chart. The features that were deemed essential and involved the least effort laid the foundations of the app and its Minimum Viable Product.

The features we started to develop included:

Profile — users create a personalised account, where they can easily book tickets. This can also act as a database of their saved items, which helps to provide recommendations.

Audio & information — provision of additional information in form of text and audio.

Interactive map — user’s location is indicated on the map, describing the layout of the exhibition space in relation to the exhibits.

Technical Considerations

When deciding whether to develop a native app or responsive site, we had to take into account the requirement of bluetooth and compass for geolocating the user around the exhibition. This therefore would not have been possible using a responsive site. Therefore a native app was chosen for the reasons highlighted in the images below.

Style Guide

We based our app style on the Barbican’s existing brand, focusing on their brand colours for Art and Design, which use pink as the primary colour. On the Barbican website they use a lot of white space; this was also incorporated into the app design.

Style Guide

Prototyping & Testing

We then started to ideate the design, and went through several rounds of user testing and design iterations, based on the main flow.

First flows in paper prototype

At each round, we synthesised our findings, and iterated the design as we moved from lo-fidelity to high-fidelity design.

Main iterations included — implementing ‘enabling interactive mode’ earlier on in the flow; introducing on-boarding; advancing to next exhibit.

Interactive Mode

Enabling interactive mode is the most crucial part of the user journey. The bluetooth signal identifies where the user is in the exhibition, and provides them with the interactive exhibit information and audio stream.

At lo-fidelity the ‘enable’ CTA was mid-flow, which users found confusing and assumed that the interaction was manual i.e. they assumed that they would navigate through the app to an exhibit to receive information, as opposed to it being received an automated bluetooth trigger.

Interactive mode | Iteration 1

The design was iterated, with the enable CTA defined at the beginning of the flow. The user has two viewing options — manual OR interactive, introducing a clear ‘Interactive Experience’ CTA.

On-boarding for first time users

The earliest version did not include on-boarding screens explaining how the interactive mode worked. This was a big pain-point during initial user testing. A series of on-boarding flows was introduced explaining how to use the app using bluetooth.

On-boarding | Iteration 2

Advancing to next exhibit

When moving between exhibits, once in range of the next painting/ exhibit bluetooth beacon, the user will receive a notification informing them that they can receive the corresponding exhibit interactive info/ audio.

The challenge here was how to give the users control without confusing them. Users found the initial ‘Next’ CTA to be misleading, and they needed an onscreen clue of what to do next.

Next exhibit interaction | Iteration 3

Initially the info for each exhibit was selected manually by the user. We then moved to a more advanced version whereby bluetooth would locate the user in the exhibition and automatically provide the information that was associated with the exhibit they were beside. However, this led to some issues whereby users found that the audio/info from the previous exhibition could be curtailed before they had finished listening to it. We therefore arrived at a hybrid iteration, whereby the geolocating tool would provide a prompt to begin the exhibit information, rather than changing it automatically. This provided a balance of convenience and usability for the user.

Outcome

Below is a video walk-through of the final prototype. This walkthrough focuses on the user journey during the exhibition process — entering the exhibition, enabling interactive mode, saving items to profile, and receiving recommendations post the event.

Future steps

  • The app currently focuses on the Barbican’s Art and Design events. It would be extremely interesting to develop the app and its interactive features for the other events.
  • Provide wider accessibility — additional audio streams in other languages as well as a different stream for children. This also includes the option for more descriptive information for visually impaired users, enabling more users to have access to an interactive experience.

If you have any questions or fancy a chat please feel free to email me at [email protected]/ https://www.linkedin.com/in/emilypela/



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here