What we changed
Nobody can deny that the technology behind Translate is unparallel but there are some UX issues which we intended to solve. We wanted to keep things at a quick reach. Whenever the user is at home or at a grocery shop they’d want the most used actions, Type and Camera, nearby. So naturally, we thought of putting them all near the thumb.
With the home screen, we aimed to keep things near the thumb without changing the Information Architecture of the page. On one hand, we dropped the icon labels for the Camera and Mic. On the other, we added more information to the Conversation and Handwriting actions. The assumption here was that the user would be familiar with buttons like Camera and Mic. And as the Conversation and Handwriting felt focussed more towards power users, they should have more information attached.
Guide with Colour
Taking inspiration from Google Tasks and Assistant for the Material Design 2.0 UI, we went with an all white and blue approach. We wanted the main action of a state to be highlighted with the blue colour. As the primary need for the user while using the app was to view the translated text we tried to keep the translated text in-blue or on-blue.
In the home state, the language switcher was on-blue. While typing, the real-time translation is in-blue. And finally, in the result screen, the translation is on-blue. This not only brings vibrance in the UI but also directs the user’s eye to the highlighted part of the screen.
This notion was shattered when we read the article by Pendar Yousefi published while we were doing this project though.
Making it seamless
There was a need for a smoother experience. Currently, the input boxes ‘jump’ up and down a lot between the three home-typing-result states. We aimed at cleaning that up by keeping components stacked from the bottom. This should also help in keeping a minimal number of screens as the transitions can be smooth when the keyboard pops up.
Now the problems started cropping up.