Designing the UI of Google Translate

Pendar Yousefi
Google Design
Published in
5 min readMay 21, 2018

--

The design story behind making Google Translate’s camera feature and real-time conversation mode more discoverable

Humans have been fascinated with magical creatures or devices that break language barriers for as long as languages have existed. At Google Translate, we have the opportunity to work on a product that helps us get one step closer to realizing the magic.

Discovering the magic

Google Translate’s instant camera translation and real-time conversation modes never cease to amaze our users. But beyond their wow-inducing nature, they are extremely useful as well, helping language learners, students, tourists, and all sorts of people who need to break through language barriers.

The problem

Unfortunately, many of our users did not know these features existed. We first got a hunch this is true when we noticed people asking for features we already had during user research interviews. We confirmed that by running a survey which showed 38% of our users did not know we had these tools.

First attempts

Our hypothesis was that the entry points into these tools were too small and hidden. Our first instinct was to make them bigger and more noticeable. So we experimented with new UIs that did just that.

To our surprise, this had the opposite effect! Usage of camera and conversation actually went down. So what was going on? When we dug in, we found that many people simply ignored colorful things below the main box, often assuming they were unimportant, especially as we had historically used that space to show feature promotion cards.

Back to the drawing board

So we went back to the drawing board and asked ourselves; what are the minimum changes we can make to the existing familiar UI to make those features more clearly visible? We tweaked the icons, added text labels, and split the voice feature into its two important but separate use cases: dictation and conversation.

This worked! Usage of all features increased across the board, with the handwriting being used as much as 25% more than it was before! We even started seeing tweets from people who thought these were new features, even though they had existed for years.

Improving the conversation UI

We didn’t stop there. At Google Translate we’re always thinking about how we can improve the product experience for our users. We wanted to make the conversation mode UI easier and more intuitive to use.

Our old conversation mode UI had 3 buttons: two buttons in the left and right for each language, and a big mic icon in the middle which we sometimes referred to as the magic mic. The magic mic was supposed to know when people were talking, what language they were speaking in, when they stopped talking, and when to translate, all without any additional input from users.

It was magical when it worked. But for some languages, the magic mic wasn’t quite as accurate as it was for others. And because the UI emphasized the magic mic, many people were only using that, not even realizing there are two other manual language buttons giving them better control and accuracy. As a result, some users were encountering errors and abandoning this mode for other options like typing.

Ch-Ch-Changes

Following the lessons we learned from improving discoverability by making simple changes, we asked ourselves: what are the simplest changes we can make to the UI to make the experience more intuitive?

We made the manual language buttons much more visible by changing their shape and color. We still kept the magic mic of course, and even gave it a more descriptive text label. But it was no longer the big star of the UI.

At the same time, we worked with the hardware team to make sure the experience worked with the Pixel Buds as well. We made small tweaks to the UI such as showing a headset icon when the Pixel Buds were connected. One of our brilliant designers, Liu Liu, even came up with the ingenious idea of the 'ice breaker' screen. He had noticed in user studies that the most awkward part of the interaction was the beginning of the conversation. He made a visual screen with a standard message that the user could just show to the other person to break the ice.

As a result of the UI changes as well as other technical improvements that made the conversation mode faster, we saw a noticeable increase in the usage of speech translations overall. While our quest for improving these and other features doesn’t end here, this was a great first step in making our most magical features more discoverable and easier to use.

--

--