Siri in iOS 14 getting real-time Translate app, sleeker interface
Apple has given Siri a major update in iOS 14 with a new user interface design and a built-in Translate feature.

The new Siri interface in UI. Credit: Apple
One of the biggest changes to the Apple digital assistant is an overhauled user interface that doesn't take up the entire display. Instead of the full-screen Siri pane, Siri will actually pop up as a small icon at the bottom on the display. Answered Siri requests will also be less intrusive, with details and information appearing as notification boxes at the top of the screen.
Siri is also getting a new Translate app that will support full and real-time translations across multiple languages. The feature is privacy-respecting and works completely offline.
At launch, Translate will support 11 languages, including:
Keep up with all the Apple news with your iPhone, iPad, or Mac. Say, "Hey, Siri, play AppleInsider Daily," -- or bookmark this link -- and you'll get a fast update direct from the AppleInsider team.

The new Siri interface in UI. Credit: Apple
One of the biggest changes to the Apple digital assistant is an overhauled user interface that doesn't take up the entire display. Instead of the full-screen Siri pane, Siri will actually pop up as a small icon at the bottom on the display. Answered Siri requests will also be less intrusive, with details and information appearing as notification boxes at the top of the screen.
Siri is also getting a new Translate app that will support full and real-time translations across multiple languages. The feature is privacy-respecting and works completely offline.
At launch, Translate will support 11 languages, including:
- English
- Mandarin Chinese
- French
- German
- Spanish
- Italian
- Japanese
- Korean
- Arabic
- Portugese
- Russian
Keep up with all the Apple news with your iPhone, iPad, or Mac. Say, "Hey, Siri, play AppleInsider Daily," -- or bookmark this link -- and you'll get a fast update direct from the AppleInsider team.
Comments
- Guessing your intent based on sensory data, voice emotion, location, current app open, bluetooth connection, travel speed, etc, so it can decide what you probably want - not just based on spoken words - and how you want the response back.
- Learn the person’s habits and create some sort of a profile of the users preferences and adapt accordingly. Apple says Siri does that but there has been zero cases where I noticed anything closely related to adaptability or personalization.