It has been more than a decade since Jeff Bezos excitedly sketched out his vision for Alexa on a whiteboard at Amazonโs headquarters. His voice assistant would help do all manner of tasks, such as shop online, control gadgets, or even read kids a bedtime story.
But the Amazon founderโs grand vision of a new computing platform controlled by voice has fallen short. As hype in the tech world turns feverishly to generative AI as the โnext big thing,โ the moment has caused many to ask hard questions of the previous โnext big thingโโthe much-lauded voice assistants from Amazon, Google, Apple, Microsoft, and others.
A โgrow grow growโ culture described by one former Amazon Alexa marketing executive has now shifted to a more intense focus on how the device can help the e-commerce giant make money.
Over the past decade or so, cars have become pretty complicated machines, with often complex user interfaces. Mostly, the industry has added touch to the near-ubiquitous infotainment screenโit makes manufacturing simpler and cheaper and UI design more flexible, even if there's plenty of evidence that touchscreen interfaces increase driver distraction.
But as I've been discovering in several new cars recently, there may be a better way to tell our cars what to doโliterally telling them what to do, out loud. After years of being, frankly, quite rubbish, voice control in cars has finally gotten really good. At least in some makes, anyway. Imagine it: a car that understands your accent, lets you interrupt its prompts, and actually does what you ask rather than spitting back a "Sorry Dave, I can't do that."
You don't actually have to imagine it if you've used a recent BMW with iDrive 8 or a Mercedes-Benz with MBUXโadmittedly, a rather small sample population. In these cars, some of which are also pretty decent EVs, you really can dispense with poking the touchscreen for most functions while you're driving.