By Parmy Olson
It seems fitting that one of Google’s most important inventions — one that would come back to haunt the company — was initially devised over lunch.
In 2017, researchers at Alphabet’s Mountain View, California, headquarters were talking over their midday meal about how to make computers generate text more efficiently. Over the next five months they ran experiments and, not realising the magnitude of what they’d discovered, wrote their findings up in a research paper called Attention is All You Need. The result was a leap forward in AI.
The paper’s eight authors had created the Transformer, a system that made it possible for machines to generate human-like text, images, DNA sequences and many other kinds of data more efficiently than ever before. Their paper would eventually be cited more than 80,000 times by other researchers, and the AI architecture they designed would underpin OpenAI’s ChatGPT (the “T” stands for Transformer), image-generating tools such as Midjourney and more.
There was nothing unusual about Google sharing this discovery with the world. Tech companies often open source new techniques to get feedback, attract talent and build a community of supporters.
But Google itself didn’t use the new technology straight away. The system stayed in relative hibernation for years as the company grappled more broadly with turning its cutting-edge research into usable services.
Meanwhile, OpenAI exploited Google’s own invention to launch the most serious threat to the search giant in years. For all the talent and innovation Google had cultivated, competing firms were the ones to capitalise on its big discovery.
The researchers who co-authored the 2017 paper didn’t see a long-term future at Google either. In fact, all of them have since left the company. They’ve gone on to launch start-ups including Cohere, which makes enterprise software, and Character.ai, founded by Noam Shazeer, the longest-serving Googler in the group, who was seen as an AI legend at the company.
Combined, their businesses are now worth about $US4.1 billion ($5.9 billion), based on a tally of valuations from research firm Pitchbook and price-tracking site CoinMarketCap. They are AI royalty in Silicon Valley.
The last of the eight authors to remain at Google, Llion Jones, confirmed this week that he was leaving to start his own company. Watching the technology he co-created snowball this past year had been surreal, he said. “It’s only recently that I’ve felt … famous,” Jones says. “No one knows my face or my name, but it takes five seconds to explain: ‘I was on the team that created the ‘T’ in ChatGPT.’”
It seems strange that Jones became a celebrity thanks to actions outside Google. Where did the company go wrong?
‘For all the talent and innovation Google had cultivated, competing firms were the ones to capitalise on its big discovery.’
One obvious issue is scale. Google has an army of 7133 people working on AI, out of a workforce of about 140,000, according to an estimate from Glass.ai, an AI firm that scanned LinkedIn profiles to identify AI employees at big tech firms earlier this year for Bloomberg. Compare that with OpenAI, which sparked an AI arms race with a much smaller workforce — about 150 AI researchers out of approximately 375 staff in 2023.
Google’s sheer size meant that scientists and engineers had to go through multiple layers of management to sign off on ideas back when the Transformer was being created, several former scientists and engineers have said. Researchers at Google Brain, one of the company’s main AI divisions, also lacked a clear strategic direction, leaving many to obsess over career advancement and their visibility on research papers.
The bar for turning ideas into new products was also exceptionally high. “Google doesn’t move unless [an idea is] a billion-dollar business,” says Illia Polosukhin, who was 25 when he first sat down with fellow researchers Ashish Vaswani and Jakob Uszkoreit at the Google canteen. But building a billion-dollar business takes constant iterating and plenty of dead ends, something Google didn’t always tolerate.
A spokeswoman for Google said the company was “proud of our industry-defining, breakthrough work on Transformers and are energised by the AI ecosystem it’s created — including new collaborators and bittersweet opportunities for our researchers to continue to advance their work outside of Google.”
In a way, the company became a victim of its own success. It had storied AI scientists, such as Geoffrey Hinton, in its ranks, and in 2017 was already using cutting-edge AI techniques to process text. The mindset among many researchers was “if it ain’t broke, don’t fix it”.
But that’s where the Transformer authors had an advantage: Polosukhin was preparing to leave Google and more willing than most to take risks (he’s since started a blockchain company). Vaswani, who would become their paper’s lead author, was eager to jump into a big project (he and Niki Parmar went off to start enterprise software firm Essential.ai). And Uszkoreit generally liked to challenge the status quo in AI research — his view was, if it ain’t broke, break it (he’s since co-founded a biotechnology company called Inceptive Nucleics).
In 2016, Uszkoreit had explored the concept of “attention” in AI, where a computer distinguishes the most important information in a dataset. A year later over lunch, the trio discussed using that idea to translate words more efficiently. Google Translate back then was clunky, especially with non-Latin languages. “Chinese to Russian was terrible,” Polosukhin remembers.
The problem was that recurrent neural networks processed words in a sequence. That was slow, and didn’t take full advantage of chips that could process lots of tasks at the same time.
The CPU in your computer at home probably has four cores, which process and execute instructions, but those used in servers for processing AI systems have thousands of cores. That means an AI model can read many words in a sentence at the same time, all at once. No one was taking full advantage of that.
Uszkoreit would walk around the Google office scribbling diagrams of the new architecture on white boards, and was often met with incredulity. His team wanted to remove the recurrent part of the recurrent neural networks being used at the time, which sounded mad, says Jones.
But as a few other researchers such as Parmar, Aidan Gomez and Lukasz Kaiser joined the group, they started seeing improvements.
Here’s an example. In the sentence, “The animal didn’t cross the street because it was too tired”, the word “it” refers to the animal. But an AI system would struggle if the sentence changed to, “because it was too wide,” since “it” would be more ambiguous. Except now the system didn’t. Jones remembers watching it work this out. “I thought, ‘This is special’,” he says.
Uszkoreit, who is fluent in German, also noticed the new technique could translate English into German far more accurately than Google Translate ever had.
But it took a long time for Google itself to apply the technique to its free translation tool, or to its language model BERT, and the company never deployed it in a chatbot that anyone could test out. That is, until the launch of ChatGPT in late 2022 forced Google to quickly release a rival called Bard in March 2023.
Over the years, the authors watched their ideas get applied to an array of tasks by others, from OpenAI’s early iterations of ChatGPT to DALL-E, and from Midjourney’s image tool to DeepMind’s protein-folding system AlphaFold. It was hard not to notice that the most exciting innovations were happening outside Mountain View.
You could argue that Google has simply been careful about deploying AI services.
But slow doesn’t always mean cautious. It can also just be inertia and bloat. Today some of the most interesting AI advancements are coming from small, nimble start-ups. It is a shame that many of them will get swallowed by big tech players, who are poised to reap the biggest financial benefits in the AI race even as they play catch-up.
Google may have the last laugh in the end, but in many ways it will have been an unimpressive journey.
The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.
Source: Thanks smh.com