Portrait of a Vietnamese person who is credited with creating algorithms that both Google and ChatGPT rely on for development
The “father” of the algorithm that both Google and ChatGPT Who to rely on for development?
Mr. Le Viet Quoc was born in 1982 in a small village in Huong Thuy (Thua Thien-Hue province), an area where Quoc used to live without electricity, but he often goes to a library near his house to research about inventions. through the pages of books and embrace the dream of one day having their own inventions.
After graduating from Quoc Hoc Hue School for the Gifted, Mr. Quoc continued to study at the Australian National University (Australia) and then worked as a PhD student at Stanford University (USA) on artificial intelligence.
In 2011, Quoc co-founded Google Brain, along with advisor Dr. Andrew Ng, Google PhD student Jeff Dean and Google researcher Greg Corrado. The goal is to explore Deep Learning on the basis of Google’s huge data volume. Before that, Quoc did a few researches at Stanford University on Unsupervised Deep learning.
Deep Learning is an algorithm based on a number of ideas from the brain to the acquisition of many layers of representation, both concrete and abstract, thereby clarifying the meaning of data types. Deep Learning can help solve a wide range of problems such as education, climate change, etc.
For example, using remote sensors, environmental data around the world will be monitored and recorded. Currently that volume of data is largely unprocessed, and Deep Learning can be applied to understand strings and point to solutions.
After graduating in 2013, Quoc officially joined Google as a researcher. He soon achieved impressive breakthroughs in the field of machine translation (Machine Translation), one of the most active areas of research in the machine learning community.
To get there, Quoc has to go beyond deep learning methods that already work with images and words that can be analyzed with fixed-size inputs.
In 2014, Quoc proposed sequence sequencing (Seq2seq) studied with Google researchers Ilya Sutskever and Oriol Vinyals. It is a framework – a library of decoder codeframes intended to train models to convert strings from one domain to another, such as converting sentences into different languages.
Seq2seq learning requires fewer engineering design choices and allows Google’s translation system to work efficiently and accurately on huge data files. It is mainly used for machine translation systems and has proven to be applicable in many more areas, including text summarization, artificial intelligence conversations, and question answering.
Why does CEO Nguyen Tu Quang say that ChatGPT’s success is thanks to Le Viet Quoc?
According to the latest post of CEO Nguyen Tu Quang, he said that it was Quoc’s seq2seq algorithm that helped Google create Transformer. This algorithm is currently used by both Google and ChatGPT.
Transformer is an algorithm invented by Google Brain, Google’s AI-intensive research unit, in August 2017.
Transformer algorithm is groundbreaking in terms of language AI training. Before this algorithm, people wanted to teach AI, they had to create a training data set in question-answer pairs (labeling data). AI will only memorize a pair of available sentences without understanding the meaning of that sentence, there is a huge difference between rote learning and comprehension.
Quoted verbatim from Google’s Transformers publication: “With transformers, computers can see the same patterns humans see”.
Google has generously made the detailed documentation of the Transformer algorithm publicly accessible to all. At the same time, it provides open-source rights for this algorithm.
Since then, the entire AI science community has benefited from Google’s invention. Among them is OpenAI, a company founded in 2015 and did not have any outstanding achievements until after 2017.
After Google announced Transformer, after only a few months, the first language AIs based on this new algorithm were born massively. In January 2018, OpenAI released the first AI based on Transformer, GPT-1, which they applied very quickly, faster than Google itself.
GPT stands for Generative Pre-trained Transformer which means “Transformer-trained Text Creation program”.
This ChatGPT was created with the main purpose of “Creating Text”. Specifically, you will play a word match game with AI, you write a sentence, the chatbot will read that sentence and then based on the knowledge stored in memory, “generate words” following the sentence you write.
at Thuvienpc.com – Source: Soha.vn – Read the original article here