Google's new AI model can read an entire book
To solve this problem, Google introduced a new machine learning model called Reformer, which can understand the content and context of a 1 million-line work - the equivalent of a novel, while only having to use use about 16GB of storage. Reformer was developed to overcome the disadvantages of Transformer - an old-fashioned neural network that works by comparing words in a paragraph to understand the relationship between them.
However, by operating in a combination-pairing mode, Transformer will occupy a lot of data space in case it has to process more than a few thousand words of text. This weakness makes using Transformer to handle long articles or a book extremely impossible.
Google developed Reformer to address both of these key issues: The ability to handle long texts and the memory consumption of the old model.
To solve the first problem, Google's new model uses a method called locality-sensitive-hashing (LSH). That is, instead of comparing all the words together as before, the new model will use the hash function to join similar words together in a group, then compare words in the same group or in groups. adjacent, helping to limit overloading, and the ability to handle larger amounts of text.
To solve the memory consumption problem, researchers use a technique called Reversible Residual Network (RevNet). RevNet was developed from Deep Residual Networks (ResNets) - a technology that works by increasing efficiency as networks grow deeper and wider. RevNets owns layers, where the output of each layer can be rebuilt and used in another. As a result, the output data for most classes almost does not need to be stored in memory during the reverse transfer.
To test the effectiveness of this model, Google provided Reformer with a number of small cropped images, and it created a full-frame image from there.
Google engineers say the new model can easily handle an entire book with great accuracy, opening up the potential for large-scale word processing.
You should read it
- Entertainment on Neural Networks, Artificial Intelligence and Machine Learning
- Asus upgraded the Transformer line
- Asus Transformer Pad netbook style launched
- Building Neural Network to learn AI
- What is Apple's Neural Engine? How does it work?
- AI classifies objects on the road only by radar measurements
- Google Now can understand more than 30 different languages around the world
- Laptop Transformer Transformer Book costs nearly 1,500 USD
May be interested
- Discovered the company stores 3 billion photos as 'materials' for the facial recognition tool, raising privacy concernsthis is one of the companies sponsored by paypal co-founder peter thiel.
- Samsung's new fridge model can set your diet with AI technologyartificial intelligence (ai) technology is having an increasingly profound impact on every aspect of human life, especially in improving comfort and quality of life.
- Baidu defeats Microsoft and Google in teaching AI to understand human languagechinese search giant baidu easily defeated two us tech giants microsoft and google in a contest of natural language processing of the world's leading ai models.
- Nvidia successfully developed an AI system that can create 3D models from 2D imageswhat if we could develop a 3d game world simply by taking pictures from our compact phone?
- The AI helps shed some light on the author of the famous 400-year-old playhenry viii is one of the most famous plays in the history of english literature, written by famous writers william shakespeare and john fletcher in 1623.
- Google's DeepMind AI becomes the best StarCraft 2 'gamer' in the worlddeepmind's artificial intelligence platforms have gained worldwide popularity over the last few years thanks to their excellent mastery of complex games.