Google's new AI model can read an entire book
One of the biggest challenges of artificial intelligence (AI) models in language processing today is to understand the context and context of each specific segment.
To solve this problem, Google introduced a new machine learning model called Reformer, which can understand the content and context of a 1 million-line work - the equivalent of a novel, while only having to use use about 16GB of storage. Reformer was developed to overcome the disadvantages of Transformer - an old-fashioned neural network that works by comparing words in a paragraph to understand the relationship between them.
However, by operating in a combination-pairing mode, Transformer will occupy a lot of data space in case it has to process more than a few thousand words of text. This weakness makes using Transformer to handle long articles or a book extremely impossible.
Google developed Reformer to address both of these key issues: The ability to handle long texts and the memory consumption of the old model.
To solve the first problem, Google's new model uses a method called locality-sensitive-hashing (LSH). That is, instead of comparing all the words together as before, the new model will use the hash function to join similar words together in a group, then compare words in the same group or in groups. adjacent, helping to limit overloading, and the ability to handle larger amounts of text.
To solve the memory consumption problem, researchers use a technique called Reversible Residual Network (RevNet). RevNet was developed from Deep Residual Networks (ResNets) - a technology that works by increasing efficiency as networks grow deeper and wider. RevNets owns layers, where the output of each layer can be rebuilt and used in another. As a result, the output data for most classes almost does not need to be stored in memory during the reverse transfer.
To test the effectiveness of this model, Google provided Reformer with a number of small cropped images, and it created a full-frame image from there.
Google engineers say the new model can easily handle an entire book with great accuracy, opening up the potential for large-scale word processing.
Discover more
AIShare by
Lesley MontoyaYou should read it
- Entertainment on Neural Networks, Artificial Intelligence and Machine Learning
- Asus upgraded the Transformer line
- Asus Transformer Pad netbook style launched
- Building Neural Network to learn AI
- What is Apple's Neural Engine? How does it work?
- The Quiet Details That Make a Sports Betting Platform Feel Reliable
- Instructions on creating toy set images with ChatGPT AI
- How are AI agents changing the journalism industry?
- How Linux stores and manages user passwords
- Windows 10's new standby can 'kill' laptop batteries
- Apple announced the record quarterly revenue that any technology company must wish for