Google's new AI model can read an entire book
To solve this problem, Google introduced a new machine learning model called Reformer, which can understand the content and context of a 1 million-line work - the equivalent of a novel, while only having to use use about 16GB of storage. Reformer was developed to overcome the disadvantages of Transformer - an old-fashioned neural network that works by comparing words in a paragraph to understand the relationship between them.
However, by operating in a combination-pairing mode, Transformer will occupy a lot of data space in case it has to process more than a few thousand words of text. This weakness makes using Transformer to handle long articles or a book extremely impossible.
Google developed Reformer to address both of these key issues: The ability to handle long texts and the memory consumption of the old model.
To solve the first problem, Google's new model uses a method called locality-sensitive-hashing (LSH). That is, instead of comparing all the words together as before, the new model will use the hash function to join similar words together in a group, then compare words in the same group or in groups. adjacent, helping to limit overloading, and the ability to handle larger amounts of text.
To solve the memory consumption problem, researchers use a technique called Reversible Residual Network (RevNet). RevNet was developed from Deep Residual Networks (ResNets) - a technology that works by increasing efficiency as networks grow deeper and wider. RevNets owns layers, where the output of each layer can be rebuilt and used in another. As a result, the output data for most classes almost does not need to be stored in memory during the reverse transfer.
To test the effectiveness of this model, Google provided Reformer with a number of small cropped images, and it created a full-frame image from there.
Google engineers say the new model can easily handle an entire book with great accuracy, opening up the potential for large-scale word processing.
You should read it
- Entertainment on Neural Networks, Artificial Intelligence and Machine Learning
- Asus upgraded the Transformer line
- Asus Transformer Pad netbook style launched
- Building Neural Network to learn AI
- What is Apple's Neural Engine? How does it work?
- AI classifies objects on the road only by radar measurements
- Google Now can understand more than 30 different languages around the world
- Laptop Transformer Transformer Book costs nearly 1,500 USD
May be interested
- Take a look at the configuration of Microsoft Surface Book 3 based on the rumor seriesmicrosoft is expected to launch a new surface model this spring and the surface book 3 will likely debut at this event.
- Synthesis of the best book wallpapers for computersif you are a book lover, besides collecting books, you can also collect beautiful pictures on the following book of network administrator.
- Laptop Transformer Transformer Book costs nearly 1,500 USDwindows 8 laptop model with hybrid design of asus owns full hd touch screen, core i7 chip and uses both ssd and hdd.
- Samsung unveiled Galaxy Book S, an ultra-thin laptop with a long battery lifegalaxy book s is a windows 10 laptop model aimed at the high-end segment.
- 30 good books to read during the Corona seasonas the outbreak of covid-19 continues to spread and many of us are looking for new ways to entertain ourselves while at home, reading books can provide rewarding and relaxing times.
- How to download the entire website for offline readingalthough wifi is everywhere today, sometimes you will go to places where there is no wifi. for example, if you join a 12-hour international flight, downloading the entire site can be a great alternative to e-books, audio books, podcasts and movies. but how do you download a website? it's easier than you think! here are four tools that allow you to do that.
- How to read 10,000 words within a minute?what should you do when you need to read and analyze a large amount of information in great detail quickly? what to do when we need to read quickly to study and write a summary tomorrow? how to read quickly without missing out on the most important content and aspects of that book?
- How to Dedicate a Bookbook dedications were begun as a way to give thanks to a patron, often in exchange for money to finance the book.http://www.theguardian.com/books/2007/jun/21/news.comment today, they are a way to express gratitude for inspiration and are...
- Harvard University removed the human skin used as a book coverafter the book's human skin cover is removed, interested people can still read the book's content directly at the harvard library or online.
- How to Cite a Translated Bookif a work was originally published in an unfamiliar language, you may only be able to read it by using a later translation. when referencing a translated book in a research paper or report, you generally need to list both the original...