Intel revealed new AI chips to compete with Nvidia's GPUs
Details are not much, but Intel says new chips will improve deep learning training time.
Details are not much, but Intel says new chips will improve deep learning training time.
When the AI exploded, Intel did not respond. Now they are trying to reaffirm their position in the silicon world by revealing a new line of chips, designed specifically for artificial intelligence, called Intel Nervana Neural Network Processor or NNP for short.
The NNP chip line is like a response to the need for machine leraning and its destination is data centers, not computers.
Intel's CPU may be a giant in the server sector (Intel estimates, it accounts for 96% of the data center market share), but the workload of existing AIs is still due to the microprocessors. graphics or GPU comes from companies like Nvidia or ARM. As a result, the demand for these companies' chips is increasing. (Nvidia's revenue last year increased by 56% next year).
Google has joined, designing its own Tensor Processing Unit chip to support cloud computing, while new companies like Graphcore in the UK have also tried to bridge the gap.
Intel's reaction is to buy AI hardware talents. They bought Mobileye vision company in March, chip maker Movidius (DJI's self-driving chip maker), in September last year and started learning deep learning Nervana Systems in August. / 2016.
Intel invests heavily in AI chip hardware experts
Since then, they are still busy with the Neural Network Processor, formerly known as Lake Crest. NNP chips are the direct result of Nervana acquisition and focus on professionals to achieve 'faster learning time for deep learning models'. (Intel also said to have received a Facebook recommendation for chip design but did not elaborate. '
Exactly how much faster their deep learning model will be, Intel doesn't say. While Google promotes its new TPU chip by offering comparative tests with rivals, https://cloud.google.com/blog/big-data/2017/05/an-in-depth- look-at-googles-first-tensor-processing-unit-tpu Intel just said that they are following the progress and will achieve the goal of speeding up deep learning 100 times by 2020.
Still very vague about when the NNP chip will come to the user.
You should read it
- Super power-saving AI chip, usable for all devices that have appeared
- Instructions for new learners AI: networks of neural networks
- The most potential AI chip manufacturing company in the world has been poured an additional $ 200 million in funding
- Building Neural Network to learn AI
- IBM announces next generation Z processor: 7nm Telum chip, 22.5 billion transistors, 8 cores running on 5GHz clock
- Intel is developing a chip that can distinguish scents
- How can the AI see us behind the walls?
- Entertainment on Neural Networks, Artificial Intelligence and Machine Learning
- Intel introduced chips that act like human brains
- Samsung announced a 3nm process chip, saving more than 50% energy, 35% faster than 7nm chip
- Toshiba applies PS3 processor chip to laptop
- Researchers are getting closer to creating a complete human brain simulation chip
Maybe you are interested
What is Chroma Key Technique - The most classic cinematic VFX What do you need to know when buying Bitcoin or selling Bitcoin? 20 proofs that this world has only 3 types of people, which one are you? Experience traveling safely into the rain for the near-sighted people Gravity wave detectors will be built in India by 2025 Create unique shoes for women to prevent rapists