Semiconductor sensors made in large quantities on a circular silicone sheet are called 'wafer' and not made separately. From a wafer it is cut into sensors and cutting rectangular pieces from a circular object will lead to problems with excess material.
In fact, a wafer can produce 244 1-inch sensors with an excess area of 12.6%; APS-C sensor (crop x1.5) with 18% redundancy; 20 - 24 camera sensors in Full-frame format and 36% excess area. No matter how small people try to cut, there is still an area left.
Image sensor is cut from a wafer.
These wafers are quite expensive. It is estimated that a high quality 8 inch diameter silicone sheet can cost up to 5000 USD. So why didn't people create rectangular wafers or squares in the first place so that when cutting out CPUs and image sensors were not wasted?
A big silicone ingot.
The reason the wafers are made round is because of the way people create it. The first shape of a semiconductor wafer is a very large cylindrical 'lump' called a silicone ingot, made with the Czochralski process. This process was invented by Jan Czochralski, a Polish scientist.
Accordingly, pure silicone will be melted into liquid with a ratio of 1 / 10,000,000 non-pure atoms, very high purity. A silicone crystal stick is immersed in this liquid, then it has just been turned and pulled up slowly. The lower the temperature, the lower the silicone will cling to the core, and evenly distribute into a ingot thanks to centrifugal force.
Czochralski process.
The video explains the process of Czochralski.
When this ingot has cooled, people will use diamond cutting blades to cut them into small pieces with high accuracy. The sheets will be washed and polished to form a silicone wafer.
To save costs in this process, scientists only have the only method of gradually increasing the diameter of the silicone ingot. This means that the diameter of the wafer also increases, which in turn can make more products in a turn to increase productivity.
The size of the wafer gradually increases over time.
However, due to complicated manufacturing process and many standards need to be ensured, the cost of creating this product is still very high and the efficiency has increased. That is the reason only very few manufacturers of high quality camera and CPU sensors in the world.
lidar sensor (short for light detection and ranging) will help enhance the camera experience on the iphone 12 pro and iphone 12 pro max. so what is lidar?
the latest leaked images of the galaxy s9 and s9 + show that the fingerprint sensor will shift from being located next to the rear camera - like on the galaxy s8 - to below the rear camera. this also means that samsung galaxy s9 still has no fingerprint sensor under the screen.
recently, qualcomm introduced the first 3d ultrasonic fingerprint sensor for smartphones at its annual snapdragon tech summit conference. this 3d ultrasonic sensor is able to read fingerprints through layers of interfering substances such as water or oil.
apple's new arkit augmented reality technology has made the world of smart phones pay attention, even though it's just sharing ideas, not real applications.
the sensor in the phone is something users don't notice, but you'll know when it stops working. a problematic sensor can cause serious problems for the application you use.
to create smartphones with a high screen-to-body ratio, companies have adopted a variety of designs, including a protruding front camera module and a 'mole' on the screen to accommodate the sensor. turn the camera.
behind every beautiful photo on a smartphone is a small image sensor in the camera. why is this image sensor more important than the number of megapixels?