Smartphone sales in some economies have relatively stagnated and phone manufacturers are beginning to increase innovative capabilities such as multiple primary and secondary cameras, hole-punch cameras and foldable displays.
We’ve already seen Time of Flight (ToF) cameras being used in smartphones back in 2018, but 2019 seems to be the year more companies are using the technology. The market considers this is the way to move forward and make improvements in the camera department, which has become one of the most important parts of a smartphone and has seen unprecedented innovation in the past several years.
The competition to enhance and distinguish smartphones focuses increasingly on the camera and everything it provides. You’ve heard the latest buzz of the Time of Flight cameras because companies are beginning to use them in smartphones and market them too. But precisely, what is a ToF?
What is a Time of Flight camera?
Time of Flight cameras features a sensor that utilises a small laser to shoot out infrared rays. This infrared light rebounds back to the sensor from anything or anyone in front of the camera. The amount of time the light requires to bounce is assessed, and this transforms into data from a range that can generate a depth map.
Most devices have so far depended on stereovision, which used two sensors to calculate an estimated but inaccurate depth, but this technique does not operate in poor light or dark, and it is not very precise.
The better way to calculate depth is by infrared illumination with structured light where a dot pattern is projected onto a scene, or a face and the sensor measures the distance between the dots.
This technology is suited for tasks such as facial recognition in the short range, up to a couple of feet, which is why Apple used it for their TrueDepth Face ID camera.
Time of Flight camera functions in a related way, but the pattern of dots is not used. As these techniques depend on infrared light, they can be used in dimly lit environments too. The Time of Flight camera lights up the scene with an even flood of light, and the camera sensor looks at each pixel in the frame.
The sensor syncs with a fragile and responsive clock, which can measure small differences that the light takes to rebound and reach back to the sensor from different surfaces, and this results in an accurate depth map with depth data for and from each pixel.
Why is a Time of Flight camera used?
Assessing the distance and size of different objects, object mapping, indoor navigation, avoiding obstacles, motion detection, object surveillance and reactive altimeters can be accomplished using ToF camera sensors.
Sensor data can also contribute to improved 3D imagery and the experience of Augmented Reality (AR). In phones, 3D animation, AR, and in specific portrait mode are most probable to use ToF camera sensors.
So while ToF cameras were not invented yesterday, they are still state-of-the-art technology, fast-moving, inexpensive, and widely available. As the trend suggests, many more smartphones with ToF camera sensors will be launched in the next few years — possibly changing our perception of what a smartphone camera can do.
Tech head. Loves Manchester United and headbanging to Rock music. Avid timelapse maker. Currently studying to become a web developer and an artist.