Qualcomm (NASDAQ:QCOM) has announced that its subsidiary Qualcomm Technologies has introduced a new mobile camera technology, called Qualcomm Clear Sight. This technology could be a game changer for the company as well as for the cameras of Alphabet (NASDAQ:GOOG)(NASDAQ:GOOGL)-owned Google's Android-based smartphones. Meanwhile, Apple (NASDAQ:AAPL) has launched its iPhone 7 and 7 Plus with meaningful improvements in camera operations.
In a previous article, I presented my view that Apple's upcoming iPhone 7 and 7 Plus will see significant demand primarily due to their advanced camera systems. Apple didn't disappoint buyers with the camera systems. With the dual-lens camera of iPhone 7 Plus, Apple will turn the camera of the device into a DSLR camera using AI (artificial intelligence) later this year (when it will release a software update). The big question is can Apple maintain its camera lead in the current and future iPhones with the advent of Qualcomm's mobile camera technology?
Apple vs. Qualcomm in Making Smartphone Cameras
Apple used its A10 Fusion quad-core processor in the latest iPhone 7 and 7 Plus smartphones. The iPhone-maker has made the processor's onboard ISP (image signal processor) meaningfully advanced in terms of technology compared to the ISPs of earlier versions of its processors. On a single photo frame, the ISP can process over 100 billion operations in just 25 milliseconds. This can produce images and videos of superior quality for a smartphone, although according to some recent reviews, Samsung's (OTC:SSNLF) Galaxy S7 camera with f/1.7 aperture (compared to iPhone 7's f/1.8) is almost equally impressive. Wider aperture has helped both the companies improve their camera systems.
Qualcomm's latest technology involves mimicking the human eye to produce better quality images. The ISPs of Qualcomm's Snapdragon 820 and 821 processors, known as Qualcomm Spectra, have been designed to mimic the rod and cone cells of human eye. Rod cells capture light in low-light conditions while cone cells capture color. The technology uses a dual-camera system with separate image sensors, one monochrome (black and white) sensor to mimic the rod cells and another color sensor to mimic the cone cells.
Image Credit: Qualcomm
Qualcomm's technology is nothing new. In fact, Chinese smartphone-maker Huawei's P9 smartphone's dual-lens camera system works in the same way. For developing the camera system, Huawei partnered with Leica, the German optics company. It's noteworthy here that their partnership is still continuing and they have announced that they will open a new research and innovation center at Leica's headquarters in Wetzlar, Germany.
Apple's technology is different, which seeks to deliver better image quality by enhancing the power of the ISP of its A10 Fusion SoC and widening the aperture of the lens. As an investor, I believe Qualcomm's technology has the potential to surpass Apple's technology for three reasons.
- Boosting image quality driven by enhancing the ISP's processing power and widening the aperture of the lens has a limit compared to achieving the same driven by dual-lens optics. Making a better camera ignoring the laws of physics and biology isn't possible. Since the dual-lens camera's monochrome sensor is capable of capturing more details of a scene compared to the color sensor due to availability of adequate light (the color sensor needs to filter out light for finding out the destination of each color), the final image formed by merging information obtained from both the sensors will result in a high quality photograph in terms of sharpness and less noise in low light. By using AI-based algorithms, Qualcomm will make the final image superior in every respect, including color and contrast.
- AI-based algorithms play a crucial role in smartphone cameras. Earlier this year Qualcomm introduced its new machine learning SDK for the Snapdragon 820 processor-line based on its Zeroth AI platform, which will allow device-makers to run their own neural networks for security cameras (which are far more complex), among many other devices. Since Apple has no AI platform that is on par with Qualcomm's Zeroth platform, as an investor I am more bullish on Qualcomm's new camera technology.
- In the iPhone 7 Plus, Apple incorporated a dual-lens camera system for delivering depth mapping. How would Apple deliver depth mapping as well as sharper and less noisy images in low light simultaneously via a single dual-lens camera system? A third lens might be necessary, which could make the iPhone even more expensive.
Apple's Competitive Advantage: Depth Mapping
Among the two 12-megapixel cameras of iPhone 7 Plus, one has a 28mm wide-angle lens with f/1.8 aperture (the camera of iPhone 7 also has this lens, as mentioned above), and the other has a 56mm telephoto lens with f/2.8 aperture. Although an actual telephoto lens doesn't start below 70mm, let's assume that the 56mm lens is a telephoto lens, as Apple calls it, since it offers 2x optical zoom and 10x digital zoom.
For producing DSLR-quality effects on images, the two lenses will work in tandem. The wide-angle lens will capture an image and the telephoto lens will add depth-of-field (also called bokeh effect) on the captured image via depth mapping. The onboard ISP of A10 Fusion processor will make the iPhone 7 Plus understand the content of an image via the upcoming software update's machine learning algorithm so that the device can separate the background from the content (or foreground). Once separating the background from the foreground is done, the camera will be able to blur the background in real time. This will enable it to create depth mapping for achieving DSLR-quality bokeh effect.
For superior image quality in terms of sharpness, color, contrast and less noise in low light, we have seen that the role of a processor's onboard ISP is limited. However, that's not the case for depth mapping, where the role of processor's ISP is significant. For obtaining depth information, the ISPs of the two lenses of a dual-cam with different focal lengths allow the camera system to process live images. This is done with the help of software which can make triangulation calculation.
What is triangulation calculation? Well, it's a mathematical calculation for finding out the distance of an area based on a series of triangles. Since the distance between the lenses of the dual-cam is known, which forms a side of a triangle, finding out the distance or depth is possible by deducing the angles and the lengths of the other two sides of the triangle via applying triangulation calculation.
Since for depth mapping the role of the smartphone processor and its ISP is significant, Apple has a competitive advantage in this case compared to Qualcomm's similar technology, which the company released last year. This is because Apple can adjust the specifications of its processor and ISP according to their intended use, which Qualcomm can't because its processors and ISPs are targeted for multiple smartphones with different camera features.
Since I am bullish on both Apple and Qualcomm, the rivalry between the companies in terms of developing better smartphone camera systems is actually beneficial for me as an investor. Qualcomm's competitive advantage lies in its new technology which can create a better camera for general purpose use, while Apple's competitive advantage lies in its camera technology intended for specialized use.
Disclosure: I am/we are long AAPL, QCOM.
I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.