I Don’t Want To See Camera Setups With A Depth Sensor Anymore


The smartphone market is insanely competitive. The landscape changes so fast and manufacturers have to ensure that they stay on top of trends and continue to provide the things that consumers want in order to move units the way that they hope to.

When the multi-camera craze started, a lot of smartphones opted for monochrome sensors and depth sensors as a secondary camera, as these could augment the primary sensor in their own special ways. While the monochrome sensor mostly died out, the depth sensor still exists, but I really don’t think it should.

The Depth Sensor Became A Thing To Enable High-Quality Portrait Photography

I Don't Want To See Camera Setups With A Depth Sensor Anymore 3I Don't Want To See Camera Setups With A Depth Sensor Anymore 3
Image: HTC

I haven’t always been an advocate for the dissolution of the depth sensor establishment. Quite the contrary, in fact. Back when depth sensors were first making themselves known to consumers, I found them to be rather exciting because of what they enabled. The HTC One (M8) was one of the first devices to come with one.

Years ago, smartphones had never been great at mimicking the shallow depth of the field that the right lens on a professional camera can bring. There was only so much that editing could do too. That’s why taking photos of people wasn’t such an enthralling experience at the time.

However, with the advent of depth sensors, phones were able to use a mix of hardware and software to determine the subject of the photo and insert artificial bokeh on everything else. Looking back, the earliest implementations had a lot of space for improvement but they were impressive regardless. That was when taking portraits of people really became fun.

The Thing Is, I Think Software Is Sufficient In 2024 When It Comes To Portrait Shots

I Don't Want To See Camera Setups With A Depth Sensor Anymore 4I Don't Want To See Camera Setups With A Depth Sensor Anymore 4
Image: Google

So, it is clear why depth sensors became a thing and I don’t dispute their usefulness back then. However, I find it hard to argue that they’re useful in 2024. I had a Google Pixel XL back in the day; it was from the first generation and it didn’t have a portrait mode until the second generation. However, using a GCam mod, I was able to use portrait mode and it was surprisingly good.

Why am I mentioning this? Well, it is because the Google Pixel XL only had a single camera. In fact, up to the third generation, Pixel phones only had a single camera, and yet, they were regarded as the absolute best phones you could get for photography. Their portrait shots were top quality.

Google Pixel 3 XLGoogle Pixel 3 XL
Image: Talk Android

Google was able to achieve all of that through the magic of software, and in 2024, where AI allows you to tap a subject in a shot and move it to another spot in the same picture, it is evident that software is good enough to trace out a photo’s subject and apply the necessary bokeh to the rest of the image. Wouldn’t you agree?

A 2MP depth sensor is just there to beef up the spec sheet and make consumers think they’re getting something more capable than it really is. I don’t think we need them anymore.

Let’s Stick To Telephotos, Ultrawides, Or Even Macro Cameras

Oppo Find X7 UltraOppo Find X7 Ultra
Image: Oppo

I’d prefer a device with a single camera over one with a depth sensor as the secondary module (especially if it shows in the price and design), but the time of single-camera smartphones is gone, and I acknowledge that. It explains why the CMF Phone 1 just had to have a second camera.

Manufacturers should aim to put more “useful” modules as secondary units in their devices, like telephotos and ultrawides (the war over which is more useful will continue to rage on). Heck, I’d even take a macro lens, despite how little use that might end up getting.

Anyway, I’m a dreamer, I know. All I’m saying is, manufacturers should throw that depth sensor money at something more worthwhile for the consumers.





Source link

Previous articleSmart Glasses Don’t Need AR or Social Media—Just Give Me a Camera