OnePlus 8T renders leak, showing new placement for the rear cameras

14 September 2020
Some specs are along for the ride too.

Sort by:

AnonD-949291, 15 Sep 2020Well i m opposite ...dont like motorised or any mechanical set upIn 2 years I used my motorized front camera maybe 10 times. If I could choose I could have a smartphone without front facing camera with let's say option to connect external camera via usb-c.

  • Jp

Now it loooks like cheaper samsung m series phone.. 😝

LG Superfan, 16 Sep 2020What I am saying is that those camera marked as Depth Senso... moreYou also have to realize that many LiDAR utilise a "regular" camera, but it actually work on IR and capture the dots and then get internally computed to gave distance value to each point, so maybe what you did was only being able to see through that camera without the dots being projected; but by definition it is still a LiDAR and a Depth Sensor if it use projected dots to gather distances.
Depending of it is use continuously projected dots and figure out their relative deformation caused by distance, it is a structured light type LiDAR :
It can use lines :
Dots :
With either a single or multiple camera (help a lot) :

And if it work as a real ToF, through structured light (except that here the timing for each points is what matter, not the distortion) or wide beam/omni-directional flash or modulation (Flash LiDAR), you technically wouldn't be able to gather video from it normally.

But at the end the non-ToF LiDAR/Depth sensor is a simple camera (that normally only see near infrared on a narrow spectrum), I am surprised it can be accessible though, I don't know why they left it accessible.
And if the picture is RGB, well, they just went cheap and easy by taking a regular 2Mp camera sensor and only remove the IR filter glass to make it work with the pattern projection (structured light), which is a really cheap, and probably a quite poor implementation of depth sensing method and certainly not what computational photography require as a regular dual camera stereoscopy would be way better.

In truth, there are many possible implementation and from little hacks that anyone can do at home with little hardware up to highly specialized sensor used in mission critical situation (like the one used for mapping the landing area in the JPL Rovers), many hardware and way to do it are faisable.
But for really properly doing photography, the per pixel, Flash LiDAR type of ToF should be the one used, sadly it isn't the case, which is why the sensors for depth data on smartphones are only little use.

And as I said, Huawei P30 Pro and P40 Pro have a 2Mp ToF, it is NOT about 2Mp = Depth sensor camera and 0.3Mp or 0.5Mp = ToF, as I literally explained, Depth Sensor can be a ToF or a LiDAR, the same way a Vehicle can be a Motorcycle or a Car, and except if they only use the image to feed an algorithm to try to get depth from a regular CMOS sensor (in which way a Depth CAMERA is a proper naming but not a Depth SENSOR), chances are they use something else that you simply didn't saw to make this camera output measured (and not AI computed) depth informations, which still make them LiDAR and possibly ToF depending on the implementation.

Demongornot, 16 Sep 2020You can disagree all you want, what I said aren't my p... moreWhat I am saying is that those camera marked as Depth Sensor are nothing but a 2MP camera I'm not saying what you said is false what I said is just limited to the so called depth sensor on smartphone specs its actually a regular camera which is being used for depth data not those advanced ToF LIDAR ones
If a phone specs say Depth sensor it means just a regular 2MP camera
If it says ToF or something that is that
If you have a phone with depth camera (called depth sensor by OEM) try the method I said

Anonymous, 16 Sep 2020Pretty much everything in this comment is incorrect. please... moreSure, because you obviously know better than informations directly taken from many video, explanation websites, constructor websites, papers, different researches and other implementations ?

Time of Flight :
Time of flight (ToF) is the measurement of the time taken by an object, particle or wave (be it acoustic, electromagnetic, etc.) to travel a distance through a medium.

Depth sensor :
I mean, this one is a no brainer, it is literally in the name, it is a sensor that measure depth, there isn't any particular description of how a depth sensor work except by being a sensor (not an implementation, like using regular camera(s) to get depth from software) that get depth data from multiple points (as opposed to a single laser rangefinder who get a single point measurement).
"Depth sensor
If you find a phone with a depth sensor, it's designed to do exactly that—sense depth."

"Laser imaging, Detection, And Ranging", but most importantly, as it isn't always only about lasers, "Light Detection And Ranging".
Basic description :
"is a method for measuring distances (ranging) by illuminating the target with laser light and measuring the reflection with a sensor"
Quote : "LIDAR – sometimes called time of flight (ToF), laser scanners or laser radar"

If a phone use a simple Depth CAMERA and market it as a Depth SENSOR, it isn't my fault or the fault of the description, they are just lying, what I say still hold true, the same way that "Hoverboard" was the name of flying "hovering" skate board, and because they wrongly used it for stupid marketing purpose doesn't mean calling a flying skateboard an "Hoverboard" is wrong.
"The fact is that very little information is published about ToF sensors." Which is a big reason why most peoples actually don't know what they talk about, they just read a single thing and think it is good, I've spend days reading things about those.

Now I am curious to know what you actually believe are all those though.

LG Superfan, 16 Sep 2020Well I disagree that Depth sensor on phones is ToF sensor c... moreYou can disagree all you want, what I said aren't my personal opinion, those are facts.
If you can "take picture with the depth sensor", then it isn't a depth sensor, it is just a crappy camera marketed as depth sensor.
Note that there is a big difference between a "Depth SENSOR" and a Depth CAMERA, the first one being a sensor specially made for getting depth data, the other is simply a regular camera where an algorithm using either AI from this single camera or stereoscopy from another camera will try and get depth data, which is the least precise and most ressource intensive way to get depth data.

The software is as important as the camera, actually it is even more, which is why brands like Google or Sony or Apple get top quality pictures with 12Mp sensors while other brands with bazillions Mp sensors can't get close, a 2Mp ToF won't change much, how can you expect a 2Mp data map to enhance significantly a 12Mp+ picture ?
It would be like trying to make a good quality picture by cropping a person out of a scenery using a selection layer from a low quality picture, the only thing you'll get is a crappy crop.

And your theory about ToF and Depth sensors have a flaw, Huawei ToF/Depth sensor on the P30 Pro and P40 Pro are 2Mp, not 0.3Mp or 0.5Mp.

  • Anonymous

Demongornot, 15 Sep 2020Haha, no problems. Basically ToF, Depth sensor and LiDAR c... morePretty much everything in this comment is incorrect. please if you do not know what you're talking about just refrain from commenting

Demongornot, 15 Sep 2020Haha, no problems. Basically ToF, Depth sensor and LiDAR c... moreI replied and for some reason my comment was saved to DB don't know if it will be posted

Demongornot, 15 Sep 2020Haha, no problems. Basically ToF, Depth sensor and LiDAR c... moreWell I disagree that Depth sensor on phones is ToF sensor cause if it was very phone would have flawless portrait mode like P30 Pro and P40 Pro
If your phone has a depth sensor install open camera app and try switching cameras you will be able to take pictures with it and it is just an RGB 2MP camera atleast this is what it look like

This is Oppo A72 with Snap 865. How to make tons of money ? Just rebrand a phone and change it's CPU and that's it. Success.

  • RWE

Oh nice! That's OnePlus S20+.

LG Superfan, 15 Sep 2020I'm overwhelmed can be a bit more simple lolHaha, no problems.
Basically ToF, Depth sensor and LiDAR can be similar or different things.
It is like "Cars" "Vehicles" "SUV" "Motorcycle" which depending on what you talk about, can be one or multiple of those.

ToF is the name of a method to tell distance by any "pulses".
Depth Sensor is any sensor that directly give you depth data regardless of the method, cal also be 2 camera on stereoscopy.
LiDAR is anything that, based on how you use it, give you depth data using light.

And most Depth sensors on smartphones are actually ToF, and because they use light, they are also LiDAR.
But they are different from regular carmeras as they only look in Infrared and need a super fast memory built in, so a regular camera can't be used as a Depth sensor.

It is just that the terms are used sometime in the wrong way, giving the impression that a phone with a "Depth sensor" have a different thing than a phone with a "ToF" while it can be the exact same sensor with just different name, I guess some brands prefer one or another naming.
Like the new LiDAR on the latest Apple products which is simply a depth sensor, which is also a ToF, but they probably preferred the newly popular LiDAR term as it sound way more impressive than the now old and negatively viewed Depth sensor/camera and ToF.

  • Anonymous

The screen is really small for a flat screen. 6.67 is the norm in 2020.

  • Anonymous

Why OnePlus has the same camera bump as the Samsung phones haha What a shame

Demongornot, 15 Sep 2020Not really, the naming isn't intuitive, but in short :... moreI'm overwhelmed can be a bit more simple lol

  • Henz

Not buying any OnePlus phones until the make smaller phones and stop competing on how many cameras they can fit in the back of a phone. Just put one lens that actually can take good photos!

LG Superfan, 15 Sep 2020Most ToF are 0.3MP as far as I know and 2MP sensor is depth... moreNot really, the naming isn't intuitive, but in short :
ToF : Anything that use emission (light, MMwaves, sound, etc) to detect a distance between the sensor and an object.
LiDAR : Any implementation allowing from light source to gather distances and get a depth map from it.
Depth sensor : Any sensor, that by itself and without external assistance can gather depth data.

ToF can be as much the Laser autofocus (which can be a camera or a single light detection diode without any specific or more precisely a 1x1px resolution), than it can be a LiDAR allowing to make 3D depth data from a lot of lasers.

LiDAR can use more than ToF as the "structured light" (like Kinect, being a bunch of lasers) doesn't rely on the time that time light take to bounce back to get distance data (dot/line projectors rely on deformation of the points to tell the distance) while it can also work as a ToF (like the structured light but whose measure distance from the time light take to bounce back), or, like Laser scanning and surveying instruments, being a single laser using ToF method to scan dot after dot by rotating quickly and gather a 3D scanning on an area.

While Depth sensor is any sensor being, by itself capable of gathering depth data, so in short, the structured light or Flash LiDAR (a big omnidirectional light rather than many dots) can be depth sensor, but laser autofocus or the laser scanning and survey equipment isn't as it isn't a "sensor" and its sensor by itself doesn't gather depth from a scene, only the implementation by rotating allow it to do so.

The best depth sensor for higher resolution and for photography use is the Flash LiDAR, as you only need a unfocused (wide beam) laser or LED and the resolution depend on the sensor, it is almost like a regular camera, at the difference that it use only monochrome bayer, so rather than the bayer having the RGB filter, you only have one IR filter, the other difference is the ultra fast memory required, as you need extremely high precision to get timing difference from light rays on short distances, in a flash LiDAR, each pixels correspond to an ultra fast memory address on an array, and though I don't know how they handle phase detection, each pixel simply record delay from initial light pulse when with it receive a photon that can go through the filter.
Allowing the almost regular camera to basically get by pixel a depth information.
Camera sensor being analog and only after an analog to digital converter being made in binary, the sensor itself isn't really "speed dependent", only the electronic part is as analog doesn't need a clock frequency to operate.

Anonymous, 15 Sep 2020Yeah, it's better. But still trash. Macro camera is u... moreThat's what happen when you talk about a subject you don't actually know.

The 5Mp Macro camera of the Poco F2 Pro deliver amazing and really solid shots, if you say that you probably just haven't checked the links I posted (those are not to make the comment look beautiful you know) where we can see the amazing results of the Poco F2 Pro 5Mp Macro with 2x optical zoom, AND the difference between the dedicated Macro ultra short focus distance compared to the Ultrawide with a purposefully built in Macro capability that can't keep up.

If you think depths sensors are only used for depth effects, you don't know anything about smartphone photography, there are used for a lot of things and the only reason it doesn't show up that much not make a big difference between phones without them, is exactly because they are too low resolution.
If smartphones had a 12Mp (or more) depth sensor and the proper software/AI, it would be able to do things you can barely imagine.

It is more camera because while a DSLR camera can easily swap objectives to use different lenses for all those specialized situations, a Smartphone have built in camera + lenses making this impossible, which, for having the different options, need to have different types of cameras.
*The general purpose camera, made for all sort of non specialized pictures.
*Wide/Ultrawide made to give a larger field of view, allowing to capture more of a single scene without requiring to step back.
*Telephoto, specially made for shooting at long distances.
*Portrait, which have two specific settings, one being the "ideal" focus distance for taking a picture of someone and the other being the depth of field allowing to make the focus area on the subject and blur the background.
*Macro which is made for extreme close up (and not just taking a picture close) with a specially made focus distance of often 2Cm.

The ONLY way to do it with a single camera would be to have a really complex lenses setup, but even there, multiple camera are better as a good software would actually use the data from multiple sensors to improve the quality.

  • FAQ

FAQ, 15 Sep 2020Yeay.. Now oneplus also have stove on their phone... Apple ... moreMy bad... I cant count apparently.. Actually they have 5 stoves.. Who know, maybe for pro ver. they will add another one.. There is a placeholders there..

  • FAQ

Yeay.. Now oneplus also have stove on their phone... Apple only have 4, but op might have 6...

Grr.. Enough with this crap. . Oneplus lost their identity... Especially in design and pricing..

It is really a sins not to follow apple design nowadays right? Even huawei start putting their square/rectangular camera island off center in their midrange phone .. Pfft..