Samsung Galaxy S21 to come without ToF sensor

15 August 2020
The Korean tech giant hasn't entirely dropped the concept of ToF sensors, though.

Sort by:

  • I
  • Itaphone
  • bJb
  • 07 Sep 2020

AnonD-754814, 17 Aug 2020About your ultrawide camera resolution theory. You'r... moreIt still a phone that produces Point and shot images, want a good images use a proper tool, DSLR with a good lens

    • A
    • A7sii
    • 80i
    • 06 Sep 2020

    Samsung removes features one by one..Samsung fans: Oh look Innovation..

      • D
      • AnonD-909757
      • pZQ
      • 19 Aug 2020

      AnonD-754814, 17 Aug 2020You're not understanding how LiDAR works. About your... moreSorry but, you are the one clearly not understanding LiDAR.
      Literally, the Wikipedia on LiDAR etymology :

      "The term lidar was originally a portmanteau of light and radar.[1][2] It is now also used as an acronym of "light detection and ranging"[3] and "laser imaging, detection, and ranging"."

      Laser is ONE of the way a LiDAR work, and even there, the laser can be used in many different ways.
      There is laser scanning, which use a SINGLE laser and is the most common, so I don't know why you insist on 100 lasers, and 100 lasers will NOT give the same result as 1000 lasers, it is like saying a 480p video is the same as a 1080p video.
      And if that was true, why would every structured beam type 3D facial recognition, such as the one in the iPhone Face ID or the Google Pixel have 30,000 beams projected ?
      Well, because you need many dots to get a high resolution map.
      Which, by the way is an exemple of IR beams that is NOT a ToF but still a LiDAR as light is used to get a 3D map.

      And again, I talk about the NON LASER type, the same used on all smartphones 3D ToF sensors such as the Huawei P40 3D facial recognition, here is an explanation :

      Binning doesn't make pixel size work that way, a 42Mp binned to 12Mp is literally 4x1µm.
      Actually it can be even beneficial to have multiple smaller sensors over a large one, this is literally how we do astronomy at the highest level, using interferometry.

      Ok so lets see how this literal patent description of Apple Face ID prove that I am right then :
      Also, just look THIS :
      So much for what they teach you in school...Reality seem to disagreed.

      LiDAR = Every methods using active light source to gather 3D imaging of something, it comprise a multiple techniques including, laser scanning where the laser pulse at regular interval and the distance is determined by time which is also what all ToF do (rangefinder on the other hand only shine a single beam in a single point, unable to map but only telling the distance between the laser and the object, they aren't LiDAR but are still ToF), AND structured light pattern who doesn't measure time but rather how multiple beams/lines/patterns are twisted by covering an object surface, which can then be used to gather depth data.

      So please refrain to insult my knowledge while all I said was wrote based on search on the matter, I ALWAYS search the subject of any technical talk I have before affirming something.

      LED based ToF can be as good as LiDAR, and the only reason they aren't good is EXACTLY what I said since the beginning, because they use low resolution sensor, which give a low resolution depth map supposed to be used on a high resolution image.
      This is literally as if I gave you a 480p monitor, then instruct you to do retouching on a 4k image without being able to zoom.
      ToF does NOT necessarily burst multiple beams, the same way a sonar only emit a sound wave, light (which also act as a wave by the way) wan be used the same way, and many LiDAR do have only ONE laser :ß.gif
      While you CAN use multiple lasers, many designs use a single one that scan line per lines, it depend on the application.

      You TOTALLY didn't understand the resolution part...
      Here is a single simple image that you should understand without any explanation and simply should understand how a wider view angle mean less pixels for the same object :
      Lets just combine it with this image :
      And this image :

      Now with all that, if you can't understand, well I can't do anything for you at this point.

        • C
        • CCE
        • n74
        • 18 Aug 2020

        From my bad experience to the S20 I'm NOT so HAPPY. I don't like the camera with auto background blurring, there should be an option to turn it off and I don't like the heat issue it lag the phone performance (Hangs). Sorry Samsung! I own already 7 Samsung gsm it's really good phone but S20 omg...

          • ?
          • Anonymous
          • rJJ
          • 17 Aug 2020

          Wow amazing phone. With wonderful features

            • D
            • AnonD-754814
            • HxI
            • 17 Aug 2020

            AnonD-909757, 17 Aug 2020Yes indeed the Mi 10 Ultra have a really good camera setup,... moreAbout your ultrawide camera resolution theory.
            You're saying every correct thing about this topic(only this not others). But you're just giving wrong conclusion.
            Now you're saying that 12MP isn't enough for ultrawide angle.
            That's true.
            It's also true that 12MP isn't enough for main camera too.
            No matter how great that 12MP sensor is , that sensor will never give enough details for current high resolution displays.

            I mean why does your theory sound so lame ?
            As the shot is wide you need higher megapixel.
            So, according to your logic,
            If you use 12MP to shoot your portrait then do you need 120MP sensor to take a group photo to take a group picture consisting 20 people ?

            People shoot different type of photos different times. But that doesn't make one resolution bad or worse.

            Yeah sensor size matters. But it's the opposite way of your saying.
            ** Ultrawide and Telephoto camera sensor on smartphone has to be lower resolution/lower size than regular camera.
            Because it's difficult to make lens for ultrawide or telephoto in that compact body of smartphone. There is a reason why Huawei used 12MP for P40 pro's 5x zoom and 8MP for P40 pro+'s 10x zoom and why Huawei's ultrawide shot is much narrower than Galaxy phones.

            Like I said, think logically first.

              • D
              • AnonD-754814
              • HxI
              • 17 Aug 2020

              AnonD-909757, 17 Aug 2020Yes indeed the Mi 10 Ultra have a really good camera setup,... moreNow as I've given the main answer.
              Now let's point out how ignorant your theory sounds. For your sake I'm gonna give it with serial number.

              ** 1 : Apple use one laser and it splits in many laser ?
              ANS: Man ! Did you ever attend your physics class ? Or did you ever go to high school ?
              Lights stays straight way. So, unless there wasn't that many beams at the beginning there is no way it's gonna be that many in the middle.
              I think you should join high school again and study a little Physics.

              *** 2 : LiDAR doesn't measure the time.
              ANS: This again proves how weak you're in physics. What does LiDAR does ? How does it do these ?
              The main and principle task of LiDAR is to measure the distance of the object(every single part). For ToF/LiDAR
              The equation is very simple.

              # Distance = (Measured time * light speed)/2 { as it has to cover same distance twice}

              How the hell one person is going to measure the distance without measuring the time.
              So, a LiDAR's must do task is calculating the reflection time.

              Your base knowledge is weak. So, don't search the web and bring illogical things as proof.

              LiDAR is one type of ToF. and I never said for ToF Laser is a must. ToF can be based on LED as well. Because all ToF needs to do is 1 single burst of light and rest is receiving the reflection and doing math. But LED based ToF aren't good.
              Now LiDAR's work is same as ToF. Instead of 1 burst of multiple laser beams, LiDAR sends multiple bursts per second. That's why it can also work as a RADAR. Because LiDAR is continuously getting the objects distance and 3D measurements.

              Don't think me as a enemy. You don't have to hold exact the opposite thought of mine.
              Before giving examples or proof, think about it. Does it sound logical on the base physics or mathematics ?

                • D
                • AnonD-754814
                • HxI
                • 17 Aug 2020

                AnonD-909757, 17 Aug 2020Yes indeed the Mi 10 Ultra have a really good camera setup,... moreYou're not understanding how LiDAR works.
                About your LiDAR theory. What do you think that is used in LiDAR except of LASER ?
                This time I'm gonna keep the comment short because I feel you're not getting the main part.

                100 laser beams or thousands it doesn't really differ much. What matters is what amount of light it's getting back. The less light you send the less you get back.
                From smartphone you can't send enough resolution of laser beam.

                Just answer this question. You've got very dark situation. For your main camera.
                Will you use 3µm 8MP sensor or will you use a 42MP 1µm sensor.
                It doesn't really matter what you choose.
                What really matters is which is gonna give better result at dark ?
                It's definitely the 8MP sensor.

                  • D
                  • AnonD-909757
                  • pZQ
                  • 17 Aug 2020

                  Duck of death, 17 Aug 2020Dismissing facts about your nonsense to keep going with you... moreSure, you know so much better, comparing two things that only have for common point the word "AI".

                    AnonD-909757, 17 Aug 2020Why are you talking about templates or color balance ? I l... moreDismissing facts about your nonsense to keep going with your nonsense...

                      • D
                      • AnonD-909757
                      • pZQ
                      • 17 Aug 2020

                      AnonD-754814, 16 Aug 2020I know LiDAR is one type of ToF. What stage current ToF... moreYes indeed the Mi 10 Ultra have a really good camera setup, but it lack Macro (the number of times I took close up and I wish I had a macro camera...) and a high resolution ToF sensor with matching algorithms.

                      I think you misunderstood how LIDAR work, LIDAR also mean "light detection and ranging", it doesn't necessarily require a laser, and when it use a laser, in most uses, it use a scanning method rather than multiple laser, even the Apple's Face ID actually use a single initial laser that is then split into many others, this is light array depth scanning, and it isn't a ToF technology as you only gather 3D (meaning depth) informations from how the dots are affected, but you aren't measuring the time they take to bounce back, which make this tech not a ToF.
                      And I think the blinking is only there so the software and filter out any parasitic infrared light, so it only read the dots.

                      And ToF doesn't even need a laser, here is how a laser ToF (typical LIDAR) data look like :
                      You can see, down bellow that there are lines, this is because the laser is scanning like what happen on an old CRT monitor which only had one (well technically 3) electron lasers, it rotate and at each rotation it goes slightly higher, making it able to scan the environment, that's basically what is the big rotating head over self driving car prototypes, it only have a single laser, the reason you are seeing the lines on the bottom of the image is because the ground is close, so the laser only illuminate a small part of it.

                      But you can also use a single infrared LED to make a ToF, you just pulse the light, and exactly like a sonar, you gather data from when the light bounce back, except that here, you already have a pixels per pixels return, meaning you don't need to do any special processing except some corrections (filter, rejections and other stuff).

                      So it is literally a regular camera with a single bayer layer and only one physical pixel per numerical one (as you don't need RGB) that read the light out of a LED who just pulse.

                      So here all the laser stuff is irrelevant as all smartphone ToF (AFAIK) use LED rather than laser, the only laser I know about on smartphones is laser autofocus.

                      I don't get what you meant about the 12MP here.
                      Wide camera have a wider view angle than regular camera, so if you want a person in the same picture taken both with main and wide/ultrawide camera to have the same resolution, you obviously need the wide shooter to have a higher resolution :
                      Since the surface that the camera is looking at is wider, for the same amount of pixels, the same object in the picture would have less pixel density with a wide/ultrawide camera.

                      It is the opposite with telephoto where you aim at numerical zoom capabilities on top of the optical one.

                      Yes there are limitation, but I mean, there are 108MP sensors, that even counting binning are still able to take excellent pictures, a 19MP isn't too much for getting quality pictures, the sensor itself and its size could affect way more the quality than that resolution difference, also having 19MP on an ultrawide sensor is almost like taking a part of the scene with a main 12MP sensor and then half of it with a 7MP other sensor, the optic play a much bigger role than the resolution here.

                        • D
                        • AnonD-909757
                        • pZQ
                        • 17 Aug 2020

                        Duck of death, 16 Aug 2020"AI" for colour balance is as easy as comparing w... moreWhy are you talking about templates or color balance ?
                        I literally showed you an IA that reconstruct an image with 90% of informations missing, with intend to show you how an IA can easily do the same for transition area, depth is an information used to make chances on the RAW image, it can be processed (and need to anyway, actually) and have the overlapping areas basically processed and have their errors corrected before being used.
                        And again, with an higher resolution depth sensor, it would be easier anyway, as there would be more data to perform those calculations, and AI work better the more data it have.

                          • D
                          • AnonD-909757
                          • pZQ
                          • 17 Aug 2020

                          gulfer1, 16 Aug 2020thats what the Sony xperia 1 II has done so right?Wat¿

                            • D
                            • AnonD-909757
                            • pZQ
                            • 17 Aug 2020

                            Anonymous, 16 Aug 2020What is ToF?? Why wouldn't the writer explain this?ToF is Time of Flight, it basically is like radar and sonar work, here you can either measure distance with an infrared laser (laser autofocus) by pulsing it and see how much time it need to get back, by knowing the time, and since we know the speed of light, we can tell the distance.
                            Then there are depth sensor using ToF, which work like a sonar, you pulse a bright LED in the infrared spectrum, then by measuring the time each pixel take to get light back, you can measure, pixels per pixels, the distance of objects, creating a map of distance, that, if we transforme the time (distance) information in color, give something like this :

                              Anonymous, 16 Aug 2020What is ToF?? Why wouldn't the writer explain this?You can Google it yourself bro

                                • D
                                • AnonD-754814
                                • HxI
                                • 16 Aug 2020

                                ypcx, 16 Aug 2020Apple's both Lidar (iPad 2020, iPhone 2020) and Face I... moreThe way you're talking feels like Apple themselves made the LiDAR used in iPad.
                                It's a Sony product.

                                  • ?
                                  • Anonymous
                                  • sxe
                                  • 16 Aug 2020

                                  Anonymous, 16 Aug 2020What is ToF?? Why wouldn't the writer explain this?Most be a toff

                                    • y
                                    • ypcx
                                    • nm5
                                    • 16 Aug 2020

                                    Apple's both Lidar (iPad 2020, iPhone 2020) and Face ID are miles ahead to whatever Samsung has. But Apple also removed the fingerprint sensor which still has its uses (e.g. when wearing sunglasses) and I'm waiting for "I want to be like Apple" Samsung to soon remove it too. Btw, do you remember the non-screen fingerprint sensors? They worked closed to 100%.

                                      Anonymous, 16 Aug 2020S30, not S21.S21 not S30

                                        • D
                                        • Deng
                                        • sxs
                                        • 16 Aug 2020

                                        Anonymous, 16 Aug 2020S30, not S21.It's obviously s21 bro, it would match with the current year like 2020 with s20 and note 20