Smartphone Camera Lens Design: A Patent Study

Поділитися
Вставка
  • Опубліковано 30 лис 2024

КОМЕНТАРІ • 32

  • @DavidBrown-zw5lr
    @DavidBrown-zw5lr Рік тому +16

    It is a priviledge watching you, a real optical expert, walk through this design, bringing your insight and experience so clearly to it's analysis. Thank you.

  • @NoobomgWhy
    @NoobomgWhy Рік тому +4

    Thanks for the great video! Regarding your question: The RI in modern sensors is corrected by a 17x17 array of values (luma shading). Those shading values are measured after assembly of lens and sensor with a white scene target and (usually) written to the sensor or some other available memory in the module. Later those values can be used to correct for the RI digitally. The same is done for R/G and B/G values to correct for 'color' shading (depending on wavelength and angle the transmission changes as well).

  • @vintech88
    @vintech88 10 місяців тому +2

    Thank you for sharing the inside and it is wonderful to see the real time simulation. Please keep doing.

  • @FWDSlip
    @FWDSlip Рік тому +3

    Awesome review, lots of information. Thank you

  • @enotdetcelfer
    @enotdetcelfer Рік тому +5

    Fantastic, thank you

  • @jyothishkumar3098
    @jyothishkumar3098 Рік тому +3

    This is amazing. I had seen a similar patent from Apple and I wanted to read the paper someday. A video is much better to get an introduction. Also I wondered what the kinds of Optical CAD software used were, and it's also explained here!

  • @刘宇熙-o5u
    @刘宇熙-o5u 3 місяці тому

    First of all, thank you for your video; it’s very beginner-friendly. Recently, I tried to replicate the lens design based on your video and the patent mentioned in it. All the surface parameters match the table exactly, and I kept five decimal places in the simulation, just like you did. However, I noticed that my layout is somewhat different from the one in the patent and your layout. Specifically, lenses 1 and 2 are very close to each other, with their edges overlapping, which results in the edge light not converging on the image plane. The shapes of the other lenses appear to be consistent. I carefully checked the parameters for the four surfaces of lenses 1 and 2 and didn’t find any issues. Additionally, I used EFLY to check the focal length of each lens, and they are essentially the same as in the table. I personally believe that the shape of an aspherical lens is determined by its radius, thickness, and aspherical coefficients. Since these data match the patent data, why do the results differ? Could you please advise if there’s something I might have overlooked that’s causing my results to differ?

    • @stephenremillard1
      @stephenremillard1  3 місяці тому

      The first thing I would check are the clear apertures on those four surfaces. After I optimized, the first lens element had a front-side clear semi-diameter of 0.724mm and a back-side of 0.792mm. If all of the surface geometry is entered without error, then the next thing I would check is how the field and the aperture are defined. I used Real Image Height for the field and set the max field to 2.801mm. I used Image Space F/# for the aperture, and I set it to 2.35. Hope this helps.

  • @danielandren2377
    @danielandren2377 Рік тому

    Great video as as usual, thanks a ton for taking the time to make these deep dives!
    Regarding how the falloff in relative illumination is corrected in software, one thing that is done for some applications is to create an inverse map of the falloff (often from calibration of individual lenses during manufacturing) and then multiplicatively scale the pixel readings to reach a flat field. Sure, this decreases the SNR of the outer portions of the image, but given the use is often far from the noise floor, it does not become terribly noticeable for the end user.

    • @stephenremillard1
      @stephenremillard1  10 місяців тому +2

      Thank you for the informative reply to my end-of-video question.

  • @FW190D9
    @FW190D9 Рік тому +4

    Great Videos
    Thanks for producing them !!

  • @joshhyyym
    @joshhyyym 11 місяців тому +2

    Great video, thanks for posting.

  • @-eduarth_ab6693
    @-eduarth_ab6693 5 місяців тому +1

    What is your opinion of Leica Co working with Xiaomi?

  • @FalconSmart
    @FalconSmart Рік тому +1

    Thanks a lot for this tutorial!!! is it possible to download a Zemax file of this design? Thank you.

    • @stephenremillard1
      @stephenremillard1  Рік тому +2

      I don't have a good way to share files, but you can find a lot of good Zemax files, including some similar to this one, at www.lens-designs.com

  • @IMSAIGuy
    @IMSAIGuy Рік тому +1

    now try and tolerance this thing. So hard to get the plastic molders to commit to anything.

    • @JohnDir-xw3hf
      @JohnDir-xw3hf 7 місяців тому +1

      But they got it. Sharpnes is good enough.

  • @hamalatsubuh4867
    @hamalatsubuh4867 Рік тому

    Can we just have only one lens and one plastic filter that corect the light?

  • @zacherychen484
    @zacherychen484 4 місяці тому +1

    Thank you for this video!

  • @ToanNguyen-vf3hc
    @ToanNguyen-vf3hc 6 місяців тому

    Thanks for wonderful video but I am still a little confused that the magtify of distortion should change following the curve of lens shape but how can you get a number telling about distortion since lens 5 and 6 are not spherical lens so distortion might change from barrel to pincushion based on how chief ray is reflacted on curve, can you help me to get deeply in it? I am college student and trying to research about distortion, thank you alot

    • @stephenremillard1
      @stephenremillard1  6 місяців тому

      This is a really good question. Wavefront aberration coefficients can be computed analytically through fourth order aspheric. But for higher order aspherics, ray tracing is exclusively used to understand the final image locations. A fourth order aspheric will cause a small departure from a spherical surface, unlike the higher order terms in the surfaces used here. As you noted, the distortion is hybrid, meaning that there is a change in sign moving from the center to the image edge, and ray tracing, rather than a single number such as a Seidel coefficient, is the only way to look at it. Zemax, and all other programs, do compute a table of Seidel coefficients. But when higher order aberrations dominate, and high order aspherics are used, I really don't know what meaning, if any, they have. I'm sure they are meaningful, and maybe someone can help us out here.
      By the way, you can get hybrid distortion without using aspherics. Some lenses balance out the third order distortion leaving higher order field dependent magnifications that can result in a wavey distortion versus field plot. Bear in mind that a distortion plot is not the result of only the third order polynomial term, but of all polynomial terms that describe a displacement in the chief ray.

    • @ToanNguyen-vf3hc
      @ToanNguyen-vf3hc 6 місяців тому

      ​ @stephenremillard1 thank for great explaination, I did some experiments in cooke triplet lens structure and realize that distortion's magnitude is really hybrid through lenses but in another lens structure, it not really working that way, when the system get signigiciant magnitude on barrel distortion, I add into it another got barrel distortion lens and the final result make the distortion decreased but image smaller, this is quite interesting, the image is really compressed to expand FOV with lower distortion. But I still dont know the way it works.

  • @julianprzybysawski8543
    @julianprzybysawski8543 5 місяців тому

    I wonder why plastic is not used for photograph lenses in larger formats.

    • @AleksandrCheplakov
      @AleksandrCheplakov 5 місяців тому +1

      Small range of refractive index, high coefficient of thermal expansion and poor optical properties
      I have seen plastic lenses on aliexpress, they have poor quality but low price

    • @maxhammick948
      @maxhammick948 4 місяці тому +1

      It is - canon has got quite good at moulding elements at that size, so their cheap lenses tend to include some very extreme plastic aspherics (e.g. the RF 28mm f/2.8)

  • @multiforc271
    @multiforc271 Рік тому

    interesting, thank you for sharing,
    I loved it since i don't own any sophesticated device, telscope or ... but it was fun to learn more about my phone.
    unrelted question, can you tell me about absorption in the lens? my knowledge to optics is limited but i am fascinated by this field, I apprecite if you can guide me, and tell me more about absorption in the lenses.
    Also I don't know if this is something common in the market and devices or not, but it would be interesting if I could filter out specific band or frequency out of my camera, I know there are different filters in scientific world, low/ high pass or band pass, but I am dreaming of something that i can play or make it in my home.
    Also, you have mentioend the glass layer to filter the inferared for CMOS, wouldn't be nice if we could remove the glass layer nd also filter out the visible range, then we could end up with inferared camera in our phone ( and yes, I know there are dongles and gadgest as inferared camera for pone, but when you make something at home it is more funa dn joyable)
    final question, is there softwares for simulating optics for free ? i don't have much money to buy some expensive sofware for hoby and playing games.
    I have seen an ad on in FreeCAD, but it is not that advanced, and simulation is ram consuming

    • @stephenremillard1
      @stephenremillard1  Рік тому

      These plastics are engineered for very low absorption across the visible spectrum. I see from the datasheet (jp.mitsuichemicals.com/en/special/apel/lineup/) that the APL series of plastics have transmissivity at d-light of 91% through 3 mm of material - assuming I'm reading it correctly. So, that would be about 99% transmitted through each lens element in this patent. I don’t have any information about IR and UV for these materials. It seems that replacing the IR filter with an IR window (iriss.com/articles/what-type-of-lens-materials-are-used-in-infrared-inspection-windows/#:~:text=The%20most%20common%20materials%20used,used%20infrared%20window%20optic%20materials) would certainly make for an interesting IR camera.
      There are some budget options for optical design software. Here are two. There is ATMOS-ATM which can be purchased for $400 from www.astro-physics.com/software/. If you can use non-sequential ray tracing, you can request the demo version of FRED at photonengr.com/fred-software/. It comes with a perpetual license. The limitation on the free demo is that you can’t save files or write scripts. The number of rays is also limited. In other words, it's exactly what it is called. A demo. But it’s free.

    • @multiforc271
      @multiforc271 Рік тому +1

      @@stephenremillard1 Thank you so much for the information.
      As for the camera, yes; I also had an eye to the market, but when i make something myself, to me have tousand times more value than buying something prety out of market.
      As for fred, i know about it, my prof had a commercial version of it, but I was wondering about something that i can use commercially; in the CAD world or graphics, there are many free and open source programs, I was hoping to find similar for the optics.

    • @maxhammick948
      @maxhammick948 4 місяці тому +1

      Filtering specific frequencies is easily done with a filter in front of the lens, either a screw in type for a camera or you can get ones that clip onto a smartphone. You can get a few cheap UV filters (which are generally just plain glass these days) and play with coatings on them at home without risking damage to any expensive optics, or mass produced filters that block or pass specific frequencies are widely available for astrophotography enthusiasts