Hi Azad, I started with LiDAR scanning before I discovered photogrammetry. It’s clear that LIDAR cannot match photogrammetry precision in any way. But if you ever want to try it again, I have the feeling that Scaniverse produces slightly better results than Polycam. Unfortunately you didn’t mention how you scanned with the LiDAR, the best way to scan with lighter is much different than the best way to scan for photogrammetry. Slow even movement and not going back to scan some areas twice is important. Misalignment of incomes by going back to fill the holes and scan things twice. There is a video on UA-cam from a guy who compared LiDAR scanning by hand and with the gimbal and he received way more precise skin with the gimbal. So it seems slow and even movement are very important when using LiDAR for scanning I still use it on site for size reference so I can match my photogrammetry skin to the original size of things. And it also really shines in a room with just wide walls, which is difficult with photogrammetry but no problem for LiDAR. Regards Kai
I had been looking into iPhone's LIDAR scans for months. Thank you for making this very in-depth review. This has cleared a lot of confusion I had in my mind.
What I learned is it can be the best of both worlds so to speak to pain lidar with photogrammetry. Certain objects are easier to capture with lidar others are better in photogrammetry, there are softwares out there that combine the two for a better overall result.
I don't usually comment on the videos but I have to say that the research and comparison through your experience is remarkable, and make me realize what I want. thanks
I use photogrammetry for detail and then do a quick LIDAR scan to properly scale the photogrammetry model. This isn't really useful if you are going to use your model as art. But it is useful if you are using your model to build upon. It's way more cost effective way. Then 'proper' 3D scanners.
good idea to use it for scale. I outline a few more methods of setting 1:1 scale in RC here wizardofaz.medium.com/setting-1-1-scale-in-realitycapture-aadeaafc3737
Great video, I've also tried capturing some facades with Lidar and the alignment issues are there. In my case I usually create more object captures, and Polycam photo mode has been amazing so far. It saves so much time and for objects like shoes the result is almost the same as Reality Capture.
Thanks for doing this. I'm glad to see I haven't been doing it wrong. Intuitively, Lidar seems like it must be better, because lasers… but its best use is for self-driving cars, where it's fast and good enough. But if you need sub-centimetre accuracy, photogrammetry is better. I'm impressed by just how accurate it is if you take the right photos with 75% overlap, no exposed light bulbs or reflective/transparent surfaces.
Thank you for sharing this. I was considering going for an iPhone to take advantage of it but I think the technology is still not quite a replacement for a photography and photogrammetry workflow. I will continue as I am - thanks
It is great you are back! And congrats on escaping the basement:D I wish you had a comparison between them over a small object like small statues. Also more subject which are black and reflective. I know it will not beat professional photogrammetry, but in terms of consumer use.
Thanks for the detailed analysis! Would be really nice to see the comparison between polycam and reality capture using a proper camera vs iPhone 13. I think that would be an ultimate review for a year or two.
Thanks much! I was initially excited about LiDAR for 3D scanning, but I quickly discovered that it's pretty iffy. Now I understand it's probably not me, but LiDAR: it's just not meant for that application. Now I guess I have to start over again, learning 3D Object Capture.
great video, as someone who doesn't have that much experience in photogrammetry , this clears a lot of stuff up. (btw: woah, that blender navigation is pretty unique! how's it done? )
What the iphone lidar or stand alone structured light scanners accel in is time and portability. You trade quality for time with a pocket-able form factor. I am hopeful that as this technology matures, it will translate into the engineer + construction industry better. I use scanning to do industrial modifications and installations. Dimensional accuracy and perfect alignments are a top priority. Compared to TLS (terrestrial laser scanning), Lidar has been more or less unusable for scanning items or buildings in an industrial setting. There are specific cases where it is usable though. However, the fact the iphone can be thrown into a pocket, and get a 'good enough' preliminary scan, or use photos from an iphone, gopro or camera is pretty cool. Photogrammetry 'can' be great in ideal conditions, but I encounter reflective surfaces and/or poor lighting far too often, or requiring hours of trying to get everything to align, so I am stuck with TLS for the time being. Have scanned (or photogrammetry) with Faro x330, Focus, Focus Premium, DotProduct DPI-10, Ipad pro, iphone 12+14, Pixel 3, Pixel 6, Geoslam, various Sony Alpha and Canon DSLR+mirrorless cameras (~24mp). I treat iPad/Iphone lidar as just another tool. It has its place and purpose.
Great video! One minor point on Object Capture API, according to the WWDC presentation it can use depth data from photos taken with iPhones that have dual cameras to recover true scale and orientation of the object though I haven't been able to get that to work
You should also compared something that I do. 4K Video instead of photos. While the resolution in a 4K Video is roughly 4x worse compared to a photo you will have many more photos out of the video. And the workflow is much faster and easier, you will aways have sufficinet overlap.
@@animaokio I use Agisoft Metashape but 3DF Zephyr can also work with video files. I can directly upload the video into Agisoft and pick how many frames I want to have.
great video; just a quick feedback, for an easier way to get which image is photogrammetry and which is LiDar, it would have helped to add supers on the screen and writte which image is which. thanks for the time you spend doing this video.
Thanks for the video since it confirmed it is not me using the device improperly. I am still a novice with these things but my finding are similar. I was using a Structure Sensor Mark II with a iPad Air 4 with the Scanner SDK app and itSeez3D app. Apparently it is similar to Lidar. One can calibrate the sensor. But sometimes the app just gets confused and misalign things. Mesh quality and texture quality was much better using a Canon 7D with Meshroom. I was scanning objects of about 30 to 50 cm. The Structure sensor does not work for small objects of say 10 cm high. One could argue my comparisons were not fair. Your comparisons were really fair. My conclusion is that I do not want to use the Structure Sensor, the quality is not to my liking.
I'm using Lidar only for point clouds on very shiny objects instead of flouring surfaces or cumbersome polariser setups. It's where lidar really shines. No pun intended 😜
ok... do you think an extra 200 dollars is worth it for the LiDar? this is my current predicament. I got an Iphone 12 mini for facial mocap stuff, not realizing there was a LiDar on the pro models. The pro models are an extra 200 bucks... but it really really looks like the Photogrammetry is just sooooooooo much better... the only use case I can think of for the LiDar so far is for AR related stuff, and even then its not EXACTLY needed... so my question to you is, do you think it is worth it? or is it just a cool thing to have in your pocket, with no actual practical uses? Like those scans with the LiDar are, as you said, unfixable in some cases.
Thank you. Curious if you built a multicamera rig with very precise camera positions and lens optics mapped and then gave those positions/lens specs to photogrammetry software as reference for the position of a matching virtual camera position/lens, would all of these misalignment issues potentially disappear?
When I write on google Prora Format nothing is appearing. Can you please share a link that explains more about the format and conversion possibilities. Thanks for the content 🙂👍
Wow you Photogrammetry is so much nicer than what Ive been able to achieve with Polycam. Definately into using secondary software to process my photographs. What is the best software option for a MAc? Or do I need to run windows and use RealityCapture as my best bet for this kidn of work?
What's your recommended workflow for generating 3D assets from building facades? I'm considering an iPhone so that i can just capture something good enough for import to Unreal, whenever I come across it, without having to lug around specialty equipment. I don't need 'THE BEST' quality assets using cross polarization filters etc... but they do need to be better than e.g. previz/concept art quality. Can't find any other content on this specifically. Your thoughts on the current best practice would be valuable!
Can you try maybe mounting iPhone to a Drone and try Lidar Scan. I wish I had a chance on that. I have iPad Pro just because of what I need it for scan space and use for augmented reality but I continuously thinking what if I had a iPhone Pro and I had an ability to use Drone then would I have much more chance to create different stuff (3D Models and integrated Augmented Objects to the certain sides, places that integrated with my digital sculpture and work...) Do you consider that? By the way living in Germany also I want to capture some Façade, is there any regulations or rules that I cannot because it is private buildings?? How is so far your experience in Germany using drones to photogrammetry? Had any problems with police or get always permission? I need to capture Monuments and Memorials but I have no one to ask if it is legal or need extra papers to do so.
We need to output our meshes to be approx 50k vertices for optimal realtime AR performance. Does the reality capture pc app allow for a target output resolution? Which other apps eg Polycam allow for target geometry resolutions? Ideally scaling texture output as well. We could destructively optimise in a 3d package but i believe this would make the texture map redundant. Great info on LIDAR vs photo capture pros and cons thankyou :)
Yes, RC you can decimate to a target triangle count (it's called simplification in RC), unwrap, and texture/bake onto that simplified mesh. You can export it with the desired file format too.
Yo Azad huge fan of your work, thank you for making this video! you have inspired me a lot with ! i'm a 3d artist and from time to time i do some scans for a different projects, I shot with nikon dslr (sigma18-35) and the results are great - but i thinking about switching to iPhone pro and i was curious to know because you are using Alpha7. is it worth getting an iPhone pro for this purpose? do you notice a lot of difference between the two?
I mean, the DSLR will likely still produce better results than your phone's camera. But for quick and dirty scans, convenience of a phone is really nice. One thing that sucks for ios to win10 is transferring the images. There are many ways of doing it and they all mostly suck. That could be its own video
Fascinating video of your findings and process. It's going to sound a bit anorak-ish of me, but you might want to know that the caps on the columns are Ionic not Roman Corinthian, if it's plain with scrolls it's Ionic, If it's detailed with leaves and volutes, it's Roman. I gave this video a thumbs up, not a thumb down by the way :D
@@AzadBalabanian I do apologise, it's just that being a former stone mason...oh, you get it. Seriously though, It's a terrific video. I was most surprised by the fact that Lidar doesn't automatically equate to better in terms of quality output. I loved the fact that you managed to get those iron bars in the window to pop and look good.
Hi Azad,
I started with LiDAR scanning before I discovered photogrammetry. It’s clear that LIDAR cannot match photogrammetry precision in any way. But if you ever want to try it again, I have the feeling that Scaniverse produces slightly better results than Polycam. Unfortunately you didn’t mention how you scanned with the LiDAR, the best way to scan with lighter is much different than the best way to scan for photogrammetry. Slow even movement and not going back to scan some areas twice is important. Misalignment of incomes by going back to fill the holes and scan things twice. There is a video on UA-cam from a guy who compared LiDAR scanning by hand and with the gimbal and he received way more precise skin with the gimbal. So it seems slow and even movement are very important when using LiDAR for scanning
I still use it on site for size reference so I can match my photogrammetry skin to the original size of things. And it also really shines in a room with just wide walls, which is difficult with photogrammetry but no problem for LiDAR.
Regards Kai
I had been looking into iPhone's LIDAR scans for months. Thank you for making this very in-depth review. This has cleared a lot of confusion I had in my mind.
Love how you manage to mix a thorough analytical method with an artist's eye.
What I learned is it can be the best of both worlds so to speak to pain lidar with photogrammetry. Certain objects are easier to capture with lidar others are better in photogrammetry, there are softwares out there that combine the two for a better overall result.
Thank you for sharing your observations! I'm watching this after a long day, but still was absolutely absorbed by your delivery
I don't usually comment on the videos but I have to say that the research and comparison through your experience is remarkable, and make me realize what I want. thanks
Awesome Job Azad! Really great summary to get anyone up to speed on the latest and greatest in Photogrammetry!
Amazing content! Was researching photogrammetry yesterday and happened upon your channel. I'm surprised you don't have more subs.
I use photogrammetry for detail and then do a quick LIDAR scan to properly scale the photogrammetry model.
This isn't really useful if you are going to use your model as art. But it is useful if you are using your model to build upon.
It's way more cost effective way. Then 'proper' 3D scanners.
good idea to use it for scale. I outline a few more methods of setting 1:1 scale in RC here wizardofaz.medium.com/setting-1-1-scale-in-realitycapture-aadeaafc3737
Great video, I've also tried capturing some facades with Lidar and the alignment issues are there. In my case I usually create more object captures, and Polycam photo mode has been amazing so far. It saves so much time and for objects like shoes the result is almost the same as Reality Capture.
You helped me convinced my team to stick to photogrammetry when they were all so excited about buying a new iPhone to test LiDAR lol
I mean it's worth a try!
I like how professional your video is.
Thanks for doing this. I'm glad to see I haven't been doing it wrong. Intuitively, Lidar seems like it must be better, because lasers… but its best use is for self-driving cars, where it's fast and good enough. But if you need sub-centimetre accuracy, photogrammetry is better. I'm impressed by just how accurate it is if you take the right photos with 75% overlap, no exposed light bulbs or reflective/transparent surfaces.
Awesome video Az, very thorough. Although not useable I found the lidar better than expected (my low expectations!).
Thank you for sharing this. I was considering going for an iPhone to take advantage of it but I think the technology is still not quite a replacement for a photography and photogrammetry workflow. I will continue as I am - thanks
It is great you are back! And congrats on escaping the basement:D
I wish you had a comparison between them over a small object like small statues. Also more subject which are black and reflective. I know it will not beat professional photogrammetry, but in terms of consumer use.
Putting your iphone on a gimbal to capture a lidar scan will remove a lot of the drift error!
Thanks for the detailed analysis! Would be really nice to see the comparison between polycam and reality capture using a proper camera vs iPhone 13. I think that would be an ultimate review for a year or two.
Thanks much! I was initially excited about LiDAR for 3D scanning, but I quickly discovered that it's pretty iffy. Now I understand it's probably not me, but LiDAR: it's just not meant for that application. Now I guess I have to start over again, learning 3D Object Capture.
great video, as someone who doesn't have that much experience in photogrammetry , this clears a lot of stuff up. (btw: woah, that blender navigation is pretty unique! how's it done? )
With a 3d spacemouse!
Great vid mate. Super informative and to the point. Really helpful. Thanks!
What the iphone lidar or stand alone structured light scanners accel in is time and portability. You trade quality for time with a pocket-able form factor. I am hopeful that as this technology matures, it will translate into the engineer + construction industry better.
I use scanning to do industrial modifications and installations. Dimensional accuracy and perfect alignments are a top priority. Compared to TLS (terrestrial laser scanning), Lidar has been more or less unusable for scanning items or buildings in an industrial setting. There are specific cases where it is usable though.
However, the fact the iphone can be thrown into a pocket, and get a 'good enough' preliminary scan, or use photos from an iphone, gopro or camera is pretty cool. Photogrammetry 'can' be great in ideal conditions, but I encounter reflective surfaces and/or poor lighting far too often, or requiring hours of trying to get everything to align, so I am stuck with TLS for the time being.
Have scanned (or photogrammetry) with Faro x330, Focus, Focus Premium, DotProduct DPI-10, Ipad pro, iphone 12+14, Pixel 3, Pixel 6, Geoslam, various Sony Alpha and Canon DSLR+mirrorless cameras (~24mp).
I treat iPad/Iphone lidar as just another tool. It has its place and purpose.
Great video. I wish you showed the untexturized mesh more often though, because the texture can hide some of the flaws of a mesh many times.
I did watch the entire video though and I really appreciate how much effort you put into this!
Great video! One minor point on Object Capture API, according to the WWDC presentation it can use depth data from photos taken with iPhones that have dual cameras to recover true scale and orientation of the object though I haven't been able to get that to work
True, having the scale of the model is a big element that with typical PG workflows involves actually measuring things.
Excellent review, thank you for taking the time to put this together.
Nice video! It make me understand why Tesla choose video capture cameras instead of LiDAR.
You should also compared something that I do. 4K Video instead of photos. While the resolution in a 4K Video is roughly 4x worse compared to a photo you will have many more photos out of the video. And the workflow is much faster and easier, you will aways have sufficinet overlap.
What's the software you use to process video? Does it work directly with the video file, or you have to pick frames and use as photos?
@@animaokio I use Agisoft Metashape but 3DF Zephyr can also work with video files. I can directly upload the video into Agisoft and pick how many frames I want to have.
great video; just a quick feedback, for an easier way to get which image is photogrammetry and which is LiDar, it would have helped to add supers on the screen and writte which image is which. thanks for the time you spend doing this video.
Where can I get a video showing using an iPhone to get an stl file?
How is your zooming so smooth??!
It’s mesmerizing
Thanks for the video since it confirmed it is not me using the device improperly. I am still a novice with these things but my finding are similar. I was using a Structure Sensor Mark II with a iPad Air 4 with the Scanner SDK app and itSeez3D app. Apparently it is similar to Lidar. One can calibrate the sensor. But sometimes the app just gets confused and misalign things. Mesh quality and texture quality was much better using a Canon 7D with Meshroom. I was scanning objects of about 30 to 50 cm. The Structure sensor does not work for small objects of say 10 cm high. One could argue my comparisons were not fair. Your comparisons were really fair. My conclusion is that I do not want to use the Structure Sensor, the quality is not to my liking.
Really good video. I enjoy your laid back, in hyped style and lots of good comparisons.
Thanks Azad. Very helpful. I think RealityCapture will be sufficient with a good camera.
Actually the Lidar scanner is horrible for object capture. The FaceID scanner is ok, but not the backfacing Lidar.
Great video. Just what I needed at the moment. Clear and to the point. Thank you.
Everypoint does combine Lidar and Photogrammetry
Thanks for all of this Azad been interested in getting into photogrammetry with RealityCapture and definitely Light Field Capturing. This helps.
Wondering how the new iPhone 15 Lidar sensor from Sony is vs old.
I doubt polycam photo mode use any lidar ?
I'm using Lidar only for point clouds on very shiny objects instead of flouring surfaces or cumbersome polariser setups. It's where lidar really shines. No pun intended 😜
Does the Lidar function in widescreen and raw modes?
Try 3D Scanner App
Can you make tutorial about how you are fixing the windows, thanks!
ok... do you think an extra 200 dollars is worth it for the LiDar? this is my current predicament. I got an Iphone 12 mini for facial mocap stuff, not realizing there was a LiDar on the pro models. The pro models are an extra 200 bucks... but it really really looks like the Photogrammetry is just sooooooooo much better... the only use case I can think of for the LiDar so far is for AR related stuff, and even then its not EXACTLY needed... so my question to you is, do you think it is worth it? or is it just a cool thing to have in your pocket, with no actual practical uses? Like those scans with the LiDar are, as you said, unfixable in some cases.
Thank you. Curious if you built a multicamera rig with very precise camera positions and lens optics mapped and then gave those positions/lens specs to photogrammetry software as reference for the position of a matching virtual camera position/lens, would all of these misalignment issues potentially disappear?
are they true scale of product ?
that's some quality content right there
Awsome research, you took a lot of time for making these comparitions, great video.
When I write on google Prora Format nothing is appearing. Can you please share a link that explains more about the format and conversion possibilities.
Thanks for the content 🙂👍
I would like to find the best way to make photogrammetry of rooms or houses (the inside). Is the iPhone a good choice for this?
Wow you Photogrammetry is so much nicer than what Ive been able to achieve with Polycam. Definately into using secondary software to process my photographs. What is the best software option for a MAc? Or do I need to run windows and use RealityCapture as my best bet for this kidn of work?
Absolute class mate!
Your Blender navigation is very smooth, are you using a special device or settings ?
3Dconnexion 3d spacemouse!
What's your recommended workflow for generating 3D assets from building facades?
I'm considering an iPhone so that i can just capture something good enough for import to Unreal, whenever I come across it, without having to lug around specialty equipment. I don't need 'THE BEST' quality assets using cross polarization filters etc... but they do need to be better than e.g. previz/concept art quality.
Can't find any other content on this specifically. Your thoughts on the current best practice would be valuable!
Can you try maybe mounting iPhone to a Drone and try Lidar Scan. I wish I had a chance on that. I have iPad Pro just because of what I need it for scan space and use for augmented reality but I continuously thinking what if I had a iPhone Pro and I had an ability to use Drone then would I have much more chance to create different stuff (3D Models and integrated Augmented Objects to the certain sides, places that integrated with my digital sculpture and work...) Do you consider that?
By the way living in Germany also I want to capture some Façade, is there any regulations or rules that I cannot because it is private buildings??
How is so far your experience in Germany using drones to photogrammetry? Had any problems with police or get always permission? I need to capture Monuments and Memorials but I have no one to ask if it is legal or need extra papers to do so.
Did you try Luma yet?
what photo app did you use for manual settings and Apple Raw
check out Trnio + lidar informed Photogrammetry on your iPhone !
We need to output our meshes to be approx 50k vertices for optimal realtime AR performance.
Does the reality capture pc app allow for a target output resolution?
Which other apps eg Polycam allow for target geometry resolutions? Ideally scaling texture output as well. We could destructively optimise in a 3d package but i believe this would make the texture map redundant.
Great info on LIDAR vs photo capture pros and cons
thankyou :)
Yes, RC you can decimate to a target triangle count (it's called simplification in RC), unwrap, and texture/bake onto that simplified mesh. You can export it with the desired file format too.
Thanks - very useful.
Yo Azad huge fan of your work, thank you for making this video! you have inspired me a lot with !
i'm a 3d artist and from time to time i do some scans for a different projects, I shot with nikon dslr (sigma18-35) and the results are great - but i thinking about switching to iPhone pro and i was curious to know because you are using Alpha7.
is it worth getting an iPhone pro for this purpose? do you notice a lot of difference between the two?
I mean, the DSLR will likely still produce better results than your phone's camera. But for quick and dirty scans, convenience of a phone is really nice.
One thing that sucks for ios to win10 is transferring the images. There are many ways of doing it and they all mostly suck. That could be its own video
Gr8 video!
What is your configuration and model of 3D mouse? My 3d connexion works like s**t in Blender.
Pan 4, Orbit 4, Navigation Orbit, Rotation Turntable
Thx!
First I used LiDAR, but results were fast but like meh 😒, today tried photogrammetry on just small object & blown 🤯🤯🤩 by its result
This is great video and very enjoyed and learned a lot.
Can we use 360 photos taken by a 360 camera to create a better Photogrammetry?
Metashape supports 360 photos as an input but otherwise, not really.
Thanks for this really informative video :)
Hey if you had to choose a phone nowadays that is good enough at photogrammetry, would you pick pixel6 from google or iphone 14 pro ?
both would work - I think it's a matter of preferred OS at that point.
Thanks so much... well I can take iPhone off my Christmas list then
What software should I use on PC to get started with photos please, thanks for the great video
realitycapture
@@AzadBalabanian thank you, I'll check out your other videos :)
Fascinating video of your findings and process. It's going to sound a bit anorak-ish of me, but you might want to know that the caps on the columns are Ionic not Roman Corinthian, if it's plain with scrolls it's Ionic, If it's detailed with leaves and volutes, it's Roman. I gave this video a thumbs up, not a thumb down by the way :D
Much appreciated :)
@@AzadBalabanian I do apologise, it's just that being a former stone mason...oh, you get it. Seriously though, It's a terrific video. I was most surprised by the fact that Lidar doesn't automatically equate to better in terms of quality output. I loved the fact that you managed to get those iron bars in the window to pop and look good.
This was really useful, thanks!
Great! Thanks!
Great video
greate explanation.
Thats great
You're great!
Like Tesla cameras are the future, not Lidar/Radar. Ai and Photos is the golden ticket
😇