Texture Scanning with a DJI Mavic 2 Pro Drone - by Grzegorz Baran

Поділитися
Вставка
  • Опубліковано 14 сер 2019
  • #photogrammetry #scanning #drone #pbr #workflow #mavic2pro
    Texture Scanning with a DJI Mavic 2 Pro Drone by Grzegorz Baran
    In this video I present my complete photogrammetry workflow for surface scanning with a DJI Mavic 2 Pro drone.
    Initially by making this video I was trying to answer the question: can I use a drone in my daily photogrammetry workflow to successfully create high quality materials
    The workflow I present is optional as there is no the only one or a perfect one, just those which work for us
    This time I used a cliff wall surface as an example.
    Since the video is quite long I split it into main sections. In details I covered:
    - introduction to the drone experience (equipment, landing pads, energy bank)
    - surface capture with the Mavic 2 Pro drone (picking a right spot)
    - shadow removal and image postproduction with the PhotoLab2 (introduction to histogram and color depth)
    - photogrammetry reconstruction in Agisoft Metashape (vertex color approach)
    - create, uvmap and align the lowpoly model around the highpoly in ZBrush (lowpoly projection)
    - PBR textures baking in Substance Designer
    - seam removal, surface fixes and material generation in Artomatix Artengine
    - presenting material in Marmoset Toolbag 3 with the external HDRI map applied as a source of light
    - overall summary of what I think about using a drone in photogrammetry and material creation
    For those interested getting even more into details, here is a link to my photogrammetry guide:
    gum.co/YanD
    The offline version of this video is available on my gumroad:
    gum.co/zMxeW
    Thanks to those who bought it already, as it really helps me to move forward with my research and creating even more content like this one.
    Feel free to follow me on Artstation: www.artstation.com/gbaran
    Hope you have found this video helpful.
    Thanks for watching.
    Grzegorz Baran

КОМЕНТАРІ • 75

  • @CONTORART
    @CONTORART 4 роки тому +12

    Felt like I should be paying for this. It's so in depth and I learnt so much. Also good to be introduced to some new software I hadn't seen before. Thanks a ton.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому

      You are welcome, I am really super happy to hear that you have found something useful for yourself here, cheers! :)

    • @boonerowen8127
      @boonerowen8127 3 роки тому

      A tip : watch movies at Flixzone. I've been using it for watching a lot of movies during the lockdown.

    • @kairomalachi255
      @kairomalachi255 3 роки тому

      @Boone Rowen Yup, I've been using Flixzone} for years myself :D

  • @danielkuzev4992
    @danielkuzev4992 2 роки тому +1

    Great tutorial mate!

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  2 роки тому +1

      Cheers! Shame it has aged and I deveoped so many things since the release of this one @) .. for example I dont use any landing pads anymore as in 90% of cases I take off and land from hand.

  • @nickoftime7937
    @nickoftime7937 4 роки тому +1

    Very in-depth tutorial and review. Answered many of my questions. Well done!

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому +1

      Thanks, Supper happy to hear that. Feel free to wath my other videos, especially the recent ones since I improved the workflow a bit since the time I recorded this video. Cheers!

  • @adrianalfordphotography
    @adrianalfordphotography 4 роки тому +1

    Beautiful video, thanks so much for sharing, this was great to watch. Cheers!

  • @Barnyz
    @Barnyz 4 роки тому +1

    Excellent and informative video, the drone seems like a cool tool, wish i had one!

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому +1

      Thanks Barnyz :) . I am sure you will get one sooner or later :)

  • @simonb1009
    @simonb1009 4 роки тому +1

    Good Equip. I watch you a lot time!

  • @GeorgeNicola
    @GeorgeNicola 4 роки тому +1

    Good work! Definitely deserved to hit like button.

  • @exotane
    @exotane 4 роки тому +1

    Really nice! Good job!

  • @nicolaverzura6436
    @nicolaverzura6436 4 роки тому +1

    Anazing. Thanks for this video like it 👍x

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому +1

      Thanks man, appreciated

    • @nicolaverzura6436
      @nicolaverzura6436 4 роки тому

      Grzegorz Baran like and subscribe. If you have time pass by my channel and let me know if you also like it

  • @CGMatter
    @CGMatter 4 роки тому +1

    great video!

  • @lennutrajektoor
    @lennutrajektoor 2 роки тому +1

    In regard of drone hitting the landing pad anchor - use anchors that doesn't stick out of the ground. Usually there are thins anchors that allow you to push to the soil level that there are no stick out parts. It helps you to have smaller landing pad in terms of area. Yes it forces you to land onto smaller area but not always there is plenty of space for that.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  2 роки тому

      Thats a very good tip, especially for those who start the adventure with drones like I did at this time. But after 3 years of using the drone I dont use any landing pads anymore :). I simply take off from hand and land on hand.

  • @Gringottone
    @Gringottone 4 роки тому +1

    Grzegorz BaraYn here! Subscribed

  • @gen2ssi396
    @gen2ssi396 2 роки тому +1

    25:35
    Some Background information:
    1. In RGB 8, there is 8 bits space located for each channel(R G B) in one single pixel, which means there are 2^8 = 256 options(from 00000000 to 11111111) for each channel. Hence there are 256*256*256=1677216 possible colors for one pixel.
    2:To convert an 8-bit representation to RGB value we live and breathe with, for example, RED R=11111111 G=00000000 B=00000000, we just convert it by calculating R=2^7+2^6+2^5+2^4+2^3+2^2+2^1+2^0=128+64+32+16+8+4+2+1=255. G=0 B=0, hence R=255 G=0 B=0
    3:To convert RGB to Hex(from 0 to 15, that is 0-9 A=10 B=11 C=12 D=13 E=14 F=15), we just mod the value by 16
    RED R=255 G=0 B=0
    255 mod 16 = 15 (255=15*16+15),15 mod 16= 15 (15 = 16*0 + 15)
    hence R=FF G=00 B=00
    that is #FF0000

  • @user-qe4rt2pm5y
    @user-qe4rt2pm5y 4 роки тому +1

    Thank you for sharing.You show a mavic can do decent material work. Do you think it can handle photogrammetry for large objects?

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому

      Thanks, I hope so. It has quite focused focal length for this, unfortunately shallow aperture (2.8-11). SO it depends on how deep is the object. I believe it is totally fine or even better to a standard camera for big targets like.. huge cliff wall. But for complex sculpture a DSLR camera mounted on a tripod where you can set the aperture to 16 having it steady during the shot would be better. So it depends on what exactly are you trying to capture. I am planning to test it in a future and record a video to have it covered but I am not sure when yet.

  • @panphoto
    @panphoto 4 роки тому +1

    Thanks, very useful video, I learned quite a lot from this. May I suggest shortening it a little and ditching the distracting background noise.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому

      Thanks :). And big thanks for the feedback. I agree.. need to learn how to keep my vids shorter but still informative enough. Regarding the bakground music.. I hope it matches better in new videos.. but if it still suck I will try to find other solution. With the background music for example you cant hear my dog licking his . . :) when I was recording the voiceover. Also.. I spent a lot of time and effort composing it.. this is the way I share it to the world ;)

    • @panphoto
      @panphoto 4 роки тому +1

      @@GrzegorzBaranArt Hi Grzegorz, Regarding the background music, I always have a problem with it because it is inevitably a distraction from the content ... otherwise I'm sure University lecturers the world over would be embellishing their lectures with it. The best UA-cam videos just get straight into it without bull*%£! graphics and self-important fanfares. You have a lot of useful information to communicate and I praise you for it. On the important matter in hand I'm wanting to do a photogrammetric exercise on a cliff face and beach for a 3D scene using a drone. I've found that I can use the drone's 4K video to scan a cliff-face. The procedure is to open the (MP4) video in Photoshop and export from it a jpeg sequence. Provided that the shutter speed is sufficiently high it works very well and you can instruct Photoshop to spit out as many or few images as the original fps will allow. I'm using a Phantom 4 Pro plus at it's best aperture, f.5.6. Alternatively, though I've not tried this, it might possible to set the still camera to shoot every, say 1 - 2 seconds? If so this would allow us to concentrate of the flight-path rather than the shutter! The main virtue of using the video camera might be the sheer quantity of imagery available for the photogrammetry software. I use Metashape as well as Reality Capture which for this type of work is far superior in speed and accuracy. Try it!

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому

      @@panphoto Thanks. There is no way video gives you tha same quality as still images. Usually video has its own compression, also is stored in quite empty color depth compared to the RAW images.
      f5.6 seems pretty decent but for a drone I would use f4.0. I made a few experiments and I have found 4.0 the best. Of course I experimented with M2P so your Phantom might return different results.
      I cant see any problems with taking images.. but to take them I first let the drone to hover in static position. When moves it motion blur the image. Just compare image generated from the video with the still one... I mean .. compare all the details.. use the Quality Estimation tool in Metashape etc. and you will see the difference. Or try to compare reconstruction quality from the video with the reconstruction quality from the still images. I can make a bet that reconstruction quality, image alignment etc from still images is way better. Of course depends on how many details works for you. If you use phtogrametry just for sketchy aerial reconstruction video should be fine and way quicker.
      Good luck with video scanning tho, would be cool to see results you get this way. Cheers!

  • @mr.z2618
    @mr.z2618 4 роки тому +1

    hi this is quite nice location and beautiful scenery, can you please share where is it in UK?

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому +1

      sure, with pleasure :)
      The exact position of the cliff wall I captured is: 54°59'26.4"N 1°23'48.6"W
      It's North of the UK - a shore line south to the Soutshields. 20 miles east to the Newcastle. You can go there even with metro from the town center.. just 20-30 mins walk from the station.
      In details.. this is a walking path from Graham's Sands through Frenchman's Bay and Marsden Bay (Marsden Grotto location) .. and down South depending on how much time you have to walk :).
      Hope that makes sense :), cheers!

  • @soeyehtet3348
    @soeyehtet3348 4 роки тому +1

    Wow! What an hidden gem!Is it possible to combine DSRL and Drone to scan buildings and pagoda?

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому +1

      Hey, thanks, of course it is. Just don't mess with different focal length to much. Try to use one for a DSLR camera and the other one the drone has and you should be totally fine.

  • @eisklotz5642
    @eisklotz5642 4 роки тому +1

    Thanks a lot! Now I need a drone ;-(

  •  3 роки тому +1

    Looking at your decimation in Agisoft -isn't it better to go slowly by 50% steps, ie. from 67M to 38,5M to 19M etc... ?
    Also, some people stress out the importance of using all TIF files in the process, but here I agree with you - 100% JPGs are more than enough, especially for drone workflow.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  3 роки тому +1

      Hey Pavel. This video was recorded ages ago :) and a lot has changes since this time. For example photogrammetry software can fully utilise 16 bits data image files. When this video was recorded it wsnt the case. So I strongly recommend to watch a few of my recent videos and use RAW files only which can be converted in any photoediting software to TIFF files with the 16bits color depth. I explained it in details in one of my videos.
      Regarding decimation, why would I go with steps? Currently in my workflow I use as dense highpoint model as possible and usually use 16k texture to carry color information. This way there is no 67 mln cap anymore. I decimate model just to get a light one so I can position the low poly plane in space and I bin the decimated one later. Hope that makes sense :)

    •  3 роки тому +1

      @Grzegorz Baran Thanks! I understand high color depth is super important during capture and RAW editing, but after that, I've never got any measurable difference in final export files. The same for sharpness in jpg compression vs tiff.
      As for the decimation, I've read about the gradual 50% steps at multiple forums / tutorials and RealityCapture also recommends it in its Help files. I guess the Agi lgorithm is similar. With smaller steps, the mesh is smoother (in my tests) and it allows to go even

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  3 роки тому +1

      @ 16 bit color depth (RAW/Tiff/PNG) is very important and makes the difference during the image alignent stage. During the reconstruction it is not as relevant, but still.. the more data the better. I would say if you have enough image overlaping, 8bits images (JPEG) used during the reconstruction should do the job. With less image overlapping, 16bits can come in handy. So with 16 bits you risk nothing, with 8 you do. This is why I strongly recommend to shot in RAW, get the most from with the photoediting software, export it in 16 bits file format like TIFF or PNG .. and use it as a source for photogrammetry reconstruction :)
      As said, I use decimated model only as a reference in 3D space to position lowpoly model. Its quality and detail accuracy doesnt really matter as I dont bake anything from it. When the lopoly model is exported I can simply delete it as I use the lopoly model as a canvas for baking from the the actual high poly one.
      ZBrush is totally optional and since it is quite expensive piece of software I can understand picking an alternative software for lowpoly creation. I simply purchased the ZBrush license years ago and I keep using it since all updates are free. I love its projection tool as it allows me to avoid cross-surface gradient which appears when the low poly is slightly misaligned. But I am sure it can be done in 3DMax, Blender and any other 3D app

    •  3 роки тому +1

      @@GrzegorzBaranArt thanks, I still have got a lot to learn ;) nevertheless, I never ran into Aligning problems since Agi 1.60 - it only skips blurred shots in scenes I'm working on

  • @milkfloat22
    @milkfloat22 4 роки тому +1

    Why Mavic and not the Phantom 4 Pro? The Phantoms camera has a mechanical shutter and films at 60fps. Surely this would help improve overall quality of photos / videos etc? Cool video btw!

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому +1

      Thanks Craig, Mavic2 Pro had very good revies, was cheapper and had everything I needed .. especially 77°focal view (I guess you were suggesting 'Phantom 4 Pro V2.0' which also has 1inch sensor but 84° FOV as an alternative). M2P is also very portable and easy to transport compared to Phantom.
      Since I pay for my equipment with my own money I considered this purchase also as a risk as I had no idea is it going to do the job for photogrammetry. I wasnt able to find anyone who used it for material reconstruction already and was opend to share his experience. Especially I have heard before that Inspire2 struggled with material capture so getting one was a big risk since it might be useless.
      Regarding to mechanical shutter .. I dont shot objects in motion and I usually shot from quite close distance.. also mechanical shutter has limited amount of shots it can handle before it has to be replaced.. to compare electronic shutter has nearly unlimited lifetime which is especially important for photogrammetry as you shoot sometimes a few hundreds of images per single capture).
      Btw. as far as I know a mechanical shutter has nothing to the video quality since it works just for the still images. For video it opens and the electronical shutters simulates it until the video is over. Would be a lot of noise if it would click 60 times per second :) . .especially when you record with sound
      But as I said.. these are not big NOes . When I decided to get M2P it was a choice between why not Mavic Air and why not Inspire2 rather then why not Phantom. I guess, mostly due to portability and a bit more focused FOV I would still pick up the M2P even if the price for Phantom is the same. But.. I also would cry if I have Phantom 4 Pro v2 instead :) of Mavic 2 Pro. Finally, if would have the budget and I would live from scanning I would consider getting an Inspire 2 :D.
      So my purchace was already way too expensive experiment I made to find some answers for a few questions I had in my head. I made it also to share results and experience with the photogrammetry community since I wasnt able to find anything like this anywhere when I was looking for it.
      Hope that makes sense :), Cheers!

    • @milkfloat22
      @milkfloat22 4 роки тому

      @@GrzegorzBaranArt THanks for the very detailed answer Grzegorz!

  • @mikeben9587
    @mikeben9587 9 місяців тому +1

    Hm, is there a better solutin now for create a seamless texture? Since Artomatix Artengine won´t be developed further and its quite expensive.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  9 місяців тому

      Hey, there are many other solution for seam removal, but I dont think any is better. This is why I still use ArtEngine on a daily basis. To be honest, I planned to record a comprehensive video on this subject, but I struggle with time. In worst case scenatio, will cover it in my incoming book about photogrammetry for materials, or sooner on Patreon if I finally decide to make it public :D

  • @Wozner
    @Wozner 4 роки тому

    Hey man! Thanks for the tutorial. A few questions. You don`t bake texture in Metashape right? Just process calculations with vertex color, and use the vertex color in FBX file, to get the texture in Zbrush then using vertex colors ? Another question, what PC specs do you have? I have Core i5-4570 along with GTX 1080ti 10 Gb on a board, and my compute process took me about 2:45 hours using CPU+GPU at Metashape. Thanks for your help man!

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому

      Hey, yeah, in this case I dont bake anything in Metashape, just generate and export mesh with color stored as vertex color information. Medpoly version of this mesh in ZBrush is used just to create and position a LOWPOLY model for baking. What I get from zbrush is jsut this lowpoly model, nothing more.
      Next I do bake in baler - in this video - Substance Designer where I used LOWPOLY plane exported from ZBRUSH and as a HIGHPOLY I use the FBX exported from Metashape :).
      This way I simply bake all maps I need In one go (except roughness of course).
      Regarding to PC, I have i7-7700K - 4.2GHz with 64GB of RAM and GF1080 with 11GB of RAM.
      Reconstruction time you had was quite fast tho, usually it takes longer and depends on many factors - amount of images and their resolution, quality of reconstruction, mode used for reconstruction - Metashape offers 3 modes with all their pros and cons, but usually photogrammetry reconstruction takes a while, 3h sounds quite fast ;).
      Hope that make sense. Cheers!

    • @RyanDashkevicz
      @RyanDashkevicz 7 місяців тому

      ​@@GrzegorzBaranArtI generally bake the textures in metashape and overshoot the texture resolution sometimes multiple 8 k textures depending on the scan, then bake down in Blender. Always shocked by how much more detail is present when I do that.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  7 місяців тому

      @@RyanDashkevicz Did you mean just albedo or full available PBR texture-set? I was doing a comparison recently (havent publish results as had no time to put them together) and baking quality in Metashape is certainly significantly lower to baking quality from lets say, Substance Designer baker. Doesnt matter if you tripple the size of texture, it is just way more glitchy. Metashape's baker simply isnt as good as other, full bakers.

    • @RyanDashkevicz
      @RyanDashkevicz 7 місяців тому +1

      @@GrzegorzBaranArt last I checked, it just reprojects (with some filtering/intelligence) the textures from the camera views onto the mesh as a texture. You can then use a lower resolution mesh with the albedo to bake out the pbr texture set from that.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  7 місяців тому +1

      @@RyanDashkevicz Yeah, I wasnt talking about color/albedo projection, but normal, height and ambient occlusion baked into low poly canvas. I made some tests recently using same data sets and results I baked within a photogrammetry app were way more glitchy and had lower quality in comparison to these I had from Substance Designer baker or XNormal.

  • @izoyt
    @izoyt 4 роки тому +1

    doesn't metashape supports mavic2pro for barrel distortion?

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому +1

      Yes, it does but Metashape loads images with captured METADATA setting. So loaded images go as they are without any color calibration, without noise reduction, without any white balance correction etc. Since I use those images to reconstruct surface details, color calibration is a big part of this process. Color data after reconstruction in Metashape is reduced to 8 bits no matter how deep was the color input. So pure 3D shape reconstruction will work without any problems if processed directly from RAW data and Metashape would correct it, but from color point of view it would be a terrible waste since color depth would be flattened to 8 bits. Hope that makes sense :)

    • @izoyt
      @izoyt 4 роки тому

      @@GrzegorzBaranArt yes, this makes all sense, thank you. thnx and keep doing good work.

  • @Benjamin-Springer
    @Benjamin-Springer 4 роки тому +1

    Hello Grzegor, thanks for your super detailed video und explanations. One question; Why you use 8bit footage for building the chunk and depthmaps? Metashape works well with raw data and handle there more details within 16bit+. I added you in facebook (benjamin springer) feel free to get in touch for more exchange of information and my PG experiences.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому +1

      Hey Benjamin, in the next video I made: ua-cam.com/video/k4btbj6IGhw/v-deo.html
      It came from this that Metashape and RC didnt support 16bits data before and even if you used 16bits it was trimmed to 8.. 16bits support was introduced recently.
      I told about this in the next video since I didnt know it when I was working on this one :). So of course you should use 16bits .. it is going to make a big difference especially for camera alignment, wont affect reconstruction quality much thou. I would recommend to use color calibrated TIFF file instead of pure RAW. Hope that helps :). Cheers!

  • @vfxexpert1425
    @vfxexpert1425 4 роки тому +1

    Hey, just a quick tip. You can actually land the drone in your hand. I always do this cause I don't want to buy new props.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому

      I was landing Tello on my hand all the time.. until once a wind pushed it and propeller hit my fingers. Since Tello is just a toy.. it just hurt and did no harm.. but I realised that if I would try to land a bigger drone.. and it would be pushed by the wind the same way.. I would have less fingers :).
      So I decided that it is safer for me to do not try it with a bigger drone. I totally agree it would be very useful to master hand landing.. was even considering to practice it recently when was in high grass.. but have changed my mind in last second since the drone after it detected my hand moved up.
      But I agree.. its very useful skill.. also havent heard if any Mavic 2 cut someone else finger.. or was it just very painful when the blade hit it :).. but I am not still confident enough to try. I am sure once I have no other choice I will :). Cheers and thanks for the tip!

    • @vfxexpert1425
      @vfxexpert1425 4 роки тому +1

      @@GrzegorzBaranArt my phantom once hit my fingers and it doesn't even hurt. I wouldn't catch a inspire tho

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому

      @@vfxexpert1425 Thanks, it means I should practice some hand landing then.. even just to know how to do this when I really have to. Thanks for an advice.. will wait for a good weather without any wind and will see.. hopefully wont record any story-driven-video about how I lost my fingers ;P trying.

    • @dennisvolkerts
      @dennisvolkerts 4 роки тому +1

      ​@@GrzegorzBaranArt Im new to the mavic 2 but do it almost every time.
      Just stand behind your drone . Its first sees your hand when you put it under it, and wil move up a bit... just hold it there flat and dont grab it... remote Stick down and it will land without any danger in your hand. (ua-cam.com/video/hLiOB3W9XtQ/v-deo.html this is the mini but the mavic is even better and just land without going up that much)
      Its works all the time.
      never try to turn the mavic upside down like he does.... I tried that but it didnt shut off but was fighting while upside down.

    • @GrzegorzBaranArt
      @GrzegorzBaranArt  4 роки тому +1

      @@dennisvolkerts Thanks man, but I have found a way better way to hand land, the mavic2pro I guess I should record a video to share it with the community .. its freaking awesome.. works great, never had any issues with that.. even during a strong wind.. no running away from hand, no drone flipping after .. just soft, clean hand landing :). If you think it would be worth and would benefit anyone if I record this video just let me know. Cheers!