This is my last depth rant

Поділитися
Вставка
  • Опубліковано 15 жов 2024

КОМЕНТАРІ • 105

  • @JDLeeArt
    @JDLeeArt День тому +66

    We need to try this on some 360 video mapped to sphere...

    • @wozniakowski1217
      @wozniakowski1217 22 години тому +5

      I had the same idea! Though I think those ai depthmap algorithms could have a hard time decifering depth from a 360 video since it's a very wacky projection and they weren't trained on that. But I need to see it nonetheless

    • @harry1010
      @harry1010 14 годин тому

      @@wozniakowski1217because I cbf looking through the literature, I remember seeing a couple of approaches. One includes figuring out what kind of 360 camera people use to correct the distortion, the other tries to convert the segmenting model to handle a curved plane instead of a flat one (which passes over the image and is used to guess depth). So yeah, definitely an area of research!

  • @InterPixelYoutube
    @InterPixelYoutube День тому +39

    Ian Hubert will have fun with these tools 100%

  • @khalatelomara
    @khalatelomara День тому +52

    The deal breaker for this method if we can actually take a non shrinking depth for a room , as if you tried it on a room perspective it kinda go in a curved trapezoid shape

    • @xabblll
      @xabblll День тому +4

      I’m not sure, but it should be possible to apply some exponential correction on depth. Usually in CG graphics then we want to save linear depth, we compress it to smaller range with logarithmic conversion, so objects closer to camera have more depth information compared to objects further away. So to get linear depth we need to apply reverse conversion

    • @khalatelomara
      @khalatelomara День тому

      @@xabblll exactly, Most of monocular depth techniques are pretty good for close objects , but when when it comes to perspective it breaks down quite quick

    • @merseyviking
      @merseyviking День тому +1

      @@khalatelomara So you mean make the doorframe the same relative size as the objects near the camera? Well if you know the parameters of the capturing camera, you could easily apply the inverse perspective transform on the depth map, and retrieve the original scale. You would need to know the aspect ratio and FOV, and if you wanted real-world scale you'd have to measure something in the real world scene and scale the resulting geometry appropriately. It might take some faff because the near and far clip planes generated by the ML algorithm are arbitrary (well I presume they are, or at least are rough guesses).

  • @stoef
    @stoef День тому +42

    FYI you can open a command prompt for a specific folder by clicking into the path at the top of the explorer and then just typing cmd and pressing enter.

    • @ClaytonOrgles
      @ClaytonOrgles День тому +1

      Yep! Also if you have Windows Terminal installed, you can right click on the window and select "open in terminal"

  • @sugar_ltd
    @sugar_ltd День тому +19

    Dude that's wild. You're right that the applications are plenty.

  • @zackmercurys
    @zackmercurys День тому +30

    in the end, you could de-light your scene using the Ian Hubert's trick, to make it absolutely de-lightful!

    • @Dude_Blender
      @Dude_Blender День тому +3

      What trick?

    • @omgbutterbee7978
      @omgbutterbee7978 День тому +12

      @@Dude_Blender Dividing projected textures by the light values of an hdri that was taken at the same place let's you flatten the image and remove shadows and highlights. InLightVFX had a good video called "How Ian Hubert Hacked VFX (and you can too!)" that goes over the whole process. It's REALLY cool.

    • @Dude_Blender
      @Dude_Blender День тому +3

      @@omgbutterbee7978 thanks mate!

    • @MasqueArt
      @MasqueArt День тому +1

      @@omgbutterbee7978 You can fake the surrounding lights as well, if you do not have HDRI, but HDRI is simpler. If you can make them.

    • @grinningtiki220
      @grinningtiki220 23 години тому +4

      Somehow Ian was doing that 18 years ago before hdri was even a thing. The man is a wizard.

  • @DQBlizzard_
    @DQBlizzard_ День тому +9

    Very in-depth video. I'll see myself out.

  • @ChrisKGallon
    @ChrisKGallon 17 годин тому +1

    I love that you're on the Davinci Resolve/Studio Train

  • @AMTunLimited
    @AMTunLimited День тому +3

    IIRC, this is a somewhat common thing to do in Davinci Resolve so you can use it as a mask for various adjustments

  • @gordonbrinkmann
    @gordonbrinkmann День тому +13

    "If this can be done with a photo, why not do depth on a video?" As far as I know DaVinci Resolve can do this quite easily (at least in the Studio version), so I would assume some other video editing software can do this as well...?

    • @tomcattermole1844
      @tomcattermole1844 День тому +3

      Would you be surprised if Adobe didn't? 😢
      Only workaround I've found is exporting footage as a PNG sequence and then running a batch in Photoshop using the blur neural filter with depth output checked. Is it janky? Yes. Is it accurate? No. Does it work? Barely. Am I an idiot for still using Adobe? Absolutely.

    • @gordonbrinkmann
      @gordonbrinkmann День тому

      @@tomcattermole1844 No, why should I be surprised...? Never thought Adobe was the ne plus ultra. Maybe there is other software out there than just Adobe? I don't know, I'm not using all video software that exists... I just said DaVinci Resolve can do depth on a video and it probably isn't the only software.

    • @tomcattermole1844
      @tomcattermole1844 День тому

      @@gordonbrinkmann unfortunately Blackmagic knows exactly what their customers want and puts in the effort to implement it. Most other softwares don't have customers that want the features or don't want to put the effort in to implement it.

  • @one_stz
    @one_stz День тому +1

    huh??? THE LAST??? I NEED MOREE

  • @toastbrot97
    @toastbrot97 20 годин тому +1

    I was using a similar method to this for image to video for a while now. Essentialy doing small camera pans and dollies into the scene to make it look a little fancier than just a scrolling 2D image. I was always wondering how stable the technique would be when done on a video instead and i have to say, it doesn't look too shabby. I think if you're trying to relight the scene the artiffacts will definitely be the biggest problem, but other than that it could be quite handy for some quick and dirty vfx, or enhancing a video shot on a tripod with some subtle realistic 3D camera shake.

  • @jamus1217
    @jamus1217 День тому +2

    That's pretty dang cool

  • @sikliztailbunch
    @sikliztailbunch День тому +4

    I don´t understand why depth passes are white in the foreground. The background can extend virtually infinitely. As can positive float values. As much as a distance of zero having a color value of zero makes also more sense to me....

    • @AlessaBaker
      @AlessaBaker День тому +2

      This is essentially why compression formats aren't great for this kind of thing, because when you treat this data like an standard 8 bit image you are confining the data relative to the min-max of that compressions range. If you store the depth map in a Float 32 image, for example, your range then is between 0 and the max of a float 32 bit.
      This is mostly how game engines handle and store coordinates in screen buffers. Since you can in theory store whatever you want in RGBA, and interpret them anyway you want. You could store object locations in RGB as X,Y,Z for example. You're no longer limited to 0-255.
      Hope that helps? :3

    • @merseyviking
      @merseyviking День тому +2

      It's because the depth is normalized to the viewing frustum. So 0 is at the near clip plane and 1 is at the far clip plane. The reason the colour is white at the near plane is because usually the interesting stuff happens there, so it makes it easier to see. But that is just a colour ramp inverted wrt the normalized depth value.

    • @sikliztailbunch
      @sikliztailbunch День тому +1

      @@merseyviking that makes sense.

    • @sikliztailbunch
      @sikliztailbunch День тому

      @@AlessaBaker Sounds reasonable. Thank you for the explanantion

    • @merseyviking
      @merseyviking День тому +2

      It might also explain the flickering: if there's little to no temporal coherence then the algorithm will be choosing different depths for the frustum planes, and so the same absolute depth is getting mapped to different normalised depths.

  • @ArdaHamamcoglu
    @ArdaHamamcoglu День тому +2

    Isn’t the top edge caused by the image being wrapped so its actually interpolating the bottom row of pixels.

  • @batleram2946
    @batleram2946 День тому +8

    Fun fact, if you want to open command prompt in a specific folder, you can type cmd into the file explorer path, and it'll open cmd in the current folder

    • @InterPixelYoutube
      @InterPixelYoutube День тому +1

      Your profile picture invokes so many great childhood memories (:

    • @slavsit7600
      @slavsit7600 23 години тому

      @@InterPixelUA-cam same, do you by any chance remember the name of that game

  • @LifeFromAbove.
    @LifeFromAbove. 19 годин тому

    Thank you so much this is Epic, I’ve been looking for a way to make 3D titles in Videos more realistic. I will give this a try!

  • @mven
    @mven 23 години тому +1

    Lots of people mentioning the cmd in the address bar trick, but did you know you can open a Powershell terminal via shift + right click context menu on any folder? Cmd is old and busted, Powershell is the new hotness.

  • @DONO_da_Razao_ABSOLUTA
    @DONO_da_Razao_ABSOLUTA День тому +2

    How can I use this to generate a live vr video where I can walk freely...like if I was inside a movie?? The question of 1000 USD.

    • @wlockuz4467
      @wlockuz4467 День тому +1

      You'd need more information than just a single camera frame to make it feel like you're actually walking in another scene.

  • @MichaelProstka
    @MichaelProstka 2 години тому

    You meshed yourself! That is so cool!

  • @ralphmoreau2768
    @ralphmoreau2768 День тому +1

    Very cool experiment, would love to see it, in it's true glorious fidelity

  • @thejaredwilcurt
    @thejaredwilcurt День тому +2

    0:45 You can just click into the address bar in Explorer and type cmd and press enter and it will open the command prompt from that folder.

    • @undefined06855
      @undefined06855 22 години тому

      or even just right click -> open terminal

  • @blainecodes
    @blainecodes День тому

    I was just experimenting with marigold and was thinking about trying out other models to see if they work better with video!
    Great video!

  • @aedanp07
    @aedanp07 День тому +1

    YESS!! This will be so useful!

  • @reasonablyrad
    @reasonablyrad 21 годину тому

    this is so niche and genius

  • @sparkbark7640
    @sparkbark7640 11 годин тому

    ok that's actually crazy

  • @Kozlov_Production
    @Kozlov_Production День тому +3

    Can you make a stereo video from usual video?

  • @agyab_cg
    @agyab_cg 16 годин тому

    this can be a great weapon for 3d artist

  • @markobozic-c46
    @markobozic-c46 День тому

    You sir are brilliant. Thank you for your brain.

  • @br11bronc34
    @br11bronc34 16 годин тому

    I have a question: why don't you use directly the depth map generator included in davinci? is there any practical reason? very good video, thanks and best regards

  • @JCtheMusicMan_
    @JCtheMusicMan_ День тому +1

    Could this be used to make a green screen 3D studio backdrop? 😎

  • @cosmobawler
    @cosmobawler Годину тому

    I tried to follow this video not being overly experienced with blender, however when I got to around 2.06 in the video, my image isnt showing any colour and is just the greyscale model, am I missing something? I feel like its probably just a button ive accidentally tapped or something lmao

  • @simonabunker
    @simonabunker День тому

    Very nice! Is it possible to generate 16/32 bit float depth images?

  • @Abreu3dfx
    @Abreu3dfx 7 годин тому

    what is the difference between this and the switchlight?

  • @PedroScherz
    @PedroScherz День тому +1

    It's your latest* depth rant. We all know this isn't over

  • @Ricoxemani
    @Ricoxemani День тому +1

    this is cool, but the problem with the geonodes setup is that it does not account for the perspective of the video. The geometry should get larger as it gets further away.

  • @BoyceBailey
    @BoyceBailey День тому

    And that's how Minority Report videos got started.

    • @uriinbar6046
      @uriinbar6046 День тому

      my thought exactly. amazing that we are witnessing that future materialize

  • @jinchoung
    @jinchoung День тому

    I wonder if this will be good enough to get rid of keying and/or rotoing....

    • @vfx.360
      @vfx.360 День тому

      No it looks horrendous and complicated. He just spams semi professional videos about topics no one really makes videos. But at conventions this year (fmx for example) I saw was more impressive use of ai and code for depth and modelling. So this won't be useful unless packed inside of a program which he probably can't do.

  • @Gonicoxdofficial
    @Gonicoxdofficial День тому +1

    My question is why you are not in blender confrence 🙃

  • @William-Stark
    @William-Stark 17 годин тому

    what about the depth-map in davinci resolve?

  • @staticdrama
    @staticdrama День тому

    you open new things! thanks

  • @nm-com
    @nm-com День тому +4

    You can simply use davinci resolves depth map filter to avoid the quirky workflow of downloading cryptic packages that are driven via the prompt and use tons of ram.

  • @bradleywilkie
    @bradleywilkie 12 годин тому

    I feel that @CorridorCrew should see this.

  • @Al_KR_t
    @Al_KR_t День тому

    Hi! Is there any way to use webcam feed as a texture? I've been looking for it for a very long time and couldn't find anything, I feel like you would be the person who knows, haha

    • @zurasaur
      @zurasaur День тому +2

      You want to map a live webcam feed to a texture on a model in blender? May I ask why 😂😂😂

    • @zurasaur
      @zurasaur День тому +3

      Was just thinking and my guess is that you have some sort of realtime evee setup and u wanna map a live input to something like a Tv screen haha
      I would personally go for unreal engine instead of this specific use case but I’m here waiting to see if the genius know a way

    • @NotSoMuchFrankly
      @NotSoMuchFrankly День тому

      It appears Depth-Anything took about 2-3 minutes to get about 8-9 seconds of depth maps so it doesn't appear it's ready for real-time.

  • @SceneOfAction
    @SceneOfAction 18 годин тому

    It's kinda neat you can just check in on geniuses doing genius stuff. noice.

  • @Derpduck.
    @Derpduck. День тому +2

    This was a pretty "in depth" rant Mr.Matter.

  • @o1ecypher
    @o1ecypher День тому

    so you can make a video into a 3d video model? or 3d model video? What... my mind just broke... next episode??? interacting with object like the square i put in the scene? turn scene into mesh and run simulations ???

  • @ilanlee3025
    @ilanlee3025 День тому

    Great video, appreciated

  • @cocamaster0
    @cocamaster0 День тому

    please don't flash the screen

  • @crvvv
    @crvvv 16 годин тому

    this is crazy

  • @_Jude-St.-Francis_
    @_Jude-St.-Francis_ День тому

    Omg you're literally my crush you're so cool❤

  • @HalcyonSmyle
    @HalcyonSmyle День тому

    brilliant work

  • @f4r3l0
    @f4r3l0 22 години тому

    Awesome video!

  • @letsexplodetogether5667
    @letsexplodetogether5667 16 годин тому +1

    dude this is the same as your eyes inferring the depth, but via computer. If you want GOOD depth maps, just get a camera/sensor that supports it.

    • @letsexplodetogether5667
      @letsexplodetogether5667 16 годин тому

      Or just two cameras for stereo vision - Fact is the """Depth""" Caps out at such a near distance and varies due to outside mechanics and has no static start and end for depth measuring.
      It's just not good for any use other than saying "Wow 3D so cool!!!!!11!1!!!"

    • @CygnusXUno
      @CygnusXUno 9 годин тому

      @@letsexplodetogether5667 That seems like a extreme view, it is useful for subject separation, background separation and feel free to list out the uses of your two camera setup and judge whether they could instead use this easier flow, where cost, and ease of setup is drastically different. Really feels like you are downplaying how competent these are at depth mapping also, depending on the application of course you might want something better, but people here are not wanting big budget CGI, they want what works with the tools they have. your comment implies that you would only be using this on future depth intended footage rather than using prexisting footage or photos. The depth it caps at is adjustable when running the command.
      "Just get a camera that supports it" does not feel very constructive.

  • @alirezaakhavi9943
    @alirezaakhavi9943 День тому

    amazing! thanks man

  • @deleted_handle
    @deleted_handle 21 годину тому

    this is heckin cool

  • @voyageruk2002
    @voyageruk2002 День тому

    Isn't 2048 2k?

  • @bartoszrozmus1585
    @bartoszrozmus1585 День тому

    you are the best!

  • @FabrizioAscari
    @FabrizioAscari День тому

    This video would still be very interesting if it was 15 minutes long instead of 7...

  • @neorientalist
    @neorientalist День тому

    Just wow

  • @Zhincore
    @Zhincore 22 години тому

    is the voice AI? sounds weird

  • @monke1709
    @monke1709 День тому

    You smart

  • @ASchnacky
    @ASchnacky День тому

    seems kinda shallow

  • @arch.blender1178
    @arch.blender1178 5 годин тому

    so cool

  • @asterion2499
    @asterion2499 День тому

    dude sounds like AI generated

  • @linohype
    @linohype День тому

    for someone like me not being a super-pro, all your fast cuts make it very hard to follow the actual workflow.

  • @YHK_YT
    @YHK_YT 19 годин тому

    awsome

  • @user-iu9yw8nc5d
    @user-iu9yw8nc5d День тому

    nice

  • @gradmongur
    @gradmongur День тому

    why are you using an ai voice of your own voice?

  • @pile333
    @pile333 День тому

    Let's think if this were used by A.I. in video creations. That would be super awesome.

  • @b.eanimationsandotherstuff1536

    ;)

  • @importon
    @importon День тому

    mid

  • @treakzy_9594
    @treakzy_9594 День тому

    this is crazy