Blender's Cycles vs. Eevee (Ray Tracing vs. Real Time)

Поділитися
Вставка
  • Опубліковано 11 сер 2017
  • Eevee is Blender 3D's brand new real-time rendering engine coming in the new Blender 2.8 release ... but how is it any different from Cycles? Which is better? When would you use Eevee? We'll answer these questions and more in this video.
    ⭐ Streamline Your 3D Workflow! ⭐
    Remington Creative specializes in improving your skills and abilities as a 3D artist. We help you achieve your goals with with top-of-the line Blender 3D tutorials, game-changing software experiences, and growing communities of creatives.
    ✏️ CREATE with Remington Creative!
    • Software » remington.pro/software
    • Resources » remington.pro/resources
    🔗 CONNECT with Remington Creative!
    • Join our Discord » remington.pro/discord
    • Follow on Instagram » / remi_creative
    • Follow on Twitter » / remi_creative
    • More Communities » remington.pro/community
    We ❤️ Blender! fund.blender.org/

КОМЕНТАРІ • 290

  • @JustinGoran
    @JustinGoran 6 років тому +351

    Not to nit-pick but just to make sure mis-information isn't being spread.
    Cycles is a Path-Tracer not a Ray-Tracer. The big difference between the two is how they calculate light, While both tracers do fire rays from the camera, a Ray-tracer doesn't actually bounce around the scene, what it does is fire a ray until it intersects with an object, then fires a ray at all the lights in the scene to calculate the value of that pixel. This means a Ray-tracer can only trace direct light (which is why Ray-tracing engines had to do a second calculation for GI)
    A Path-tracer on the other hand traces a path from the camera to a light source, it fires a ray from the camera, hits a material, and based on that materials properties it bounces in a different direction, this process is repeated until either the ray hits the ray-bounce limit or hits a light source.

    • @pfannkuchengesicht42
      @pfannkuchengesicht42 6 років тому +26

      a path-tracer is a kind of ray-tracer. what you describe as a ray-tracer is usually called a "forward ray-tracer". Ray tracing alone only describes how there are rays traced to find an intersection.

    • @potatosalad5355
      @potatosalad5355 6 років тому +1

      Whay dont talk about biased and unbised renderers...?

    • @Argoon1981
      @Argoon1981 6 років тому +5

      Of course path-tracer is a ray-tracer if rays are being employed them is a ray-tracer, is just a more complex one compared to old simplistic ray-tracing.

    • @D0NTREPLY
      @D0NTREPLY 5 років тому

      i know weta digital uses a realtime ray tracing engine.

  • @lawrencedoliveiro9104
    @lawrencedoliveiro9104 6 років тому +106

    1:44 Note that the light rays are traced backward from the camera, not forward from the light source(s). This way you don’t bother following lots of extra light rays that never reach the camera.

  • @sdhpCH
    @sdhpCH 6 років тому +378

    What about using eevee while creating the scene and use cycles for the result? Possible?

    • @earomc
      @earomc 6 років тому +67

      sdhpCH that's how it works at the moment.

    • @RemingtonCreative
      @RemingtonCreative  6 років тому +210

      +sdhpCH Certainly possible! This is one of the main intentions of Eevee!

    • @sdhpCH
      @sdhpCH 6 років тому +30

      Thx. Then I think I got the concept :) I just was unsure if one or the other messes something up so you can't just change between them without adding changes etc

    • @sudd3660
      @sudd3660 6 років тому +19

      this was my hope too, that eevee was just an addition to viewport options, betweeen rendered view and material.

    • @user-pw5by5jw8p
      @user-pw5by5jw8p 6 років тому +18

      If you stick to the pbr model, using only the pbr shader and textures to manipulate that shader (which allows you to do anything you want pretty much), you'll have compatibility with pretty much every modern rendering engine, including Cycles (Principled BSDF / PBR node groups) and Eevee (built to be a PBR renderer from the start).

  • @matthewcarson9707
    @matthewcarson9707 6 років тому +20

    I read all the comments till now and here is a summary (as I understand it myself).
    -Eevee will be the new high-end way to edit your scene with it looking as close as possible to the final render (but not exactly).
    -Eevee is using real-time graphics processing based on OpenGL library, same thing used for the old viewport but much much more sophisticated and will also one to layer rendering types in the viewport, unlike Cycles. You will be able to draw a wire-frame object within a full 'Eevee' scene. Right now when the viewport is 'Render' mode (Cycles) you can't edit anything basically because you can't see it! You can edit blindly if you like to punish yourself.
    -Eevee is basically doing the same thing as Unreal Engine, although the specifics and quality choices will vary. In that case Eevee will provide a better preview of performance within a game engine like Unreal.
    -Eevee would be the basis of a new Game Engine (if they put one in 2.8 or later). That would be an alternative way to make a game instead of Unreal or other engine, although there are many MANY reasons why sticking with a commercial game engine is preferable (and also why there is some doubt that the Blender Game Engine may not be revived after 2.8)
    -Cycles is off-line meaning calculations aren't real-time. That also means Cycles can use as much time as your patience will allow for more complex calculations and that will *always* give better results than trying to do all the calculations in real-time.
    That being said, real-time engines like Unreal are getting better and better. Eventually though it will be hard to tell the difference between a full rendering method and a real-time method.
    -Eevee would be the method used to capture a quicky animation in the same way a 'OpenGL' render is doing now (since Eevee is OpenGL anyway.)

    • @TimeoutMegagameplays
      @TimeoutMegagameplays 6 років тому

      Matthew Carson You can actually edit whilst with Cycles, but you wont be in render preview mode, you would normally use the material or textured or solid mode wich is a preview very next to the Blender Render preview, it works like Eevee, but its actually working with the Cycles materials and stuff so you don't need to be remaking material or shit like that.

    • @TimeoutMegagameplays
      @TimeoutMegagameplays 6 років тому

      Matthew Carson And very good work collecting informations from the comments 👏

    • @ciaranpmryan
      @ciaranpmryan 5 років тому +1

      Pretty much this.
      I waited long enough and I just downloaded 2.8 last week. I’ve been trying a few wee test scenes and my experience is to work and build the shaders in EEVEE and then render in cycles. The difference is day and night, but the trick is to have the confidence that once it looks good in EEVEE then it’ll be awesome in Cycles. The first few renders I did were all in EEVEE thinking that it was great. I then just decided to render in cycles with the default settings, not expecting much, but holy moly, it blew my socks off. I had never had anything look so good, so quickly. Building shaders and node trees will be a dream for me now with EEVEE.

  • @richardcopperwaite4333
    @richardcopperwaite4333 6 років тому +91

    This video is mostly accurate, but as a graphics dev, just wanted to help clarify a couple of things.
    For starters, I would use the term Rasterizers to describe what you call Real Time engines, because most of the other aspects of the Real Time process are done by both approaches anyway - both can be executed on the GPU to take advantage of its parallel (bulk data) processor, and both take texture, shader and material information into account when producing their final images.
    But, where the Ray Tracer calculates the path of rays through a virtual space, and applies the material rules based on where each ray has struck each object, a Rasterizer instead takes that virtual space, re-orients it in the direction of the camera, flattens it from 3D into perspective 2D, and uses the material rules to fill the 2D outline of each object. Because the Ray Tracer more closely simulates the transport of light, it automatically benefits from accurate shadows and reflections, whereas the Rasterizer doesn't - so it needs to fake them by running extra passes over the scene to generate shadow textures for each important light source, along with a bunch of other magic tricks.
    The main reason that a renderer like Eevee is possible nowadays instead of 5-10 years ago, is IMO because of Physically Based Shading, which is a simple but really smart new way to define Rasterizer materials without giving them impossible amounts of shine or color absorption :)

    • @RemingtonCreative
      @RemingtonCreative  6 років тому +12

      +Richard Copperwaite I was hoping to get a response like this! A first person account always beats research. Thanks :)

    • @TimeoutMegagameplays
      @TimeoutMegagameplays 6 років тому +2

      Richard Copperwaite lol, real time rendering without PBR exists long ago already, and PBR too, Eevee should be possible before, because, actually Cycles supports non-PBR workflows such as Speculargloss.

  • @PaulGuevara
    @PaulGuevara 6 років тому +2

    Great video Grant; good overview of the differences of Cycles and Eevee, nicely broken down :)

  • @tmkdigital6361
    @tmkdigital6361 6 років тому +2

    Sometimes those definitions can be hard to understand without example, this video cleared out my confusion, thanks Grant! 😀

  • @AlimayoArango
    @AlimayoArango 6 років тому +3

    Amazing and excellent video :D

  • @knampf9779
    @knampf9779 5 років тому +33

    It's funny how you created an animation showing how cycles renders in Blender ...in Blender.

  • @SteelBlueVision
    @SteelBlueVision 6 років тому +25

    We really need some sample scenes and how they render with both renderers to compare and contrast them.

    • @aandre311
      @aandre311 6 років тому +3

      there are many videos showing the difference side by side, even comparing it with unreal 4, eevee can deliver unreal 4 lvl graphics, but is nowhere close to the photorealism you can get from cycles

  • @moldysann7083
    @moldysann7083 6 років тому +1

    Thanks alot for the info! Very helpful indeed

  • @creedolala6918
    @creedolala6918 6 років тому +9

    "Here ya go!"
    "Thanks GPU Buddy!"

  • @virtual_intel
    @virtual_intel 5 років тому

    Nice way of breaking down the technical differences between both render engines on Blender 2.8 Beta. I like having both options, will simply need to create differing lighting setups for both as lighting shifts drastically when you go from EEVEE to Cycles and back the other way. I think by duplicating all my lights and labeling them for each engine after making adjustments will do the trick. Plus using the tiny eye hide method via the outliner/collections.

  • @ThePir869
    @ThePir869 4 роки тому

    I wish I could give this explanation 2+ thumbs up. Great job sir.

  • @yoanfalquet4636
    @yoanfalquet4636 3 роки тому +3

    We are 3 years later, what is the evolution of its 2 rendering engines? Have there been any major evolutions ? It could be interesting to have a new video on this subject, with updates of the 2 rendering engines. Anyway thanks for making this video, I'm switching from C4D to Blender, and it helps me to see more clearly !

  • @BillyBuntin
    @BillyBuntin 5 років тому

    hugely helpful. Thanks bro.

  • @xanestudios
    @xanestudios 6 років тому

    smart geek
    Thanks for alot for explaining
    I looove the sampling effect outro
    really creative

  • @TheHorreK2
    @TheHorreK2 5 років тому

    That was really good to know, thanks!

  • @katharsisherbie3413
    @katharsisherbie3413 5 років тому +2

    very informative thank you! Off topic but have to ask: I love the music in the background starting at 8:14, whats the name?

  • @user-pw5by5jw8p
    @user-pw5by5jw8p 6 років тому

    Wow that visualization is really impressive!

  • @BlankerWahnsinn
    @BlankerWahnsinn Рік тому +1

    An update comparison video would be very interesting, like how the renderengines improved

  • @virtual_intel
    @virtual_intel 5 років тому

    Oh and one other thing I found to be a huge difference between both is the errors created in Blender 2.80 beta regarding using Ctrl-Z to undo. Sometimes all my textures vanish into thin digital air after clicking Ctrl-Z or the undo edit option. Cycles seems to handle such an error without causing any issues, but EEVEE tends to be a huge problem with this. My temporary fix for it is the turn off and back on the Node option via the Shader Editor. After doing so my textures reappears, but not all the time. Other times I gotta add them back in manually from the actually diffuse maps for them via the Shader Editor aka node editor. Such a pain, but yeah Blender 2.8 isn't in it's full release so I guess we gotta deal with such buggy glitches and all. Hopefully it won't happen once the full release gets launched.

  • @vangoumazg
    @vangoumazg 3 роки тому

    very informative, great job!

  • @rathernotdisclose8064
    @rathernotdisclose8064 6 років тому +1

    Sounds like the way to go is to check your render with real-time engine first, and then do your final render with cycles.

  • @aryankulkarni6066
    @aryankulkarni6066 5 років тому

    thanks, man. helped a lot.

  • @waltage
    @waltage 6 років тому

    the key thing it's unite both tech into the single one smooth process between realtime and photorealism

  • @shitshow_1
    @shitshow_1 3 роки тому

    Great Explanation : )

  • @mainecoon6122
    @mainecoon6122 6 років тому

    very helpful vid, thx

  • @cevenseven6262
    @cevenseven6262 6 років тому

    I think wanting to visualize sound bounces in a very similar way to this do you think you can do it tutorial on how to set up those firing particles

  • @quizcanners
    @quizcanners 6 років тому

    I know more about rendering and shaders and less about Blender specifically. But I think the better way to enplane it would be that while Ray-trace calculates light the way you showed - by tracing many possible light routes, real time rendering relies on a set of methods to simulate some individual aspects: shadows are calculated to later be multiplied by color of light, Reflections are produced by providing a texture that contains sort of 360 picture (Or Cube texture) + plus you can use screen space reflection, use depth buffer to do ambient occlusion, diffuse static lights can be baked - are all optional things and often there are different methods for each of those - those elements would be produced by ray tracing alone because they are results of light's natural behavior, which RayTrace emulates. Real time uses math formulas that would predict how bright light will be in a given fragment depending on position, angle to camera, distance to light source, smoothness and so on.

  • @yagz7820
    @yagz7820 4 роки тому +7

    Still I don't get it how some game engines' renders looks similar to cycles but real time?

    • @shamaiahellis8810
      @shamaiahellis8810 3 роки тому +3

      All a matter of skill if your skilled you can make eevee look very close to cycles

    • @Byefriendo
      @Byefriendo 3 роки тому +7

      While realtime engines can get close to cycles, it involves a lot of artistry and trickery, placing irradiance volumes in the right place, tweaking lights to unrealistic values, adding fake lights for bounce lighting etc... with cycles everything just works out of the box how it is supposed to, and is more accurate than even the best realtime approximation. The price you pay for this is render time

  • @3polygons
    @3polygons 6 років тому

    So.... I hope I am not wrong.....I guess this is a way to preview your scene, in the way 3D packages always allowed in their 3D textured mode (Max, Maya, etc) (just much better, although I guess this kind of thing is being implemented in every package, or will be) and till some point Blender already allowed, just with a lot more accuracy than with what we had now in textured mode, counting completely in the newer game texturing workflows, PBR workflows. If so, 3 HUGE benefits that I can see for now, despite knowing it is just an approach to what you will see in a final rendered image :
    - If you are an artist producing assets (even entire characters modeled, with normal maps and all the material stuff) for triple A games, or to produce content for indy games using PBR workflows (quite a few), be it objects, characters, etc... You just got a very good previewer like the one in Substance Painter, so no more blindly guessing while in the modeling stage if you kind of like to go producing a bit all at a time, or if you are not planning to export to Substance, but do it with Blender and other tools. This would mean a great addition (and clearly Blender keeping up with the times for an entire large industry, not only for rendering) for high end game art production (I am not speaking about doing games inside Blender, but producing art for AAA games, or the higher end indy games of today)
    - If you are rendering images, it can speed up a lot your scene building, materials fine tuning ,lighting, etc, counting on an instant previewer. All to an extent, as the results from Cycles would be quite different. Still, i see it as very practical, useful for speeding up. And anyway, my scene materials and lights in preview mode, in Max, Maya, XSI, at least in the past, where never even close to a render. (again, been a while away from those and using Blender)
    - If you are animating, can be very helpful for fast previews very close to a rendered production in what is the main raw looks, not just some open gl grey solids or plain textures thing. Saving here also a lot of time.
    Is it all so ?
    I see an important win in this new thing. I've been hoping for this in Blender almost since 2002...

  • @homeschoolerzrock4813
    @homeschoolerzrock4813 4 роки тому

    Thank you so much!

  • @manojpandeymsp4018
    @manojpandeymsp4018 Рік тому

    I really liked your explanation. I am working on a camera coverage simulation where I need to calculate the volume of empty space that is in coverage by the camera. It means, There can be obstacles to hinder the camera view and I need to calculate the volume of a room(let's say) that is viewed by the camera. Can you give me some suggestions on how to approach this simulation using a blender?

  • @FloMoonYeah
    @FloMoonYeah 6 років тому +1

    I think the arrival of EEVEE means wonderful news for most 3D noobs (like me), it gives you the opportunity to learn 3D modelling, UV mapping, Texturing and shaders, animation and lighting etc... seeing the results of your work in realtime which is best to "know what you are doing" in comparison to Cycles where you spend most of your time trying to figure out what the materials are doing without being able to see it.
    Also, biggest difference, achieving something "nice"" in EEVEE will require a much cheaper rig than what you'd need with cycles.
    I've a GTX 1050 with only 2GB of VRAM, and there is no way i can upgrade anytime soon, without EEVEE i'm stuck with painfully long render times of very simple scenes (a ball bouncing in a room), that makes the learning curve of Blender reaalllly flat.
    Another thing to consider, if like me you're learning Blender and wish to export your work on Unity, that means no more export every minute to see the result of a single change, and that is the best part for me.
    I really wish this will be released soon, BlenderFoundation rocks!

    • @TimeoutMegagameplays
      @TimeoutMegagameplays 6 років тому

      TheBoltMaster Or you can change to material mode and have low quality scene for editing, like, wtf, without the material or solid real time mode you wouldn’t even be able to edit your scene properly.

  • @morizanova8219
    @morizanova8219 4 роки тому +1

    Several decades ago, music and camera also have similar path. And nowadays look which one most people willing to.use . In the end Fast - simple and affordable always hard to beat. No matter how big the limitations. And as happen in another techs, I believe Ever will getting better :)

  • @totheknee
    @totheknee 6 років тому +1

    +1 simply for the amazing ray tracing simulation.

  • @bbaovanc
    @bbaovanc 4 роки тому

    How did you visualize cycles like that?

  • @zionsky3342
    @zionsky3342 2 роки тому

    dude, the fact light behaves like a particle and a wave, and you just said that got me real suspicious about the reality we living in xD

  • @Illasera
    @Illasera 6 років тому

    Excellent video

  • @ClaudioMalagrino
    @ClaudioMalagrino 6 років тому

    I read at Developers Blog something about "Cycles fallback" on 2.8 development. What do they mean?

  • @kageratiridia
    @kageratiridia 6 років тому

    How can I set an emission shader to the material in eevee? Or in Blender 2.8 I can t set an emission shader to the material, and i need to use an default sets of lamps? What should I do, if I need some lamp who has an plate form?

  • @xanecosmo5061
    @xanecosmo5061 5 років тому +5

    Captions calls eevee "eating" 😂😂😂

  • @tildey6661
    @tildey6661 6 років тому

    Hey man, love the videos just a few questions regarding Eevee/blender 2.8.
    Are cycles and Eevee materials interchangeable? I.e. can I make a scene and preview it in Eevee and then make the final render in cycles without having to redo my nodes.
    And how in blender 2.8 do you change from the render preview mode to modes like solid/material/wireframe? I looked for an option but I couldn't find a way.

    • @RemingtonCreative
      @RemingtonCreative  6 років тому +2

      +Tildey Cycles and Eevee will be interchangable. I'm not for about your second question, but it's probably just an issue that will be resolved as 2.8 undergoes further development

  • @BastianHyldahlFilms
    @BastianHyldahlFilms 6 років тому

    Lets just say:
    You better start saving up for that Titan V

  • @doc.4zure514
    @doc.4zure514 6 років тому

    hy great videos...do you know how i can upload and render a motion tracking video on sheep it render farm? plz help me...

  • @yudingzhou8683
    @yudingzhou8683 4 роки тому

    Thanks for sharing

  • @gipen
    @gipen 6 років тому +2

    Was expecting comparative examples :P

  • @adarshsingh764
    @adarshsingh764 6 років тому +3

    Or use evee for preview and cycles for the final render.

  • @aldairgonzalez860
    @aldairgonzalez860 5 років тому +1

    is there any book or article you recommend where I can get more information of how real time engines works?

  • @ZacharyWhite25
    @ZacharyWhite25 3 роки тому

    Is there an updated review on Eevee and Cycles since Blender is coming out with 2.93? Is there any change to Eevee?

  • @UncleBurrito15
    @UncleBurrito15 6 років тому +1

    Upload more often please

  • @bigtime9597
    @bigtime9597 6 років тому +1

    @Remington Graphics Alright, so I understand the idea that Ray Tracing is best for PBR. But, I need to know this, just to be sure. Is it possible for Eevee to be used for, let's say NPR scenes/animations? Good example would be, dare I say it?... I do dare... Rooster Teeth's web series, RWBY?

  • @bonbonpony
    @bonbonpony 6 років тому

    Nice explanation of ray tracing, with a nice visualization :> But I don't get one thing: You said that the data is being collected behind the light source, which I guess is accurate, but then you also say that "that's when you see when you render", which I'm not sure is correct :q If that was the case, you would see the scene from the point of view of the light source, and only one light source, instead of from the point of view of the camera. I guess what you meant is that the data is being collected when the projected rays hit the light source, just to record which ray it was and what trajectory it went through, but when it comes to rendering, the rays are sorta "back-tracked" to the camera to make the final image there.

    • @SerBallister
      @SerBallister 6 років тому

      A rough explanation would be: We fire a ray from the camera and for each surface the ray hits the material is added into a kind of stack.. then if we successfully reach a light source (there can be multiple light sources in the scene, and this can be stuff like skyboxes too) we can then modulate the light source colour by the colour of all the surfaces we hit in that path. This has the bonus of transferring light energy between surfaces, AKA global illumination. There are some tricks like bi-directional tracing which kind of combine both methods but that has it's own pros and cons

  • @christianimations7853
    @christianimations7853 5 років тому

    You know what I think?....I think this should help making your own VFX movie.

  • @jlco
    @jlco 6 років тому

    3:56 You switched the red and the green.
    I may or may not be slightly OCD

  • @Thomason1005
    @Thomason1005 6 років тому

    i kindof expected a side to side comparison of scenes in both render engines. so you can see the differences in reflections, volumetrics etc.. p.e. how big is the difference in lighting / ggx(shading model) calculation/ tone mapping(in a simple scene with common features only)? would love to have a breakdown of where the precise differences are in day-to day use. or is it too early in developement to do so?

    • @Thomason1005
      @Thomason1005 6 років тому

      i really liked the similar blender pbr branch (with realtime area lights omg) and wanted to use it for an animation last year, but i switched to cycles after i noticed that a lot of things did not update during pbr playback. p.e. animated light colors or materials would stay static in realtime mode. that really killed it for animation purposes. you can work around different handling of tonemapping and ggx calculation, but not being able to animate materials or lights is a kill. suprisingly the shadow quality and screenspace reflections quality would have been sufficient. also the ssao was really good. it was on point. but i needed quite heavy compositing which wasnt supported in pbr.

    • @Thomason1005
      @Thomason1005 6 років тому

      turns out 2.8 crashes before building up the window on my pc..

    • @Thomason1005
      @Thomason1005 6 років тому

      i made this comparison as image: twitter.com/TheMarkedVapor/status/905422728496369666 might make a more detailed video later

  • @su-swagatam
    @su-swagatam 4 роки тому

    Nice explanation can you do one on pro render and modo

  • @jamesburnes3297
    @jamesburnes3297 6 років тому

    Could you just make EEVEE another option for the 3D view? That way you could stay in EEVEE mode until you needed a more sophisticated view (and were prepared to take a hit on vizualization times). That would also require you specify the kind of render you wanted when doing full animations. This should work as long as the node models are mostly compatible right?

    • @Hayreddin
      @Hayreddin 5 років тому

      Eevee IS another option for the 3D view, the shaders work with both Eevee and Cycles and produce very similar results (Cycles being the most accurate, of course). They really are doing a great job with 2.8, if you're interested, I'd suggest you to follow Blender Today on UA-cam, Pablo Vasquez is documenting all the work being done week by week (even daily during the Code Quest).

  • @maulanalaser4748
    @maulanalaser4748 5 років тому +3

    Since i'm an impatient guy & doesn't worship photorealistic, i love EEVEE Engine.
    Especially avter knowing the name is taken vrom a Pikachu character.

  • @keshavashukla
    @keshavashukla 3 роки тому

    not sure but should light come from light and fall on camera ?, so black screen catching light should be behind camera ?

  • @is0295
    @is0295 4 роки тому +1

    im just thinking, back then they were trying to make real time look like ray tracing. now we are trying to make ray tracing into real time.

  • @chr1st0pher
    @chr1st0pher 6 років тому +3

    FASCINATING. love this video. tremendous work. really good way to break down these concepts. have you heard of voxel cone tracing? seems to be a really cool newish technique that is different from these. check out the two minute papers video on it if you haven't

    • @sebastianmestre2145
      @sebastianmestre2145 6 років тому

      tokyomegaplex i would call voxel cone tracing a form of ray tracing. It is done in a voxel based environment but it is ray tracing none the less.
      Cone tracing may refer to shooting multiple rays or using a shape to register hits instead of a point.
      Put these two concepts together and you get voxel cone tracing

    • @chr1st0pher
      @chr1st0pher 6 років тому

      right, but it's actually fast enough to be done in real time with current hardware whereas classic raytracing is wayyyyyy too slow for this. it's kind of an inbetween ground but i get that it's more similar to raytracing than rasterization. while eevee and all these real time rendering apps are trying to make rasterization as good looking as possible, voxel cone tracing seems like its sort of taking ray tracing and adding in some shortcuts that make it less accurate but way faster.

    • @sebastianmestre2145
      @sebastianmestre2145 6 років тому

      tokyomegaplex voxel cone tracing is usually used to make rasterization look better so you could call any engine that uses it a "hybrid" engine

    • @chr1st0pher
      @chr1st0pher 6 років тому

      neat!

  • @alfredgisc9171
    @alfredgisc9171 6 років тому

    So, if i take this and summarize it, basically EEVEE is basically just pushing your render through an OpenGL Pipeline... and its fast because its actually using the actual intended pipeline of the GPU, unlike Cycles which (if you have GPU rendering activated) has the GPU calculating paths with CUDA and then processing the result through the regular pipeline.

    • @alfredgisc9171
      @alfredgisc9171 6 років тому

      In the end EEVEE is going to replace for the internal renderer.

  • @luigimaster111
    @luigimaster111 5 років тому

    Eevee sounds like such a step up, it's a shame my PC is too old to run it.

  • @noc2_art
    @noc2_art 6 років тому +11

    A pointless comparison. Cycles is a heavy-duty path tracer that is able to produce results that get as close to light simulation as possible. The other is an OpenGL-CL based realtime representation of the former that can come no where even close to path or raytraced outputs...

    • @RemingtonCreative
      @RemingtonCreative  6 років тому +13

      +ajlan altug You'd he surprised how many people have asked me if Eevee is going to replace Cycles. Thats why I decided to make this.

    • @noc2_art
      @noc2_art 6 років тому +1

      Point taken... Fair enough.

    • @rfx8459
      @rfx8459 4 роки тому

      ajlan altug Yes but Eevee is a hybrid engine. It is RayTracing. While cycles is a full unbiased Path Tracer.
      If you really hold Eevees hand and give the scene a lot of care, it is 100% indistinguishable from Cycles.
      The biggest factor is the Global illumination which is where Eevee struggles compared to Cycles.

  • @jaytwentytenone2068
    @jaytwentytenone2068 4 роки тому +2

    Wait... So according to the simulation, the camera acts as the light and the light acts as the camera?

    • @fakefoogarelated7456
      @fakefoogarelated7456 2 роки тому

      (I know it's been a year but) yes, just because it's simpler and faster to cast rays from camera than doing the same from every light source

  • @LightBWK
    @LightBWK 6 років тому

    The lamp rays on the head lamps are wrong if you observe carefully.

  • @daveindezmenez
    @daveindezmenez 6 років тому

    Wouldn't the light come from the light and bounce to the camera, instead of the other way around?

  • @maayaa77
    @maayaa77 5 років тому

    What rendered the car ?

  • @dasneakygiraffe
    @dasneakygiraffe 4 роки тому

    I've been trying to figure out why my render looks fine using the Eevee engine but not the Cycles engine? I'm getting a whited out, high-noise render from Cycles for some reason.

    • @artffan5413
      @artffan5413 4 роки тому

      Increase your samples and use denoisers

  • @0zyris
    @0zyris 4 роки тому

    It is surely the camera that detects the rays, not the light source. There can be multiple light sources but one camera collects light from all of the light sources. While it is true, for efficiency, that the ray tracer must first establish which rays leaving the light source will eventually end up at the camera's image detection surface, and ignore the rest, the rays are affected by various factors in sequence. The fog between the light and the reflective surface or refractive object, for example, must first modify the rays then strike or pass through the objects, being modified some more, then striking perhaps another surface and being affected more. Only when all the effects have been compiled can the finished data be passed to the camera as a pixel.
    If you had a second camera recording a different view simultaneously, you would be collecting a completely different set of resulting rays originating from all or some of the same multiple lights.
    Or am I wrong?

  • @yofadhli
    @yofadhli 6 років тому

    Why not edit the scene in eevee and when it looks great, *THEN* you change to cycles. Cause mostly why i go back and forth when rendering is because materials/light testing sometime something mess up. So if i fix all that in eevee in realtime and then change to cycle when actualy rendering it wont take too long. (Except if theres noise, or those white particles that sometime appears from reflections.)

  • @bopin1083
    @bopin1083 4 роки тому

    Imagine if eevee and cycles merge into one
    *REAL TIME PATH-TRACER FOR EITHER VFX, CGI, GAMES, VISUALIZATION IN THE SAME TIME*

  • @Rocketman1105Gaming
    @Rocketman1105Gaming 6 років тому

    I feel like real time engines could be very useful for indie studios since it will greatly reduce the rendering cost and time, but I don't see them ever being taken up by large film companies until one is made that can contest with the Ray tracing engines in quality and flexibility.

    • @eyondev
      @eyondev 6 років тому

      Yes, I been starting to get into animation but the rendering times have me terrorized from doing comlex things, so i hope tha eevee is good and fast enough.

    • @ulls1984
      @ulls1984 6 років тому

      Real time rendering engines in blockbuster movies is already a thing...www.polygon.com/2017/3/1/14777806/gdc-epic-rogue-one-star-wars-k2so

  • @sudd3660
    @sudd3660 6 років тому

    i want to know if eevee is a tool i can use in my normal workflow, like a better version of the viewport, then use cycles for render tests and all that it is good at.

    • @RemingtonCreative
      @RemingtonCreative  6 років тому +1

      +xuddish p That's one of it's primary purposes!

    • @sudd3660
      @sudd3660 6 років тому

      excellent :)
      im so curious how it will work, hope it comes soon.
      im guessing it runs best on a good video card and some projects can even just take a screenshot straight out of eevee window.

  • @bardiakenway
    @bardiakenway 3 роки тому

    Will 50 samples among Denoisers destroy my animation sequence? Or it's enough because I really need to boot up my renders In cycle.

  • @MASQUALER0
    @MASQUALER0 4 роки тому +2

    Now we need realtime pathtracing

    • @LolmenTV
      @LolmenTV 4 роки тому +1

      Probably it will come to new updates with RTX cards.

    • @MASQUALER0
      @MASQUALER0 4 роки тому

      @@LolmenTV mabye, but rtx only accelerates ray tracing

  • @MrPaceTv
    @MrPaceTv 5 років тому

    Update description things seem to have changed.

  • @philhacker1137
    @philhacker1137 6 років тому

    awesome vedio! I'm stuck between both worlds architectural design virtual reality . Eevee is my Best choice I think. tg4eevee!

  • @mihailazar2487
    @mihailazar2487 5 років тому

    well, if you're doing still frames, can't go wrong with cycles, but if you get familiar with EEVEE you could really get into animations heavily
    (realistically, you'd need a server to do animations in Cycles) I had to do an animation for school, presenting my concept for a robot to our robotics team, .. a 30 second animation took 18 HOURS TO RENDER @ 480p 24fps on an Nvidia 1050
    Don't use Cycles if you want to do animations ... you'll get sick of waiting pretty soon

  • @davidnguyen5183
    @davidnguyen5183 5 років тому +2

    Well Cycles is a another module let's say to use and Eevee is a Pokemon

  • @eliechobok1893
    @eliechobok1893 6 років тому +1

    +remington graphics what about archviz, what engine is the best for it

    • @RemingtonCreative
      @RemingtonCreative  6 років тому

      +Elie Chobok There is no distinct best, but I'd say Eevee, especially if you're sitting with a client.

    • @abstractwaves6166
      @abstractwaves6166 6 років тому +1

      i do my arch-viz scenes in blender and then i export them in UE4 to render them because my clients always want some fast changes and in most case i dont got the time to rerender them in cycles, that's why i use UE4 in my pipeline. now with eevee im going to be much faster because i wont have to export my scenes everytime :D

    • @eliechobok1893
      @eliechobok1893 6 років тому

      i have to wait for evee cz i dont know how to work on UE4

    • @eliechobok1893
      @eliechobok1893 6 років тому

      i started learning blender 2 years ago, but im not planing on learning UE4 now, i want to improve my skills in blender for now

  • @warlord76i
    @warlord76i 6 років тому

    I really don't understand why not the light on the scene emits photons, and then the camera is what caches the light?( i mean thats how the real physical world works) Why it is inversely?

    • @hOREP245
      @hOREP245 6 років тому +1

      If you send out the 'photons' from the light, there is a high chance of that photon never even reaching the camera and just bouncing away into space. If you send them out from the camera, you guarantee that they won't be wasted like that, making it more efficient.

    • @SerBallister
      @SerBallister 6 років тому

      What is the probability of a photon exiting the sun and ending in your retina? Very low right? But if i trace a path from my retina, into a scene, it can find an illuminated object with much higher probability.

  • @italojfk2378
    @italojfk2378 6 років тому

    About mocap, Blender 2.8 will depend of another program ??

    • @RemingtonCreative
      @RemingtonCreative  6 років тому

      Mocap in Blender 2.8 will be the same as mocap on Blender 2.79

    • @italojfk2378
      @italojfk2378 6 років тому

      Remington I´m sorry, only now that I understood

  • @sagarkumar2754
    @sagarkumar2754 4 роки тому

    Ryzen 5+ vega 8 mobile gfx+ 12 gb ram
    Which engine is best for fast rendering?? Please ans

  • @joelankamah4633
    @joelankamah4633 6 років тому

    how do I contact u when I want to ask question unrelated to eevee?

  • @krazybubbler
    @krazybubbler 5 років тому

    real time pros... what's the difference between 'instant updates', 'virtualy no render time' and 'easy to preview'? seems these three should be listed as one tbh

  • @bruh-qi7ue
    @bruh-qi7ue 6 років тому

    So it's basically a material viewport with light and shadows and.... bugs? I guess

  • @iftikharmalik639
    @iftikharmalik639 4 роки тому

    Please vote it.
    which 1 better?
    cycle? or evee?

  • @alisaim8260
    @alisaim8260 4 роки тому +1

    My laptop turns off itself while rendering with cycles but this does not happen with eevee

    • @liquidphilosopher1816
      @liquidphilosopher1816 4 роки тому

      Cycles takes insane amount of time for animation rendering....
      I have no choice but eevee

  • @JacobKinsley
    @JacobKinsley 4 роки тому

    Why is this video slightly louder in my right ear? lol

  • @dwiseftiaji
    @dwiseftiaji 6 років тому +1

    Which one is Ray-Tracing and which one is Real-Time Engine? -_-
    Is Eevee the Ray-Tracing or Cycles is the Ray-Tracing? o.O

    • @trx6049
      @trx6049 4 роки тому +1

      Cycles is Ray Tracing
      EEVEE is Real Time

  • @UnauthorizedExpression
    @UnauthorizedExpression 6 років тому +2

    I don't understand why there is scale at all in blender. If you run a simulation with small objects (1 blender unit in size) the engine sees them as different than if the objects are 100 blender units. I'm certain this is causing many problems for every area of blender including the way light behaves. I don't know it for sure, but I have a gut feeling it is. Size and weight are relative in a 3d environment. I would like to see blender make this the default. Where 1 bu is no different than 1000 bu. until you specify the difference. That or make the physics more realistic. Hire a few particle physicists.

    • @MrPaceTv
      @MrPaceTv 5 років тому

      Particle physists? Haha want an image processing engine or fusion reactor.

  • @kmcasi2037
    @kmcasi2037 6 років тому

    In game engine like UE4 is use Eevee so maybe if is implemented and on Blender maybe they will update end game engine on Blender. Any way I used Blender for 3D modelling and UE4 for render just because in UE4 is faster with Eevee and now I will not need to switch between this 2 softares, but how u say Eevee is still in development maybe I will life wen Eevee will be better, if not then I will told to my grandsons to use it :))

  • @animatorladka614
    @animatorladka614 3 роки тому

    I use eevee for cartoonic Anim and Cycles for more real like Anims

  • @irfangusani6853
    @irfangusani6853 4 роки тому

    What about an animated movie like disney or pixar? Eevee or Cycles?
    Please let me know if anybody has the right answer!

  • @jacobbrink6059
    @jacobbrink6059 6 років тому

    Why dont they only have ray tracing for particles emitted from the light? That way, only light that actually matters will be calculated.

  • @neoneo3622
    @neoneo3622 5 років тому

    recommended now because includes "ray tracing"

  • @sammyjaohnson5631
    @sammyjaohnson5631 3 роки тому

    Wait it use real world samples?