HUGE! The Fastest Local Video Generator EVER! XODA Video Pack with [LightTricks LTXV]

Поділитися
Вставка
  • Опубліковано 24 лис 2024

КОМЕНТАРІ • 30

  • @Andro-Meta
    @Andro-Meta 16 годин тому +2

    A little explanation on base shift and max shift for anyone looking to play with that:
    Base shift is a small, consistent adjustment that stabilizes the image generation process, while max shift is the maximum allowable change to the latent vectors, preventing extreme deviations in the output. Together, these parameters balance stability and flexibility in image generation.
    Using a bird as an example:
    Increasing Base Shift: Raising the base shift results in a more consistent and stable depiction of the bird. For instance, the image might consistently show a bird with clear, well-defined features such as a distinct beak, feathers, and wings. However, this increased stability could lead to less variation, making the bird’s appearance feel repetitive or overly uniform.
    Decreasing Base Shift: Lowering the base shift allows for more subtle variations and intricate details, like nuanced patterns in the bird’s feathers or unique postures. However, this added variability might make the bird’s image less stable, with occasional irregularities or minor distortions.
    Increasing Max Shift: A higher max shift enables the model to explore the latent space more freely, leading to creative or exaggerated interpretations of the bird. For example, the bird might develop surreal colors, elongated wings, or fantastical plumage, but it risks straying far from a realistic bird representation.
    Decreasing Max Shift: Reducing the max shift tightly constrains the model, resulting in a more controlled and realistic depiction of the bird. The image is likely to stay close to a conventional bird appearance, but it might lack creative or distinctive elements that make the bird unique or captivating.

  • @electronicmusicartcollective
    @electronicmusicartcollective 21 годину тому

    Thank you for presenting this stunning LTX model. Try right now out

  • @BlackMixture
    @BlackMixture День тому +1

    This is HUGE! Thanks for being a hero in the community and showing us how powerful local video gen could be!

  • @Afr0man4peace
    @Afr0man4peace День тому

    Hi thanks for all your work. I will test it today. Will leave some review videos on civitai when I get it to work.

  • @joshuadelorimier1619
    @joshuadelorimier1619 2 дні тому +3

    I think the model is trained at 25 also I've been getting 5 seconds no problem however have to do 3 seconds whenever I change the prompt

  • @Kaoru8168
    @Kaoru8168 2 дні тому +1

    i was never interested in local models until this came out im going to find the best settings and squeeze every last thing of this goldmine

  • @Alchemist284
    @Alchemist284 2 години тому

    It would be cool if you opened a discord server where people could discuss their projects and problems

  • @TomHimanen
    @TomHimanen 2 дні тому +1

    Could you create a video that explains all the basic parameters and what they affect? Examples would be great because written articles often don't have any and trying them with slow GPU is just very slow way to learn. I have learned by trial and error what for example CFG does but still don't fully understand its interactions with other node parameters.

  • @VFXShawn
    @VFXShawn День тому

    This is fast, but we need to be able to control the strength of the latents and images.

  • @AgustinCaniglia1992
    @AgustinCaniglia1992 2 дні тому +1

    ❤thank You sir

  • @TheClubPlazma
    @TheClubPlazma День тому

    What software do you use to create the white moving avatar bro . Thanks

  • @gjsxnobody7534
    @gjsxnobody7534 День тому

    so what is the longest you got it to do? can you do 30 seconds?

  • @krakenunbound
    @krakenunbound 2 дні тому

    I had to update from the batch file in the updates folder (comfyui) and then the custom nodes finally installed correctly. Simply using the built in update everything button in manager did not work.

    • @begineditingfilm
      @begineditingfilm День тому

      I had the same problem. I try the same thing you did and it worked. Thanks

  • @brianmolele7264
    @brianmolele7264 День тому

    What Vram is required to run this ? Or is it one of those that needs RTX 4090 to run locally ?

  • @ApexArtistX
    @ApexArtistX День тому

    its great and all but is it 8 vram crash proof

  • @goodie2shoes
    @goodie2shoes День тому

    These companies need to roll out this stuff more gradually-these constant dopamine spikes are wrecking my sleep! Oh, and don't think we forgot-you still owe us that deep dive into flux tools. 😉

  • @TeddyLeppard
    @TeddyLeppard День тому

    Another 8-12 months and these obscure interfaces will start to go away in favor of far more intuitive controls and production friendly ways to create video.

  • @agente_ai
    @agente_ai День тому +5

    THIS IS NOT HUGE....I tested half the day and all night....and no way this even comes close to what commercial platforms are able to offer....the only thing they do better is faster renders but the quality and prompt adherence is pure shite! Not to mention it is far too early to be claiming something that has unfriendly UX is going to be huge or is the next best thing....
    Disingenuous, at best; dishonest, at worst.

    • @imoon3d
      @imoon3d День тому

      Yes all local video creation models are just for fun testing...

    • @Andro-Meta
      @Andro-Meta День тому +4

      He made it pretty clear that for local video ai creation, this is huge. And this is.

    • @noonesbiznass5389
      @noonesbiznass5389 22 години тому

      Agree... it is fast as hell, as the OP mentioned, but quality isn't that good... I feel like CogVideoX is better, albeit slower.

    • @fotszyrzk79
      @fotszyrzk79 19 годин тому

      The only huge thing is fast rendering time, the outcome is shit!

    • @Andro-Meta
      @Andro-Meta 16 годин тому +2

      @@fotszyrzk79 I don't know what you're expecting out of a fast model that doesn't require a GPU that breaks bank that isn't fully trained yet and will be fully trained and released open source, but it really seems like you're missing the point.