I am Yoshino Mai! Nice to meet you Mister Senpai .. I was wondering.. Could you possibly share your opinion on the leap motion tracking system for 3D vtubers? I am very curious about how this works!
This method does not remove the position arrows at the bottom, and will have it be a weird resolution. to get red of that and make it 16:9 or whatever matching one, use the built in virtual camera.
Thanks for the good tutorial, I tried it and it's way better than VTuber Maker and VTuber Editor, which VRoid itself seems to suggest. I noticed in both programs though, warudo and VTuber Maker, that there is almost no face tracking. It basically only does mouth and eyes open/close. It doesn't even smile, let alone do the A E I O U shapes I've set up in VRoid. I am used to expression tracking like eyebrows and mouth shapes and even chin movement from my Live2D model, and while of course it doesn't happen by itself (I had to set up all that in Live2D), I didn't even see options for these things in VRoid? I see the tracking programs could take up parameters like tongueOut and mouthLeft and so on, but I don't know how to set them up in my model. Do you know if that's outside of VRiod's scope? And also, is there anything I can do to at least track smiles and mouth shapes? Thanks!
It actually has all of the same features and more, using the blueprints system. The wiki has a ton of tutorials for setting things like throwing props up and the community releases blueprints for new redeems constantly.
It can still be used. Warudo supports using browser sources like Fugi's Discord reactive images, and window captures as items so you could still use it for redeems or placing yourself into a 3d envrionement as a png tuber.
STREAMING: UA-cam.com/VTuberSenpai & www.twitch.tv/vtubersenpaiyt
DISCORD: discord.gg/PNYY55UDyV - Senpai's Lofi: open.spotify.com/artist/0EUA6zwuvlQpLF7RR7KVJO
TWITTER: twitter.com/vtubersenpai
Patreon : www.patreon.com/vtubersenpai - Suggested Streaming Gear: kit.co/vtubersenpai
So you no need to buy device for leap motion tracking or hand tracking?
Warudo has been amazing to work with. The community for it is extremely helpful, too.
I've been using Vseeface, but I just might make the switch to Warudo. Thanks for making this video, Senpai!
No problem!
Another great video from my favourite Senpai! Thank you :)
Senpai thank you so much, I have been trying for a while with warudo and this should def help.💙🎧
Glad I could help
I am Yoshino Mai! Nice to meet you Mister Senpai .. I was wondering.. Could you possibly share your opinion on the leap motion tracking system for 3D vtubers? I am very curious about how this works!
Leapmotion is awesome and if you have the money i'd suggest you get one.. but thats just me
Warudo is really fun to play around with, I hope we see some creative content with it from the community!
Agreed!
Ah, this app software seems new to me. Thanks for this!
No problem!
OG SENPAI CAME BACK FROM THE DEAD (for a little while..)
Thank you for doing this (I know you were probably already planning to do it before I requested it, but thanks)
Nope wasnt going to... only because you suggested it got my but in gear to make it
😯
so helpful thank you!!
Your welcome :D
This method does not remove the position arrows at the bottom, and will have it be a weird resolution. to get red of that and make it 16:9 or whatever matching one, use the built in virtual camera.
Thanks for the info!
@@VtuberSenpai Ah, getting rid of the transform arrows is just going to the character page, scrolling down and toggling transform tool.
Thanks for the good tutorial, I tried it and it's way better than VTuber Maker and VTuber Editor, which VRoid itself seems to suggest.
I noticed in both programs though, warudo and VTuber Maker, that there is almost no face tracking.
It basically only does mouth and eyes open/close. It doesn't even smile, let alone do the A E I O U shapes I've set up in VRoid.
I am used to expression tracking like eyebrows and mouth shapes and even chin movement from my Live2D model, and while of course it doesn't happen by itself (I had to set up all that in Live2D), I didn't even see options for these things in VRoid?
I see the tracking programs could take up parameters like tongueOut and mouthLeft and so on, but I don't know how to set them up in my model. Do you know if that's outside of VRiod's scope?
And also, is there anything I can do to at least track smiles and mouth shapes?
Thanks!
mediapipe tracking isnt loading for some reason
za warudoooo!
Just found that using a chroma key looks better than Allow Transparency
Luckily, yt put this up to me, else I'm thinking about buying the leap motion controller just for my hands, hmhmhm...
Do we have vtubers plus or t.i.t.s compatible yet
I didnt see that in the options
It actually has all of the same features and more, using the blueprints system. The wiki has a ton of tutorials for setting things like throwing props up and the community releases blueprints for new redeems constantly.
Just say it like "wa-doo-doe"😂
I tried and still didnt work LOL
I’m a PNGtuber so this doesn’t apply to me lol. But I watched anyways coz Senpai deserves more Views
Also senpai what’s your favorite Pokémon
Onyx all the way! Hbu?
@@VtuberSenpai Jigglypuff or Eevee
It can still be used. Warudo supports using browser sources like Fugi's Discord reactive images, and window captures as items so you could still use it for redeems or placing yourself into a 3d envrionement as a png tuber.
Tried this program but my face has completely exploded :/
💕
Waldo xD
You don't know how to pronounce "Warudo"? Just watch JoJo.
So the thing is... I can hear it... I can say it right in my head... but why I say it... my mouth just doesn't do it right lol
Ty for this video cant wait to try it out 🩵💜