Very sincere video where the "I am doing this with you" concept has finally been very successfully executed. Your personality shines through your presentation and it adds so much more to what we are watching. Thank you
For the IK chain, you just need to disable rotation on the X and Z to stop the undersired rotation on the teeth. Not sure where it's located in Maya, but it's under the motion tab in 3DStudio Max
Thank you so much. I've been beating my head over Audio2Face for months. Non of my custom faces will work right. This helps. What graphics card are you using? I have an A4500 It might be settings or something. Mine is never this smooth.
Any chance you'll make a tutorial like this but exporting to Unreal Engine instead? The only tutorials out there covering that process involve Metahuman, nothing on setting it up for a custom rig.
I'm not sure but I think you can have the teeth x rotation (up and down) linked through the Set Key dialog. It wouldn't be absolutely perfect... but it could work.
After importing USD files from blender into omniverse a2f no character or any signs were visible even after creating distant light and pbr matrerial.plz help
well my mesh bugged on post wrap and disappeared and now when i did Ctrl+z my mesh came back but now i cant post wrap again or click on any of the settings
Production Crate Idea: With the coming of Amazon's Wheel of Time Series it would be awesome if you gave us Aes Sedai Channeling Effects. I think this series is going to be huge.
wow great video.... maybe u can help me, i make the fitting, i make the post wrap , but when i select "this guy" and then i go to add a2f pipeline, i get i failed to execute command, maybe u know why? thank u for your help
great explanation, thank you! when I begin the process it gets stuck on 62.5%? Am I just being impatient? Checked in task manager and my 3090 is at 24%, and my 2080ti is at 12%? Hope someone can help clarify whether i am just being impatient aha.
@@donut_4048 Sadly i didn't find a workable solution. However they did release a version of blender within the Omniverse system that apparently addressed or at least streamlined the process with Blender. I've yet to explore there new update, because even when they demonstrated it on their YT channel, their own technician was hitting so many bugs in the demo session! it was a joke.
I avoid tutorials that have an irritating noise in the background that interferes with the narrative. It would appear that the trend is for the speaker to invade one's monitor as well.
@Carter Crick Yes. Use the default male head that already has the a2f pipeline (char_male_model_hi) instead of the one that comes with the male template. Just open the demo scene (default scene), go to "Character Transfer", click the "+ MALE TEMPLATE" button, delete the new Mark head, keep mark_openMouth and the original char_male_model_hi, find your character in the content browser below and drag and drop it into the viewport. Now you just need to do the same process as in the video, but you don't have to add the a2f pipeline. Sometimes the program just doesn't work properly, but you just need to save and restart, I don't know why. If everything goes wrong, you can try using Blender and the Faceit addon (paid) to transfer the default head animations with your audios to your characters as shape keys, for this you will need "male_bs_46", so make sure your Nucleus is setup to have access to localhost.
Revelation 13:15 “And he had power to give life unto the image of the beast, that the image of the beast should both speak, and cause that as many as would not worship the image of the beast should be killed.” Beware Transhumanism... The Future is Here... Get Ready
COULD YOU TELL THE INTRO WAS ANIMATED?!
Very sincere video where the "I am doing this with you" concept has finally been very successfully executed. Your personality shines through your presentation and it adds so much more to what we are watching.
Thank you
For the IK chain, you just need to disable rotation on the X and Z to stop the undersired rotation on the teeth.
Not sure where it's located in Maya, but it's under the motion tab in 3DStudio Max
omggggggggggggggggggggggggg something like this is what I've been waiting on.
It’s insane! Definitely planning on using it as much as possible
I´m so glad that this isn´t an online service that can go out anytime but a software that I can install in my computer and keep it forever...lol
It is protected by DRM but that's likely to be circumvented should anything happen.
AMAZING tutorial. Thank you so much. 😊💚⭐
They change the setting these days, it’s not working anymore
I can’t run the program
Can you make a video Guide
Thank you
As we say in my country, "loco sos un crack!" Thank you very much is very useful. I'm taking my first steps with Nvidia iA
Thank you so much. I've been beating my head over Audio2Face for months. Non of my custom faces will work right. This helps. What graphics card are you using? I have an A4500 It might be settings or something. Mine is never this smooth.
Any chance you'll make a tutorial like this but exporting to Unreal Engine instead? The only tutorials out there covering that process involve Metahuman, nothing on setting it up for a custom rig.
Do you need RTX graphic cards for that to work?
you are explaining a visual effect that is happening to a character, but the camera keep pointing at your face, instead.
Useful tech. Glad I chose an RTX card ☺️
Which do u like better or feel is the easiest to use Audio2Face or FaceIt?
I'm not sure but I think you can have the teeth x rotation (up and down) linked through the Set Key dialog. It wouldn't be absolutely perfect... but it could work.
We can export Maya cache for both eyes, head and tongue. But for some reasons teeth cache is not exporting. Your method helped, thanks!
When I clicked launch for the Audio2Face, the app says "Not Responding." How can I fix this?
Can u guys re create the scene from free guy where the world shuts down and starts getting erased
Hello nice tutorial. Can we provide him runtime audio? Will it work?
Hey does the nvidia omniverse support Gtx or only Rtx 🤔🤔🥺🥺🥺
After importing USD files from blender into omniverse a2f no character or any signs were visible even after creating distant light and pbr matrerial.plz help
Hey there, I have some problem with animation can you help me?
Is that an Oculus in the background??
Is there going to be a oculus game creating video in the near future possibly???
well my mesh bugged on post wrap and disappeared and now when i did Ctrl+z my mesh came back but now i cant post wrap again or click on any of the settings
Awesome tutorial!
Production Crate Idea: With the coming of Amazon's Wheel of Time Series it would be awesome if you gave us Aes Sedai Channeling Effects. I think this series is going to be huge.
Hope so, read all the books, fingers crossed they do it right!
whenever i try to add male template it does not show up on the perspective window , i can only see it when i change the viewport camera. Any ideas ?
Love From Cambodia. Love your video.😍😍😍
wow great video.... maybe u can help me, i make the fitting, i make the post wrap , but when i select "this guy" and then i go to add a2f pipeline, i get i failed to execute command, maybe u know why? thank u for your help
1718 is it because of position of ik handle on face not being centered?
Would you happen to know if this works with C4D? My model is animated, but the exported USD has no animation information attached when I open it up.
How did you animate the arms while you were talking in the intro video?
Very educative
great explanation, thank you!
when I begin the process it gets stuck on 62.5%? Am I just being impatient? Checked in task manager and my 3090 is at 24%, and my 2080ti is at 12%?
Hope someone can help clarify whether i am just being impatient aha.
blender crushes when i import the mesh. Can anyone help me to fix?
Does maya character already has blendshape or a2f will generate automatically 🤔?
It looks like the video is a little bit slower than the sound (intro). Still awesome though.
How to add flutter apps in this type of animation
Cooooool 🔥
Very helpful thank you!
Thanks for sharing
Anyone know if you can do the same end import process into blender? with shape keys to drive the target mesh? Thank you.
If u got the solution plz share it with us. We too are finding it thnxx
@@donut_4048 Sadly i didn't find a workable solution. However they did release a version of blender within the Omniverse system that apparently addressed or at least streamlined the process with Blender. I've yet to explore there new update, because even when they demonstrated it on their YT channel, their own technician was hitting so many bugs in the demo session! it was a joke.
is there a way to animate the eyes?
I avoid tutorials that have an irritating noise in the background that interferes with the narrative. It would appear that the trend is for the speaker to invade one's monitor as well.
Just curious
I haven’t been watching the latest videos too much but I have a question
Did you release the glow plugin?
Also where is Adrian?
Audio2face doesn’t recognize my audio files
No way to use this on a Mac?
how can you do it for blender?
Omniverse has a channel here and has a wonderful tutorial for blemder workflow
It only works with RTX cards.
What a program this is
when i click the a2f pipeline and yes attech nothing happens. any clue on what I could be doing wrong?
@Carter Crick Yes. Use the default male head that already has the a2f pipeline (char_male_model_hi) instead of the one that comes with the male template. Just open the demo scene (default scene), go to "Character Transfer", click the "+ MALE TEMPLATE" button, delete the new Mark head, keep mark_openMouth and the original char_male_model_hi, find your character in the content browser below and drag and drop it into the viewport. Now you just need to do the same process as in the video, but you don't have to add the a2f pipeline.
Sometimes the program just doesn't work properly, but you just need to save and restart, I don't know why. If everything goes wrong, you can try using Blender and the Faceit addon (paid) to transfer the default head animations with your audios to your characters as shape keys, for this you will need "male_bs_46", so make sure your Nucleus is setup to have access to localhost.
@@almostagamedev how do I make sure my nucleus is connected? I can't export the Json right now and that might b the reason
If the character just moved their eyes a little, and blinked, it would already add a lot.
Killer!
Can it do talking animals too?
this is revolucionary...
Wowowow
Too complex for my abilities. But thanks.
Oh god… there goes my job as a 3d animator..
This tool will do more to help 3D animators, add it to your arsenal and it will add to your workflow!
It's a bit off with the lip sync, so no, you are secure in your job still.
ik = Inverse kinematics
Author: Any other program.
Blender: Am I a joke to you...
Valve's faceposer gives better results and was made 20 or so years ago.
Revelation 13:15
“And he had power to give life unto the image of the beast, that the image of the beast should both speak, and cause that as many as would not worship the image of the beast should be killed.”
Beware Transhumanism... The Future is Here... Get Ready
wait...u mean this guy is gonna use audio2face to kill us all??
First❤
awful results though
no, its looks very bad
Awesome tutorial!