Solid! I'm at a point where I need to add sounds to a project to give it some more life and your videos have been a great quick intro as I dive in. Thanks.
I'm definitely glad to hear they're helping. If you're on Discord, feel free to stop by the channel listed in the description. It's not just a server to promote my videos. We have a small but helpful community of audio engineers of all skill levels and areas of expertise who are always asking questions or helping answer questions.
@@TheSoundFXGuy As far as mixing, I'm under the impression its best to mix the channels using Sound Class / Mix - rather than adjusting gain in each sound.. is this still the case with MetaSounds, and if so, a video on mixing would be a welcome addition. If you already have one, I wasn't able to quickly search it, could you link me?
@@KavanBahrami I don’t have a specific video on mixing techniques in MetaSounds so that would definitely be a good video idea. But to answer your question, yes; mixing inside your MS is a viable workflow. The mono/stereo mixer nodes have float inputs for each input gain (up to 8 channels) so having all your audio files at a -6 to -3 dB out of your DAW will give you enough to play around with. I saw that you joined my Discord server so I’ll see if I can’t sent you a screenshot or two later this evening.
Using an input with array allows you to send data into the MetaSound from a blueprint. The variable with array is not accessible from the blue print and are just data sets that you've predefined within the MetaSound graph. Hopefully that makes sense.
I absolutely could have used that instead. You’re 100% correct. I think one of my favorite parts of this whole thing is that there are multiple solutions to the same puzzle.
Instead of mixing each of the elements down to stereo, do you know if there's a way to send them to separate... busses maybe... that can go out to different audio components placed in different areas in the map? So that you have the ONE gun MetaSound that gets ONE "Play" trigger, but can specialize the different elements out to different areas? I would love to be able to do something like that (gracefully) for big machinery in VR. Thanks for all your wonderful videos about MetaSounds! The manual is severely lacking and I'm really learning everything from you now 😂
Firstly, I'm really glad you're finding my videos informative! Secondly, and to actually answer your question, YES THERE IS!!! At the time of typing this reply, the latest version of Unreal Engine is 5.3 Preview 1 and one of the new audio features in 5.3 is the Audio Bus Writing node that allows you to send audio data via a bus into other MetaSound sources that can have their own separate attenuation and be placed in different locations in 3D space. My most recent video actually talks about the audio bus writer node and how it works. ua-cam.com/video/ur_RLVP6tHE/v-deo.html
No. The 'Start time' is for if you have the wave player set to loop. So for example if I have a 10 second intro followed by a 30 second loop, you can set the start time to 10 seconds. The first play through, it'll play the whole file. When it loops, it'll jump to 10 seconds into the file and loop from there. Personally I prefer to use 2 separate wave players and cut the intro and the loop into 2 separate wave files in my DAW but that's what it's for.
My main gripe with Metasounds is how chunky the editor feels, DAW layouts like what you get with FMOD feel a lot more natural and intuitive to use to me than flowgraphs for sound design like this, but this is still very welcome.
I'm glad you enjoyed it. While there are some technical aspects that may persuade users one way or another, just like with different DAWs it comes down to personal preference. In the grand scheme of things, pick what works best for you, your workflow, and your budget. I prefer MetaSounds but I'll never judge anyone for using something else. As long as the end user has a pleasurable audio experience, that's what really matters.
@@TheSoundFXGuy That's a very good sentiment. I will actually be using Metasounds as much as I prefer Fmods workflow and interface anyway though, simply because as far as I'm aware you can't use Steam Audio with it as of now.
You can. You just have to be particular about the order in which things happen in your signal chain. Also, can't use the default "On Play" input. Instead you'll need to create a custom play input that can be called at your disgression and you'll need to create a custom integer input. Basically, you'll need to initialize your MetaSound using an "Add audio component" node in your Blueprint. This will start the MetaSound playing but because we're using a custom input, this doesn't start any audio; it's simply activating the MetaSound. From there, you'll send the number of whichever index slot you desire from your Blueprint to your MetaSound, and then you'll trigger your custom play input. The amount of time it takes to perform this process at runtime is so small it'll appear to happen instantly.
@The Sound FX Guy Hey, after two weeks of trial an error and pain in the brain 🤯 , I finally solved the issue, non of my construct worked until I replaced "Play sound at Location" with "Spawn Sound at Location" and but the "Execute Trigger Parameter" behind the play node and it works 🥳👍
On Finished is used to tell the MetaSound to stop playing. Without it, the MS timer will continue to run. You can also use the On Finished output to trigger other events. Let's say for example you have musical intro and then a body that loops. You can run the On Finished of that intro wave player to the player of the looping body and it will seamlessly start playing your loop as soon as the intro is finished.
@@TheSoundFXGuy Thank you so much for the quick reply! That's pretty cool. I had some speculation but I couldn't really find much about it in the documentation.
@@ChrisScribs A lot of the documentation is still being written because, while we're still in preview, changes are being made. Once the full UE 5 version is released, I'm sure we'll see more official documentation come about.
Amazing tutorials for me as I was looking for some beginner meta sound tutorial!
Great tutorials man. UE5 tech is exciting, and personally I’m not a sound guy so I need this knowledge. Keep it up!
you are a pure gold sir
Great! Thanks
Really like your videos, I get along very well with your methodology and explanations!
Solid! I'm at a point where I need to add sounds to a project to give it some more life and your videos have been a great quick intro as I dive in. Thanks.
I'm definitely glad to hear they're helping. If you're on Discord, feel free to stop by the channel listed in the description. It's not just a server to promote my videos. We have a small but helpful community of audio engineers of all skill levels and areas of expertise who are always asking questions or helping answer questions.
@@TheSoundFXGuy As far as mixing, I'm under the impression its best to mix the channels using Sound Class / Mix - rather than adjusting gain in each sound.. is this still the case with MetaSounds, and if so, a video on mixing would be a welcome addition. If you already have one, I wasn't able to quickly search it, could you link me?
@@KavanBahrami I don’t have a specific video on mixing techniques in MetaSounds so that would definitely be a good video idea. But to answer your question, yes; mixing inside your MS is a viable workflow. The mono/stereo mixer nodes have float inputs for each input gain (up to 8 channels) so having all your audio files at a -6 to -3 dB out of your DAW will give you enough to play around with. I saw that you joined my Discord server so I’ll see if I can’t sent you a screenshot or two later this evening.
Quick question: what is the difference between using Inputs with array and using a variable with array. which one is better for metasounds?
Using an input with array allows you to send data into the MetaSound from a blueprint. The variable with array is not accessible from the blue print and are just data sets that you've predefined within the MetaSound graph. Hopefully that makes sense.
Thank you for this tutorial!
I have a question: why do you use stereo delay instead of Start Time on a Wave Player node?
I absolutely could have used that instead. You’re 100% correct. I think one of my favorite parts of this whole thing is that there are multiple solutions to the same puzzle.
@@TheSoundFXGuy yeah, that's great!:)
Instead of mixing each of the elements down to stereo, do you know if there's a way to send them to separate... busses maybe... that can go out to different audio components placed in different areas in the map? So that you have the ONE gun MetaSound that gets ONE "Play" trigger, but can specialize the different elements out to different areas?
I would love to be able to do something like that (gracefully) for big machinery in VR.
Thanks for all your wonderful videos about MetaSounds! The manual is severely lacking and I'm really learning everything from you now 😂
Firstly, I'm really glad you're finding my videos informative! Secondly, and to actually answer your question, YES THERE IS!!!
At the time of typing this reply, the latest version of Unreal Engine is 5.3 Preview 1 and one of the new audio features in 5.3 is the Audio Bus Writing node that allows you to send audio data via a bus into other MetaSound sources that can have their own separate attenuation and be placed in different locations in 3D space.
My most recent video actually talks about the audio bus writer node and how it works. ua-cam.com/video/ur_RLVP6tHE/v-deo.html
Thank you for this tutorial=)
This is rad!!!!!
Quick question: Would offsetting the 'start time' value in the wave player node do the same as having a Delay Node?
No. The 'Start time' is for if you have the wave player set to loop. So for example if I have a 10 second intro followed by a 30 second loop, you can set the start time to 10 seconds. The first play through, it'll play the whole file. When it loops, it'll jump to 10 seconds into the file and loop from there. Personally I prefer to use 2 separate wave players and cut the intro and the loop into 2 separate wave files in my DAW but that's what it's for.
Just in case... You can select multiple waves and drag-n-drop them directly in the array, instead of doing it 1-by-1. Saves a lot of time =)
I didn't know that at the time of making this video but it's definitely been a HUGE time saver since I found out.
U are amazing!!!!
this is great info
Nice
My main gripe with Metasounds is how chunky the editor feels, DAW layouts like what you get with FMOD feel a lot more natural and intuitive to use to me than flowgraphs for sound design like this, but this is still very welcome.
I'm glad you enjoyed it. While there are some technical aspects that may persuade users one way or another, just like with different DAWs it comes down to personal preference. In the grand scheme of things, pick what works best for you, your workflow, and your budget. I prefer MetaSounds but I'll never judge anyone for using something else. As long as the end user has a pleasurable audio experience, that's what really matters.
@@TheSoundFXGuy That's a very good sentiment. I will actually be using Metasounds as much as I prefer Fmods workflow and interface anyway though, simply because as far as I'm aware you can't use Steam Audio with it as of now.
Hey, is it possible to trigger the sounds by index from the array, from a Blueprint? random ok, but certain sounds? cant get it to work
You can. You just have to be particular about the order in which things happen in your signal chain. Also, can't use the default "On Play" input. Instead you'll need to create a custom play input that can be called at your disgression and you'll need to create a custom integer input. Basically, you'll need to initialize your MetaSound using an "Add audio component" node in your Blueprint. This will start the MetaSound playing but because we're using a custom input, this doesn't start any audio; it's simply activating the MetaSound. From there, you'll send the number of whichever index slot you desire from your Blueprint to your MetaSound, and then you'll trigger your custom play input. The amount of time it takes to perform this process at runtime is so small it'll appear to happen instantly.
@@TheSoundFXGuy wow thanks a lot! I am a beginner, lets se if I can manage that 👍
@@dubtube6691 if you’re on Discord, my server is linked in the description. Feel free to join and ask questions in our help forum section.
@@TheSoundFXGuy cool thanks a lot
@The Sound FX Guy Hey, after two weeks of trial an error and pain in the brain 🤯 , I finally solved the issue, non of my construct worked until I replaced "Play sound at Location" with "Spawn Sound at Location" and but the "Execute Trigger Parameter" behind the play node and it works 🥳👍
Does anyone know what "On Finished" is actually used for?
On Finished is used to tell the MetaSound to stop playing. Without it, the MS timer will continue to run. You can also use the On Finished output to trigger other events. Let's say for example you have musical intro and then a body that loops. You can run the On Finished of that intro wave player to the player of the looping body and it will seamlessly start playing your loop as soon as the intro is finished.
@@TheSoundFXGuy Thank you so much for the quick reply! That's pretty cool. I had some speculation but I couldn't really find much about it in the documentation.
@@ChrisScribs A lot of the documentation is still being written because, while we're still in preview, changes are being made. Once the full UE 5 version is released, I'm sure we'll see more official documentation come about.
😀