Awesome video as always. We love to see your videos. They are cleverely made, with a lot of details, knowledge and humour! A classic! 👌 Importing custom shaders is on our short term goals list, but please be aware: it will not be just copy / paste. There will be a certain amount of work for each shader. We also believe there is margin from performance improvements but we are still investigating and it might take a while longer. Right now we rely on GLSL shaders. But in theory, if we use Metal shaders, there will be performance gains. Regarding the different speed while playing back recorded videos when frame rate drops, we are aware of the issue and we are trying to find ways of working around. It is possible. We are just not there yet. Our ultimate goal is to have the audio along with the video in sync while using the record function. We hope to have a solution to this problem very soon. A small note: when you long press a material parameter, it does open the modulation matrix right on the page of that parameter so you don't have to look for it. That's why you always see the matrix being opened. It's a shortcut! ;) You have only used audio in this video, but if you had MIDI clips hosted in LK, Atom, or whatever, you could use that MIDI to trigger elements and modulate parameters (eg: with automation lanes). This was our primary goal with VS, making use of the musical information while it is being played. As for the timeline, there are many ways this could be interpreted. We would love to hear your thoughts about it. Thanks for this amazing video! We hope to meet all your expectations very soon. 😊🙏
Imaginando, you make some of the finest apps for iOS, and the fact that you are approachable and interact with the community like this makes you even more special. And DRC is still the best iOS synth!
Hey Jakob, thank you sooooo much for the mention! Really grateful for that! 🙏❤️ Regarding VS, i totally get your points. On many of these, @Imaginando is already working hard. Regarding performance: I don't know if you have noticed that, but setting the output resolution in VS' settings does also affect the performance. So if you're just jamming around and don't need full hd resolution, you can easily go with 720p or lower while maintaining a more stable and higher framerate. Btw, i realized that VS performs better in standalone mode, i guess because it gets more ressources from the system. Saying that, if you're working with midi instead of audio trigger, you could just route midi from AUM for example, running in the background, into standalone VS. Standalone mode has also the advantage of fullscreen output via hdmi while still maintaining the UI on the ipad, great for live tweaking and jamming with VS. Regarding sequencing/timeline: For now, there is no inbuilt sequencer, but almost all paramaters/layers are midi controllable. So if you use it inside a DAW or in combination with midi sequencer in AUM (e.g. Atom) you could build sequenced visuals. Have a look at the official promo video i created for VS, this was actually using just 1 instance of VS and a lot of automation/midi sequencing affecting visual parameters and visibility of layers in VS. All running in realtime and in one take. The trailer was done in Ableton Live admittedly but i this is theoretically also possible on iOS. Oh and since you mentioned LumaFusion and layer settings: There are also overlaying methods build into VS: Above the color selection wheel, there's a dropdown with various methods (multiply, screen, subtract etc). This is really great in combination with other materials or your own background layer, e.g. by using the subtract mode on a shader "masks" all the underlying layers. Great for creative visual effects that immediatly looks more interesting. Anyway, thanks again for your great videos and keep up the awesome work on your channel! Take care, Marcus
Perplex On I’m thinking maybe feed VS MIDI from my Akai Force tho since I’m on an Original iPadPro 12.9” have a feeling it’s going to be a bottleneck I still need to try the Desktop demo on my M1 Macbook as well as the Ganz Graf MOD Max4Live Device. Sooooo cool but looking like it’s ALMOST there.
Go and check out VS and Perplex on! ▼ ▼ ▼ Imaginando home ► www.imaginando.pt/products/vs-visual-synthesizer Perplex On UA-cam ► ua-cam.com/users/perplexon Perplex On Instagram ► instagram.com/perplex_on/
The sync issues is exactly why I have not jumped on this app as of yet. I know sync is very frustrating. I will stick to Apple Motion if I want to do visualizations. Thanks for the video. A timeline would be awesome as well as export. Maybe stand alone version could have timeline where you could sync video and audio together and export all as one synced up unit. Having something like Multiband would be nice for isolating the different frequencies to send to the visualizer.
I’d love it if they make a separate app made for Audio visual sequencing. I think an app built around that concept might be better in the end. It’s kinda hard changing stuff sometimes when one didn’t plan for it. But yeah. Man I wish I had a timeline and export.
The sync issues is something we definitely want to fix. We also want to include audio in the video. I'm still puzzled the way you see timeline though. If VS is used as plugin, it can benefit from the host timeline, if there is any. Let's supose Cubasis is used, and VS is loaded as a plugin into Cubasis. The arrangement timeline could have automations that send MIDI to VS. This is actually how the video trailer of VS has been done on Ableton Live. But I would definitely like to hear about how do you envision the timeline workflow within VS itself.
@@ImaginandoPt I guess I am looking at it in stand alone mode not DJ style. Say I have a track, either with stems or without. I want to import the audio into a timeline, then manipulate VS automate, etc, to do what I want, then have it render the video with audio so I can publish. If used in Cubasis, could that work that way and export audio and video? Just trying to eliminate extra apps.
Oh yeah custom shaders will be great, I'd also like to see webcam support, and more options for using video clips with modulation. I want to do things similar to the stuff like zwobot or ebosuite do inside of Ableton Live. You know what would be handy, is if you could see all the alpha settings in one page so you can adjust them quickly instead of going through each layer.
It’s great to be back a s see you presenting in the original way rather than stomping around your room with a microphone. There was a period in your development where your personality changed. I left and came back and you’re back to normal! Ya! Many thanks. Whatever we’re doing artistically - we’re always changing..
My main question would be if using your iPad for a live performance with AUM and several synths running, what would be a good option for running live visuals, which I assume would need to be done with a mac or PC that is somehow receiving audio from the iPad or a mixer. I assume a modern iPad couldn't do everything itself all at once. Seems like that would be really taxing.
Dude! I’m so glad you did a video on this app, I was really interested to hear your thoughts. I really like it but I haven’t produced any UA-cam content with it yet. I will be doing but I feel that it requires a certain investment of time to do it justice so I’m just looking for a nice little gap in my schedule to do so.
I can take some time to get stuff going, however, for me it’s mostly been like ... well I get lost in how much fun I’m having. 😂 once you get a hang of the controls in there, it’s quite straightforward in what it does.
@@JakobHaq I’ve had a really good pissypants around with it and made some short little sequences - I like using the midi to make the kick and snare make stuff move.
Great demo Jakob, I was playing with the Demo on my Mac Mini(2011 16 gig Ram) and WOW it brings it to its knees. What iPad where you using ? I’m still on a 2015 iPadPro 12.9”. I want to try the Ganz Graf MOD Max4Live Device in Ableton Suite on my M1 Macbook too before making a decision which way to jump……. And of course still drooling over Resolume which is amazing but PRICEY. VS doesn’t record Audio just the video too eh? I could PROBABLY get around that recording HDMI Out into my Blackmagic Stream Deck…… maybe…….
I’m using an iPad Pro 11” from 2018. When it comes to graphical apps I’d recommend getting the latest model of iDevice. There’s a lot that can be done with older devices but apps like these needs a fast and powerful graphics processor.
It’s called “You need to hear this babe” and it’s been released both as a music video here on UA-cam and as a track on all major streaming platforms. ▼ Video: ua-cam.com/video/ulOuHZlSF14/v-deo.html ▼ Streaming: distrokid.com/hyperfollow/jakobhaq/you-need-to-hear-this-babe
I noticed that background items like the cosmos were visible through the sun and the grids. Can that be fixed? Also do not ever worry about "smart". That term is relative. And hardware limitations. There is a reason engineers use Precision workstations when they are designing things.
The nebula is showing through the sun because of how the alpha is set. I’d do stuff a bit differently if I was working with something like MMV instead. It would be nice to work with transparency layers in a different way.
@@JakobHaq we don't know if you are aware of it but you can control the Z index of a layer by press, hold and drag it of the left or right. Left is bottom most, and right is top most. This would allow to having the nebula on top of the other stuff.
I’m not sure as I haven’t worked with it. But after doing some googling I’ve seen that people make tutorials on fractal stuff for mmv. No idea if they’re 3D though. Wish I could help but I know too little.
I have played with VS, but having no render option makes it almost entirely useless for video production. My M1 iPad Pro drops down framerates with a couple of layers too, so no real optimisations there. I used Straella and Luma Fusion to do this (with music done in Gadget). ua-cam.com/video/COqrxRgiWgY/v-deo.html But for more elaborate stuff, I use VirtualDJ, which allows the SmartToys shaders and can render out files suitable for Luma Fusion.
I haven’t been able to even get my screen recordings to save with VS on a Gen 3 Pro. First time I haven’t had full functionality of an AUV3 on this machine outside of Wavestorm, which I believe is gone from the App Store. Staella is great….. ua-cam.com/video/AL2XnTgC7ho/v-deo.html
man i thought this would work totally different it has all the great visuals but how you set this up is super annoying and they made bam so intuitive i'm just confused here
I’m here to PLAY not to DRAW. I'm already disturbed by Animoog becoming increasingly similar to AutoCad. This seems to me like a generator of images that is uglier than many other hypnotic ones available on the market and that don't claim to be called synths.
Awesome video as always. We love to see your videos. They are cleverely made, with a lot of details, knowledge and humour! A classic! 👌
Importing custom shaders is on our short term goals list, but please be aware: it will not be just copy / paste. There will be a certain amount of work for each shader.
We also believe there is margin from performance improvements but we are still investigating and it might take a while longer. Right now we rely on GLSL shaders. But in theory, if we use Metal shaders, there will be performance gains.
Regarding the different speed while playing back recorded videos when frame rate drops, we are aware of the issue and we are trying to find ways of working around. It is possible. We are just not there yet. Our ultimate goal is to have the audio along with the video in sync while using the record function. We hope to have a solution to this problem very soon.
A small note: when you long press a material parameter, it does open the modulation matrix right on the page of that parameter so you don't have to look for it. That's why you always see the matrix being opened. It's a shortcut! ;)
You have only used audio in this video, but if you had MIDI clips hosted in LK, Atom, or whatever, you could use that MIDI to trigger elements and modulate parameters (eg: with automation lanes). This was our primary goal with VS, making use of the musical information while it is being played. As for the timeline, there are many ways this could be interpreted. We would love to hear your thoughts about it.
Thanks for this amazing video! We hope to meet all your expectations very soon. 😊🙏
Imaginando, you make some of the finest apps for iOS, and the fact that you are approachable and interact with the community like this makes you even more special. And DRC is still the best iOS synth!
@@JamieMallender we feel very honoured by your kind words! Thank you! 🙏😊
@@ImaginandoPt 💜💜💜💜
Love @imaginando!
@@swancakes 😊🙌🙏
Hey Jakob, thank you sooooo much for the mention! Really grateful for that! 🙏❤️ Regarding VS, i totally get your points. On many of these, @Imaginando is already working hard.
Regarding performance: I don't know if you have noticed that, but setting the output resolution in VS' settings does also affect the performance. So if you're just jamming around and don't need full hd resolution, you can easily go with 720p or lower while maintaining a more stable and higher framerate. Btw, i realized that VS performs better in standalone mode, i guess because it gets more ressources from the system. Saying that, if you're working with midi instead of audio trigger, you could just route midi from AUM for example, running in the background, into standalone VS. Standalone mode has also the advantage of fullscreen output via hdmi while still maintaining the UI on the ipad, great for live tweaking and jamming with VS.
Regarding sequencing/timeline: For now, there is no inbuilt sequencer, but almost all paramaters/layers are midi controllable. So if you use it inside a DAW or in combination with midi sequencer in AUM (e.g. Atom) you could build sequenced visuals. Have a look at the official promo video i created for VS, this was actually using just 1 instance of VS and a lot of automation/midi sequencing affecting visual parameters and visibility of layers in VS. All running in realtime and in one take. The trailer was done in Ableton Live admittedly but i this is theoretically also possible on iOS.
Oh and since you mentioned LumaFusion and layer settings: There are also overlaying methods build into VS: Above the color selection wheel, there's a dropdown with various methods (multiply, screen, subtract etc). This is really great in combination with other materials or your own background layer, e.g. by using the subtract mode on a shader "masks" all the underlying layers. Great for creative visual effects that immediatly looks more interesting.
Anyway, thanks again for your great videos and keep up the awesome work on your channel! Take care, Marcus
Perplex On I’m thinking maybe feed VS MIDI from my Akai Force tho since I’m on an Original iPadPro 12.9” have a feeling it’s going to be a bottleneck I still need to try the Desktop demo on my M1 Macbook as well as the Ganz Graf MOD Max4Live Device. Sooooo cool but looking like it’s ALMOST there.
Thanks Jakob very helpful top man 👍😉
Thanks!
You can now import your own Shaders
Go and check out VS and Perplex on!
▼
▼
▼
Imaginando home ► www.imaginando.pt/products/vs-visual-synthesizer
Perplex On UA-cam ► ua-cam.com/users/perplexon
Perplex On Instagram ► instagram.com/perplex_on/
Looks very interesting. I saw it now supports video + audio recording simultaneously, I think I’m gonna buy it, Jakob! Thx 🙏 for the awesome review.
Been holding off on this app for far too long, you convinced me!
Have fun!
From this video i learnt about magic music visualiser and it is way way way better if you dont need iOS.
That’s brilliant!! Great video as usual, Jakob 📱✨😺👍
Thank you!
Just scooped it in their birthday sale $19.99 had my eye on it since release, now is the time haha
wonderful tutorial
Another alternative for people who like to delve in much deeper is magic music visuals BTW. But Mac and Windows only.
Excellent review. Well done.
Thanks
Dude are you psychic or something!? I was literally just downloading this and watching some videos on it!!! 😂
I’m not psychic but I do eat a lot of cabbage! 💋
Whoa!
Awesome 👌 I was looking for an iOS program that could do this. Great tutorial too. Thanks man ✌
Awesome video as always!
Thank you!
On pc just use spout to send out video to obs and record there
@Jakob, any thoughts to add to the Description of this video on the most recent version ?
Buddy, your 360 spin is my personal hello...❤️ U J!!
Aw 💋
The sync issues is exactly why I have not jumped on this app as of yet. I know sync is very frustrating. I will stick to Apple Motion if I want to do visualizations. Thanks for the video. A timeline would be awesome as well as export. Maybe stand alone version could have timeline where you could sync video and audio together and export all as one synced up unit. Having something like Multiband would be nice for isolating the different frequencies to send to the visualizer.
I’d love it if they make a separate app made for Audio visual sequencing. I think an app built around that concept might be better in the end. It’s kinda hard changing stuff sometimes when one didn’t plan for it. But yeah. Man I wish I had a timeline and export.
@@JakobHaq
It would be nice if they make it like a standalone video DAW
with AUV3 support in the app and individual videos tracks
The sync issues is something we definitely want to fix. We also want to include audio in the video. I'm still puzzled the way you see timeline though. If VS is used as plugin, it can benefit from the host timeline, if there is any. Let's supose Cubasis is used, and VS is loaded as a plugin into Cubasis. The arrangement timeline could have automations that send MIDI to VS. This is actually how the video trailer of VS has been done on Ableton Live. But I would definitely like to hear about how do you envision the timeline workflow within VS itself.
@@ImaginandoPt I guess I am looking at it in stand alone mode not DJ style. Say I have a track, either with stems or without. I want to import the audio into a timeline, then manipulate VS automate, etc, to do what I want, then have it render the video with audio so I can publish. If used in Cubasis, could that work that way and export audio and video? Just trying to eliminate extra apps.
Thanks for this nice eye candy adventure!
My pleasure!
Awesome video
Thank you!
Thanks for the tutorial. There’s really no way to route mic or external device audio directly into VS without going through AUM ?
Not bad, a bit of work. Does it really do audio beat matching?
Oh yeah custom shaders will be great, I'd also like to see webcam support, and more options for using video clips with modulation. I want to do things similar to the stuff like zwobot or ebosuite do inside of Ableton Live. You know what would be handy, is if you could see all the alpha settings in one page so you can adjust them quickly instead of going through each layer.
Alpha Chanel overview with controls would be amazing yeah.
Thank you
My pleasure!
It’s great to be back a s see you presenting in the original way
rather than stomping around your room with a microphone.
There was a period in your development where your personality
changed. I left and came back and you’re back to normal! Ya!
Many thanks. Whatever we’re doing artistically - we’re always changing..
This just makes me want to stomp around my room again. My personality is simple. Someone tells me they don’t want me to do something, I do it. 💋💋
Oh, and welcome back to the channel! 💜
@@JakobHaq I never asked you to do anything. I shared my return and that I
liked your original presentation.
My main question would be if using your iPad for a live performance with AUM and several synths running, what would be a good option for running live visuals, which I assume would need to be done with a mac or PC that is somehow receiving audio from the iPad or a mixer. I assume a modern iPad couldn't do everything itself all at once. Seems like that would be really taxing.
Massive sale on this for the next 12ish hours. Picked it up for $4
Dude! I’m so glad you did a video on this app, I was really interested to hear your thoughts. I really like it but I haven’t produced any UA-cam content with it yet. I will be doing but I feel that it requires a certain investment of time to do it justice so I’m just looking for a nice little gap in my schedule to do so.
I can take some time to get stuff going, however, for me it’s mostly been like ... well I get lost in how much fun I’m having. 😂 once you get a hang of the controls in there, it’s quite straightforward in what it does.
@@JakobHaq I’ve had a really good pissypants around with it and made some short little sequences - I like using the midi to make the kick and snare make stuff move.
Dope…Enter the Apple Metal API 😂 help is on the way 🖤😎
Great demo Jakob, I was playing with the Demo on my Mac Mini(2011 16 gig Ram) and WOW it brings it to its knees. What iPad where you using ? I’m still on a 2015 iPadPro 12.9”. I want to try the Ganz Graf MOD Max4Live Device in Ableton Suite on my M1 Macbook too before making a decision which way to jump……. And of course still drooling over Resolume which is amazing but PRICEY.
VS doesn’t record Audio just the video too eh? I could PROBABLY get around that recording HDMI Out into my Blackmagic Stream Deck…… maybe…….
I’m using an iPad Pro 11” from 2018. When it comes to graphical apps I’d recommend getting the latest model of iDevice. There’s a lot that can be done with older devices but apps like these needs a fast and powerful graphics processor.
@@JakobHaq thank you for the insight
😳🤯😳🤯Take My Money 💵💰💸💰💵💸!!! 🤤🤤🤤
Omg Jakob, what’s the jam at the beginning? Release that shit. Haha
It’s called “You need to hear this babe” and it’s been released both as a music video here on UA-cam and as a track on all major streaming platforms.
▼
Video: ua-cam.com/video/ulOuHZlSF14/v-deo.html
▼
Streaming: distrokid.com/hyperfollow/jakobhaq/you-need-to-hear-this-babe
I noticed that background items like the cosmos were visible through the sun and the grids. Can that be fixed? Also do not ever worry about "smart". That term is relative. And hardware limitations. There is a reason engineers use Precision workstations when they are designing things.
The nebula is showing through the sun because of how the alpha is set. I’d do stuff a bit differently if I was working with something like MMV instead. It would be nice to work with transparency layers in a different way.
@@JakobHaq we don't know if you are aware of it but you can control the Z index of a layer by press, hold and drag it of the left or right. Left is bottom most, and right is top most. This would allow to having the nebula on top of the other stuff.
Can magic music visuals do 3D fractile sharers?
I’m not sure as I haven’t worked with it. But after doing some googling I’ve seen that people make tutorials on fractal stuff for mmv. No idea if they’re 3D though. Wish I could help but I know too little.
Lumen? i thought that was all you needed if you have a mac
Yes Great video synth for Mac users but it’s not available for iOS.
Nice video. I'll glean what info I can for PC lul
are we playing or drawing? did i get the wrong channel?
try synesthesia
$20 US is too much for something still working out the bugs. I could go $20 if the a/v sync was locked in.
Fair point
So you absolutely love the app despite it not being what you want an app like it to do. Are you sure?
I don’t have to like everything with a thing or person or animal in order to love it or them.
I have played with VS, but having no render option makes it almost entirely useless for video production. My M1 iPad Pro drops down framerates with a couple of layers too, so no real optimisations there. I used Straella and Luma Fusion to do this (with music done in Gadget). ua-cam.com/video/COqrxRgiWgY/v-deo.html
But for more elaborate stuff, I use VirtualDJ, which allows the SmartToys shaders and can render out files suitable for Luma Fusion.
I tried staella and I like got. It’s just way to limited for my taste. I need loads of controls basically.
@@JakobHaq Yep, that's its limitation, but it's still nice to play with. We definitely need a SmartToys-able client for iOS.
Do you have a link to a website where I can check out this “SnartToy” thing? I have no idea what it it!
Works brilliantly with Virtual DJ.
I haven’t been able to even get my screen recordings to save with VS on a Gen 3 Pro. First time I haven’t had full functionality of an AUV3 on this machine outside of Wavestorm, which I believe is gone from the App Store.
Staella is great…..
ua-cam.com/video/AL2XnTgC7ho/v-deo.html
man i thought this would work totally different it has all the great visuals but how you set this up is super annoying and they made bam so intuitive i'm just confused here
I’m here to PLAY not to DRAW. I'm already disturbed by Animoog becoming increasingly similar to AutoCad. This seems to me like a generator of images that is uglier than many other hypnotic ones available on the market and that don't claim to be called synths.
I tried to like this plugin but it's hot garbage...