- 59
- 38 945
Matthew Cooksey
Приєднався 1 вер 2013
Відео
10 Unreal Engine 5 Tutorials in 10 Minutes
Переглядів 3398 місяців тому
Hey everyone! In my latest video, "10 Unreal Engine 5 Tutorials in 10 Minutes," I take you on a whirlwind journey from a blank canvas to crafting an interactive app filled with dynamic materials, 3D movement, and stunning particles. 🌌 Starting with absolutely nothing, I'll show you how Unreal Engine 5 can transform your creative visions into reality. This isn't just about making games; it's abo...
Virtual Production Tips - Mastering the Digital Studio
Переглядів 8028 місяців тому
Step into the future of filmmaking with our expert tips on virtual production. This video offers practical advice for navigating the digital studio landscape, from working with teams to integrating hardware. Learn how to approach virtual shoots, collaborate effectively, and make informed decisions to enhance your virtual production projects over time. Perfect for filmmakers looking to refine th...
How to Control Unreal Engine with Ableton Live [Tutorial, UE4, Ableton Live]
Переглядів 2,6 тис.9 місяців тому
Unlock the potential of 3D animation in Unreal Engine with this in-depth tutorial on controlling it using Ableton Live. Learn step-by-step how to set up Ableton Live to seamlessly integrate with Unreal Engine, allowing you to manipulate 3D animations live using OSC. Whether you're a game developer, animator, or music producer, this tutorial will open up a new world of creative possibilities. Di...
Unleashing AI Power in Gaming Integrating Chat GPT with Unreal Engine
Переглядів 6 тис.9 місяців тому
Step into the future of game development with our latest video, where we explore the groundbreaking integration of Chat GPT into Unreal Engine using the HTTP GPT plugin. This powerful combination opens up new horizons for game developers and creators, enabling the creation of dynamic AI agents with unprecedented levels of interaction and intelligence. What You'll Discover: Cutting-Edge Integrat...
Virtual Production Studio Upgrades - [UE 5, Aximmetry]
Переглядів 2,9 тис.9 місяців тому
Dive deep into the heart of innovation with our latest video, showcasing the groundbreaking updates we've made to our virtual production studio. We're pushing the boundaries of what's possible in virtual production, and we're excited to share our journey with you. What's New in Our Virtual Studio? 3D Masking Magic: Discover how we're using advanced 3D masking techniques to seamlessly integrate ...
A Weekend With Friends In Folkestone
Переглядів 1219 місяців тому
A Weekend With Friends In Folkestone
Unreal Engine 5, Virtual Production Live AV Virtual Production -Synchromotion Devlog 6
Переглядів 3869 місяців тому
Welcome back to our journey with 'Synchromotion'! 🌟 Devlog 6 dives into the thrilling evolution of our project as we embark on building Version 2.0. This isn't just an update; it's a complete reimagination, bringing new dimensions to our audiovisual symphony. 🎵🖼️ In this episode, I'm thrilled to share my plans for the enhanced graphics that will redefine our visual storytelling. Expect a blend ...
Unreal Engine and ChatGPT Agents. Game design using custom dynamic agents.
Переглядів 4639 місяців тому
Unreal Engine and ChatGPT Agents. Game design using custom dynamic agents.
Synchromotion - Live AV Project Devlog Ep.5 [Virtual Production, Unreal engine, Albeton Live]
Переглядів 3289 місяців тому
Dive into the latest episode of the Synchromotion Devlog, where we take you behind the scenes of our groundbreaking audiovisual project. This episode offers an exclusive look at the intricacies of our live shoot, showcasing the synergy of technology and creativity that brings Synchromotion to life. 🎥 Behind the Scenes: Join us on set to witness the magic of Synchromotion in action. From the set...
Synchromotion Live AV Performance [Unreal Engine, Ableton Live, Virtual Production]
Переглядів 2,1 тис.10 місяців тому
🌟 Welcome to the grand finale of Synchromotion, an audio-visual performance where music and graphics collide in real-time! Dive into a live session where I perform two mesmerizing songs using Ableton Live and Ableton Push, integrated with stunning live graphics powered by Unreal Engine. 🎶 What's Inside: Dual Performance: Experience the harmony of audio and visuals as I control both elements liv...
Synchromotion - Live AV Project Devlog Ep.4 [Virtual Production, Unreal engine, Albeton Live]
Переглядів 24310 місяців тому
🎥 Welcome to the latest episode of our Synchromotion Devlog series! In this episode, we're diving into the exciting process of setting up our virtual production studio. Get ready for an in-depth look at the behind-the-scenes magic that brings Synchromotion to life. 🚀 What's Inside: Studio Setup: Watch as we transform our studio space into a hub of creativity and technology. From positioning the...
How to use ChatGPT
Переглядів 8810 місяців тому
Welcome to an insightful journey into the world of ChatGPT! 🚀 In this video, I, Cooksey, delve into the practical and innovative ways to leverage ChatGPT for enhanced productivity and creativity. Here's what you'll discover: Building Context with ChatGPT: Learn how to effectively build context in conversations with ChatGPT. I'll guide you through the process of getting ChatGPT to ask targeted q...
Synchromotion - Live AV Project Devlog Ep.3 [Unreal engine, Albeton Live, Ableton Push, TouchOSC]
Переглядів 14910 місяців тому
Synchromotion - Live AV Project Devlog Ep.3 [Unreal engine, Albeton Live, Ableton Push, TouchOSC]
Synchromotion - Live AV Project Devlog Ep.2 [Unreal engine, Albeton Live]
Переглядів 10111 місяців тому
Synchromotion - Live AV Project Devlog Ep.2 [Unreal engine, Albeton Live]
Content Innovations Devlog 24th November 2023
Переглядів 5111 місяців тому
Content Innovations Devlog 24th November 2023
Innovations Devlog 17th November 2023
Переглядів 3811 місяців тому
Innovations Devlog 17th November 2023
Synchromotion - Live AV Project Devlog Ep.1 [Unreal engine, Albeton Live]
Переглядів 17211 місяців тому
Synchromotion - Live AV Project Devlog Ep.1 [Unreal engine, Albeton Live]
R&D Team Visit the Cloud Academy Team
Переглядів 8211 місяців тому
R&D Team Visit the Cloud Academy Team
Content Innovations Devlog 9th October 2023
Переглядів 59Рік тому
Content Innovations Devlog 9th October 2023
Virtual Production Studio Tour Deep Tech Dive (Unreal Engine, Aximmetry)
Переглядів 16 тис.Рік тому
Virtual Production Studio Tour Deep Tech Dive (Unreal Engine, Aximmetry)
Blackmagic Cloud Storage - How to Set it Up and Why It's Awesome
Переглядів 1,6 тис.Рік тому
Blackmagic Cloud Storage - How to Set it Up and Why It's Awesome
Script for copying and renaming files based on Shotlist
Переглядів 13Рік тому
Script for copying and renaming files based on Shotlist
Really awesome - love the use case! AI is definitely disrupting the video game industry in a major way.
How do I find clients for my studio? I don't really understand how I can make money from my studio.
I still want to ask, is it possible to connect to the game through a Chat GPT interface
Hey! I'm using this plugin with Whisper right now. For some reason, it works in Simulate mode only (with typing in the message). When I set the Content variable using my recognized speech in PIE, it doesn't work at all. I'm wondering if you or anyone else using this has encountered the same. Thanks!
Hi Mathew Cookey, I hope you're doing well. Could I get in touch with you? It would be great if I could have your email address or phone number. I need some assistance and would really appreciate your help!
Hi Mathew Cookey, I hope you're doing well. Could I get in touch with you? It would be great if I could have your email address or phone number. I need some assistance and would really appreciate your help!
Can you show the "Flow" map on Aximmetry - still trying to figure out how to record each camera separtely. thank you !!
I'd love to buy some of your SynchroMotion files/blueprints that are ready-made. I'm not very good at creating from scratch like you do, so I'd love to have ready made but flexible systems that I can use creatively with Ableton and UE!
Thank for this video
So happy for you you've got a 4k screen ... ever considered how much of that can be read on a 2k monitor?
Brurh um still waiting for the next one to drop 😅
Thank you for making this video. It answered many of my questions about Blackmagic Cloud. I have one question left. Does Blackmagic Cloud sync the little files you bring into a project that are collected or created after the shoot such as sound effects, music, narration audio clips, and images? Or does it only sync video proxies and originals?
Thanks a lot ! This is trully amazing !
I synced a project from my client's pc to my system and edited it but when I add new graphics and music to the project in client's pc they don't get synced to my pc. They are offline even though the client pc shows that they are synced to cloud.
Pretty slick setup! The prompter in the thumbnail caught my eye. 😎
Hello. I'm following each step, and while the signal from Ableton to Unreal is being sent and the print appears, the sound doesn't play along with the video. Could you suggest a solution? Thank you.
Do you think I could do all this on an M2 processor ? 8Gb RAMM? I don't want to commit if my computer is going to quit before I do :(
I'm a drummer thinking about doing mocap + instruments for live virtual productions. I'm so glad I stumbled onto your video to know that others are doing this same kind of work! These kinds of virtual productions are super interesting, especially when it's done live and you can respond to the music in real-time. I think Unreal is perfect for these kinds of things and I'm definitely going to be following along!!
🙌 Thankss
Hey Matthew, I'm working on something similar and was wondering if there's a way to contact you for information concerning this topic. Kind regards
Thanks to share your experiences with us... we are part of this community, testing and learning every day.
How do you select what pass to record on each HyperDeck? For example green screen on one and final composite on another?
Thanks for the Tutorial, that's what i've been looking for!
nice. also maybe you should make your display resolution 1080x1920 in display settings. just because it's easier to see on video. when youre recording videos at least.
Yes please! Even my 3440x1440 px Monitor cannot show the small text in Blueprints correctly
Any chance there's a way to get this to work using external audio like a microphone?
So vague from 3:30 on. We’re watching tutorials for a reason dude, doesn’t help the text is illegible
He lost me at getting the API Key....
How do you getting the Timecode from the cams and the Mars synced up together in Unreal at realtime. We using Deck Link Cards in combination with Tentacles to send a timeclock into the cams and the mars, but the cams are always not synced because the Blackmagic Media Bundle doesn't get a timeclock. 😔
It's not the timecode that matters so much as genlock. Genlock ensures that every frame is in sync with the tracking data coming in, otherwise they are coming in at different rates and will drift over time, even if you set it up correctly. You do need timecode however if you want to do post comp work. That means that your tracking data and video will have the same timestamps making it much easier to synchronize them when you're doing compositing etc.
@@cookseyyy Ah I understand. So the timecode is good for later syncing in post. 👍 But to sync the scene with the camera, after I recorded it on the cam with timeclock, how do I get the same timecode from something like a tentacle sync into the scene recording in aximmetry or even Unreal Engine directly? I mean I can plug the same timecode from the tentacle sync into my PC Audiojack (there are in sync with eachother so cams and pc getting the same audio timecode). But now I need to get the audio into aximmetry and convert the L audio channel into a timecode and feed it in someway into the scene recording inside of aximmetry to sync it up later in post. How are you handling timecode data in aximmetry or Unreal? Are you providing each timecode seperately or is there a way to use the left channel of the SDI Cam Signal as timecode in aximmetry?😅
@univers Unreal does receive timecode input. In content browser find blackmagic timecode provider. After setting it up, go to project setting and change timecode setting accordingly
thanks, but everything is very small on the screen, I can't see anything
Great tutorial man, much appreciated!!
Glad it helped!
Mask is like this techie-military non human vibe, but youre dancing and moving a lot so it doesnt match the style. it reduces your visibility so you have turn your head in a more dorky way
Amazing tips. Thanks for all the advice Cooksey.
Any time!
Great tips and all 100% true and valuable for people in virtual production! Also what's the best Aximmetry forums you use? You're right about documentation not being great.
I use the official ones on their site as well as some discord channels. I find it helpful to go to the source though and i've met some super helpful people on there! (I'm, mrlargefoot on there btw.)
@@cookseyyy Thanks. I have analysis paralysis on Aximmetry. Right now I am looking at the OWL Composure plug-in to see if it will work for me. I'll look for you!
Do you need GenLock for the Synch? Because the C70 has timecode in/out depending on your preference
It depends on the tracking system. The Vive Mars system needs genlock to stay in sync so the C70s aren't technically in sync. However this only matters if you plan to move the cameras independently. We typically only move our A camera which is an Ursa 12k which does have sync so we just use that.
Thnx for the tutorial, I managed to get it working on my computer but can't figure it out to receive the osc data from a local server between computers, is there a plugging you need to activate in order to receive messages from a network? I am using the IP and port from the network, Ableton recognize the incoming osc data but not unreal.
It's hard for me to visualize your setup so I'm guessing a bit here, but I'll give it a try! You should just be able to change the IP address of the server you set up in UE to be a proper network IP, (not 127.0.01). The trick is pointing to that IP address from whatever other device you're using. So for example when I'm in the studio my render machine is 192.168.0.14 (or something) and then on my laptop in ableton I make sure (1, I'm on the same network and subnet) and 2, I'm sending all my OSC data to 192.168.0.14, not some other IP such as the laptop itself.
Thnx!! I figure it out, had to use a max pluggin on the computer running unreal so it change incoming network IP into the computer IP. Don't know why but unreal engine dosnt recognize the IP of the network we are using.
Hey dude! Thanks for this video. Pretty good. I'm actively researching on cloud-based editing so it was great to come across this video. I'm still confused, though. I'm a freelancer, usually on a tight budget, solo operator. Media's got more expensive ever since I started shooting R3D, 2 years ago. And I'm always on the run. I shot surfing, travel often, and need to take drives with me all the way around. Do you think that would be an affordable solution for me? How much does all the BM cloud workflow cost for you (with dropbox or google driver, all that)?
It can get expensive if you're storing a lot of media. I'd say if you're solo and on one device then a NAS system is going to be much better value for you. However if you need to do the odd remote edit then you can just rely on blackmagic's proxy workflow which is really robust. Then you'd just need to be back on your main machine with OG rushes to get the full rez playout.
❤
Please dont stop working/researching on this topic <3
Hopefully I wont! I'd love suggestions though or things you'd be interested in learning more about!
Amazing video, thank you!
Glad you liked it!
crazy 🤩
Love how you did not edit out the little family intrusion...:) Super interesting stuff...thanks!! ( & subscribed...)
Thanks! I wouldn't be here without my family!
thank you !
Before I go to bed, I'll put the video speed to 0.25 (and volume to zero) and let it run again in the background, hopefully boosting you in the algorithm's eyes. Good luck & All the best!
Haha, hopefully that will help you sleep too!
Beautiful! Subscribed! ❤
This just kicked me back into UE and AI! This is so cool! Thank you for making this video. A quick question on it. Playing around a few hours today and I found that I've spent 5$ on OpenAI API tokens. Wondering if there is some other way to send/receive messages back-and-forth. Currently if the messages are stored in the array and sent multiple times for history and continuity it looks like it uses more and more tokens just to keep it consistent. What I am currently trying to do is having a manual, like a document about a product and ask GPT4 questions related to that document, so everytime I send the request, the document is sent again and again. Is there a way to have the document loaded in to OpenAI server, train the model according to it and then spend tokens only on requesting information back from it? Thanks!
Yeah that's a whole area to explore when using gen AI. You pay for tokens input (the context length essentially, which can get quite long when you're always feeding a whole chat in). And you also then pay for token output, essentially the length of the response. This is also related to the model you're using. There are methods for truncation and tokenization to reduce costs for bigger contexts for sure, but I'm not that experienced with that yet. With custom agents and GPTs you can pre-train them on your material which works great for PDFs and manuals like in your use case. The API is slightly different in that case though and I need to explore it further.
This is helpful to see some folks using a studio with the same rough use cases as we've been working on. Getting Unreal setup and working right has been our hangup. I think we want to live key using a blackmagic ultimatte.
Nice! Yeah I've looked at the Ultimattes a few times. In reality though the Aximmetry keyer is really great and we haven't needed to use anything different.
Cool informative video thank you! I had been experimenting trying to get a custom GPT to lead a role playing game by giving it an editable json file to use as a game state file. I never was able to get it push the story forward in a meaningful way. I'm curious if you have found a method that allows you to actually be able to play out a full narrative arc without it losing context.
Yes! This is a big challenge but there are a number of approaches. The best one I've found so far is to use a system of "Agents". So you're not using one instance of GPT to do everything but break it down into different parts such as characters and story writers. You can get them to all talk to each other to kind of share the load and get things in the right format. I've been building a system for this which I'll share soon! The other option which I've explored less is using GPT functions. They are designed to call and receive data in specific formats which should be useful, especially for working with other APIs
This is so cool. Thanks for such an awesome breakdown! There really is limitless potential here. I've been wanting to make some kind of open world multiplayer game that uses ai so Im looking forward to seeing how you build more systems with this!
I'm open to suggestions!
@@cookseyyy I'd love to know how I could make it do voice to text and then respond. But I'm worried if multiple people are talking to one npcs it'll get super garbled haha
Is there a live input HDRI application that can feed some log 360 images into a real-time rendering application, to get GI like lighting. I.e. for changes in lighting onset to procedurally relight?
I know a company called Antilatency are working on something like this. But they are doing it in relation to your lighting setup too, so you have a virtual HDRI capture for your environment and then it sends that data to your lighting to match. Your idea is cool though, and would work really well for AR graphics.
@@cookseyyy thanks for the lead. Can't see anything on their website, but I'll dig deeper Found some papers of people in universities doing this but no commercial application. A challenge when doing stage performances (inc fake hologram) is syncing up the lighting onset to the CG. I see some people have developed DMX to Unreal bridges, but I can't find realtime HDRI's.
Love this: At [Your Company/Channel Name], we're pushing the boundaries of what's possible in virtual production, and we're excited to share our journey with you. Good video that approaches the problems and shows a few solutions. Interested in seeing where you are moving to from the Vive system.
Haha! Good spot! As for the tracking system we're going to try out the ReTracker Bliss system. It should be arriving next week so I'll do a setup video for sure!
@@cookseyyy Please make a video on it. I will say that it took me about 4-5 weeks to dial in my Vive Mars Camtrack system however after a few days of testing it seems to be spot on now. Can you share why you are moving away from Vive?
@@Justin_Allen It's fine, but we shoot a lot of content in a day, and it sometimes gets a bit jittery or goes offline randomly. It mght be fine for weeks at a time but then one day if we lose an hour of shoot time it can really affect our output. I just want to try another option at a similar price point to see if it's better/the same/worse!
@@cookseyyy I appreciate your experience. Please update us if it works. I am in early stages but it really took me 3-5 weeks in order to get mine where I feel it is "rock-solid". The longest, and hopefully last issue, was really only solved by seeing a throwaway comment in the Unreal Engine documentation that I had not realized made a difference before. Since fixing that, at least over the last week, it's bee good for me. I'll be aware of the outages you mentioned now, thanks!
Hi Matthew! Great video! I'm really intrigued by how you implemented custom masking for inputs in Aximmetry. Could you possibly create a tutorial on this? It would be immensely helpful to see the step-by-step process. Also, the Xbox controller feature caught my eye - it looks like a fantastic tool for navigation. I would love to see a detailed guide on how to set it up and use it in your workflow. Thanks for sharing these innovative techniques!
Sure thing! I've been meaning to do it for so long!
Hello there, i also really appreciate your effort and i am also interested in the masking plane feature you built. Can you tell if you built that in unreal or is it an aximmetry feature?