Posts

Showing posts from July, 2023

TouchOSC to Unreal 5

Image
Playing with TouchOSC to create an external interface on an iPad that can control content in Unreal 5. OSC ( open sound control ) can use simple float value data that can be sent across your network to trigger events. This opens up loads of potential for live experiences and virtual production uses. Just need a bit of time to research and experiment with some ideas. This was adapted from a great introductory tutorial by Aiden Wilson, found here - https://youtu.be/9CkKPCBys44 Blueprint of OSCSetup TouchOSC can be downloaded from  https://hexler.net/touchosc For the OSCserver - in Unreal set the IP address to 0.0.0.0 or empty, but listening on port 8000 ( or similar port not being used by other apps / hardware ) In the TouchOSC HOST setup , look for the IP address of the PC - use command prompt to find it - ipconfig should reveal the IPv4 address The Blueprint listens for messages sent via OSC - which can be broken down to Events. - In the event you can recieve you the float value ( ...

360 Experiences - made for UnCommons #2

Image
Blast to the past - on 8th December 2016, for  UnCommons #2   , I created several immersive 360 Experiences using After Effects and 360 photo. These virtual christmas Babules which the audeience could step into using their smartphones were developed for an UnCommons event at the Delius Center / Artworks Creative in Bradford. This was an after party event of the Wearable Tech event at the Bradford Media Museum. At the event - which was also a silent disco - audience members via a QRCode in the building could launch the 360 videos for playback on their phone, or with a google cardboard headset. The sonic collaboration was with Dr. Marlo De Lara - who developed the synthetic soundscapes - some of the audio was used to create audio reactive visuals in some of the Baubles, or in general evoke the visual qualities that were created for the bauble's - elements of fragmentation, drifting, loss of equilbrium. De Lara produces textural compositions which develop from micros...

Adobe Character Animator / Unreal 5 Mashup

Image
Lots of potential here - having Adobe Character Puppets playing live into an Unreal Environment - this opens up opportunities of Cartoon Digital Puppets for Live Shows and Immersive Theatre. The setup here - is piping in a live feed generated from Adobe Character Animator - using Mercury Transit - picked up using NDI - to pipe it into Unreal as a video texture with Alpha Channel. NDI Tools - is available to download here - https://ndi.video/type/ndi-tools/ you then need to install the NDI Unreal SDK - you will need to fill in an email form to recieve a link. an alternative would be Off Worlds - Toolkit - that has NDI streaming - https://offworld.live/resources/download-center here's an old tutorial that shows the basics of setting up a NDI reciver and material in Unreal.  

Playing with Unreal 5 - Motion Graphics

Image
Over the year - I have been dabbling around with Unreal to get a sense of how to use it for Realtime  Motion Graphics - and particularly Audio Reactive visuals. Niagra Particles were a great area to investigate that had much flexibility to explore and easy(ish) to setup with Blueprints. Above is a simple Niagra Setup - rendering meshes at each particle point, this was a great place to start to get my head around the power of this system in Unreal 5 and see how far I could push my computer with realtime rendering. Using Sam Schreuder's Niagra Particle tutorial example - which can be found here -  https://www.youtube.com/watch?v=UETAS5g-q4M&t=0s  - you can quickly setup the basics of sound reaction that effects properties of the particle emitter. Short tutorial - and covers the fundamentals to investigate further. Only downside of this method - is that it is in game audio - rather than a live feed coming in from a linein/microphone ( that's on the todo list to explore )...

Metahuman 2.0 - Using Raw Camera input

Image
This quick explore is using the facetime camera on an Ipad Pro ( gen 2 ) to capture video and depth - to then post process onto a Metahuman facerig. This method though not realtime - allows for more subtle performance in the facial muscles. The process doesn't take too long and allows for editing of the sequences easily and Retargeting the data onto other Metahumans 2 rigs. The Gen 2 Ipad pro - isn't recommend as supported by Unreal LiveLink Apple Device - but so far it seems to behave appropriately. In terms of online support - Epics own Development support gets you quickly up and running - which you can find here -  https://dev.epicgames.com/documentation/en-us/metahuman/metahuman-documentation