Matt Baamonde is a multifaceted creative force, constantly pushing the boundaries of art and technology. With a degree in music composition from the renowned Berklee College of Music as a shredding guitar virtuoso, Matt's journey began in the realm of music production and film scoring. After a decade of working on hundreds of TV shows and movies in various creative roles in Hollywood, he ventured into the visual arts, specializing as an Unreal Engine VFX artist and character animator.
Driven by an insatiable passion for innovation, Matt thrives at the forefront of virtual world-building. He embraces the limitless possibilities that emerge from the fusion of cutting-edge technology and boundless creativity. As a pioneer in the new era of media and entertainment, Matt is excited to shape the future of immersive experiences.
With a diverse skill set encompassing visual effects, graphic design, 3d animation, photography, video editing, and music production, Matt brings a holistic perspective to his projects. His musical background lends a unique understanding of rhythm and movement, essential elements that breathe life into his visual creations.
A visionary at heart, Matt's work is fueled by his ability to manifest the impossible. He continually seeks to expand the horizons of what's achievable, transforming his imagination into captivating realities. Through his art, Matt invites audiences to step into uncharted territories and experience the extraordinary.
UNREAL ENGINE | ANIMATION | VIRTUAL PRODUCTION | GAME DESIGN | AI
Degree: Contemporary Writing and Production
Expert in Unreal Engine 5 (& 4) for comprehensive environment creation and level design, including terrain, lighting, foliage, sky and water surfaces. Skilled in 3D modelling for level design, static mesh optimization and procedural content generation (PCG), enhancing interactive world realism and NPC AI integration.
Experienced in managing cinematic animation pipelines and rendering processes using Unreal Sequencer, Blender, Maya including post production VFX work in After Effects, Nuke and/or DaVinci Resolve. Specialized in motion capture cleanup and high-fidelity animation rendering, optimizing sequences for VFX and interactive media.
Proficient in Blueprint Visual Scripting within Unreal Engine, creating interactive applications and games. Expertise in live interactions and sequence activation, enabling rapid feature implementation and gameplay enhancement.
Skilled in advanced virtual production tools, managing the integration of physical and digital elements with technologies like NDisplay, Mo-sys, and Motion Capture Suits. Proficient in DMX lighting control and real-time sequence activation for enhanced production fidelity.
Experienced in designing custom materials using Unreal Engine, focusing on multi-layer blending and virtual texturing. Optimizes material logic for solid body, organic, and VFX surfaces, improving visual quality and performance.
Proficient in version and revision control using Perforce P4V and Git, ensuring project consistency and collaborative efficiency. Implements robust version control systems to manage project iterations and maintain code integrity across team contributions.
Unreal Engine 4 + Unreal Engine 5, Unity, Cinema 4D, Daz Studio, Maya, Blender, 3DS Max, Houdini
Blackmagic DaVinci Resolve + Fusion, Nuke, Sony Vegas
Photoshop, After Effects, Premiere, Illustrator, InDesign, Lightroom & Lightroom Classic, XD, Bridge, Acrobat, Animate, and Character Animator, Substance Suite
ChatGPT + API, Midjourney, DALLE, Runway, Wonder Dynamics (Invited Beta Tester,) ControlNet, Eleven Labs, Stable Diffusion, Deforum, AnimateDiff, Comfy UI and Automatic 111
HTML & CSS, Javascript, Python, C++
Wordpress, Shopify, Wix, Squarespace, WooCommerce and similar front end web platforms.
Background in Hollywood scoring and music supervision with over 100 film and national TV credits. VES Global member.
Strength: Resourcefulness
Weakness: Coffee
Music, Photography, Guitar, Filmmaking, Fitness, Video Game Development
Raccoon

As part of one of my first projects at CNN, I helped kickoff the CNN real-time XR Virtual Weather Set by redesigning the entire blueprint and logic base, designing looks for backgrounds, the central monitors, skies and foliage, as well getting the project performant and ready to air under a fast deadline. Super proud of the incredible artists and engineers on the XR and production teams and all the work they did before and after I joined to get this project to the finish line.





I played a key role in bringing Microsoft's flagship events - Build, Inspire, and Celebrate - to life through cutting-edge virtual production techniques. Using Unreal Engine's nDisplay technology and LED backgrounds, we created immersive, dynamic environments that elevated these high-profile corporate events to new heights of visual engagement and technological innovation.
Unreal Engine Versions: Leveraged both Unreal Engine 4.27 and 5.1, utilizing the strengths of each version to meet diverse production needs.
nDisplay Implementation: Mastered Unreal's nDisplay feature to synchronize content across multiple LED screens, creating seamless, expansive virtual environments.
LED Integration: Seamlessly blended physical presenters with digital backgrounds on massive LED walls, creating a hybrid reality that enhanced the presenters' interactions with virtual content.
Microsoft Build
Designed and implemented interactive 3D visualizations of complex software architectures and cloud systems. Created dynamic, code-themed environments that responded in real-time to on-stage demonstrations.
Microsoft Inspire
Developed a virtual global marketplace, allowing presenters to seamlessly "travel" between different international settings. Implemented real-time data visualization tools to showcase Microsoft's global partner network.
Microsoft Celebrate
Crafted celebratory virtual environments that dynamically changed based on the achievements being highlighted. Designed and animated virtual fireworks and particle effects for key moments in the presentations.
Multi-Version Workflow: Developed a pipeline to efficiently work between Unreal Engine 4.27 and 5.1, leveraging each version's unique capabilities while maintaining consistency across events.
Real-time Rendering Optimization: Fine-tuned rendering settings to achieve high visual fidelity while maintaining smooth performance on LED walls.
Custom Blueprints: Created a library of reusable Blueprints for quick iteration of event-specific features and interactive elements.
Dynamic Lighting: Implemented complex lighting scenarios that could be adjusted in real-time to match the mood and theme of different segments within each event.
High FPS Asset Optimization: Implemented rigorous asset optimization techniques to ensure consistent high frame rates across all events. Developed a LOD (Level of Detail) system tailored for LED wall viewing distances, balancing visual quality and performance. Utilized texture atlasing to reduce draw calls and improve rendering efficiency. Implemented aggressive mesh optimization techniques, including polygon reduction and efficient UV mapping. Created a custom shader library optimized for high-performance rendering on LED displays.
Developed a custom content streaming system to manage the vast amount of assets required for multiple events. Created a bespoke color correction system to ensure consistent visual output across different LED panel types and environments. Implemented a real-time compositing solution within Unreal to blend live camera feeds with virtual elements seamlessly.
Performance Profiling Tools: Developed custom profiling tools within Unreal Engine to identify and address performance bottlenecks in real-time during live events.
Adaptive Resolution Scaling: Implemented an adaptive resolution scaling system that dynamically adjusted render resolution to maintain target frame rates during complex scenes.
Scale and Complexity: Managed the immense scale of these events by creating modular, reusable assets and implementing efficient level streaming techniques.
Version Compatibility: Developed a robust pipeline to ensure assets and features were compatible and consistent across different Unreal Engine versions.
Live Event Demands: Implemented fail-safe systems and real-time adjustment capabilities to handle the unpredictable nature of live events.
High FPS Requirements: Tackled the challenge of maintaining 60+ FPS for LED wall displays by implementing aggressive GPU and CPU optimizations. Developed a dynamic asset loading system that intelligently managed memory usage to prevent frame rate drops during complex scenes. Created a real-time performance monitoring system that alerted operators to potential FPS issues before they became visible to the audience.
Our virtual production work significantly enhanced the Microsoft events by providing visually stunning and immersive environments that reinforced Microsoft's position as a technology leader. Enabled presenters to interact with virtual 3D models and data in ways that made complex concepts more accessible and engaging. Allowed for quick transitions between diverse "locations" and themes without physical set changes. Created a consistent, high-quality visual experience for both in-person and remote attendees. Achieved and maintained high frame rates throughout the events, ensuring smooth motion on LED walls and eliminating visual artifacts that could distract from the presentations.
This project showcases not only the power of Unreal Engine in transforming corporate events into immersive, interactive experiences but also demonstrates the technical expertise required to optimize these complex virtual environments for flawless, high-performance delivery in a live setting. By leveraging cutting-edge virtual production techniques and implementing robust optimization strategies, we were able to create environments that not only wowed audiences but also enhanced the communication of Microsoft's key messages and technological innovations.

In the Spring of 2022, LEO Events hired XiteLabs to create content for Walmart's Annual Shareholders Conference and Associate Celebration Show. Taking place in Walmart's Arena, the hybrid event brought together an audience of 14,000 Walmart associates and shareholders while millions watched the live broadcast remotely. The conference celebrated Walmart's continued growth and company spirit with all the power of Xite's hybrid event workflows.
Xite was tasked with creating content to accompany Walmart President and CEO Doug McMillon's keystone presentation, and built an immersive world to accompany the 'movie magic' theme of the presentation. The creative team designed a life-size, mixed-reality movie theatre which incorporated finishing touches such as photos of the founding Walton family and Walmart's signature branding.
Xite created a palette of specific AR elements for the live broadcast portion of the conference. Some of these included props that lived or floated above the stage and a virtual theater marquee that protruded over the stage. These items provided an immersive experience for those in the audience via large screens showing the AR magic, and at-home viewers who saw the full magic of Xite's hybrid AR show. In addition, to start building on McMillon's vision, the broadcast used augmented reality to display clips of his favorite movies, enhancing the narrative of the show.
A "Back to the Future" DeLorean rises from the stage and shoots out to orbit an enormous AR globe floating over the audience.
Augmented reality charts materialize onstage next to Doug as he speaks on the continued growth Walmart has seen over the past few years.
An AR drone launches from a tower, flies over the audience and delivers a package in a vibrant illustration of Walmart's new drone delivery capabilities.
A triumphant ending – stars appeared over the heads of associates receiving awards and recognition for their service. Their names were displayed across the movie theatre marquee. Furthermore, this brought to life the idea that McMillon sees his associates as bright shining stars worthy of being celebrated, much like celebrities, for their hard work and incredible spirit in Walmart's movie.
Custom real-time rendering (8 UHD synced outputs) displayed on the massive 15,000 pixel-wide LED screen and floor using Unreal Engine.
Additional augmented reality elements rendered live with StypeLand, for the thousands in the arena and millions of people watching worldwide.
Red Spy was utilized for AR tracking while Blender was used to create 3D models.

This passion project brought to life an intense battle between Link and Ganondorf from The Legend of Zelda series. Set in a breathtaking fantasy environment - a forest in the clouds surrounding a sacred shrine - this short film showcases a brutal and epic fight scene. The project demonstrates a comprehensive use of Unreal Engine's capabilities, blending motion capture technology with hand-crafted animation to create a visually stunning and action-packed fan tribute.
Unreal Engine Mastery: Leveraged advanced features of Unreal Engine for high-fidelity rendering and complex animation integration.
Motion Capture Integration: Seamlessly combined motion capture data with hand-keyed animation for fluid and dynamic character movements.
World Creation: Designed and built an original, immersive fantasy environment that captures the essence of The Legend of Zelda universe.
Advanced Animation Techniques: Utilized a mix of motion capture and hand-keyed animation to create realistic combat sequences and expressive character performances.
Created a lush, floating forest environment with intricate details: dynamic cloud systems surrounding the forest platforms, detailed sacred shrine with ornate Hyrulean architecture, responsive vegetation and environmental elements. Implemented atmosphere and lighting systems to enhance the magical ambiance of the setting.
Choreographed and animated an intense battle sequence between Link and Ganondorf. Utilized motion capture for base body movements to ensure realistic physics in combat. Enhanced with hand-keyed animations for weapon interactions and acrobatic moves. Implemented a dynamic camera system to capture the most impactful moments of the fight.
Developed highly detailed character models for Link and Ganondorf, faithful to their iconic designs. Created expressive facial animations with hand-keyed facial animations to convey emotion and intensity during the battle. Implemented blend shapes for nuanced expressions and lip-syncing.
Designed and implemented a range of visual effects using Unreal's Niagara system: magical effects for weapon clashes and special abilities, environmental particle systems for enhanced atmosphere. Created original sound design to complement the visuals with custom-designed sound effects for combat, magic, and environmental elements. Implemented 3D audio for an immersive viewing experience.
Animation Workflow: Recorded base movements via motion capture for complex fighting sequences. Refined motion capture data and blended with hand-keyed animations. Utilized control rigs for detailed facial expressions and lip-syncing. Fine-tuned fight sequences to balance realism with the fantastical elements of Zelda lore.
Environment Creation: Developed a custom foliage system for the dense, magical forest. Implemented dynamic lighting to create a sense of time and atmosphere. Used Unreal's landscape tools to craft the floating islands and cloud formations.
Visual Effects Pipeline: Created a library of Niagara-based effects for magic, combat impacts, and environmental details. Developed material functions for dynamic character interactions (e.g., cloth movement, armor reflections).
Performance Optimization: Implemented efficient LOD systems for the complex environment. Optimized character meshes and animations for smooth playback. Utilized Unreal's Sequencer for efficient scene composition and rendering.
Style Balance: Carefully balanced realism with the stylized aesthetics of The Legend of Zelda series.
Complex Animation Integration: Developed a robust system to seamlessly blend motion capture data with hand-keyed animations.
Performance Management: Optimized the highly detailed environment and complex VFX for smooth playback and rendering.
Garnered significant attention and praise from The Legend of Zelda fan community. Showcased as an example of high-quality fan animation using game engine technology. Demonstrated the potential of Unreal Engine in creating cinematic-quality animated shorts.
This Legend of Zelda fan animation project exemplifies the power of Unreal Engine in creating stunning, narrative-driven animated content. By combining technical expertise in animation, environment design, and visual effects with a deep appreciation for the source material, this short film pushes the boundaries of what's possible in fan-created content. The project not only pays homage to the beloved franchise but also serves as a testament to the capabilities of modern game engines in producing cinematic-quality animations.
Bringing the dystopian future of Neo-Gotham to life, this Batman Beyond animated pitch showcases the potential for a gritty, neon-soaked reimagining of the beloved animated series. Leveraging the power of Unreal Engine 5, we've created a visually stunning and atmospherically rich environment that captures the essence of Terry McGinnis' world.
Real-time Rendering: Utilized Unreal Engine 5's Lumen global illumination system to achieve dynamic, cinematic-quality lighting in real-time, bringing Neo-Gotham's bioluminescent underbelly to life.
Nanite Micropolygon Geometry: Employed UE5's Nanite virtualized geometry to create highly detailed cityscapes and character models without compromising performance.
Ray Tracing: Implemented ray-traced reflections and shadows to enhance the futuristic, glossy surfaces of Neo-Gotham's architecture and the sleek design of the Batsuit.
Niagara VFX: Crafted complex particle systems using Unreal's Niagara to simulate the neon haze, flying cars, and Batman's high-tech gadgetry.
MetaHuman Integration: Adapted MetaHuman technology to create hyper-realistic facial animations for Terry and other key characters, pushing the boundaries of digital performance.
Our vision was to blend the nostalgic appeal of the original series with cutting-edge graphics that today's audiences expect. By reimagining classic elements through a cyberpunk lens, we've created a Neo-Gotham that feels both familiar and freshly dangerous.
The pitch video strategically unveils key iconic elements - the Batcave, Terry's suit-up sequence, and a pulse-pounding flight through the city - each designed to evoke the series' core themes of legacy, technology, and the eternal battle against corruption.
Developed a custom material system to dynamically age and weather buildings, allowing us to showcase Neo-Gotham's decades of history in a single frame.
Created an AI-driven traffic system for flying vehicles, bringing a living, breathing quality to the city's skyways.
Implemented a unique "memory glitch" effect using Unreal's post-process volume, visually representing the interplay between Terry's actions and Bruce Wayne's mentorship.
This project not only demonstrates technical proficiency in Unreal Engine 5 but also showcases the potential for reviving beloved properties with state-of-the-art game engine technology. The Batman Beyond pitch stands as a testament to the power of real-time rendering in breathing new life into animated storytelling.

For BMW's groundbreaking presentation at the Consumer Electronics Show (CES), our team created a digital pre-visualization of the entire stage show. My role focused on animating a digital stand-in for Arnold Schwarzenegger, a key element in ensuring the seamless integration of the physical and digital aspects of this high-profile event.
Custom Character Creation: Worked with a detailed skeletal mesh of Arnold Schwarzenegger created in Blender, importing and optimizing it for use in Unreal Engine.
Real-time Animation: Leveraged Unreal Engine's animation tools to create lifelike movements for the digital double, matching the planned choreography of the live event.
Skeletal Mesh Rigging: Fine-tuned the character rig within Unreal Engine to ensure smooth, realistic movements specific to Schwarzenegger's physique and mannerisms.
Environment Interaction: Ensured accurate interaction between the digital character and the virtual stage environment, crucial for precise planning of camera angles and lighting setups.
Animated the custom skeletal mesh stand-in of Arnold Schwarzenegger, meticulously matching planned movements and gestures to aid in event choreography and timing. Collaborated closely with the event planning team to adjust animations based on script changes and stage direction. Created a library of reusable animation assets for quick iterations and last-minute changes. Optimized the character's performance to maintain high frame rates in real-time, ensuring smooth previsualization for the production team.
Developed a custom blend space system for smooth transitions between different poses and movements, allowing for rapid adjustments during pre-visualization sessions. Implemented a real-time lighting system that mimicked the planned stage lighting, allowing directors to preview how lighting changes would affect the presenter's appearance. Utilized Unreal's Sequencer to create a precise timeline of events, synchronizing the digital double's movements with other elements of the presentation.
This pre-visualization work was instrumental in the success of BMW's CES presentation. By providing an accurate digital representation of Arnold Schwarzenegger on the virtual stage: event planners could fine-tune the show's pacing and visual impact before the physical stage was even built. Camera operators received invaluable insight into optimal shooting angles and movements. Lighting designers could pre-plan their setups, ensuring the celebrity presenter would be showcased effectively alongside the prototype vehicle. The production team could identify and resolve potential issues well in advance, leading to a smoother, more polished final presentation.
This project exemplifies the power of Unreal Engine in event pre-visualization, demonstrating how custom digital doubles and real-time animation can significantly enhance the planning and execution of major corporate showcases.

For the prestigious Ferrari NYC Gala, I was tasked with creating a mesmerizing pre-visualization of a prototype Ferrari model. The goal was to craft a series of abstract, reflective camera movements that would highlight the vehicle's sleek exterior finish, setting the stage for an unforgettable reveal at this high-profile event.
Unreal Engine 5 Rendering: Leveraged UE5's advanced rendering capabilities to achieve photorealistic results in real-time.
Lumen Global Illumination: Utilized Unreal's Lumen technology for dynamic, high-quality global illumination, enhancing the car's reflective surfaces with true-to-life lighting.
Path Tracing: Employed UE5's path tracing capabilities for ultra-realistic reflections and shadows, crucial for showcasing the Ferrari's glossy exterior.
Cinematic Camera Work: Crafted sophisticated camera movements using Unreal's Sequencer, creating fluid, abstract shots that accentuated the car's curves and finish.
Designed and implemented a series of abstract, reflective camera movements to showcase the prototype Ferrari's exterior in a visually stunning manner. Fine-tuned material properties to accurately represent the unique finish of the prototype vehicle, ensuring it responded realistically to various lighting conditions. Optimized rendering settings to achieve the highest visual quality while maintaining performance for real-time previews. Collaborated with the event planning team to ensure the visual narrative aligned with the gala's overall aesthetic and Ferrari's brand identity.
Developed a custom post-process material to enhance the reflective qualities of the car's surface, creating a unique visual signature for the reveal. Implemented a dynamic environment system that subtly changed lighting and reflections throughout the sequence, adding depth and interest to the abstract shots. Created a bespoke auto-exposure system to handle the challenging lighting scenarios, ensuring the car remained the focal point despite varying brightness levels.
Lumen Setup: Configured Lumen settings to achieve the perfect balance between accuracy and performance, paying special attention to indirect lighting bounces to capture the nuanced reflections on the car's surface.
Path Tracing Optimization: Fine-tuned path tracing parameters to achieve film-quality reflections and shadows while keeping render times manageable for iterative workflow.
Material Mastery: Crafted complex, layered materials in Unreal's Material Editor to accurately represent the prototype's unique paint job, including subtle flake and clear coat effects.
The pre-visualization played a crucial role in the Ferrari NYC Gala: provided event organizers with a clear vision for the prototype reveal, informing decisions on stage design and lighting setups. Allowed for precise planning of the reveal sequence, ensuring maximum visual impact during the live event. Served as a reference for the video production team, guiding their approach to filming the actual vehicle during the gala. Created buzz and anticipation among Ferrari executives and VIP guests, setting the stage for an unforgettable unveiling.
This project showcases the power of Unreal Engine 5 in creating high-end automotive visualizations, demonstrating how cutting-edge real-time rendering can elevate prestigious events and product reveals.

"Star Wars: Souls of the Fallen" is an ambitious fan film that I wrote and brought to life using cutting-edge virtual production techniques. As the Unreal Engine artist for this project, I was responsible for creating all backgrounds and VFX animations, seamlessly blending live-action footage with digital environments to bring the Star Wars universe to life.
Unreal Engine 5 Integration: Leveraged UE5's advanced features to create photorealistic Star Wars environments, from sprawling space vistas to detailed interior shots of Star Destroyers.
Virtual Production Pipeline: Implemented a real-time virtual production workflow, allowing for immediate visualization of scenes combining live actors with digital backgrounds.
Dynamic Lighting: Utilized Unreal's Lumen global illumination system to create dynamic, cinematic-quality lighting that reacted realistically to both practical and digital elements.
VFX Animation: Created complex visual effects animations, including lightsaber battles, force powers, and space combat sequences, all rendered in real-time within Unreal Engine.
Adapted the script into a visual language, translating written descriptions into immersive 3D environments and dynamic visual effects. Designed and built iconic Star Wars locations, including the bridge of the Imperial Star Destroyer "The Dominion", a nebula-shrouded ocean planet with distinctive ring systems, and various Imperial corridors and chambers, each with unique atmospheric qualities.
Developed a custom shader system for lightsaber effects, allowing for real-time color changes and dynamic interactions with the environment. Created a procedural star field generator for space scenes, enabling quick iterations of different galactic backdrops. Implemented a modular set design system within Unreal, allowing for rapid prototyping and adjustment of Imperial interiors to match script requirements.
Green Screen Integration: Utilized Unreal's compositing tools to seamlessly blend live-action green screen footage with digital environments in real-time.
Performance Optimization: Balanced visual fidelity with performance to ensure smooth playback during live shoots, enabling directors and actors to react to the virtual environment in real-time.
Asset Creation Pipeline: Developed a streamlined workflow for creating and importing Star Wars-themed assets, ensuring consistency with the established visual language of the franchise.
Recreating the iconic Star Wars aesthetic while adding unique visual elements to our fan film universe. Balancing the need for high-quality visuals with the performance requirements of real-time rendering. Coordinating with the live-action team to ensure seamless integration between physical and digital elements.
This project showcases the power of Unreal Engine in bringing fan-created content to life with a level of visual quality that rivals professional productions. By leveraging cutting-edge virtual production techniques, we were able to create a Star Wars fan film that captures the essence of the franchise while adding our own unique vision.
The use of Unreal Engine not only allowed for stunning visuals but also provided the flexibility to make real-time adjustments during production, enhancing the collaborative process between the digital and physical aspects of filmmaking.




.webp)






Content coming soon — check back for project details, images, and videos.
Explorations into Generative AI, Media Creation and Filmmaking
General language based prompt development for controlled generative output. Skilled in nuanced language for stylistic, filmic, cinematic, artistic and design based detailing. Expert knowledge of various platforms and specific tags, parameters, command lines, input adjustments and referencing.
Expert in command line parameters, cinematic and stylistic character creation and upscaling results for final pixel use.
Stable Diffusion, Midjourney, Dalle, Microsoft Bing, ChatGPT, HeyGen, Elevenlabs





































































