Matt Baamonde is a multifaceted creative force, constantly pushing the boundaries of art and technology. With a degree in music composition from the renowned Berklee College of Music as a shredding guitar virtuoso, Matt's journey began in the realm of music production and film scoring. After a decade of working on hundreds of TV shows and movies in various creative roles in Hollywood, he ventured into the visual arts, specializing as an Unreal Engine VFX artist and character animator.
Driven by an insatiable passion for innovation, Matt thrives at the forefront of virtual world-building. He embraces the limitless possibilities that emerge from the fusion of cutting-edge technology and boundless creativity. As a pioneer in the new era of media and entertainment, Matt is excited to shape the future of immersive experiences.
With a diverse skill set encompassing visual effects, graphic design, 3d animation, photography, video editing, and music production, Matt brings a holistic perspective to his projects. His musical background lends a unique understanding of rhythm and movement, essential elements that breathe life into his visual creations.
A visionary at heart, Matt's work is fueled by his ability to manifest the impossible. He continually seeks to expand the horizons of what's achievable, transforming his imagination into captivating realities. Through his art, Matt invites audiences to step into uncharted territories and experience the extraordinary.
UNREAL ENGINE | ANIMATION | VIRTUAL PRODUCTION | GAME DESIGN | AI
Degree: Contemporary Writing and Production
Expert in Unreal Engine 5 (& 4) for comprehensive environment creation and level design, including terrain, lighting, foliage, sky and water surfaces. Skilled in 3D modelling for level design, static mesh optimization and procedural content generation (PCG), enhancing interactive world realism and NPC AI integration.
Experienced in managing cinematic animation pipelines and rendering processes using Unreal Sequencer, Blender, Maya including post production VFX work in After Effects, Nuke and/or DaVinci Resolve. Specialized in motion capture cleanup and high-fidelity animation rendering, optimizing sequences for VFX and interactive media.
Proficient in Blueprint Visual Scripting within Unreal Engine, creating interactive applications and games. Expertise in live interactions and sequence activation, enabling rapid feature implementation and gameplay enhancement.
Skilled in advanced virtual production tools, managing the integration of physical and digital elements with technologies like NDisplay, Mo-sys, and Motion Capture Suits. Proficient in DMX lighting control and real-time sequence activation for enhanced production fidelity.
Experienced in designing custom materials using Unreal Engine, focusing on multi-layer blending and virtual texturing. Optimizes material logic for solid body, organic, and VFX surfaces, improving visual quality and performance.
Proficient in version and revision control using Perforce P4V and Git, ensuring project consistency and collaborative efficiency. Implements robust version control systems to manage project iterations and maintain code integrity across team contributions.
Unreal Engine 4 + Unreal Engine 5, Unity, Cinema 4D, Daz Studio, Maya, Blender, 3DS Max, Houdini
Blackmagic DaVinci Resolve + Fusion, Nuke, Sony Vegas
Photoshop, After Effects, Premiere, Illustrator, InDesign, Lightroom & Lightroom Classic, XD, Bridge, Acrobat, Animate, and Character Animator, Substance Suite
ChatGPT + API, Midjourney, DALLE, Runway, Wonder Dynamics (Invited Beta Tester,) ControlNet, Eleven Labs, Stable Diffusion, Deforum, AnimateDiff, Comfy UI and Automatic 111
HTML & CSS, Javascript, Python, C++
Wordpress, Shopify, Wix, Squarespace, WooCommerce and similar front end web platforms.
Background in Hollywood scoring and music supervision with over 100 film and national TV credits. VES Global member.
Strength: Resourcefulness
Weakness: Coffee
Music, Photography, Guitar, Filmmaking, Fitness, Video Game Development
Raccoon

As part of one of my first projects at CNN, I helped kickoff the CNN real-time XR Virtual Weather Set by redesigning the entire blueprint and logic base, designing looks for backgrounds, the central monitors, skies and foliage, as well getting the project performant and ready to air under a fast deadline. Super proud of the incredible artists and engineers on the XR and production teams and all the work they did before and after I joined to get this project to the finish line.





I played a key role in bringing Microsoft's flagship events - Build, Inspire, and Celebrate - to life through cutting-edge virtual production techniques. Using Unreal Engine's nDisplay technology and LED backgrounds, we created immersive, dynamic environments that elevated these high-profile corporate events to new heights of visual engagement and technological innovation.
Unreal Engine Versions: Leveraged both Unreal Engine 4.27 and 5.1, utilizing the strengths of each version to meet diverse production needs.
nDisplay Implementation: Mastered Unreal's nDisplay feature to synchronize content across multiple LED screens, creating seamless, expansive virtual environments.
LED Integration: Seamlessly blended physical presenters with digital backgrounds on massive LED walls, creating a hybrid reality that enhanced the presenters' interactions with virtual content.
Microsoft Build
Designed and implemented interactive 3D visualizations of complex software architectures and cloud systems. Created dynamic, code-themed environments that responded in real-time to on-stage demonstrations.
Microsoft Inspire
Developed a virtual global marketplace, allowing presenters to seamlessly "travel" between different international settings. Implemented real-time data visualization tools to showcase Microsoft's global partner network.
Microsoft Celebrate
Crafted celebratory virtual environments that dynamically changed based on the achievements being highlighted. Designed and animated virtual fireworks and particle effects for key moments in the presentations.
Multi-Version Workflow: Developed a pipeline to efficiently work between Unreal Engine 4.27 and 5.1, leveraging each version's unique capabilities while maintaining consistency across events.
Real-time Rendering Optimization: Fine-tuned rendering settings to achieve high visual fidelity while maintaining smooth performance on LED walls.
Custom Blueprints: Created a library of reusable Blueprints for quick iteration of event-specific features and interactive elements.
Dynamic Lighting: Implemented complex lighting scenarios that could be adjusted in real-time to match the mood and theme of different segments within each event.
High FPS Asset Optimization: Implemented rigorous asset optimization techniques to ensure consistent high frame rates across all events. Developed a LOD (Level of Detail) system tailored for LED wall viewing distances, balancing visual quality and performance. Utilized texture atlasing to reduce draw calls and improve rendering efficiency. Implemented aggressive mesh optimization techniques, including polygon reduction and efficient UV mapping. Created a custom shader library optimized for high-performance rendering on LED displays.
Developed a custom content streaming system to manage the vast amount of assets required for multiple events. Created a bespoke color correction system to ensure consistent visual output across different LED panel types and environments. Implemented a real-time compositing solution within Unreal to blend live camera feeds with virtual elements seamlessly.
Performance Profiling Tools: Developed custom profiling tools within Unreal Engine to identify and address performance bottlenecks in real-time during live events.
Adaptive Resolution Scaling: Implemented an adaptive resolution scaling system that dynamically adjusted render resolution to maintain target frame rates during complex scenes.
Scale and Complexity: Managed the immense scale of these events by creating modular, reusable assets and implementing efficient level streaming techniques.
Version Compatibility: Developed a robust pipeline to ensure assets and features were compatible and consistent across different Unreal Engine versions.
Live Event Demands: Implemented fail-safe systems and real-time adjustment capabilities to handle the unpredictable nature of live events.
High FPS Requirements: Tackled the challenge of maintaining 60+ FPS for LED wall displays by implementing aggressive GPU and CPU optimizations. Developed a dynamic asset loading system that intelligently managed memory usage to prevent frame rate drops during complex scenes. Created a real-time performance monitoring system that alerted operators to potential FPS issues before they became visible to the audience.
Our virtual production work significantly enhanced the Microsoft events by providing visually stunning and immersive environments that reinforced Microsoft's position as a technology leader. Enabled presenters to interact with virtual 3D models and data in ways that made complex concepts more accessible and engaging. Allowed for quick transitions between diverse "locations" and themes without physical set changes. Created a consistent, high-quality visual experience for both in-person and remote attendees. Achieved and maintained high frame rates throughout the events, ensuring smooth motion on LED walls and eliminating visual artifacts that could distract from the presentations.
This project showcases not only the power of Unreal Engine in transforming corporate events into immersive, interactive experiences but also demonstrates the technical expertise required to optimize these complex virtual environments for flawless, high-performance delivery in a live setting. By leveraging cutting-edge virtual production techniques and implementing robust optimization strategies, we were able to create environments that not only wowed audiences but also enhanced the communication of Microsoft's key messages and technological innovations.

In the Spring of 2022, LEO Events hired XiteLabs to create content for Walmart's Annual Shareholders Conference and Associate Celebration Show. Taking place in Walmart's Arena, the hybrid event brought together an audience of 14,000 Walmart associates and shareholders while millions watched the live broadcast remotely. The conference celebrated Walmart's continued growth and company spirit with all the power of Xite's hybrid event workflows.
Xite was tasked with creating content to accompany Walmart President and CEO Doug McMillon's keystone presentation, and built an immersive world to accompany the 'movie magic' theme of the presentation. The creative team designed a life-size, mixed-reality movie theatre which incorporated finishing touches such as photos of the founding Walton family and Walmart's signature branding.
Xite created a palette of specific AR elements for the live broadcast portion of the conference. Some of these included props that lived or floated above the stage and a virtual theater marquee that protruded over the stage. These items provided an immersive experience for those in the audience via large screens showing the AR magic, and at-home viewers who saw the full magic of Xite's hybrid AR show. In addition, to start building on McMillon's vision, the broadcast used augmented reality to display clips of his favorite movies, enhancing the narrative of the show.
A "Back to the Future" DeLorean rises from the stage and shoots out to orbit an enormous AR globe floating over the audience.
Augmented reality charts materialize onstage next to Doug as he speaks on the continued growth Walmart has seen over the past few years.
An AR drone launches from a tower, flies over the audience and delivers a package in a vibrant illustration of Walmart's new drone delivery capabilities.
A triumphant ending – stars appeared over the heads of associates receiving awards and recognition for their service. Their names were displayed across the movie theatre marquee. Furthermore, this brought to life the idea that McMillon sees his associates as bright shining stars worthy of being celebrated, much like celebrities, for their hard work and incredible spirit in Walmart's movie.
Custom real-time rendering (8 UHD synced outputs) displayed on the massive 15,000 pixel-wide LED screen and floor using Unreal Engine.
Additional augmented reality elements rendered live with StypeLand, for the thousands in the arena and millions of people watching worldwide.
Red Spy was utilized for AR tracking while Blender was used to create 3D models.

This passion project brought to life an intense battle between Link and Ganondorf from The Legend of Zelda series. Set in a breathtaking fantasy environment - a forest in the clouds surrounding a sacred shrine - this short film showcases a brutal and epic fight scene. The project demonstrates a comprehensive use of Unreal Engine's capabilities, blending motion capture technology with hand-crafted animation to create a visually stunning and action-packed fan tribute.
Unreal Engine Mastery: Leveraged advanced features of Unreal Engine for high-fidelity rendering and complex animation integration.
Motion Capture Integration: Seamlessly combined motion capture data with hand-keyed animation for fluid and dynamic character movements.
World Creation: Designed and built an original, immersive fantasy environment that captures the essence of The Legend of Zelda universe.
Advanced Animation Techniques: Utilized a mix of motion capture and hand-keyed animation to create realistic combat sequences and expressive character performances.
Created a lush, floating forest environment with intricate details: dynamic cloud systems surrounding the forest platforms, detailed sacred shrine with ornate Hyrulean architecture, responsive vegetation and environmental elements. Implemented atmosphere and lighting systems to enhance the magical ambiance of the setting.
Choreographed and animated an intense battle sequence between Link and Ganondorf. Utilized motion capture for base body movements to ensure realistic physics in combat. Enhanced with hand-keyed animations for weapon interactions and acrobatic moves. Implemented a dynamic camera system to capture the most impactful moments of the fight.
Developed highly detailed character models for Link and Ganondorf, faithful to their iconic designs. Created expressive facial animations with hand-keyed facial animations to convey emotion and intensity during the battle. Implemented blend shapes for nuanced expressions and lip-syncing.
Designed and implemented a range of visual effects using Unreal's Niagara system: magical effects for weapon clashes and special abilities, environmental particle systems for enhanced atmosphere. Created original sound design to complement the visuals with custom-designed sound effects for combat, magic, and environmental elements. Implemented 3D audio for an immersive viewing experience.
Animation Workflow: Recorded base movements via motion capture for complex fighting sequences. Refined motion capture data and blended with hand-keyed animations. Utilized control rigs for detailed facial expressions and lip-syncing. Fine-tuned fight sequences to balance realism with the fantastical elements of Zelda lore.
Environment Creation: Developed a custom foliage system for the dense, magical forest. Implemented dynamic lighting to create a sense of time and atmosphere. Used Unreal's landscape tools to craft the floating islands and cloud formations.
Visual Effects Pipeline: Created a library of Niagara-based effects for magic, combat impacts, and environmental details. Developed material functions for dynamic character interactions (e.g., cloth movement, armor reflections).
Performance Optimization: Implemented efficient LOD systems for the complex environment. Optimized character meshes and animations for smooth playback. Utilized Unreal's Sequencer for efficient scene composition and rendering.
Style Balance: Carefully balanced realism with the stylized aesthetics of The Legend of Zelda series.
Complex Animation Integration: Developed a robust system to seamlessly blend motion capture data with hand-keyed animations.
Performance Management: Optimized the highly detailed environment and complex VFX for smooth playback and rendering.
Garnered significant attention and praise from The Legend of Zelda fan community. Showcased as an example of high-quality fan animation using game engine technology. Demonstrated the potential of Unreal Engine in creating cinematic-quality animated shorts.
This Legend of Zelda fan animation project exemplifies the power of Unreal Engine in creating stunning, narrative-driven animated content. By combining technical expertise in animation, environment design, and visual effects with a deep appreciation for the source material, this short film pushes the boundaries of what's possible in fan-created content. The project not only pays homage to the beloved franchise but also serves as a testament to the capabilities of modern game engines in producing cinematic-quality animations.
Bringing the dystopian future of Neo-Gotham to life, this Batman Beyond animated pitch showcases the potential for a gritty, neon-soaked reimagining of the beloved animated series. Leveraging the power of Unreal Engine 5, we've created a visually stunning and atmospherically rich environment that captures the essence of Terry McGinnis' world.
Real-time Rendering: Utilized Unreal Engine 5's Lumen global illumination system to achieve dynamic, cinematic-quality lighting in real-time, bringing Neo-Gotham's bioluminescent underbelly to life.
Nanite Micropolygon Geometry: Employed UE5's Nanite virtualized geometry to create highly detailed cityscapes and character models without compromising performance.
Ray Tracing: Implemented ray-traced reflections and shadows to enhance the futuristic, glossy surfaces of Neo-Gotham's architecture and the sleek design of the Batsuit.
Niagara VFX: Crafted complex particle systems using Unreal's Niagara to simulate the neon haze, flying cars, and Batman's high-tech gadgetry.
MetaHuman Integration: Adapted MetaHuman technology to create hyper-realistic facial animations for Terry and other key characters, pushing the boundaries of digital performance.
Our vision was to blend the nostalgic appeal of the original series with cutting-edge graphics that today's audiences expect. By reimagining classic elements through a cyberpunk lens, we've created a Neo-Gotham that feels both familiar and freshly dangerous.
The pitch video strategically unveils key iconic elements - the Batcave, Terry's suit-up sequence, and a pulse-pounding flight through the city - each designed to evoke the series' core themes of legacy, technology, and the eternal battle against corruption.
Developed a custom material system to dynamically age and weather buildings, allowing us to showcase Neo-Gotham's decades of history in a single frame.
Created an AI-driven traffic system for flying vehicles, bringing a living, breathing quality to the city's skyways.
Implemented a unique "memory glitch" effect using Unreal's post-process volume, visually representing the interplay between Terry's actions and Bruce Wayne's mentorship.
This project not only demonstrates technical proficiency in Unreal Engine 5 but also showcases the potential for reviving beloved properties with state-of-the-art game engine technology. The Batman Beyond pitch stands as a testament to the power of real-time rendering in breathing new life into animated storytelling.

For BMW's groundbreaking presentation at the Consumer Electronics Show (CES), our team created a digital pre-visualization of the entire stage show. My role focused on animating a digital stand-in for Arnold Schwarzenegger, a key element in ensuring the seamless integration of the physical and digital aspects of this high-profile event.
Custom Character Creation: Worked with a detailed skeletal mesh of Arnold Schwarzenegger created in Blender, importing and optimizing it for use in Unreal Engine.
Real-time Animation: Leveraged Unreal Engine's animation tools to create lifelike movements for the digital double, matching the planned choreography of the live event.
Skeletal Mesh Rigging: Fine-tuned the character rig within Unreal Engine to ensure smooth, realistic movements specific to Schwarzenegger's physique and mannerisms.
Environment Interaction: Ensured accurate interaction between the digital character and the virtual stage environment, crucial for precise planning of camera angles and lighting setups.
Animated the custom skeletal mesh stand-in of Arnold Schwarzenegger, meticulously matching planned movements and gestures to aid in event choreography and timing. Collaborated closely with the event planning team to adjust animations based on script changes and stage direction. Created a library of reusable animation assets for quick iterations and last-minute changes. Optimized the character's performance to maintain high frame rates in real-time, ensuring smooth previsualization for the production team.
Developed a custom blend space system for smooth transitions between different poses and movements, allowing for rapid adjustments during pre-visualization sessions. Implemented a real-time lighting system that mimicked the planned stage lighting, allowing directors to preview how lighting changes would affect the presenter's appearance. Utilized Unreal's Sequencer to create a precise timeline of events, synchronizing the digital double's movements with other elements of the presentation.
This pre-visualization work was instrumental in the success of BMW's CES presentation. By providing an accurate digital representation of Arnold Schwarzenegger on the virtual stage: event planners could fine-tune the show's pacing and visual impact before the physical stage was even built. Camera operators received invaluable insight into optimal shooting angles and movements. Lighting designers could pre-plan their setups, ensuring the celebrity presenter would be showcased effectively alongside the prototype vehicle. The production team could identify and resolve potential issues well in advance, leading to a smoother, more polished final presentation.
This project exemplifies the power of Unreal Engine in event pre-visualization, demonstrating how custom digital doubles and real-time animation can significantly enhance the planning and execution of major corporate showcases.

For the prestigious Ferrari NYC Gala, I was tasked with creating a mesmerizing pre-visualization of a prototype Ferrari model. The goal was to craft a series of abstract, reflective camera movements that would highlight the vehicle's sleek exterior finish, setting the stage for an unforgettable reveal at this high-profile event.
Unreal Engine 5 Rendering: Leveraged UE5's advanced rendering capabilities to achieve photorealistic results in real-time.
Lumen Global Illumination: Utilized Unreal's Lumen technology for dynamic, high-quality global illumination, enhancing the car's reflective surfaces with true-to-life lighting.
Path Tracing: Employed UE5's path tracing capabilities for ultra-realistic reflections and shadows, crucial for showcasing the Ferrari's glossy exterior.
Cinematic Camera Work: Crafted sophisticated camera movements using Unreal's Sequencer, creating fluid, abstract shots that accentuated the car's curves and finish.
Designed and implemented a series of abstract, reflective camera movements to showcase the prototype Ferrari's exterior in a visually stunning manner. Fine-tuned material properties to accurately represent the unique finish of the prototype vehicle, ensuring it responded realistically to various lighting conditions. Optimized rendering settings to achieve the highest visual quality while maintaining performance for real-time previews. Collaborated with the event planning team to ensure the visual narrative aligned with the gala's overall aesthetic and Ferrari's brand identity.
Developed a custom post-process material to enhance the reflective qualities of the car's surface, creating a unique visual signature for the reveal. Implemented a dynamic environment system that subtly changed lighting and reflections throughout the sequence, adding depth and interest to the abstract shots. Created a bespoke auto-exposure system to handle the challenging lighting scenarios, ensuring the car remained the focal point despite varying brightness levels.
Lumen Setup: Configured Lumen settings to achieve the perfect balance between accuracy and performance, paying special attention to indirect lighting bounces to capture the nuanced reflections on the car's surface.
Path Tracing Optimization: Fine-tuned path tracing parameters to achieve film-quality reflections and shadows while keeping render times manageable for iterative workflow.
Material Mastery: Crafted complex, layered materials in Unreal's Material Editor to accurately represent the prototype's unique paint job, including subtle flake and clear coat effects.
The pre-visualization played a crucial role in the Ferrari NYC Gala: provided event organizers with a clear vision for the prototype reveal, informing decisions on stage design and lighting setups. Allowed for precise planning of the reveal sequence, ensuring maximum visual impact during the live event. Served as a reference for the video production team, guiding their approach to filming the actual vehicle during the gala. Created buzz and anticipation among Ferrari executives and VIP guests, setting the stage for an unforgettable unveiling.
This project showcases the power of Unreal Engine 5 in creating high-end automotive visualizations, demonstrating how cutting-edge real-time rendering can elevate prestigious events and product reveals.

"Star Wars: Souls of the Fallen" is an ambitious fan film that I wrote and brought to life using cutting-edge virtual production techniques. As the Unreal Engine artist for this project, I was responsible for creating all backgrounds and VFX animations, seamlessly blending live-action footage with digital environments to bring the Star Wars universe to life.
Unreal Engine 5 Integration: Leveraged UE5's advanced features to create photorealistic Star Wars environments, from sprawling space vistas to detailed interior shots of Star Destroyers.
Virtual Production Pipeline: Implemented a real-time virtual production workflow, allowing for immediate visualization of scenes combining live actors with digital backgrounds.
Dynamic Lighting: Utilized Unreal's Lumen global illumination system to create dynamic, cinematic-quality lighting that reacted realistically to both practical and digital elements.
VFX Animation: Created complex visual effects animations, including lightsaber battles, force powers, and space combat sequences, all rendered in real-time within Unreal Engine.
Adapted the script into a visual language, translating written descriptions into immersive 3D environments and dynamic visual effects. Designed and built iconic Star Wars locations, including the bridge of the Imperial Star Destroyer "The Dominion", a nebula-shrouded ocean planet with distinctive ring systems, and various Imperial corridors and chambers, each with unique atmospheric qualities.
Developed a custom shader system for lightsaber effects, allowing for real-time color changes and dynamic interactions with the environment. Created a procedural star field generator for space scenes, enabling quick iterations of different galactic backdrops. Implemented a modular set design system within Unreal, allowing for rapid prototyping and adjustment of Imperial interiors to match script requirements.
Green Screen Integration: Utilized Unreal's compositing tools to seamlessly blend live-action green screen footage with digital environments in real-time.
Performance Optimization: Balanced visual fidelity with performance to ensure smooth playback during live shoots, enabling directors and actors to react to the virtual environment in real-time.
Asset Creation Pipeline: Developed a streamlined workflow for creating and importing Star Wars-themed assets, ensuring consistency with the established visual language of the franchise.
Recreating the iconic Star Wars aesthetic while adding unique visual elements to our fan film universe. Balancing the need for high-quality visuals with the performance requirements of real-time rendering. Coordinating with the live-action team to ensure seamless integration between physical and digital elements.
This project showcases the power of Unreal Engine in bringing fan-created content to life with a level of visual quality that rivals professional productions. By leveraging cutting-edge virtual production techniques, we were able to create a Star Wars fan film that captures the essence of the franchise while adding our own unique vision.
The use of Unreal Engine not only allowed for stunning visuals but also provided the flexibility to make real-time adjustments during production, enhancing the collaborative process between the digital and physical aspects of filmmaking.

"From Ashes" is an ambitious third-person fantasy action-adventure game that I single-handedly developed using Unreal Engine. The game follows a knight's journey to unravel the mysteries of a fallen kingdom, blending intense combat, puzzle-solving, and rich storytelling in a vast, atmospheric world.
Narrative: Crafted a compelling story of redemption and discovery, with branching dialogue options and multiple endings.
World Building: Designed a rich, interconnected fantasy world with diverse biomes, hidden areas, and a deep lore for players to uncover.
Progression System: Implemented a skill tree and equipment upgrade system, allowing players to customize their playstyle.
Combat System: Developed a fluid, responsive combat system inspired by games like Shadow of Mordor, featuring melee combat with combos, parries, and dodges; ranged attacks with various projectile types; and special abilities tied to equipment and player progression.
Puzzle Elements: Created Zelda-inspired environmental puzzles that test players' problem-solving skills and encourage exploration.
Open World Design: Implemented a seamless open world with dynamic events, side quests, and discoverable lore items.
Unreal Engine Mastery: Leveraged UE5's advanced features for stunning visuals and smooth performance — utilized Nanite for highly detailed environments without performance loss, implemented Lumen for dynamic global illumination and realistic lighting, and employed Niagara for complex particle effects in spells and environmental details.
AI Programming: Developed sophisticated enemy AI with varied behaviors and combat styles using Behavior Trees and EQS (Environment Query System).
Performance Optimization: Implemented LOD systems, occlusion culling, and efficient asset streaming for smooth gameplay across various hardware configurations.
Character Design: Created the main character, NPCs, and enemies, each with unique visual styles and animations.
Environment Art: Designed and modeled diverse landscapes, from crumbling castles to mystical forests, using a combination of hand-crafted assets and procedural generation techniques.
VFX: Developed a suite of visual effects for spells, combat impacts, and environmental interactions using Unreal's Niagara system.
Sound Effects: Created and implemented a library of sound effects for combat, movement, and environmental ambience.
Music Composition: Composed an original soundtrack that dynamically adapts to gameplay situations and environments.
Intuitive Interface: Designed a minimalist, immersive UI that provides necessary information without cluttering the screen.
Menu Systems: Created responsive and visually appealing menu systems for inventory management, skill trees, and game options.
Dynamic Weather System: Implemented a realistic weather system that affects gameplay and environment aesthetics.
Day/Night Cycle: Created a day/night cycle with dynamic lighting that influences NPC behaviors and certain game events.
Adaptive Difficulty: Developed an AI Director that adjusts game difficulty based on player performance.
Scope Management: As a solo developer, carefully prioritized features and used iterative development to create a polished, cohesive experience.
Performance Balancing: Optimized the open world and complex systems to run smoothly on a variety of hardware configurations.
Consistent Art Style: Maintained a consistent art style across diverse environments and characters through careful planning and asset reuse.
"From Ashes" showcases my ability to bring a complete game vision to life, from initial concept to final product. By leveraging the power of Unreal Engine and drawing inspiration from beloved titles in the genre, I've created an immersive fantasy world that invites players to lose themselves in a knight's epic quest.

As the Senior Environment Artist for the "King of Killers" limited television series, a sequel to the successful film, I played a crucial role in bringing the gritty, action-packed world to life through cutting-edge virtual production techniques. This project demanded the creation of highly detailed, performance-optimized environments that seamlessly blended real-world locations with digital enhancements, all while supporting intense action sequences and visual effects.
Unreal Engine Mastery: Leveraged Unreal Engine's latest features to create photorealistic environments that could be rendered in real-time for LED wall displays.
Virtual Production Integration: Collaborated closely with the cinematography team to ensure digital environments seamlessly integrated with practical sets and live-action footage.
Performance Optimization: Implemented advanced optimization techniques to maintain high frame rates and visual fidelity across complex, action-heavy scenes.
Digital Double Creation: Developed highly accurate digital doubles of real-world locations, allowing for seamless transitions between practical and virtual sets. Implemented photogrammetry techniques to capture and recreate intricate details of actual locations. Created a library of modular assets that could be quickly assembled and modified to represent various urban environments.
Action-Oriented VFX Integration: Designed environment assets with built-in destruction and particle systems to enhance action sequences. Collaborated with the VFX team to create reactive environments that responded dynamically to gunfire, explosions, and other action elements. Implemented a system for real-time environment damage and debris accumulation throughout action scenes.
Performance Optimization: Developed a custom LOD system tailored for the unique demands of virtual production on LED walls. Implemented advanced culling techniques to maximize rendering efficiency without compromising visual quality. Created a dynamic asset streaming system to manage memory usage and maintain consistent performance across long, complex shots.
Environment Creation Workflow: Worked closely with concept artists and the production designer to translate 2D designs into fully realized 3D environments. Developed a library of modular assets that could be rapidly assembled and customized to create diverse urban landscapes. Created a suite of physically-based materials that accurately represented various surfaces under different lighting conditions.
Optimization Techniques: Implemented extensive texture atlasing to reduce draw calls and improve rendering efficiency. Utilized advanced mesh optimization techniques, including LOD generation and polygon reduction. Developed performance-efficient shaders that maintained visual quality while reducing GPU load.
VFX Integration: Implemented a system for procedural building and object destruction that could be triggered in real-time during filming. Created dynamic systems for environmental elements like dust, debris, and weather effects that reacted to character actions and scripted events.
Scale and Detail: Balanced the need for expansive, detailed environments with the performance requirements of real-time rendering by developing smart LOD and culling systems.
Continuity Across Episodes: Maintained visual consistency across multiple episodes by creating a robust asset management system and style guide.
Rapid Iteration: Developed a pipeline for quick environment modifications and updates to accommodate last-minute script changes or directorial decisions.
Enabled the creation of expansive, visually rich environments that would have been prohibitively expensive or impossible to achieve with traditional filming methods. Provided directors and actors with immersive, reactive virtual sets that enhanced performance and facilitated more dynamic cinematography. Significantly reduced location shooting requirements, saving time and budget while expanding creative possibilities.
This project showcases the power of Unreal Engine and virtual production techniques in television series creation. By leveraging cutting-edge technology and implementing robust optimization strategies, we were able to create a visually stunning, action-packed series that pushed the boundaries of what's possible in TV production.

For the New Year's Eve 2023 celebration in downtown Los Angeles, I played a key role in creating a spectacular virtual production video montage that was projection mapped onto City Hall. This high-profile project, featured on all local news stations just before the ball drop and fireworks, showcased the spirit of Los Angeles through a series of animated environments. Despite challenging weather conditions, including pouring rain and a thunderstorm, our team delivered a memorable visual experience that captivated both in-person attendees and television viewers.
Unreal Engine Utilization: Leveraged Unreal Engine to create real-time rendered environments with dynamic lighting and effects.
Projection Mapping: Collaborated with projection specialists to optimize content for large-scale architectural projection onto City Hall.
Weather-Resistant Production: Adapted our workflow and technical setup to ensure flawless execution despite severe weather conditions.
Griffith Park Flythrough: Designed and animated a detailed recreation of Griffith Park, including the iconic Observatory and surrounding landscape.
Space Flight Sequence: Crafted a breathtaking journey from Earth's atmosphere into space, showcasing LA's position in the cosmos.
Stylized Los Angeles Drive: Developed a visually striking, stylized version of Los Angeles for a virtual drive-through experience, highlighting key landmarks and the city's unique character.
Environment Design Process: Gathered extensive photographic and video reference of Los Angeles locations to ensure accuracy and authenticity. Created highly detailed 3D models and textures, optimized for real-time rendering and large-scale projection. Implemented dynamic lighting systems to capture the essence of LA's diverse environments, from urban streets to the stars above.
Real-Time Rendering Optimization: Developed custom LOD systems to maintain visual quality while ensuring smooth performance for live projection. Implemented efficient particle systems for weather effects, city lights, and space elements. Optimized shaders and materials for maximum visual impact when projected onto the architectural surface of City Hall.
Weather Adaptation: Created real-time rain and lighting effects within the Unreal Engine environments to harmonize with the actual weather conditions. Implemented dynamic fog and atmospheric systems to enhance the visibility and impact of the projection through the rain.
Weather Resilience: Worked closely with the technical team to ensure all equipment was weather-proofed and that the projection remained visible despite the rain.
Real-Time Adjustments: Developed a system for making last-minute adjustments to brightness, contrast, and color to compensate for the unexpected weather conditions.
Seamless Integration: Ensured smooth transitions between real-world fireworks and digital effects, creating a cohesive experience for viewers.
Delivered a visually stunning centerpiece for the New Year's Eve celebration that became a talking point across local media. Successfully merged the physical and digital worlds, enhancing the traditional New Year's Eve experience with cutting-edge technology. Demonstrated the resilience and adaptability of virtual production techniques in challenging, live event scenarios.
This project showcases the power of Unreal Engine and virtual production in creating memorable, large-scale public experiences. By leveraging advanced real-time rendering techniques and creative environment design, we were able to transform City Hall into a canvas for a virtual journey through Los Angeles and beyond.

For Walmart's 2023 Annual Shareholders Conference and Associate Celebration Show, I played a key role in creating an unprecedented immersive digital experience. Building on the success of the 2022 event, we pushed the boundaries of virtual production and augmented reality to deliver a hybrid event that seamlessly blended real-world elements with cutting-edge digital environments. The event, held in Walmart's Arena, captivated an audience of 14,000 in-person attendees while millions more watched via live broadcast.
Unreal Engine Mastery: Leveraged advanced features of Unreal Engine for real-time rendering and complex environment creation.
Cesium Integration: Utilized Cesium for Unreal to create a geographically accurate digital twin of Bentonville, Arkansas.
Live Action Integration: Seamlessly blended live stunt performance with digital environments for a "Mission Impossible" inspired sequence.
Augmented Reality: Expanded on previous year's AR capabilities for enhanced broadcast and in-arena experiences.
Digital Twin of Bentonville: Created a highly detailed, geographically accurate recreation of Bentonville, Arkansas using Cesium for Unreal. Implemented real GPS data to ensure precision in the virtual environment. Optimized the digital city for real-time rendering and dynamic camera movements.
Mission Impossible Inspired Sequence: Designed and animated a digital helicopter for a high-stakes action sequence. Coordinated with a live stuntman to integrate real-world action with the digital environment. Collaborated with Tom Cruise to set up and cue the dramatic scene.
Corporate Command Center Hub: Developed a futuristic, interactive command center environment for the main corporate presentation. Integrated real-time data visualization and dynamic content display capabilities. Created a flexible space that could adapt to various presentation needs throughout the event.
Expanded AR capabilities to include more complex and interactive elements: floating props and virtual set extensions visible to both in-arena and broadcast audiences, dynamic data visualizations and charts materializing alongside presenters, and interactive AR elements responding to on-stage actions and presenter cues.
Unreal Engine Implementation: Utilized Unreal's Sequencer for complex, multi-layered visual sequences. Implemented custom shaders and materials for realistic city environments and futuristic command center aesthetics. Leveraged Niagara particle systems for enhanced visual effects in both AR and virtual environments.
Cesium Integration Workflow: Developed a pipeline to import and optimize large-scale geographical data into Unreal Engine. Created a level-of-detail system for the Bentonville digital twin to maintain performance across various shot scales. Implemented a dynamic time-of-day system for realistic lighting changes in the virtual city.
Live Action Integration: Developed a real-time compositing system to blend live stunt performance with the digital helicopter and city environment. Created a precise timing system to synchronize virtual camera movements with physical camera work. Implemented dynamic lighting and physics simulations to enhance the realism of the integrated sequence.
Broadcast and In-Arena Optimization: Developed a dual-output system to deliver tailored experiences for in-arena screens and broadcast feed. Implemented efficient LOD systems to maintain high visual fidelity across diverse viewing scenarios. Created a robust failover system to ensure uninterrupted visual delivery throughout the live event.
Scale and Detail: Balanced the need for a vast, detailed digital Bentonville with real-time performance requirements through advanced optimization techniques.
Live Integration: Developed precise synchronization methods to seamlessly blend live stunt work with digital environments.
Diverse Audience: Created visual experiences that were impactful both for in-person attendees and remote viewers, tailoring content for different viewing mediums.
Delivered a groundbreaking visual experience that elevated Walmart's innovative image and corporate vision. Created memorable, cinematic moments that resonated with both associates and shareholders. Provided a flexible digital platform that enhanced the delivery of key corporate messages and data presentations. Set a new standard for corporate events by blending Hollywood-level production with cutting-edge virtual technologies.
This project showcases the pinnacle of virtual production and augmented reality in corporate events. By creating a digital twin of Bentonville, integrating live action sequences, and developing immersive AR experiences, we delivered an unparalleled visual spectacle that not only entertained but also effectively communicated Walmart's corporate vision and achievements.
.webp)
For Sofi Tukker's groundbreaking Coachella 2023 performance, I was responsible for creating a series of captivating, interactive Unreal Engine environments that transformed the stage into multiple dynamic worlds. This project pushed the boundaries of live performance visuals, seamlessly blending music with real-time rendered environments, animated creatures, and responsive visual effects.
Unreal Engine Mastery: Leveraged Unreal Engine's cutting-edge features, including Nanite, to create highly detailed, performant environments.
DMX Integration: Implemented DMX control for real-time manipulation of environments, lighting, and camera movements.
Dynamic World-Building: Created diverse ecosystems from African savannas to underwater realms, each with unique characteristics and animations.
African Savanna: Designed a vast savanna landscape with day-night cycles. Animated realistic giraffes that interacted with the environment.
Underwater World: Crafted an immersive underwater environment with dynamic lighting. Animated jellyfish and implemented Blueprint-controlled fish behaviors.
Swamp Ecosystem: Developed a detailed swamp environment with realistic water effects. Animated alligators that brought the scene to life.
Sofi Tukker Themed World: Created a fantastical world inspired by the artists' aesthetic. Implemented dense Nanite foliage for unprecedented detail and performance.
DMX-Controlled Cameras: Developed a system for real-time camera control via DMX, allowing for dynamic shot changes synchronized with the music.
Animated VFX: Created responsive lighting systems that reacted to the music, underwater caustics and particle effects, and Alembic-driven animated lotus flowers.
Day-Night Cycle: Implemented a dynamic time-of-day system that transformed environments throughout the performance.
Environment Optimization: Utilized Nanite technology for high-detail, performance-efficient foliage and landscapes. Implemented LOD systems for animated elements to maintain smooth performance. Optimized materials and shaders for real-time rendering in a live performance context.
Animation Systems: Developed custom animation blueprints for various creatures, ensuring lifelike movements and interactions. Created a modular system for blending between different animation states based on performance cues.
DMX Integration: Designed a robust DMX control system for real-time environment switching, dynamic lighting adjustments, and triggered camera movements and effects. Implemented failsafes and backup systems to ensure continuous operation during the live performance.
VFX Development: Crafted a library of performant visual effects using Unreal's Niagara system. Developed a system for synchronizing VFX with musical beats and performance cues.
Performance Optimization: Balanced visual fidelity with real-time performance requirements through aggressive optimization and efficient asset management.
Live Integration: Worked closely with the performance team to ensure seamless integration of digital elements with the physical stage and performers.
Rapid Iteration: Developed a pipeline for quick updates and modifications to accommodate last-minute creative changes and technical requirements.
Delivered a visually stunning, immersive experience that elevated Sofi Tukker's music and stage presence. Received widespread acclaim for pushing the boundaries of what's possible in live music visual production. Set a new standard for the integration of real-time rendered environments in music festival performances.
This project showcases the power of Unreal Engine in live entertainment, demonstrating how cutting-edge game engine technology can be adapted to create unforgettable music experiences. By blending technical expertise with creative vision, we were able to transport the Coachella audience into multiple fantastical worlds, perfectly complementing Sofi Tukker's unique musical style.

For the launch of Pacific Biosciences' (PacBio) new product at the American Society of Human Genetics (ASHG) conference, I developed a groundbreaking interactive presentation system using Unreal Engine. This award-winning show, held at the Novo Theater in Downtown Los Angeles, captivated a massive live audience with its innovative approach to product demonstration.
Gesture-Controlled Presentation: Created a custom Unreal Engine blueprint system that allowed the CEO to navigate through product slides using hand gestures, reminiscent of the technology in "Minority Report."
DMX Integration: Implemented a DMX-controlled Unreal Engine system for real-time manipulation of environments and visual effects.
Interactive 3D Environments: Developed a series of morphing worlds and interactive environments, including a vast, explorable universe.
Gesture Control System: Designed and implemented a cutting-edge gesture recognition system using Unreal Engine blueprints. Integrated motion capture technology to accurately track the presenter's hand movements. Created intuitive gesture mappings for slide navigation, zoom functions, and interactive element manipulation.
DMX-Controlled Environments: Developed a robust DMX control system for real-time environment transitions synchronized with the presentation flow, dynamic lighting adjustments to highlight key product features, and triggered visual effects to emphasize important points.
3D Universe Showcase: Created a vast, explorable 3D universe that served as an overarching metaphor for the product's capabilities. Implemented interactive elements within the universe to represent different aspects of the product.
Morphing Worlds: Designed a series of transforming environments that visually represented the product's various applications and benefits. Developed smooth transition effects between different "worlds" to maintain audience engagement.
Product Visualization: Created detailed, scientifically accurate 3D models of the PacBio product and its components. Implemented interactive exploded views and cutaways to showcase the product's internal workings.
Gesture Control Implementation: Utilized Unreal Engine's VR framework as a base for the gesture recognition system. Developed custom blueprints to interpret and translate hand movements into presentation controls. Implemented a machine learning model to improve gesture recognition accuracy and reduce false positives.
Real-Time Rendering Optimization: Balanced high-fidelity visuals with the need for smooth, real-time performance in a live presentation setting. Implemented dynamic LOD systems for complex 3D models and environments. Optimized lighting and post-processing effects for maximum visual impact with minimal performance overhead.
DMX Integration: Created a custom DMX interface within Unreal Engine to allow for precise control of virtual elements. Developed a cue system that synchronized virtual environment changes with lighting and audio cues in the physical space.
Presenter Training: Worked closely with the CEO to ensure comfort and proficiency with the gesture control system, including developing an intuitive training program.
Fail-Safe Mechanisms: Implemented redundant control systems and quick recovery options to mitigate any potential technical issues during the live presentation.
Complex Data Visualization: Developed innovative ways to represent complex scientific data in visually appealing and easily understandable formats.
The presentation was met with overwhelming positive reception, setting a new standard for product launches in the medical device industry. Received an award for innovative use of technology in corporate presentations. Significantly enhanced audience understanding and engagement with PacBio's new product, contributing to the launch's success.
This project showcases the power of Unreal Engine in creating immersive, interactive presentations for complex scientific products. By blending cutting-edge technology with intuitive design, we were able to transform a traditional product launch into an unforgettable experience.

For the Formula E Racing Concert Series, a high-profile event celebrating electric car racing in the Middle East, I created a suite of captivating VJ visuals. These visuals featured looping car race tracks set in abstract, fantasy sci-fi worlds, blending the excitement of Formula E racing with futuristic, otherworldly environments. The project demanded a perfect fusion of high-octane racing aesthetics with imaginative, cutting-edge visual design.
Unreal Engine Mastery: Leveraged Unreal Engine's advanced features to create high-fidelity, real-time rendered environments suitable for live VJing.
Looping Track Design: Developed seamlessly looping race tracks that could play indefinitely during performances.
Sci-Fi World Building: Created diverse, abstract environments that pushed the boundaries of imagination while maintaining a connection to racing themes.
Real-Time VJ Capabilities: Implemented a system for live manipulation and mixing of visual elements during concerts.
Developed a variety of race track layouts inspired by Formula E circuits but reimagined in fantastical settings: anti-gravity tracks weaving through neon-lit cityscapes, quantum tunnels with light-speed effects and time distortions, biomorphic circuits integrating alien flora and fauna, and energy field tracks with dynamic, reactive surfaces.
Created immersive, otherworldly backdrops for the race tracks: nebula-filled space vistas with surreal celestial bodies, microscopic worlds where tracks weave through molecular structures, cybernetic landscapes merging technology and organic forms, and dimension-bending realms with impossible geometries.
Implemented a range of VFX to enhance the racing atmosphere: energy trails following the path of imaginary racers, particle systems simulating exotic fuels and propulsion methods, holographic overlays displaying race data and track information, and reality-warping effects suggesting high-speed travel through alternate dimensions.
Unreal Engine Implementation: Utilized Unreal's Sequencer for creating complex, layered visual sequences. Employed Blueprint visual scripting for real-time parameter control and effect triggering. Leveraged Niagara particle systems for creating dynamic, reactive visual elements. Implemented custom shaders for unique surface effects on tracks and environments.
VJ System Development: Created a custom VJ interface within Unreal Engine for live performance control. Developed a modular system allowing for real-time mixing and blending between different track and environment combinations. Implemented MIDI control mapping for intuitive, hands-on manipulation of visual elements.
Optimization: Employed aggressive LOD techniques to maintain high frame rates. Developed an efficient asset streaming system to handle rapid transitions between diverse visual themes. Optimized post-processing effects for real-time application without compromising visual quality.
Cultural Sensitivity: Carefully balanced futuristic designs with elements that resonated with Middle Eastern aesthetics and culture.
Performance Demands: Optimized complex sci-fi environments for smooth playback on live event hardware.
Coherent Variety: Developed a visual language that allowed for diverse sci-fi interpretations while maintaining a cohesive link to Formula E racing.
Created a unique visual identity for the Formula E concerts, setting them apart from traditional racing events. Enhanced the futuristic, high-tech image of Formula E through cutting-edge visual representations. Provided an immersive, otherworldly experience that complemented the revolutionary nature of electric racing. Contributed to the cultural significance of the event in the Middle East by showcasing innovative, forward-thinking artistry.

In the Spring of 2022, LEO Events hired XiteLabs to create content for Walmart's Annual Shareholders Conference and Associate Celebration Show. Taking place in Walmart's Arena, the hybrid event brought together an audience of 14,000 Walmart associates and shareholders while millions watched the live broadcast remotely. The conference celebrated Walmart's continued growth and company spirit with all the power of Xite's hybrid event workflows.
Xite was tasked with creating content to accompany Walmart President and CEO Doug McMillon's keystone presentation, and built an immersive world to accompany the 'movie magic' theme of the presentation. The creative team designed a life-size, mixed-reality movie theatre which incorporated finishing touches such as photos of the founding Walton family and Walmart's signature branding.
Xite created a palette of specific AR elements for the live broadcast portion of the conference. Some of these included props that lived or floated above the stage and a virtual theater marquee that protruded over the stage. These items provided an immersive experience for those in the audience via large screens showing the AR magic, and at-home viewers who saw the full magic of Xite's hybrid AR show. In addition, to start building on McMillon's vision, the broadcast used augmented reality to display clips of his favorite movies, enhancing the narrative of the show.
A "Back to the Future" DeLorean rises from the stage and shoots out to orbit an enormous AR globe floating over the audience.
Augmented reality charts materialize onstage next to Doug as he speaks on the continued growth Walmart has seen over the past few years.
An AR drone launches from a tower, flies over the audience and delivers a package in a vibrant illustration of Walmart's new drone delivery capabilities.
A triumphant ending – stars appeared over the heads of associates receiving awards and recognition for their service. Their names were displayed across the movie theatre marquee. Furthermore, this brought to life the idea that McMillon sees his associates as bright shining stars worthy of being celebrated, much like celebrities, for their hard work and incredible spirit in Walmart's movie.
Custom real-time rendering (8 UHD synced outputs) displayed on the massive 15,000 pixel-wide LED screen and floor using Unreal Engine.
Additional augmented reality elements rendered live with StypeLand, for the thousands in the arena and millions of people watching worldwide.
Red Spy was utilized for AR tracking while Blender was used to create 3D models.

"Movie Box" was a groundbreaking series of NFTs that reimagined storytelling in the digital art space. These unique creations featured action-packed mini short films presented within a 3D "box" reminiscent of action figure packaging. Each NFT showcased dynamic, rotating environments where characters came to life, engaging in fights, narratives, or spectacular visual effects displays. The project was a resounding success, with all NFTs in the series selling out.
Unreal Engine 5 Mastery: Leveraged the cutting-edge features of UE5 to create high-fidelity, real-time rendered environments and characters.
Lumen Global Illumination: Utilized UE5's Lumen technology for dynamic, photorealistic lighting that enhanced the depth and realism of each scene.
Niagara VFX: Implemented complex particle systems for stunning visual effects that brought each mini-story to life.
Advanced Animation Techniques: Combined skeletal mesh motion capture with hand-keyed animations for fluid, expressive character movements.
Unreal Engine Sequencer: Crafted intricate, cinematic sequences to tell compelling stories within the confined space of each "box."
Innovative Presentation Format: Designed a unique "box" format that simulated the experience of opening an action figure package. Implemented a rotation mechanism that allowed viewers to explore the scene from multiple angles.
Diverse Storytelling: Created a variety of narratives ranging from intense fight scenes to character-driven stories. Developed distinct visual styles for each NFT, ensuring a unique experience for every piece.
High-Quality Character Animation: Utilized motion capture technology to achieve realistic base movements. Enhanced animations with hand-keyed details to add personality and style to each character.
Immersive Environments: Crafted detailed, miniature environments that served as dynamic backdrops for each story. Implemented interactive elements within the environments to enhance storytelling.
Animation Pipeline: Recorded base movements using state-of-the-art motion capture technology. Created flexible, efficient character rigs in Unreal Engine. Enhanced motion-captured data with hand-keyed animations for nuanced performances. Composed complex animation sequences using Unreal's Sequencer tool.
Visual Effects Creation: Developed a library of Niagara-based visual effects, including combat impacts, environmental particles, and magical elements. Optimized VFX for performance within the confined space of the "box" format.
Lighting and Rendering: Leveraged Lumen global illumination for dynamic, realistic lighting that adapted to the rotating box mechanic. Implemented ray tracing for enhanced reflections and shadows. Optimized rendering pipeline for high-quality output suitable for NFT platforms.
Performance Optimization: Balanced high-fidelity visuals with the need for smooth playback by implementing efficient LOD systems and optimizing assets.
Narrative Compression: Developed techniques to tell compelling stories within the limited time and space constraints of the "box" format.
NFT Integration: Created a seamless pipeline for exporting and minting high-quality animated NFTs while preserving visual fidelity.
Achieved a 100% sell-out rate for all NFTs in the "Movie Box" series. Received critical acclaim for innovation in the NFT space, blending traditional storytelling with cutting-edge technology. Established a new benchmark for animated NFTs, demonstrating the potential of Unreal Engine 5 in digital art creation.

For "The Grappling Network," a Brazilian Jiu-Jitsu focused TV/Streaming network covering various martial arts, I developed a diverse range of digital environments using Unreal Engine. These environments served as backdrops for sparring matches, UFC-style tournaments, and fantasy fight scenarios. The project spanned from creating highly accurate digital doubles of real-world arenas to designing fantastical fighting venues inspired by popular video game franchises.
Unreal Engine Mastery: Leveraged advanced features of Unreal Engine to create high-fidelity, real-time rendered environments.
Photorealistic Rendering: Utilized state-of-the-art rendering techniques to achieve lifelike representations of real-world arenas.
Creative World-Building: Designed imaginative fighting venues drawing inspiration from iconic games like Mortal Kombat, Street Fighter, Tekken, and Dead or Alive.
Performance Optimization: Ensured smooth real-time performance for live broadcast and streaming applications.
Developed detailed digital doubles of famous MMA and BJJ arenas, including accurate lighting setups mimicking live event conditions, dynamic crowd systems for enhanced atmosphere, and precise recreations of arena layouts and features.
Created a series of fantastical fighting venues: mystical temple arenas with dynamic elemental effects, futuristic cityscapes with neon-lit fighting platforms, ancient colosseum-inspired settings with mythological themes, and otherworldly landscapes defying real-world physics.
Implemented reactive environmental features such as destructible objects and surfaces that respond to fight actions, dynamic lighting changes reflecting the intensity of matches, and interactive backgrounds that react to significant moments in fights.
Environment Creation: Extensive research into real-world venues and fighting game aesthetics. High-detail modeling of arena elements and custom texture creation. Advanced lighting techniques including dynamic global illumination and real-time shadows. Complex, physically-based materials for realistic and fantastical surfaces.
Performance Optimization: Implemented efficient LOD systems for maintaining visual quality at various camera distances. Utilized Niagara for optimized particle effects. Developed a custom culling system to manage complex scene elements in real-time.
Broadcast Integration: Created a flexible camera system for dynamic shot composition during live broadcasts. Implemented a real-time compositing solution for blending live footage with digital environments.
Balancing Realism and Fantasy: Developed a cohesive visual style that allowed realistic and fantastical elements to coexist believably.
Real-Time Performance: Optimized environments for live broadcast demands, ensuring high visual fidelity without compromising on frame rate.
Rapid Iteration: Created a modular design system allowing for quick customization and themed variations of fighting arenas.
Elevated the visual quality of The Grappling Network's broadcasts, setting a new standard in martial arts sports presentation. Provided versatile virtual environments that enhanced storytelling and viewer engagement across various fight formats. Enabled the creation of unique, branded fight experiences that distinguished The Grappling Network from traditional sports broadcasts.

This portfolio showcases a diverse range of music-related projects, including pre-rendered music videos, live concert visuals, and interactive VJing experiences. Leveraging the power of Unreal Engine and integrating advanced audio-reactive technologies, I've created immersive, dynamic visual experiences that synchronize perfectly with music across various genres and performance settings.
Unreal Engine Utilization: Mastered a wide array of Unreal Engine features to create visually stunning and performance-optimized content.
Audio Reactivity: Implemented sophisticated audio-reactive systems for real-time visual responses to music.
Live Integration: Developed robust solutions for live VJing and concert visuals using OSC and Ableton Live.
Beat-Synced Animations: Created precise, rhythm-matched animations for both pre-rendered and real-time content.
Visual Fidelity: Nanite for incredibly detailed environments, Lumen for dynamic global illumination, and Ray Tracing for high-fidelity reflections and shadows.
Performance and Optimization: Custom HLSL shaders for unique visual effects and Niagara for complex, music-reactive particle systems.
Animation and Sequencing: Control Rig for flexible character rigs and Sequencer for intricate, beat-synced sequences.
Interactivity: Blueprint Visual Scripting for responsive, audio-reactive systems and real-time visual effects.
OSC Implementation: Developed a robust OSC interface between Unreal Engine and audio software for real-time parameter control. Created a flexible mapping system to link audio features to visual elements.
Ableton Live Integration: Established a bi-directional communication system between Ableton Live and Unreal Engine for precise audio-visual synchronization. Implemented MIDI mapping for hands-on control of visual parameters during live performances.
Beat Detection and Analysis: Developed a real-time beat detection system for accurate rhythm-based visual triggering. Implemented frequency analysis to drive dynamic visual responses across the audio spectrum.
Pre-rendered Music Videos: Created narrative-driven visual experiences with cinema-quality rendering. Developed unique visual styles tailored to each artist's aesthetic and music genre. Implemented complex character animations and environmental interactions synced to musical elements.
Live Concert Visuals: Designed adaptable visual systems capable of responding to live musical performances. Created immersive stage environments with real-time lighting and effects. Developed failsafe systems to ensure consistent visual quality during live shows.
Interactive VJing Experiences: Built a customizable VJing toolkit within Unreal Engine for on-the-fly visual mixing. Implemented layer-based compositing for complex, multi-source visual performances. Created a library of beat-synced loops and effects for diverse musical styles.
Content coming soon — check back for project details, images, and videos.
Explorations into Generative AI, Media Creation and Filmmaking
General language based prompt development for controlled generative output. Skilled in nuanced language for stylistic, filmic, cinematic, artistic and design based detailing. Expert knowledge of various platforms and specific tags, parameters, command lines, input adjustments and referencing.
Expert in command line parameters, cinematic and stylistic character creation and upscaling results for final pixel use.
Stable Diffusion, Midjourney, Dalle, Microsoft Bing, ChatGPT, HeyGen, Elevenlabs





































































