INTRO
Technodramatists is industry-leading AR tech and entertainment company that incorporates emerging technologies into live performance and art installation. By tracking performers’ movement and expression, a variety of captivating virtual artwork unite with the reality and provide a new dimension of theatre experience. We built systems called Absolute Motion and weARlive, which enable 3D sets and props, avatars, VFX (Visual Effects), and lens filters to be interactively responsive to the performer’s motion, sound, surroundings, and face expressions.
MY ROLE :
Lead AR/VR Artist, UX/UI Designer
Leading the art and design, creating 3D interactive assets and UX/UI, and directing artistic vision
TEAM :
Lorne Svarc, Justina Yeung, Yashwanth Iragattapu
TOOLS :
Blender, Maya, Cinema 4D, Unity, Adobe XD, Photoshop, Xcode
METHODOLOGIES :
3D Modeling, 3D Animation, Avatar Face & Body Rigging, UX/UI, Usability Testing, Interface Design, VFX, Particle Effects, Lens Filters, Art Direction
PRODUCTION 1
Off-Broadway show
Alibi is one of our Augmented Reality theatre productions using Absolute Motion and weARlive, which went live on Feb 6, 2021 in Gene Frankel Theatre, NYC. Inspired by Tristen Tzara's 1924 French Dadaist play Handkerchief of Clouds, Alibi immerses the audience in a love triangle, crime, art and materialism told through fragments of movement, music and absurdity combined with AR.
VISUAL ART PROCESS
KEYWORDS : Absurdity, Dada, Contrast between 1920s and modern, Steampunk, German Expressionism, Red, Organic, Tethering
I created 3D assets (objects, particle effect, animation, etc) according to the artistic vision. Below are how they merged into the stage and performers as AR effects.
PRODUCTION 2
Live AR Music Performance
Gioco: The Abstractnaut is livestream music concert with Augmented Reality using Absolute Motion, the iOS system we built for real-time AR visual effects. It was livestreamed on Apr 15, 2021 from Montreal. Gioco, the performer, searches for meaning in sex, substances, culture, fame, and love in a chaotic universe.
VISUAL ART PROCESS
KEYWORDS : Psychedelic, Hallucinating, Color spectrum, Science fiction, Illusion
According to the artistic vision, I created materials & shaders for models and props.
ASTRONAUT
PLANET
SPACESHIP
I created AR effects (rigged avatars, VFX, animated 3D scan models, objects, camera filters) and they were integrated perfectly with the show.
INTERACTIVE ASSETS CREATION
Cloth tag and wind function in Cinema 4D > Texture sheet animation in particle effects in Unity
3D modeling of flower bud and blooming flower in Blender > Vertex animation > Extract the animation data as a texture (Generate EXR texture of vertex positions for each frame of animation) > Apply into Unity shader > Link AR body
Rigging avatar into AR skeleton in Cinema 4D > Export with Astronaut’s mesh and AR skeleton together > Import to Unity
Blendshapes is one of ARKit’s features, enabling face sync via camera of iOS devices. Making 52 different facial position (e.g eyeLeftBlink, jawRight) in Maya > Naming each Blendshape nodes properly with prefix > Importing to Unity > Refine textures and materials
TECHNICAL ART STRATEGY
In order to achieve the best performance for each media/machine, I built several optimization tactics.
Texture Atlas
Baking Textures
LOD (Level of Detail)
Light Baking, Optimize Polygon Numbers, etc
Texture Atlas is the method of mapping many separate textures together into a single texture. It improves FPS since it draws only one call for all materials inside it. Firstly, check if there are objects that have multiple materials in one object. Then combine them together as one, by creating a new texture containing all parts and re-do UV mapping according to it. After that create a texture containing all the materials from multiple objects by doing same process.
Texture baking means to bake ahead of real-time render. It’s much faster when it’s rendered, because the textures baked do not have to be recomputed. Use nodes to link each materials in Blender.
LOD rendering reduces the number of triangles rendered for an object as its distance from camera increases. LOD will reduce the load on the hardware and improve rendering performance.
UX/UI DESIGN
ABOUT
weARlive is Technodramatists’ unique AR platform that allows users to control live theatrical face sync performances with a mixture of real actors, AR, real and virtual backgrounds into one combined performance. The system enables control, animation, and impersonation of virtual characters by actors, streaming of actor's facial features and emotions onto the avatar, general oversight of theatrical processes supporting multiple output screens. weARlive is working with 2 apps, one is for iOS devices to capture of actor’s face, and the other one is for PC, which enables face-sync avatars to the actors by incorporating the data from iOS app. I executed UX/UI re-design of PC app.
CHALLENGES
How might we improve the user experience of the original design of weARlive PC app, enhance intuitive usability of it, and provide users a prompt and seamless workflow in live theatre production?
TARGET USERS & SETTINGS
Directors, associate directors, artistic directors, set designers, lighting designers, stage technicians, etc who are responsible for art and technology of live theatre production.
MAJOR FEATURES
Building virtual 3D scenes consisted of selections of avatars, props, lightings, and background.
Saving multiple camera properties based on each camera’s position and angle.
Switching between saved scenes and cameras in real-time according to the narrative of the show.
Controlling each 3D element’s property - position, rotation, and scale, locked, enabled, etc.
To define design challenges, I performed usability testing of original interface with stage director, art director, associate designer and software engineer. The key takeaways are as follows.
MAINTAIN SUFFICIENT WORKING SPACE
Reserve the maximum of working space (3D area) for seamless and uninterrupted experience. Functions could be organized tactically in compact area. Apply hide / show function for the character list.
INTUITIVE AND STREAMLINED FUNCTION AREA
Utilize icons / arrows / toggle / sliders according to the user affordance. Allow users to make prompt decisions in every steps without any second thought. Time is important matter in live theatre production.
CASE STUDY FOR SUCCESSFUL DESIGN SOFTWARES
I did case study with interface design of Adobe XD, Sketch, and Blender to research how they are making the best use of limited space with as many functions as possible.
PROVIDE THE PROPER TITLES / TERMS
Testers had a difficulty to understand several terms such as “Menu,” “Play Event,” and “Shot.” Upgrade them with other titles, apply icons together, or relocate them to facilitate understanding.
I built aesthetic strategy based on the key insights from usability testing and case study.
I designed UI, visual, and user flow based on 3 key screens - Home, Scene Editor, and Play Mode.