• Work
  • Play
  • About
  • CV
Min Kwak

Min Kwak

  • Work
  • Play
  • About
  • CV
 
 

INTRO

The First Augmented Reality Visual Effects Systems for Live Theatre Production

Technodramatists is industry-leading AR tech and entertainment company that incorporates emerging technologies into live performance and art installation. By tracking performers’ movement and expression, a variety of captivating virtual artwork unite with the reality and provide a new dimension of theatre experience. We built systems called Absolute Motion and weARlive, which enable 3D sets and props, avatars, VFX (Visual Effects), and lens filters to be interactively responsive to the performer’s motion, sound, surroundings, and face expressions.

 
 
 

MY ROLE :

Lead AR/VR Artist, UX/UI Designer

Leading the art and design, creating 3D interactive assets and UX/UI, and directing artistic vision

TEAM :

Lorne Svarc, Justina Yeung, Yashwanth Iragattapu

 

TOOLS :

Blender, Maya, Cinema 4D, Unity, Adobe XD, Photoshop, Xcode

METHODOLOGIES :

3D Modeling, 3D Animation, Avatar Face & Body Rigging, UX/UI, Usability Testing, Interface Design, VFX, Particle Effects, Lens Filters, Art Direction

 
 
 

PRODUCTION 1

Alibi

Off-Broadway show

Alibi is one of our Augmented Reality theatre productions using Absolute Motion and weARlive, which went live on Feb 6, 2021 in Gene Frankel Theatre, NYC. Inspired by Tristen Tzara's 1924 French Dadaist play Handkerchief of Clouds, Alibi immerses the audience in a love triangle, crime, art and materialism told through fragments of movement, music and absurdity combined with AR.

 
 
 

VISUAL ART PROCESS

Artistic Reference

KEYWORDS : Absurdity, Dada, Contrast between 1920s and modern, Steampunk, German Expressionism, Red, Organic, Tethering

 
d7hftxdivxxvm.cloudfront.net.jpg
Ernst-Ludwig-Kirchner_Street.png
HHcut.jpg
Invaluable-Dada-Art-Movement.jpg
Mus-e-Karl-Waldmann-Museum.jpg
Left-Bauhaus-Costumes-Courtesy-of-Charnell-House-Right-Bauhaus-Costume-Courtesy-of-Charnell-House.png
MalinBulow_03.jpg
Screen Shot 2021-10-11 at 12.51.42 AM.png
 
 
 

3D Assets Creation

I created 3D assets for AR effect of the show including 3D objects, animation, VFX, and avatars.

02.png
04.png
03.png
 
 
 

Show Visuals

I created 3D assets (objects, particle effect, animation, etc) according to the artistic vision. Below are how they merged into the stage and performers as AR effects.

 
 
 

PRODUCTION 2

Gioco: The Abstractnaut

Live AR Music Performance

Gioco: The Abstractnaut is livestream music concert with Augmented Reality using Absolute Motion, the iOS system we built for real-time AR visual effects. It was livestreamed on Apr 15, 2021 from Montreal. Gioco, the performer, searches for meaning in sex, substances, culture, fame, and love in a chaotic universe.

 
 
 

VISUAL ART PROCESS

Artistic Reference

KEYWORDS : Psychedelic, Hallucinating, Color spectrum, Science fiction, Illusion

 
90.jpeg
Screen Shot 2021-10-11 at 1.20.34 AM.png
Screen Shot 2021-10-11 at 1.22.21 AM.png
Screen Shot 2021-10-11 at 1.25.04 AM.png
Screen Shot 2021-10-11 at 1.36.15 AM.png
Screen Shot 2021-10-11 at 1.26.58 AM.png
Screen Shot 2021-10-11 at 1.28.58 AM.png
Screen Shot 2021-10-11 at 1.34.55 AM.png
 

3D Assets Artwork

According to the artistic vision, I created materials & shaders for models and props.

All2.png
  • ASTRONAUT

  • PLANET

  • SPACESHIP

 
 
 

Show Visuals

I created AR effects (rigged avatars, VFX, animated 3D scan models, objects, camera filters) and they were integrated perfectly with the show.

 
 
 

INTERACTIVE ASSETS CREATION

Floating Letter

Cloth tag and wind function in Cinema 4D > Texture sheet animation in particle effects in Unity

 
 
Texture sheets (300 and 600 frames)
 
 
 

Blooming Flowers

3D modeling of flower bud and blooming flower in Blender > Vertex animation > Extract the animation data as a texture (Generate EXR texture of vertex positions for each frame of animation) > Apply into Unity shader > Link AR body

 
Blender-size.png
 

Astronaut Avatar

Rigging avatar into AR skeleton in Cinema 4D > Export with Astronaut’s mesh and AR skeleton together > Import to Unity

 
 
 
 

Face Sync with Blendshapes

Blendshapes is one of ARKit’s features, enabling face sync via camera of iOS devices. Making 52 different facial position (e.g eyeLeftBlink, jawRight) in Maya > Naming each Blendshape nodes properly with prefix > Importing to Unity > Refine textures and materials

 
 
 
Blendshape Alibi Ann.png
 
 
 

TECHNICAL ART STRATEGY

Optimization for Real-Time Rendering

In order to achieve the best performance for each media/machine, I built several optimization tactics.

  • Texture Atlas

  • Baking Textures

  • LOD (Level of Detail)

  • Light Baking, Optimize Polygon Numbers, etc

Texture Atlas

Texture Atlas is the method of mapping many separate textures together into a single texture. It improves FPS since it draws only one call for all materials inside it. Firstly, check if there are objects that have multiple materials in one object. Then combine them together as one, by creating a new texture containing all parts and re-do UV mapping according to it. After that create a texture containing all the materials from multiple objects by doing same process.

 
 

Baking Textures

Texture baking means to bake ahead of real-time render. It’s much faster when it’s rendered, because the textures baked do not have to be recomputed. Use nodes to link each materials in Blender.

 

LOD (Level of Detail)

LOD rendering reduces the number of triangles rendered for an object as its distance from camera increases. LOD will reduce the load on the hardware and improve rendering performance.

LOD2.png
 
 
 

UX/UI DESIGN

“weARlive” PC App Experience & Interface Design

ABOUT

weARlive is Technodramatists’ unique AR platform that allows users to control live theatrical face sync performances with a mixture of real actors, AR, real and virtual backgrounds into one combined performance. The system enables control, animation, and impersonation of virtual characters by actors, streaming of actor's facial features and emotions onto the avatar, general oversight of theatrical processes supporting multiple output screens. weARlive is working with 2 apps, one is for iOS devices to capture of actor’s face, and the other one is for PC, which enables face-sync avatars to the actors by incorporating the data from iOS app. I executed UX/UI re-design of PC app.

Screen Shot 2021-09-09 at 6.03.24 PM.png
Screen Shot 2021-10-08 at 8.35.50 PM.png
Screen Shot 2021-10-08 at 8.35.19 PM.png
Screen Shot 2021-10-08 at 8.37.10 PM.png
 
 
 

Project Analysis

CHALLENGES

How might we improve the user experience of the original design of weARlive PC app, enhance intuitive usability of it, and provide users a prompt and seamless workflow in live theatre production?

TARGET USERS & SETTINGS

Directors, associate directors, artistic directors, set designers, lighting designers, stage technicians, etc who are responsible for art and technology of live theatre production.

MAJOR FEATURES

  • Building virtual 3D scenes consisted of selections of avatars, props, lightings, and background.

  • Saving multiple camera properties based on each camera’s position and angle.

  • Switching between saved scenes and cameras in real-time according to the narrative of the show.

  • Controlling each 3D element’s property - position, rotation, and scale, locked, enabled, etc.

 

Usability Testing

To define design challenges, I performed usability testing of original interface with stage director, art director, associate designer and software engineer. The key takeaways are as follows.

Original Interface3.png
 

Key Insights

  • MAINTAIN SUFFICIENT WORKING SPACE

Reserve the maximum of working space (3D area) for seamless and uninterrupted experience. Functions could be organized tactically in compact area. Apply hide / show function for the character list.

  • INTUITIVE AND STREAMLINED FUNCTION AREA

Utilize icons / arrows / toggle / sliders according to the user affordance. Allow users to make prompt decisions in every steps without any second thought. Time is important matter in live theatre production.

  • CASE STUDY FOR SUCCESSFUL DESIGN SOFTWARES

I did case study with interface design of Adobe XD, Sketch, and Blender to research how they are making the best use of limited space with as many functions as possible.

  • PROVIDE THE PROPER TITLES / TERMS

Testers had a difficulty to understand several terms such as “Menu,” “Play Event,” and “Shot.” Upgrade them with other titles, apply icons together, or relocate them to facilitate understanding.

 
 
 

Aesthetic Strategy

I built aesthetic strategy based on the key insights from usability testing and case study.

aesthetic strategy2.png
 
 
 

Design Outcomes

I designed UI, visual, and user flow based on 3 key screens - Home, Scene Editor, and Play Mode.

original.png
redesign.png
 

Powered by Squarespace.