top of page

Ello! 

Internet Friends,

Nice to meet you

Just call me Jesse. 

I'm a technical User Experience designer based in Chicago, being passionate about XR design and other tangible IXD.  

Close menu

What if there was a system in place to support hybrid living?

HMW connect hybrid workers with the right resources in their neighborhood.

When WebXR becomes common, how should designers get ready for the new design area?

Getting more people involved in Translational Research is the key to save more lives. 

In the new norm, HMW bring back daily socialization for remote workers while protecting their boundaries?

In the post-pandemic, are cloud drinking and in-home dancing possible? MR gave the solution. 

What if facial expressions are a new input for games? How life looks like in AR. 

Summer Intern in iMotions, working on implementing eye-tracking SDK to Varjo XR-3 and making tutorials for clients and sales team.

What's the user experience when Lvl 5 AV comes to campus? An entire system for self-driving shuttle.

Using COVID daily death data , bringing a new Monument to tell one thing — the passing lives are not numbers, we should remember them.

AR, Game, Interaction design

Type

Independent project

Team

Time

2021 Spring

An AR face filter rhythm game using the user's facial expressions as controls made by SparkAR.

Facial expressions filter game

DEVIL

Project Challenge

The fastest way to explore how face tracking works for designers is to try some face filter games, because you don't need to hard code in Unity and try to find out where to deploy. This experimental game is created by Typescript with the script API provided by Facebook Spark AR. The purpose of this game is to explore if human facial expressions can be a kind of meaningful input source. The current use of facial recognition in games can only use one single feature to trigger something, like using “blink” to control the player to jump. However, in this game, in order to enhance the connection between human and digital presence, and inspired by the “face dance” trend in Tiki-tok, the user's phone can use facial data to generate a synchronized skull head to play a rhythm game in a new way.

Try the effect on your Instagram

Game Mechanism

As Katherine Isbister mentioned in her book "How Games Move us Emotion By Design", using movement design to create emotion and connection, it is crucial to make some meaningful interactions to enjoy the new tech, rather than treat them as new tools, especially people has already faced to their camera screen for about 1 year. Therefore, I chose the 4 most common facial expressions—Surprise, Angry, Smile, and Laugh as input methods to express themselves. Thanks to the rhythm game mechanism, I slightly change the meaning of each expression to a new one that can stand for the right feelings when we hear the beats. The score shows the number of beats you hit. Each level has a set score. Once you reach the given score, the next level and new expression will be unlocked. Also, fire shows the overall status of the player. The higher score you get, the stronger the fire will be.

SURPRISE--Intoxicated

Raise your eyebrows to match the coming beasts, showing you are intoxicated.

SMILE--Comfortable

Smile to match the coming beasts, showing you feel comfortable.

LAUGH--Happy

Open mouth to match the coming beasts, showing you are laughing loudly, feeling happy.

ANGRY--Enjoyable

Frown to match the coming beasts, showing you are enjoyable.

Game Design 

Using editor tool to make shape keys in Blender for each part of the skull so that they can be imported into Spark AR. 

Created color palette and inspired by the ring design for beats tracks.

Using shader nodes to create a sci-fi material of the skull, and remake it in Spark AR own shader patches.