AR, Game, Interaction design
Type
Independent project
Team
Time
2021 Spring
An AR face filter rhythm game using the user's facial expressions as controls made by SparkAR.
Facial expressions filter game
DEVIL
Project Challenge
The fastest way to explore how face tracking works for designers is to try some face filter games, because you don't need to hard code in Unity and try to find out where to deploy. This experimental game is created by Typescript with the script API provided by Facebook Spark AR. The purpose of this game is to explore if human facial expressions can be a kind of meaningful input source. The current use of facial recognition in games can only use one single feature to trigger something, like using “blink” to control the player to jump. However, in this game, in order to enhance the connection between human and digital presence, and inspired by the “face dance” trend in Tiki-tok, the user's phone can use facial data to generate a synchronized skull head to play a rhythm game in a new way.
Try the effect on your Instagram
Game Mechanism
As Katherine Isbister mentioned in her book "How Games Move us Emotion By Design", using movement design to create emotion and connection, it is crucial to make some meaningful interactions to enjoy the new tech, rather than treat them as new tools, especially people has already faced to their camera screen for about 1 year. Therefore, I chose the 4 most common facial expressions—Surprise, Angry, Smile, and Laugh as input methods to express themselves. Thanks to the rhythm game mechanism, I slightly change the meaning of each expression to a new one that can stand for the right feelings when we hear the beats. The score shows the number of beats you hit. Each level has a set score. Once you reach the given score, the next level and new expression will be unlocked. Also, fire shows the overall status of the player. The higher score you get, the stronger the fire will be.
SURPRISE--Intoxicated
Raise your eyebrows to match the coming beasts, showing you are intoxicated.
SMILE--Comfortable
Smile to match the coming beasts, showing you feel comfortable.
LAUGH--Happy
Open mouth to match the coming beasts, showing you are laughing loudly, feeling happy.
ANGRY--Enjoyable
Frown to match the coming beasts, showing you are enjoyable.
Game Design
Using editor tool to make shape keys in Blender for each part of the skull so that they can be imported into Spark AR.
Created color palette and inspired by the ring design for beats tracks.
Using shader nodes to create a sci-fi material of the skull, and remake it in Spark AR own shader patches.
Different users would generate different weights of each part of the skull based on their facial data. In this way, everyone can have a decent gaming experience.
Updating signals are tricky the first time in SparkAR script API, the best way is to combine persistence data and don’t assign multiple variables to the given signals.
Next Steps & Refection
-
Make the indicators more visible, assign them functions, like matching the long beats.
-
Improve the setting, especially for step1 and step2, they two sometimes happen at the same time.
-
Making end game effect to show level completed, currently only have the fire to show the status.
-
Make the game not symmetry
-
Generate beats map automatically