Secret Passageways with Avatar Faces
Term: Spring 2018, Rochester Institute of Technology
Inspired by these secret corridors, this exhibit presents virtual passageways connecting various locations on campus and allows visitors to talk with others via specially created avatar faces. Using facial motion capture technology, facial expressions of a user are captured and are used to control these avatar faces, which are, in turn, viewed by others in the Golisano College of Computing and Information Sciences (GCCIS), the College of Imaging Arts and Sciences (CIAS), and the MAGIC center.
My Role: Technical Artist, Programmer and 3D Modeling Artist
My primary work involved reworking existing Facial models, originally developed by CIAS department, to suit the requirement of Faceware LIVE that included creating the required blend shapes in Maya and interfacing the 3D model with Faceware LIVE in Unity. All my work has been extensively detailed in the report above, available for download. Lastly, I was responsible for setting up the final project and exhibiting the same at ImagineRIT 2018.
Technical: Unity3D, Autodesk Maya, Faceware LIVE, Photon Voice Unity Asset.
Exhibited at ImagineRIT 2018 and several Open Houses for incoming students at RIT in 2018
Term: Summer 2018, Rochester Institute of Technology
Karaoke with a twist! You sing in the real world while a virtual puppet – which you control using facial motion capture – performs the song on a virtual stage. Fun for all budding pop stars not yet ready to show their real faces to the world.
My Role: Technical Artist and Programmer
An extension of Secret Passageways, my primary work was to develop the menu for the user and integrate it with Faceware LIVE. I have developed the User Interface in C# that allows the user to select the character and scene. Depending on the choices, the desired environment is provided to the user, by dynamically loading the respective prefabs and panel image. The Facial Expressions of the user are captured and mapped onto the character using Faceware LIVE thus providing real-time animation for live performance. The main learnings of this project involved C# scripting in Unity3D.
Technical: Unity3D, Faceware LIVE, Photon Voice Asset in Unity3D
Exhibited at Rochester Fringe Festival 2018.
Term: Fall 2018, Rochester Institute of Technology
Unabridged Emotions is a mixed reality dance performance that explores the expression of basic emotions through poetry, dance and facial expression. During the piece, facial avatars, controlled by users off-stage, are projected on stage while each dancer portrays a particular basic emotion. Users controlling the faces experience the performance from stage view via the use of networked cameras connected to Raspberry Pis. A 10-minute preview of the piece will be performed.
My Role: Technical Artist and 3D Modeling Artist
My primary work as a Research Assistant familiar with working with 3D Face models was to ensure that the 3D Face models of the dancers, developed by CIAS department fulfill the requirements of Faceware LIVE client in Unity and function accordingly when interfaced in Unity.
Technical: Autodesk Maya, Unity3D and Faceware LIVE.
Exhibited at 3rd Annual Frameless Symposium at RIT.
Still Life with GL
Term: Spring 2017, Rochester Institute of Technology
Imitated a still life scene using OpenGL in C++. This required implementing the 2D graphics pipeline that includes Shading, Lighting and Texture Mapping and reads the object file formats of the 3D models. Modeled the objects for this scene in Blender.
Language: C++ and OpenGL
Euclidean Shortest Path Problem
Term: Fall 2017, Rochester Institute of Technology
Implemented a solution in Java to solve the Euclidean Shortest Path Problem. The problem space contained a set of polyhedral obstacles in euclidean space and a starting point and an ending point. JAVA Swing was used to developing the User Interface that aids the user to draw his obstacles, plot the starting point and destination point and shows the visualization of the traced shortest path.
I was responsible for creating the User Interface using Java Swing that takes in the obstacles and starting and destination point and gives the collection of points to the next steps. After deriving the shortest path points, I was responsible for retrieving the same to visualize the shortest path as the end result.
Please see the detailed report at
For code samples please see
1. PointO. java
Rendering Lights and Shadows in Augmented Reality