2015 - 2019
Automated Colon Segmentation
This is a Helmsley funded research in the Qualcomm Institute at UCSD. Using U-Net and volume rendering algorithms, we have developed a system that can accurately segment the colon area from MRI scan and generate a manipulable 3D volume in virtual reality and CAVE to help study Crohn's disease. The last milestone of this project is documented in this Research Paper. Also checkout our Demo and our Github repository.
Bodylogical AR is a 3D Visual Analytics tool on HoloLens that I developed at the Qualcomm Institute for the Bodylogical team from PricewaterhouseCoopers(PwC). It supports rendering, manipulating and analyzing 10,000+ data with high dimensional features, along with the multi-user shared experience and dynamic file loading/exporting. Checkout this quick overview, or read more in my article on Medium. Alternatively, you can watch this Youtube Demo on my Channel.
Calcflow AR is an augmented reality application for Magic Leap One that I developed at Nanome. Calcflow is originally a virtual reality application where user can study and visualize vector calculus in an interactive environment. During my internship at Nanome, I was the principal developer responsible for porting this application's basic feature to the AR platform using Magic Leap One using Unity. In the app, user is able to generate and manipulate 3D graphs based on the input parametric function.
ARTEMIS - Remote Surgery
ARTEMIS stands for Augmented Reality Technology Enabled reMote Integrated Surgery, a research project to help Naval Medical Center develop an Augmented Reality Telementorship platform. I work on 3D localization and tracking of users hands, head and eyes. Check our Research Symposium for more information. I also work on the User Interaction design aspect of the project. View our Design Site to learn more about our current progress.
DeepMotion is a novel solution to the handwriting-to-text task. This system recognizes handwritten letters based on pen motion. We built it with Arduino Uno R3, MPU9250 motion sensors, and our own 3D printed molds. With self-collected 5,000+ temporal motion data samples, we explored different neural network models using TensorFlow, including CNN and RNN with LSTM. The details of this project is documented in this Research Paper, or you can check out this project on Github repository.
Holo Bear Tour is an augmented reality experience that guides you to explore UCSD's Stuart Collection statue, the Hawkinson's Bear. The purpose of this to explore embedded visualization in the context of mixed reality environment, a research I conducted with Dr. Ali Sarvghad at Ubiquitous Computing Lab. I have published a more elaborated introduction about this project on Medium. You may also check out this Demo Video to get a taste of how it actually works. Find this project on Github.
Re: Art is an awards-winning web app that implemented convolutional neural network to enable transforming an user-uploaded video into an animated artwork as if it was drawn by great artist like Van Gogh. We built this app at SD Hacks 2016, making it the first video style transfer web app on the internet, and won the Top Prize in the SD Hacks. Check out our project on the Devpost or my Github to see how it works! We also had a Press Mention on CSE News.
LowPoly ZenGarden is an interactive game demo with beautiful graphics and meditative music to help players relax and peace their mind. We developed this game using C++ and OpenGL with custom shaders. Riding a lonely boat on the limpid low poly water, players can explore the procedurally generated terrain, appreciate the snow particle effects and capture great moment with depth of field camera. Feel free to check our Demo and enjoy a peaceful break. See Github page for details.
PixelBase is a web app based on Blockchain that provides a verifiable record for image copyright for the public. We designed this web app by trying to fully make use of the decentralized feature of blockchain to ensure the copyright record is unmodifiable. I also extended the verification process by implementing similar image recognition using OpenCV. This app is developed at CalHacks 2.0 at UC Berkeley. You can find more about PixelBase at our submission onto Devpost.
GoundCrew VR is a virtual reality game application on HTC Vive that we designed for San Diego Air and Space Museum. It is a project that I collaborated with the VR Club at UCSD. Although it is still in development, players could either exploring freely in the Gallery scene to interact with real aircrafts, or jump into the Ground Crew scene to challenge levels in our Game Play Demo. We are planning on deploying this application at the museum after it’s completed. Checkout the Github page for more.
DreamJournal is a web app that creates multimedia AI-generated journals based on Microsoft Cognitive Service, Facebook API and Amper Music. It works by importing user’s photos from their facebook timeline and generate a conversational interaction with user to understand the story behind the pictures. Then it adopts a pre-trained deep learning model to generate a compelling media that records user’s memory for user. Feel free to view our submission on the Devpost for LA Hacks 2017.
SaitoGemu VR is an action-packed virtual reality horror game inspired by the PlayTest in Black Mirror. You are trapped in a graveyard full of different kinds of skeleton zombies with nothing in your hand, and you need to try your best to find all the necessary components to solve the puzzle. There are guns left by a mysterious person that you may find, but your fists can be your weapon as well. Can you escape in the end? See out our Game Play Demo on YouTube or check out its Github page.