about meI design and develop immersive virtual experiences, and I love what I do.
Hi There! My name is Jack Yang. I'm a mission-driven Virtual Reality developer specializing in XR Interactions and Virtual Experiences Simulation.
I have been in the XR industry since 2015, and I am passionate about changing how people interact with both the physical world and the virtual through this fantastic technology.
Moreover, thanks to my background in communication arts, I am very interested in Social Computing and how XR could impact interpersonal communication.
• Spearhead the building of an interactive, networked AR/VR content management and training simulation system, featuring hand tracking with integrated gesture recognition and simulated physics.
• Create prototypes and deploy key features, including multiplayer networking, 3D model processing, and virtual object manipulation interface; this resulted in winning a $750,000 research contract with the U.S. Air Force and being accepted into the TechStars program.
• Formulate and implement new design decisions and product directions based on user testing observations.
• Add and maintain Ultraleap (Leap Motion) hand tracking support and demonstration projects for Microsoft’s Unity Mixed Reality Toolkit (MRTK) project by contributing 4,000+ lines of code.
• Align with other developers to carry out bug fixes and repository documentation management.
• Responsible for devising chromatic adjustment algorithms with computer vision techniques on hyperspectral images to simulate color blindness.
• Excel in the development of an artificial intelligence algorithm to replicate human behavior during color vision deficiency tests such as Farnsworth-Munsell 100 Hue Test and D-15 Test; subsequently examine the accuracy of chromatic adjustment with 90% confidence.
•Accountable for programming virtual reality simulations to visualize research findings through color-calibrated Oculus Head Mounted Display (HMD); this has resulted in practical design implications for potential human vision enhancement optical lenses.
• Constructed a virtual reality teleoperation system where users could remotely control robots via hand and arm gestures by passing ROS data between Unity and robot through network socket with little latency.
• Crafted a motion playback system with an intuitive user interface to dynamically replicate virtual robot arm movement; this was accomplished by interpolating robotic data from experiments in Unity that were used to analyze more than 15 lab experiments.
• Studied game design and development at CMU Entertainment Technology Center
• Developed a VR game with the Oculus Rift DK2 called Kitchen Kraving (published on the Oculus Store) under the instruction of Professor Chris Klug
Interaction Design Specialization UC San Diego
• In Progress
Human-Computer Interaction Professional Certificate Georgia Tech
• Completed a 4-course program covering HCI topics ranging from design principles and feedback cycles to agile development
VR Development Professional Certificate UC San Diego
• Completed a 3-course series covering from computer graphics to virtual reality development and a Unity VR application as the final project
Virtual Reality Specialisation University of London
• Completed a 5-course series of specialization program including VR modeling, VR interaction design, social VR, and 3D modeling, along with an interactive VR capstone project
Computer Vision Nanodegree Udacity
• Learned computer vision and deep learning techniques from image processing to building CNNs and developed automatic image captioning, object tracking, and SLAM projects
Z.Yang*, B. Rubio-Perez*, J. Salman, M.Frising, M. A. Kats, “Monte Carlo Simulations of the Farnsworth-Munsell 100 Hue Color Vision Test for Anomalous Trichromatic and Dichromatic Observers ”, (In Progress, Spring 2022)
Z. Yang, B. Rubio-Perez, M. A. Kats, “Breaking Binocular Redundancy Through Virtual Reality” , (In Progress, Winter 2021)
J. Salman, M. Gangishetty, B. Rubio-Perez, D. Feng, Z. Yu, Z. Yang, C. Wang, A. Shahsafi, D. Congreve, M. A. Kats, “Passive frequency conversion of ultraviolet images into the visible using perovskite nanocrystals” , Journal of Optics (2021)
Featured in：Cameron, Mike, “Effective Leaders: Four Attributes That Underpin The Core Characteristics of Effective Leadership” , SpiritCast Network (2021)
my portfolioA brief collection of my work. Contact me if you would like to see more.
Color Vision deficiency Simulation
Vision, Optics, VR
Binocular Rivalry Through Virtual Reality
Robotic Mimicry Control in Virtual Reality
MRTK for Unity
MRTK, Ultraleap, Unity
VR, Unity Video Production
Poly Space VR
Social VR, Unity
Entrepreneurship & More
Color deficiency Simulation
- Organization : Kats Laboratory of Applied Physics
- Professor : Dr. Mikhail Kats
- Duration : Ongoing (since December 2017)
- Technologies : Matlab, Hyperspectral Imaging, Unity, Oculus
- Overview : Our project explored color vision deficiency (CVD) and the potential to correct it through optical lenses.
- Description : In the first phase, I devised chromatic adjustment algorithms using computer vision to simulate this condition. Next, I confirmed my hypothesis by combining optics with AI to develop a system to mimic human interactions during a colorblindness test with 90% accuracy. In the second phase, I programmed simulation models into VR environments to visualize my research findings.
- Application : This technique enables mathematical optical simulation Color Vision Deficiency and provides a realistic simulation through VR. The process can potentially be reverse-engineered as a vision enhancement algorithm to allow humans more see more colors.
Passive Frequency Conversion Paper
Vision Deficiency Simulation Paper (In Progress)
Robotic Mimicry Control in VR
- Organization : UW Graphics Group
- Professor : Dr. Michael Gleicher
- Duration : 1 year and 3 months
- Technologies : Unity, ROS, Linux, StreamVR, IK Solver
- Overview : This project explores remote manipulation of robots through virtual reality and inverse kinematics . There are two parts: mimcry control of robot arms, and re-creation of robot motion through data.
- Mimicry Control : In the first half of the research, I implemented a virtual reality system that allows users to remote control robots with their hand and arm gestures by passing ROS (Robot Operating System) data between Unity and the robot through network socket with minimal latency. The video below demonstrates this capability in action.
- Recreate Robot Movement : During the second half of the research, due to COVID-19 restriction, I could not be in the lab and research with the robot, so I switched gear into replicating movement by interpolating data from actual robotic experiments in Unity. By providing a spreadsheet of joint angles with timestamps, the interpreter I designed can create animation clips that replicate the exact robot arm movement. Here is a demo of what it looks like.
- Application : This system allows us to perform robotic manipulation remotely, which can benefit many fields, such as the operation of dangerous equipment and more delicate robot motions, while reducing the cost of robot maintenance.
Binocular Rivalry Through Virtual Reality
- Organization : KATS LABORATORY OF APPLIED PHYSICS
- Professor : Dr. Mikhail Kats
- Duration : 6 months
- Technologies : Unity, SteamVR
- Overview : This project explores and simulates the vision of several visual impairments such as binocular disparities.
- Description : The human visual system perceives the world by combining information received from the convergence of two eyes and infer three-dimensional cues such as depth. Here, we demonstrate a virtual reality system that breaks the inherent binocular redundancy by projecting two different dynamic and static contents to each eye.
- Application : By breaking binocular redundancy in virtual reality, our method aims to provide new ways to customize and simulate binocular rivalry without the need for physical equipment such as a mirror stereoscope. With the foundation of our technique, it should be possible to create more complex and personalized binocular disparities. This technique yields implications to binocular disparity experiments, virtual reality accessibility design, and potential human vision enhancement.
- Organization : Carnegie Mellon ETC
- Position : Project Manager & Developer
- Duration : 4 months
- Technologies : Unity, Maya, Oculus, C#
- Overview : Kitchen Kraving is a fast-paced kitchen-themed VR game.
- Description : Kitchen Kraving is a Virtual Reality game in a kitchen setting where the users have to make food while stealing food to eat without being caught by the boss, who randomly check in on the user. The game was first developed at Carnegie Mellon ETC with a group of artists, programmers, and sound designers. It was later published on the Oculus store
- Position : Developer
- Duration : 8 months
- Technologies : Unreal Engine, Blender
- Overview : Neon Painter is a space-themed VR game for self-expression and contemplation.
- Description : Neon Painter is a 3D Painting VR application where users stand on top of a lone planet in randomly generated space scene. Combined with gorgeous graphics and an intuitive painting system, users will have an immersive and expressive VR experience.
- Inspiration : The theme is inspired by the concept of the "overview effect," euphoria astronauts often experience during flight. The visual of this game is inspired by Cyberpunk 2077. I intend to use this game to capture that shift in awareness and use painting as a way for users to express themselves, almost as a meditative experience.
MRTK for Unity
- Organization : Microsoft Mixed Reality Toolkit (MRTK)
- Position : Open Source Developer
- Duration : Ongong (Since March 2021)
- Technologies : Unity, Ultraleap, MRTK
- Overview : MRTK is a Unity toolkit used for developing XR content.
- Description : In this open Source project, I work with Microsoft developers and other open-source contributors on MRTK for Unity. Some of the features I worked on are integrating Leap Motion Hand Tracking with MRTK (see demo video above) and adding new features for the upcoming MRTK 2.7.
Poly Space VR
- Position : Creator & Developer
- Duration : 1 Year
- Technologies : Unity, Blender, Photon Engine
- Overview : PolySpaceVR is a virtual reality social platform for customizability and small group gatherings.
- Inspiration : When I played VRChat, I noticed that the giant platform is great for meeting random people but not so much for intimate or customized experiences. After some research, I realized there is no VR social media platform targeted for small groups, so I created one myself. Due to the size of the project, I choose to make it open source and lightweight so that other developers who share the vision can contribute.
- Description (from Oculus Store): Poly Space VR is a low poly, lightweight, open-source social platform where you can meet and chat with your friends in VR. You can create and share your own Poly Spaces by following the guidelines. Each month, an incredible version of Poly Space VR will be selected and uploaded to Oculus Store so everyone can join in on the fun!
- Position : Founder & Developer
- Duration : Ongoing (Since July 2021)
- Technologies : 360 Camera, Unity, VRTK
- Overview : BubbleVR is a VR platform for users to upload and relive their 360 experiences.
- Description : Ever since I started by VR YouTube channel The 360 Studio , I constantly revisited my old videos in VR, but always had to go through the file selection process through my 2D computer screen, which inspired me to create BubbleVR. BubbleVR is essentially a VR experience management system, where every 360 experience is a bubble that people can rearrange and visit. I chose Bubble as a representation of memories because of its fragile yet beautiful nature.
- Outlook : Though BubbleVR
currently a local 360 video content management system, I see it as
the future of VR social media, where people can visit each other's memories and
experiences, similar to the Stories feature in Instagram.
In the following are some 360 videos from my YouTube channel:
- Position : Creator & Developer
- Duration : Ongoing (since January 2020)
- Technologies : NLP, Unity, VR, AI
- Overview : Project Virtualso combines artifical intelligence (AI), natural language processing (NLP), and virtual reality to create customizable conversational humanoid agent capable of making emotion-driven facial expressions and body gestures.
- Inspiration : This project attempts to combine my knowledge in communication arts and VR to create realistic human interaction wthin VR. Due to COVID, many do not have access to in-person interactions and cannot develop the soft skills necessary, so I created several simulations such as job interviews and public speaking.
- Virtual Interview : The user interviewed by a conversational humanoid agent. The virtual interviewer can hold basic conversations and engage in realistic emotion-driven interactions such as facial expression, body gestures and much more thanks to Natural Language Processing (NLP) techniques.
- Virtual Presentation : The user presents their slides in front of a room of AI audiences capable of reacting to the speech in VR. This project aims to help those with fears in public speaking and has been tested by formal employees from several companies and received very positive feedbacks.
- Position : Engineer, Co-Designer
- Duration : 7 months
- Technologies : Arduino, Circuit Playground
- Overview : Gloat is an interactive wearable designed by me and designer/musician Basi.
- Description : Gloat is a transparent bubble coat jacket with colorful lights embedded that react to light and sound. This piece is the intersection where fashion design meets technology. Gloat are used from art installation to music videos and received overwhelming critical acclaim.
- Position : Co-Founder, CTO
- Duration : 1 Year
- Technologies : Flutter, Adobe XD, Firebase
- Overview : Collect is a social media startup aiming to enable people collect experiences.
- Description : Collect is a platform that allows people to collect and store experiences in pictures. The saved images will be stored and sorted by collection type, making it easy for users to build unique collections, look back at their collections and share their photographs with users that have similar interests.
- Position : Founder
- Duration : 3 Years
- Technologies : Flutter, AdobeXD
- Overview : UpNote is a startup aiming to democratize music for bar goers.
Description : Upvote’s B2B and B2C platform allows bars,
restaurants, event organizers and, individuals democratize music playlists and
capture data on music preferences. Upvote enables individuals to nominate songs
through integration with their music streaming service of choice (ex. Spotify or Apple Music). Once a song is
other users in the same location can up- or down-vote each song selection to determine which will play next.
Check out our pitch below:
- Note : Unfortunately, the project was cut short due to COVID-19 and bar shutdowns
get in touchI’m always open to discussing ideas and collaboration!
emailjackyangzzh [at] gmail [dot] com
Want to connect? Please fill out the form below and I will reply you shortly.
My BlogA Collection of My Latest Blogs
I enjoy writing blogs on Medium, whether it is a book review, my thoughts on the current state of technology, or just some reflections in general. I usually write a new blog post every months, and you can check out all the content on My Medium Page