Stuff by Brad

A page dedicated to interesting projects I've worked on.

Adventures in Flocking

               

So, about a year ago I worked on a couple of really fun projects with some classmates for our animation class. One of these projects was loosely defined as anything that leveraged a flocking algorithm, so we made a toy like game. As with many of my small game projects, we implemented this one in Unity to keep it simple. The game allows you to release ants into this apartment that will flock around in group and you can chose from a variety of objects to drop on them or put in their way. To help us learn about flocking algorithms, we put controls for all of the coefficients in the GUI so that we could tweak the algorithm on the fly.

I developed our collision detection between the ants and the walls, because the ant would need to be able to tell if the wall is on its right or left when it approaches it in order to avoid it properly. We also needed models for objects to drop on the ant so I made a few of them myself in Blender and created they prefabs for these objects in Unity. Finally, I also incorporated the stereoskopix scripts so that we could support the 3D functionality of the displays we had in the classroom.

If you’d like to play a browser friendly version of the game check it out here.

The project website can be found here.

GraviDarts

GraviDarts is a game I developed for a Virtual Reality course. It was developed in Unity and was designed to run on a 2x2 array of passive stereoscopic lcd displays with a Vicon Bonita tracking system. The tracking system followed the player’s head and a wiimote. The head tracking was used to maintain a correct viewer centered perspective and the tracked wiimote was used to capture the throwing hand’s movements.

The concept for the game was that the player would play darts normally but the gravity could changes in terms of both its direction and magnitude. I was wondering how the player would accommodate to the new gravity while all physical cues in reality told them otherwise. I was curious about this after we had read studies that showed subjects would fall over if you slowly rotated a virtual room in a VR system because the visual cues were stronger than their sense of balance.

I had some problems with throwing the dart because I was using an interface for transporting tracking data into Unity that some one else had developed. Apparently in their code, they had read in the tracking info in Unity’s update() function which is called at every frame. The problem with this is that the buttons from the wiimote were read in a fixedUpdate() function, which is called at every physics time step (generally called more often than update()). So when you released the button you weren’t guaranteed to have the latest possible position. I tried to use the last few known positions and the position at the time of button release to figure out the velocity/acceleration the player’s hand had at the time of release. However, since the tracking data and the button data were being read at different rates I would some times have the same position for the last position and the current position. This, of course, killed my ability to calculate a meaningful velocity at the time of release and so I had to work around by calculating the last known velocity for each piece of tracking data, so I could use the an old velocity if the knew one was bad.

Dynallax

For this project I worked with scientists from Argonne National Laboratory. I had two main tasks in this project. The first, was helping to develop a visualization tool for nano-scale materials scientists to run on a custom built auto-stereoscopic workstation (3D without glasses). The second was to extend the libraries used for application development for that display to help future developers. It was a great learning experience and was the project I spent the majority of grad school working on. I learned about all forms of stereoscopic displays and how they produce a three dimensional images to ensure the visualization would appear correctly and so I could debug calibration issues in the hardware of the display. For the work on the library however, I had to do a lot of work with network programming and incorporating multitouch support.

In addition to library and viz app development, I also had to help develop procedures for calibration of the workstation and help in analysis of the workstation as a 3d display. The calibration dealt chiefly with the headtracking required to maintain the 3d image as the user moved. If the camera was knocked or bumped in the slightest the quality of the 3d image would quickly diminish so the calibration had to be quick and easy for users to perform. When I assisted in the analysis of the display, it included tasks like creating test patterns and applications to allow us to measure the amount of crosstalk between right and left eye images or taking light-meter readings under different conditions. It helped to increase my understanding of how a product development cycle flows.

Robot vs. Zombots

This is a game I worked on for the game development course I took last spring. The goal was to be an open world with a destructible environment where the destruction was an essential element of the gameplay. In the game you attempt to protect the humans at the center of the map from the evil robots entering from the outskirts of town for five minutes. The only way to stop them is by blocking their path or crushing them with buildings you knock over.

An interesting problem we were tasked with for this course was to develop the game to be played on a wall sized array of 42” lcd displays. This informed several design choices for the game and presented problems for implementation. I tried to make the game immersive by having the player use gesture based controls using a wiimote. I tried to implement controls using Kinect but at the time there were no official drivers released and I was trying to us a community developed library to integrate it into Unity. I had it working but the lag and noise in the data stream made it far less playable than the wiimote control implementation so the kinect functionality was left out of the final release.

The team I was part of was made up of artists and computer scientists from Chicago, IL, and Baton Rouge, LA. So we had to stay to a strict schedule and meet regularly via phone and video conferencing. We developed this in Unity and created almost all of the assets ourselves which was quite an awesome learning experience.