Over the last year I've been working on an indoor location system to track firefighters while they do their training. This is fun and excruciating location is a really difficult one to use. In short the higher your resolution the more difficult it is to accurately say where someone is. For this project we need down to 0.5m accuracy with the ability to distinguish between multiple people huddled together.
Without going into the gritty of it (NDA) we kitted out the test site with an array of sensors that give us the location of people going through, proximity and pressure sensors were used to give us these location. This style of location system has been done before but it's problem is it cannot distinguish between people. We've done it optically using a kinect style approach and not to toot my own horn I am pretty damn happy with the results, we locate and identify people and ID them within the test grid with a perfect accuracy for up to four people, any more than that and the optical system doesn't hold up but that's something we are working on (the main limiting factor is the Sheer amount of data generated from the optical system so the more machines you have to crunch the numbers the higher your resolution can go).
One of the other projects that we are working on that we are really happy with is a multitouch exhibit for a space/science museum. The emphasis being on collaboration and several users interacting at once, something that isn't often done well with these things.
Pictures to come.