AUIC 2012 Roundup

So the Australasian User Interface Conference for 2012 has been and gone. The Wearable Computer Lab presented two full papers and two posters, of which I was an author of one 🙂

The papers we presented are listed below, and the publication page has been updated so you can get the PDFs. Cheers!

E. T. A. Maas, M. R. Marner, R. T. Smith, and B. H. Thomas, “Supporting Freeform Modelling in Spatial Augmented Reality Environments with a New Deformable Material,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012. (pdf) (video)

T. M. Simon, R. T. Smith, B. H. Thomas, G. S. Von Itzstein, M. Smith, J. Park, and J. Park, “Merging Tangible Buttons and Spatial Augmented Reality to Support Ubiquitous Prototype Designs,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.

S. J. O’Malley, R. T. Smith, and B. H. Thomas, “Poster: Data Mining Office Behavioural Information from Simple Sensors,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.

T. M. Simon and R. T. Smith, “Poster: Magnetic Substrate for use with Tangible Spatial Augmented Reality in Rapid Prototyping Workflows,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.

How OpenNI Nearly Spoiled The Show

Half RealSo, for the last few months I’ve taken a break from the PhD to do some work for a theatre show for The Border Project, Half Real.

There’s a lot of technology in the show. In particular, most of the set is projected, and we are using a Microsoft Kinect to track the actors on stage, and modifying the projections based on their location.

I’m working on Linux, and using OpenNI for interfacing with the Kinect. Things almost worked perfectly. In this post I will document the trials and tribulations of getting the Kinect to work for Half Real.

Continue reading How OpenNI Nearly Spoiled The Show

Kinect on Ubuntu with OpenNI

UPDATE October 2015: Verified working in Ubuntu 14.04 LTS and 15.04!

I’ve spent all this morning trying to talk to the Microsoft Kinect using OpenNI. As it turns out, the process is not exceptionally difficult, it’s just there doesn’t seem to be any up to date documentation on getting it all working. So, this post should fill the void. I describe how to get access to the Kinect working using Ubuntu 12.04 LTS, OpenNI 1.5.4, and NITE 1.5.2. Continue reading Kinect on Ubuntu with OpenNI

Latex, Texlipse, and EPS Figures

I’m currently in the early stages of writing my PhD thesis. I’m writing it using LaTeX, and I’m trying to get the perfect build system and editing environment going. Yesterday I had a look at Texlipse, a plugin for Eclipse. There was one problem: EPS figures didn’t work.

In newish versions of Latex, if you use the epstopdf package, your images are converted on the fly, but this wasn’t  working in Texlipse. Luckily the fix is easy, and the rest of this post explains what to do.
Continue reading Latex, Texlipse, and EPS Figures

Quimo. A deformable material to support freeform modelling in spatial augmented reality environments

Hello Everyone

3DUI has wrapped up for the year, so here is our second publication. We introduce a new material for freeform sculpting in spatial augmented reality environments. Please read the paper, and have a look at the video below.


 

Adaptive Color Marker for SAR Environments

Hey Everyone

So right now I am at the IEEE Symposium on 3D User Interfaces in Singapore. We have a couple of publications which I’ll be posting over the next few days. First up is Adaptive Color Marker for SAR Environments. In a previous study we created interactive virtual control panels by projecting onto otherwise blank designs. We used a simple orange marker to track the position of the user’s finger. However, in a SAR environment, this approach suffers from several problems:

  • The tracking system can’t track the marker if we project the same colour as the marker.
  • Projecting onto the marker changes it’s appearance, causing tracking to fail.
  • Users could not tell when they were pressing virtual controls, because their finger occluded the projection.

We address these problems with an active colour marker. We use a colour sensor to detect what is being projected onto the marker, and change the colour of the marker to an opposite colour, so that tracking continues to work. In addition, we can use the active marker as a form of visual feedback. For example, we can change the colour to indicate a virtual button press.

I’ve added the publication to my publications page, and here’s the video of the marker in action.

 

Behaviours Demo – Android Programming

Hey Everyone

So this week I became a member sponsor on www.3dbuzz.com. The first thing I had a look at was their XNA Behaviour Programming videos, which are the first in their set on AI programming. However, not being particularly interested in XNA, I implemented the algorithms presented in the videos for Android.

Here’s a video of the demo running on my Nexus One:

Since I was on Android and only using the Android and OpenGL ES libraries, I had to write a lot of low level code to replace the XNA functionality that 3DBuzz’s videos rely on. I also had to implement an on-screen joystick. I might write up a couple of posts on the more interesting parts of the code (what is not in the videos) soon.

Thanks
Michael