So I’ve been messing around with DIY guitar effects. In this post I’m going to talk about my variant of the Bulk Fuzz, a simple but crazy fuzz circuit from Joe Gore at Tonefiend. Continue reading Bulk Fuzz
So the Australasian User Interface Conference for 2012 has been and gone. The Wearable Computer Lab presented two full papers and two posters, of which I was an author of one 🙂
The papers we presented are listed below, and the publication page has been updated so you can get the PDFs. Cheers!
E. T. A. Maas, M. R. Marner, R. T. Smith, and B. H. Thomas, “Supporting Freeform Modelling in Spatial Augmented Reality Environments with a New Deformable Material,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012. (pdf) (video)
T. M. Simon, R. T. Smith, B. H. Thomas, G. S. Von Itzstein, M. Smith, J. Park, and J. Park, “Merging Tangible Buttons and Spatial Augmented Reality to Support Ubiquitous Prototype Designs,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.
S. J. O’Malley, R. T. Smith, and B. H. Thomas, “Poster: Data Mining Office Behavioural Information from Simple Sensors,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.
T. M. Simon and R. T. Smith, “Poster: Magnetic Substrate for use with Tangible Spatial Augmented Reality in Rapid Prototyping Workflows,” in Proceedings of the 13th Australasian User Interface Conference, Melbourne, Victoria, Australia, 2012.
So over on Youtube someone asked me about doing exponents in Java. I didn’t talk about this in episode 3 of my Java tutorial, so I have created a short supplementary video looking at some of the more advanced mathematical functions in Java. Short, and to the point. Here it is:
There’s a lot of technology in the show. In particular, most of the set is projected, and we are using a Microsoft Kinect to track the actors on stage, and modifying the projections based on their location.
I’m working on Linux, and using OpenNI for interfacing with the Kinect. Things almost worked perfectly. In this post I will document the trials and tribulations of getting the Kinect to work for Half Real.
UPDATE October 2015: Verified working in Ubuntu 14.04 LTS and 15.04!
I’ve spent all this morning trying to talk to the Microsoft Kinect using OpenNI. As it turns out, the process is not exceptionally difficult, it’s just there doesn’t seem to be any up to date documentation on getting it all working. So, this post should fill the void. I describe how to get access to the Kinect working using Ubuntu 12.04 LTS, OpenNI 1.5.4, and NITE 1.5.2. Continue reading Kinect on Ubuntu with OpenNI
I’m currently in the early stages of writing my PhD thesis. I’m writing it using LaTeX, and I’m trying to get the perfect build system and editing environment going. Yesterday I had a look at Texlipse, a plugin for Eclipse. There was one problem: EPS figures didn’t work.
In newish versions of Latex, if you use the epstopdf package, your images are converted on the fly, but this wasn’t working in Texlipse. Luckily the fix is easy, and the rest of this post explains what to do.
Continue reading Latex, Texlipse, and EPS Figures
3DUI has wrapped up for the year, so here is our second publication. We introduce a new material for freeform sculpting in spatial augmented reality environments. Please read the paper, and have a look at the video below.
So right now I am at the IEEE Symposium on 3D User Interfaces in Singapore. We have a couple of publications which I’ll be posting over the next few days. First up is Adaptive Color Marker for SAR Environments. In a previous study we created interactive virtual control panels by projecting onto otherwise blank designs. We used a simple orange marker to track the position of the user’s finger. However, in a SAR environment, this approach suffers from several problems:
- The tracking system can’t track the marker if we project the same colour as the marker.
- Projecting onto the marker changes it’s appearance, causing tracking to fail.
- Users could not tell when they were pressing virtual controls, because their finger occluded the projection.
We address these problems with an active colour marker. We use a colour sensor to detect what is being projected onto the marker, and change the colour of the marker to an opposite colour, so that tracking continues to work. In addition, we can use the active marker as a form of visual feedback. For example, we can change the colour to indicate a virtual button press.
I’ve added the publication to my publications page, and here’s the video of the marker in action.
After over a year, here’s the next instalment of my Git tutorial! In this video we look at the difference between rebasing and pulling from remote repositories. Sorry it took so long!
Continue reading Git Tutorial 4 – Rebase vs. Merge
So this week I became a member sponsor on www.3dbuzz.com. The first thing I had a look at was their XNA Behaviour Programming videos, which are the first in their set on AI programming. However, not being particularly interested in XNA, I implemented the algorithms presented in the videos for Android.
Here’s a video of the demo running on my Nexus One:
Since I was on Android and only using the Android and OpenGL ES libraries, I had to write a lot of low level code to replace the XNA functionality that 3DBuzz’s videos rely on. I also had to implement an on-screen joystick. I might write up a couple of posts on the more interesting parts of the code (what is not in the videos) soon.