In any industry, each competitor seeks to distinguish themselves from the other competitors by providing something none of the others can offer. This is generally known as the ‘special sauce’ and I think I’ve found mine.

Most people have heard of the Arduino, but if you haven’t it is a very cheap and very powerful electronic device that hooks up to a computer via USB cable and allows software to control a huge number of real world devices like temperature, pressure, motion and other sensors, motors, heaters, light dimmers, electrical switches.

My education is in process control, which is precisely the act of reading such sensors and controlling such devices so Arduinos are definitely my thing, but what is a Uniduino?

Well a Uniduino is a library of C# code that allows the Arduinos functionality to be controlled from within the Unity game engine.

What that means is that I can create a game world that gets information from real world devices and uses that information to control how something in the game looks and acts. I can use input from the person playing the game to send information back to the uniduino to get it to do something in the real world.

The game world can be running on a remote server, and the person(s) in the game world could be logged in remotely, so this is the core of some interesting telepresence applications. Throw in some remote video cameras and you could do some serious things.

Remote health care is already being explored with video technology. How could it benefit from the added element of a virtual world? What kind of devices could we make for chronic care patients to allow them to live their own lives while being connected to and within reach of health care intervention when they need it?

Remote monitoring of many industrial systems is already commonplace. How could this be extended by adding virtual or augmented reality? Fixed cameras monitor a lot of installations – what about a 3D overlay (underlay) on the video to tell the operator where things *should* be when they are normal?

If you’ve played any recent first person shooter video games, you know how far we have come in recreating experiences virtually. We manage to get by using non immersive flat panels crammed with abstracted information from sensors that try to tell us what is going on somewhere else with our machines.

The next generation of workers will be able to walk around virtually inside those environments and see, touch, feel and hear what is going on.

As any old time engineer on a ship or a train will tell you, they can ‘feel’ when something isn’t right with the machine. We’re not going to ‘feel’ anything about our machines until we start making the man-machine interface much more immersive than a bunch of numbers on a touchscreen.

Comments are closed.