This year’s theme for Microsoft Research’s TechFest research showcase is natural user interfaces (NUIs).
While the company is showing off some non-NUI-themed research at this year’s event, most of the TechFest 2011 projects that the company is highlighting visibly are those that focus on using gestures, touch, computer vision and speech as ways to interact with PCs and other computing devices.
While Microsoft’s most celebrated example of how the company successfully commercialized its early NUI work is the Kinect Xbox sensor, there’s also quite a bit of focus by Microsoft’s NUI researchers on the intersection between NUI and healthcare. I blogged earlier this yr about how Microsoft is thinking about incorporating Surface 2, Xbox,
Microsoft Office 2010 Key, Xbox Live and Kinect sensors into healthcare applications. There are a number of Microsoft Research projects exploring the NUI-health connection.
One of these project, known as InnerEye, is focused on “the automatic analysis of patients’ (medical) scans using modern machine learning techniques,
Microsoft Office Enterprise 2007,” like semantic navigation and visualization. The Microsoft researchers on the project are working with the Microsoft Amalga team on Inner Eye,
Office 2010 Key, which is one of the demos being showcased at TechFest 2011.
Another area of NUI exploration, which doesn’t seem to be on the TechFest 2011 agenda, is interacting without touching. Microsoft researchers are publishing a couple of new papers on this topic this 12 months,
Microsoft Office 2010, one on image-guided interventional radiology (a PDF of which is available now), and another on brain-computer interaction.
Here’s a list of the TechFest 2011 projects,
Office 2007 Professional, NUI and non-NUI both, that the Softies are playing up this 12 months.
Today, March 8, is the semi-public TechFest day, when certain invited guests get to peek at some of the projects that will be on display for Microsoft employees from March 8 to 10.