The most resent wearable computing Kickstarter success (having reached it goal in only 3 days!) is the NFC Ring designed by John McLear from the UK. NFC for those of you unfamiliar with the acronym stands for Near Field Communication, a wireless protocol similar to RFID (Radio Frequency Identification – used to security tag products in shops among many other things). NFC is becoming increasingly popular for particular tasks because of certain qualities that distinguish it from RFID and for this reason it now comes as standard in most new high end smartphones. In fact you may already have been using NFC if you are one of the early adopters of a contactless credit card (See here for some more (scary) information on contactless credit cards).
What distinguishes NFC from its now ubiquitous ancestor RFID is that it allows two way communication (you can only read from an RFID chip) and as the name suggests it works over a very short range (maximum of 4 inches of 10 centimetres). Both of these qualities make it particularly suitable for smartphones and tasks that require more security (like electronic payment). So how does John McLear propose to utilize this technology?
The NFC Ring can be used to unlock doors, mobile phones and to transfer information, link people or even transfer accessibility preferences or login details. Have a look at the promo video below for more details.
I’m sure you’ll agree this a great looking product at a fantastic price (under €30 including delivery) who’s full usefulness is probably not yet completely evident. In addition to that that there are a couple of other features that make this an outstanding Kickstarter project.
First of all the detailed video (below) where John outlines the design iterations and technological barriers the team overcame to come up with the final product will prove very interesting and informative to any potential product designers out there. Also their equal weighting of aesthetics, security and functionality could be considered a blueprint for the design of wearable technology. Releasing the SDK (Software Development Kit) as Open Source should ensure a steady stream of user generated apps and innovation at a rate that just wouldn’t be possible even with a large team of developers. Finally allowing people the option of customizing the ring to their own individual preferences or create unique designs opens the door to allow creative and artistic individuals the opportunity of reselling their designs. You can even just buy the NFC chips and use a 3D printer to print your own ring!
This is true user driven design in the sense that although this is a product in its own right it is also a platform for users to create their own unique product with the functionality they need and the aesthetic they desire… as long as it’s a ring any thing they want with the 3D printing option!
Anybody got any ideas for useful Assistive Technology (or any other) applications for this technology? Please comment below (as long as it’s not spam about rip-off Oakley sunglasses
Summary: When you touch your own body, you feel exactly what you touch — better feedback than any external device. And you never forget to bring your body.
Usability expert Jakob Nielsen discusses the future of HCI (Human Computer Interaction) and Ubiquitous User Interfaces in the latest instalment of his blog, Alertbox. Specifically he looks at two concepts that use human body parts as user interfaces: Sean Gustafson‘s hand based interface (pictured) and the EarPut, ear based input system being developed by Roman Lissermann and colleagues from the Technical University of Darmstadt. One very interesting discovery that has been made through this work is that when blindfolded, users were almost twice as fast using the hand interface than they were using a regular glass touch screen. Read the full article here http://www.nngroup.com/articles/human-body-touch-input/
Anybody with even a passing interest in technology will have probably heard the rumours that have abounded over the last few days about Apples supposed new product, the iWatch. Even though it is pure speculation at this stage and has been greeted by Apple with their usual stoic silence it’s a great opportunity to look at the whole area of wearable computing. Wearable Computing which has long been a mainstay of science fiction is about to become a reality with many tech evangelists claiming it will be the next big thing. Tech analyst Juniper Research estimates that wearable computing will generate €600m in revenue this year and €1.25bn in 2014 with annual unit sales rising from 15m in 2013 to 70m by 2017. The demand certainly seems to be out there, in a previous post we mentioned Pebble the watch like smart phone accessory that raised over $10m in their Kickstarter campaign (100 times their goal of $100,000). Numbers like this will surely encourage manufacturers to consider similar designs. Rather than wasting time discussing a product that (at least for the moment) doesn’t exist the remainder of this article (and indeed my next couple of posts) will concentrate on products that have got at least as far as the prototype stage and in some cases are already available to buy (for more on the mythical iWatch see http://www.guardian.co.uk/technology/2013/feb/18/iwatch-apple-tv).
Glasses – Google Project Glass
In April last year Google released the video below demoing “Project Glass” their Augmented Reality (AR) glasses that allow their user to access information and interface with their smart phone. Although not a new idea (portable computing with a heads up display (HUD) has been around for a number of years see the work of Thad Starner and Steve Mann) this is the first time something like this has even vaguely resembled a mainstream product.
There is more going on in the video above than AR of course. Like other mobile technologies (particularly the smart phone to which they are connected) the glasses are context aware and the user interacts with the technology using natural language. With Project Glass Google have been accused by some commentators of completely missing the point however claiming that instead of mediating and augmenting your connection with reality they mediate and augment your connection to Google services. Since the project was announced in April the glasses have made an appearance at New York fashion week, jumping out of a aeroplane on skydivers and even a brief cameo being worn by Google founder Sergey Brin on a subway. The latest sneak peak into the possibilities offered by Project Glass, now simply called Glass is the video below which was released on February 20th.
Despite all the publicity, whether this is something real people actually want is still uncertain. If it is however it could mean a huge change in how we interact with mobile technologies. Large touch screens would not be necessary on smart phones, in fact we wouldn’t have to take them from our pockets. Any surface could be a keyboard or we could just speak text (when appropriate of course). How might this effect visually impaired users? Is there any alternative to AR for them or do they just stick with the legacy touch screens until they are no longer supported? These negatives aside there are plenty of positives in terms of possible AT applications for this technology. Any ideas for AT uses?
It’s worth mentioning that Google aren’t the only ones looking at this technology, other companies hoping to introduce smart glasses include Vuzix and the camera manufacturer Olympus.