Ah, the patent application process. Without it, we wouldn’t have had early warning of motion controllers, Sony’s streaming ambitions or the weirdly ugly people Microsoft expects to be playing its next console. Well, now it’s turned up a slightly odd application made by Sony Computer Entertainment. SCE, it seems, wants to monitor your nervous system. The application seems to insist that Sony has come up with a way to measure your nerves and predict your next course of action in “one or more body parts.”
Here’s the application introduction:
An electronic device may be controlled using nerve analysis by measuring a nerve activity level for one or more body parts of a user of the device using one or more nerve sensors associated with the electronic device. A relationship can be determined between the user’s one or more body parts and an intended interaction by the user with one or more components of the electronic device using each nerve activity level determined. A control input or reduced set of likely actions can be established for the electronic device based on the relationship determined.
Those of you with really good memories might recall that Microsoft is also trying to do something similar with biometrics. Will the next generation of consoles interact with us on an instinctive level?
Sony Computer Entertainment has also filed an application for software to aid in the development of heads-up displays. That one says it is for “presenting a graphic display of information as it is normally produced by a process implemented with computer software; selecting an arbitrary range of objects within the graphic display; applying one or more filters to the processing of the objects in the arbitrary range; and changing the processing of the objects dynamically in response to the filters.” Which sounds like it might have more to do with the headset technology they’re still working on perfecting but it seems perfectly suited to the next generation of home consoles.
Finally, for today at least, SCE has also filed an application for a “Gaze-Assisted Computer Interface.” That system is able to detect where a user is “gazing” and process that as a selection within software. It also, interestingly, accounts for detecting a user’s gaze location and using that as a measure to predict where their physical interaction was intended to end up. This might be a method of refining calibration of motion controllers but it also sounds like it might be wonderful for the progress being made in assisted computing for those with disabilities.