Blog

Headset Brain Computer Can Help Occupants Control Their Environment

Headset Brain Computer Can Help Occupants Control Their Environment

Maria Lorena Lehman Maria Lorena Lehman
4 minute read

Listen to article
Audio is generated by DropInBlog's Blog Voice AI and may have slight pronunciation nuances. Learn more

So often interactive adaptive architectural interfaces must rely on picked up cues that are either created from occupant behaviors or from different objects within an environment that move, change or transmit other real-time information. And with these types of cues comes concern from building occupants about how “control” will be established between them and their surrounding built environment. For if a building is indeed adaptive, where are the control points? Who sets the rules? And how can the resulting architectural transient behavior be seamless for both the building system and its occupant?

Well, an exciting new brain computer interface technology has been demonstrated as a new way for users to interface with their machines. And I think such technology can serve as a liaison between occupants and their buildings. Created by Emotiv Systems, this head-worn device will literally allow one to signal change by simply using one’s own thinking power. Taking only a few minutes to put on this wireless interface technology, suddenly there is so much that can potentially be done to alleviate problem points with which many of today’s interface technologies often struggle.

Within an adaptive building, such technology could greatly ease the way that a building and its occupants communicate. While privacy is indeed a concern, there is an element of control here where the wearer of this interface technology must visualize in order to create the change they wish to experience.

As you will see in the video (at the end of this article), this head-worn device may seem a bit clunky by today’s standards — but if you can imagine where such technology might take us, you will see that the rippling effects in terms of usability can be far-reaching. Not only can such a device impact the many uses for augmented reality where someone using the technology can simply visualize an action through thought, and thus, create consequential behaviors in a virtual world, but it can also improve interactions in real-life applications by enhancing a user’s thinking power as they engage with their surroundings. Think smart buildings here.

Just imagine that within your own home you could use your thoughts to visualize how you would like a certain lighting condition to change, window glass to change in transparency or the temperature of a room to change by simply imagining the action that you would like to see carried out. Although at this point some of this may seem quite “magical”, there is a very real potential for this to not only work, but to have profound and positive life-changing benefits for those that not only use it, but need it. (What this can do for accessibility within buildings could potentially change the canvas of where we are today.)

Getting this “Magical” Headset to Sync with Other Building Systems

However, such a brain computer does not eliminate the need for transient architecture to look for other cues, and continue to develop its mechanisms within its own systems for making sense of incoming data. And of course, an adaptive building system must take into account more cues than just those found within an occupant’s visualization powers and within the behavior of the objects within their building environment. Also, adaptive architecture must also take into account things like external environmental conditions, and the needs of both a collective body of occupants as well as those needs of only an individual. It must synchronize with all of these.

Taking all of this into account, I do think that this Emotiv Systems brain computer interface is definitely making some steps toward thinking outside the box. As more ways for occupants to interact with their built environment come into the forefront, adaptive architecture will be that much better because it will be able to make more sense of a building occupant’s goals, with a more seamless approach on how to get them there.

The following is a video where Tan Le, the head of Emotive Systems, explains how this brain computer interface technology works. Within this video you will see a live demonstration which is quite amazing to watch, and no doubt, you will immediately be struck by other ideas on how this headset technology will impact architectural design through more far-reaching applications.

Custom HTML/CSS/JAVASCRIPT

Image Credit: © on_the_wings | Flickr

« Back to Blog