Let's talk about Dr. Lennart Nacke's most favorite thing, as stated by him in class: biometric controllers. A biometric controller would be a controller that reads impulses sent by your brain, or more familiar things such as your heartbeat rate, your facial expressions, and eye-tracking or head tracking. These controllers can allow for more intuitive inputs to be sent into the game, and also some inputs you were not even aware of. This can change the game play in ways that were never imagined before. A horror game that can measure your heartbeat would know what things scare you the most, when to give you a break, and when to throw in the monsters. Facial expression technology may be seen in MMORPGs, allowing for games like Second Life to be even more realistic, as players can communicate with more than just text, but their real life facial expressions. You may have to fake a laugh in the real world if you typed "lol" even though the joke you just heard was stale. Dating deception on all levels of reality! Eye tracking and head tracking can make first person shooters so much more easy to control. There won't be a need for that really awkward second analog stick used to turn the camera in-game, just turn your head, or focus your eyes on the thing you want to see. As Gabe Newell put it, biometric inputs allows for more bandwidth between the player and the game, as doing these things comes naturally and does not interrupt your game (Sottek & Warren, 2013).
I was actually allowed to try NeuroSky's MindBand (pictured below) during the summer, when the development team for AntiMatter was integrating it with their game. The readings were a bit unusual to understand, they connected the "concentration" reading to the max number of bullets the player could shoot per second. I'm not sure what constitutes concentration, but thinking about nothing seemed to ramp it up all the way to 100 for me.
Picture of the MindBand (NeuroSky, 2011)
Judging from the experience I had with the MindBand, biometric controllers still seem a long ways off before they will be able to reliably give you good readings. However, VentureBeat believes that 2013 will be the year when all these technologies will coalesce and build the foundation of "NeuroGaming" as they call it (Lynch, 2013). The success of these technologies will be up to game designers, as they will be the ones to tweak the parameters, allowing for the technology to be actually usable. I for one am probably most excited for head tracking, a controller that probably can't go wrong.
Lynch, Z. (2013, January 17). Let the neurogames begin. Retrieved from http://venturebeat.com/2013/01/17/let-the-neurogames-begin/
NeuroSky. (2011). Neurosky mindband europe. Retrieved from http://www.home-of-attention.com/en/shop/1/flypagetpl/shopproduct_details/4/itemid-12
Sottek, T. C., & Warren, T. (2013, January 8). Exclusive interview: Valve's gabe newell on steam box, biometrics, and the future of gaming. Retrieved from http://www.theverge.com/2013/1/8/3852144/gabe-newell-interview-steam-box-future-of-gaming
Please turn your attention to this sci-fi short movie called R'ha.
The premise is one we've seen many times before in movies like: Terminator, The Matrix, and TRON: Legacy. The premise being that a sentient AI decides to take over and/or destroy all of humanity, with different degrees of cruelty and success. So what makes the short film above different? This was made entirely by a single student named Kaleb Lechowski, 22, in 6-8 months (sources vary). Another thing that stands out is that the protagonist is alien, which brings an idea to mind. Is it possible advanced alien races have already destroyed themselves? Will our reliance on technology bring about the same fate? But now that's getting off-topic, we're here to talk about Human-Computer Interactions (HCI).
In the above short the alien was able to communicate to the robot through speech, so the Artificial Intelligence(AI) has speech recognition. This may have been due to the fact it would have been less dramatic and extremely awkward if the captive had to use a keyboard to type in answers that the computer prompted as text on a screen (though in Matrix this happens with great dramatic effect near the beginning of the movie). However, speech recognition allowed for speedy communication between alien and machine. The AI also shows a deep understanding of the alien's thought process, which in HCI is actually very desirable. Wouldn't it be nice for a computer to continuously learn from your mistakes, and therefore be able to adapt to your common mistakes, quickly correct them for you, and therefore speed up productivity? Of course in the case of the short film, the AI uses this advantage to trick the protagonist, and lead the machines straight to the race's refuge point. (Though I have to wonder, with such advanced technology wouldn't the AI have access to mind reading, thus making this whole charade kind of pointless?)
Let's now move to the virtual world of Oz within the movie Summer Wars, in a much more familiar solar system.
Oz is accessible from multiple platforms from computers to cellphones to a Nintendo DS clone, making the virtual world extremely portable, and the controls must be translated to these very different interfaces. The controls are very important, as there is also a gaming community within Oz. However, it may be impossible to translate all the in-game controls, and many hardcore gamers in Oz are restricted to using a keyboard and mouse (I assume so since the only gamer uses the keyboard throughout the movie). The rendering must be done in the cloud, because smart phones and other portable systems probably don't have enough processing power to render millions of avatars. (Spoiler: which happens to be required during the movie when 20 million gather in a certain place for a certain awesome reason). This means that the player's view into the world of Oz is pre-rendered on the server and streamed to them. (Or it may be the case that it is the future and all cellphones have an nVidia Geforce GTX+ 9999). The world of Oz also has instant language translation, allowing for people all over the world to access this community. Streaming graphics and instant language translation are some extremely accommodating features that allow for Oz to be accessed by anyone on any platform, regardless of technology specs and language barriers. This universal usability may be one of the reasons why Oz has so many users from all over the planet (or because it's just the setting for the movie).
The trailer above is misleading, Summer Wars also features a program that was created by humans and accidentally ends up being hellbent on destroying them. The program is a genetic algorithm that likes to continuously learn from its surroundings, by playing games. It probably uses a form of machine reinforcement learning, but now we're getting off-topic again and talking about AI. But wait... AI I believe is a very important part of HCI, and will be even more important as the technology improves. Not only does AI allow for easier, faster, and more efficient ways for interfacing with humans, it has the potential to build upon itself. Take for example voice recognition, this is a part of AI which allows for speech to be recognized and a set of instructions can be built for the program using only speech. This is a very fast and intuitive way for humans to interact with computers, because it is so similar to interacting with other humans. AI learning also allows for instruction sets to be implied, and the computer do things that you never asked directly for, but the computer learnt are regular occurrences. An example would be commanding your computer to make you some coffee. After doing this for a week at around a similar time, say 6PM, the computer implies that coffee is probably needed around 6PM everyday, and makes it anyways, even though you did not ask. "Why, thank you," you'd probably say to your computer, as it kindly produces a mug of coffee as you arrive home. As technology improves, human to computer interaction becomes human to robot interaction and then human to pseudo-human interaction.
As we continuously improve AI, we may reach a point where AI can in itself propagate better AI, this creates the technological singularity. Technological singularity may be the end of mankind itself, or the beginning of the infinite expansion of knowledge and understanding. Computers that understand humans even better than humans understand themselves, something that can answer all of our questions. This would be an infinitely more awesome way for humans to interact with computers.