Tuesday, April 19, 2011

Thoughtful Chunk 11: Adaptive House or Adaptive Human?

 


The Adaptive House as a Schoolhouse around 1926

Professor Michael Mozer, from the Department of Computer Science and Institute of Cognitive Science at the University of Colorado, Boulder, created a so called adaptive house. That is, Mike Mozer bought an old building and redesigned it to be alert and responsive to his activities within it: turning lights on and off and adjusting heating as it monitored his movements, for example. The following video was hosted on Extreme Homes and in it Mike Mozer chronicles the transformation. He tells of the history behind the decision to attempt the creation of this “Neural Network House”—a type of computer system that learns.


At Mike Mozer's website view several videos that demonstrate some of the networked system in action. According to Norman (2007) the neural network operating system is "designed to mimic the pattern-recognition and learning abilities of human neurons, thus the human brain" (p. 119). Norman asks whether this house is actually smart or intelligent but Mike Mozer says that adaptive is a better descriptor as it changes or "improves" its behavior in response to preferences that Mozer indicates: it "learns" what Mozer will expect. Some of these adaptive learnings can be viewed on the linked site above.


Interestingly, Norman mentions that Mozer would often remember that his house was anticipating his arrival at a certain time and dutifully would be preparing the appropriate ambiance, as it were: Mozer would consequently feel somewhat obliged to get home! Hence my title questions that asks which is adapting here--the house or the human? 


Does this example signify what we could expect as more and more "adaptive" paraphernalia enters society? Imagine excusing yourself from further conversation with buddies due to the "Assistant Robot" who is running your bath at that particular time! Must get home! As Norman recognizes, this would become onerous: he exclaims "This house sounds like a real nag" (p. 121). In truth, this conundrum occurs as this house and other such "robotized" systems are actually unable to read the human mind--which can be unpredictable and even nonsensical at times. Norman would describe these incidents as exemplifying the lack of "Common Ground" (pp. 49-55). In his words:


The lack of common ground is the major cause of our inability to communicate with machines. People and machines have so little in common that they lack any notion of common ground. People and people? Machine and machine? That's different: those pairs function quite well. People can share with other people. Machines can share with other machines. But people and machines? Nope. (Norman, 2007, pp. 50-51) 

I'm just not sure what choices we will make, or even should make, as we continue to advocate for, and develop, supposed intelligent or smart technologies: will we opt for adaptive technologies or adaptive humans?

No comments:

Post a Comment