The MMI field is new and old. Since the time we threw our first stone weapon to hunt for food 10k years ago we have started interacting. The difference in 2004 is machines are capable of intelligence. As compared to human brain (millions of years old) computers are still in the single cell stage. As the human brain has evolved it has nicely adapted itself to the changing environs. Brain senses five main inputs - light(vision), sound(hearing), chemical molecules in gas form (smell), taste (chemical molecules in liquid form) and touch (basic sensor input).
Light is perceived as presence/absence, tone, brighness and colour (frequency).
Data-information-knowledge that is how processing occurs.
If the data is exploding in size will the brain evolve to process this or just selectively filter the info and use it? We sense the world by vision (light), hearing (sound), smell (chemicals as gas), taste (chemicals as liquid) and touch (attributes of objects such as temperature, dimensions in 3D, size, etc.).
Predominantly vision is used to input data. Modern machines are using sight, hearing and some smell to present data.
Is the sensory end organs' data processing necessary for it to travel to the brain? Cochlear implants prove that it is not so. Can we improve the efficiency of data acquisition? Does the brain work better when cooled (as PCs do)? How does the brain learn? What is memory? Can we write and read from memory directly into the brain? Is magnetism a way to reach the brain non invasively and interact with it?
Is it possible to directly read/write to the human brain? Vision is the primary means of inputting data now. In ancient times hearing was the main means of data acquisition (in India). What is memory? What is the real world like when not interpretted through a human brain? Is the perception affecting our basic laws of science? Einsteins theory of relativity says time is relative to the observer. Is sense also relative to the observer?