A great example of this is Apple: Steve Jobs ensures a consistent, simplistic and familiar Graphical User Interface (GUI). The iPhone for example has you slide your finger quickly across the screen to browse, much like flipping pages in a book: my 4 year old niece can can even search youtube with ease.
However, the algorithms academics argue HCI is a waste of time that dumbs down our technology and consequently the people that use them. This opinion to me is incredibly shortsighted and disregards the general consumer of computer-technology and more importantly (for crude and/or business-minded) the possible consumer.
How many times have you struggled to teach a GUI, most likely windows, and had trouble trying to teach it to someone over 50? Next time, try them out with an ipad and see if it makes a difference.
Much of the Western and english HCI, especially that which is more affordable, assumes a more-than-basic understanding of computers that can only be learnt through the osmosis of ubiquitous-like computing. When this method of technological practical-education is not readily accessible to a general public, it hinders their access to the technology as it makes the task of learning too cumbersome and odious. Then if one adds the expectation of our socially constructed idioms riddled throughout the O/S, it sets projects up to fail.
How does this change occur? Certainly not with One Laptop Per Child that has proved to be a failure in its implementation. The idea is not to dumb down the technology but create HCI for a different user. Like Microsoft 7, maybe it is important for NGOs like One Laptop Per Child to check up on their own projects, have them be the testers of new GUIs. Maybe checking up to see how ineffective the project in itself is?
Let my prediction stand: the One Tablet Per Child will fail, just the same.