How Applications Learn
I'm currently reading an excellent book called How Buildings Learn by Stewart Brand. In it he considers the ways buildings evolve over time, changing in ways architects don't anticipate, and draws conclusions about how architecture must change to address this evolution. Brand argues for a more user-centered architecture that cares how a building's occupants will use the space for years to come. It sounds obvious, but studies about how people actually use a building are rarely doneand architecture awards are given based on photographs, not interviews with a building's inhabitants. It's timely reading for me because we've been looking at houses to buy in Corvallis, and this book is changing the way I think about buildings. It's also changing the way I think about Web development.
I'd like to see someone put together a similar study called How Applications Learn. Using Brand's book as a template, I bet you could take many of his theories and see if they apply to application development. Showing screenshots of the first version of Wordand each version leading up to the latest Wordwould be like showing a house with floors, rooms, and inexplicable hallways added over the years. If cathedrals are "High Road" development and warehouses "Low Road" (as Brand labels them), there must be application equivalents. What are the 70's domes of coding? Are some applications considered an "investment" while not actually solving a real-world problem? How does the structure of MMORPGs limit or embrace the ways their players can add to the environment? Which programs are truly complex and which are "decorated sheds"? User-research is often done early in the design process, but what are some methods for ensuring user-centered design well into the life of the application after features are added where needed?
There wouldn't be as much history and tradition to draw on for How Applications Learn, but it's never too early to think about how time affects our virtual structures.
I'd like to see someone put together a similar study called How Applications Learn. Using Brand's book as a template, I bet you could take many of his theories and see if they apply to application development. Showing screenshots of the first version of Wordand each version leading up to the latest Wordwould be like showing a house with floors, rooms, and inexplicable hallways added over the years. If cathedrals are "High Road" development and warehouses "Low Road" (as Brand labels them), there must be application equivalents. What are the 70's domes of coding? Are some applications considered an "investment" while not actually solving a real-world problem? How does the structure of MMORPGs limit or embrace the ways their players can add to the environment? Which programs are truly complex and which are "decorated sheds"? User-research is often done early in the design process, but what are some methods for ensuring user-centered design well into the life of the application after features are added where needed?
There wouldn't be as much history and tradition to draw on for How Applications Learn, but it's never too early to think about how time affects our virtual structures.