Tuesday, September 15, 2009

NML: Neighbourhood Markup Language - David Rokeby



PROJECT DESCRIPTION

I am not, by nature, a utopian dreamer. But I felt compelled in this instance to propose a constructive project with a positive slant. I decided to try and imagine a new kind of social frame, a way of reactivating the shared experience of public space, enabled by the properties of convergent devices. To my surprise, and despite my intent to dream way beyond implementability, I have ended up with a project that poses more social complications than technical ones.

In considering the issues raised by the project themes, particularly the implications of communications networks and devices with respect to locality, I found myself returning again and again to the persistent, spatial scent-based animal communications networks, like the rich (to dogs) olfactory mappings of urine marked trees and hydrants and the marker trails that lead ants to food and back to the nest.  And so I am proposing a system for annotating physical space.

All content in the proposed system is addressed (for both saving and accessing) using the GPS determined location of the reader/writer. A message, image or sound saved from a certain location is only retrievable by someone physically located within a certain radius of that same position... it can only be summoned up in its physical and social context.

Annotations might include reminder notes left for oneself or friends (perhaps encrypted), personal observations, public notices, sounds, historical or architectural notes (official and personal), memorials, recommendations of interesting side-streets for the casual walker, exchanges of views on local issues, a story, a game, personal tours, markers for a treasure hunt.

Access is possible from any wireless networked portable computing device with a GPS unit. 
The user would be able to configure the device to continuously scan the content attached to the immediate vicinity for the presence of annotations, with customizable filters to reduce local data clutter to those of greatest interest to the user. Things already accessed would be marked as read and filtered out as well, unless intentionally called up. As the aim is not to further fragment public space by encouraging people to walk around with faces glued to small LCD screens, audio would be a preferred format for the annotations.

The device would indicate, perhaps through vibration, when data comes into range. On the other hand, a discrete but distinctive audible indicator (the social calls of crickets or frogs?) might be interesting as a signifier of data reception. Having a sound that is not personalizable might result in a positive confusion: "It was not my device, but then what is here that someone else is interested in..." Browsing or searching the entire set of annotations for one's current position would be possible through a familiar web-style interface.

0 comments:

Post a Comment