SELECT intItemID FROM tblItems WHERE vcLink = '#sSelf#'

ETCON: New thinking and ideas

February 14, 2004 :: by Lee 1 responses  

Amidst the technologies, research and demos of new tools and products that were showcased at ETCON, a few ideas and snippets of thinking caught my eye:

Language and Emergent Metadata
Ambiguity and diversity
Emotional Design and self-expression
Distributed intelligence and evolving applications

Language and Emergent Metadata

I had some interesting conversations about trying to support emergent metadata in community-based online systems, which is an issue of immediate concern for us and some of the projects we are involved in. People working some potentially very interesting projects are thinking about this as a way of enabling users to construct their own views of content and navigation systems that support their own conceptual models as well as their basic interests. George Lakoff writes about the way issues are framed in political discourse in the United States (which reminds me: I will *really* miss the extraordinary Fox News channel...) and seeks to redress the imbalances created by conservative think tanks in the 1980's, as we have discussed previously. is a fantastic example of this - the metadata required is as simple as a single free text tag that helps organise personal content, but if this happens to match what other people are using then - bingo - instant emergent metadata associations, and the most popular terms get promoted to the front page, thus creating a brief positive feedback loop. We have also used copied this approach recently in a local online community content aggregation experiment, which we will release next week for beta testing.

I think this is a tremendously important idea for sites that seek to involve users in the way they operate. There are clearly areas where top-down metadata works well, and where there is broad agreement over terms, but there are other areas where users should themselves be able to define their own terms, language and linkages to reflect their own sense of identity and perspectives. Connecting both ends in a meaningful way is a real challenge. The key factor in whether this can work, in my view, is that it must not impose an overhead on content creation.

None of this is strictly new, I suppose, but talking to people who are also wrestling with its implementation is useful.

Ambiguity and diversity

Matt Webb was invited to present a tiny, inconsequential application that he has been working on called 'glancing'. The app puts an eye in the Mac toolbar which connects the user vaguely and loosely with a group of friends. The eye starts closed, but as users choose to use the menu behind the eye to 'glance' at the group, it starts to open. It is based on the idea that in public (and especially in cultures such as the UK) people tend to exchange subtle but minimal glances prior to approaching somebody to engage them in conversation. These glances are ambiguous enough to cause no loss of face if they are ignored, and act as a kind of handshake mechanism to establish that it is OK to talk. Matt's application, deliberately very slow (so as not to attract too much attention from the user), is a simple way of achieving the same thing among online connected friends. Matt also related the tool to the idea of a 'stroke' (as in stroking a cat or a dog), which has been cited as the lowest level of interaction in transactional analysis applied to social encounters. Users can stroke their friends, virtually of course, by glancing at them, and this has a function akin to grooming among groups of primates.

His rationale for implementing it in the way he did was that eye contact is unconcious/involuntary, yet visible to others and requires presence. The application itself is designed to be polite and to require a low cognitive overhead. Matt feels that in cyberspace there is not enough ambiguity of interaction, partly because nothing is physically visible (except perhaps videoconferencing in a very limited way) and we have a tiny sensory surface with which to touch each other.

He went on to talk about tenuously related (but extremely interesting) ideas about ubiquitous computing and our desire to create the physical in cyberspace. However, he leaves us with this thought:

Computing emerged from 1950's first order cybernetics, based on control systems and a binary object/message infrastructure, which is the basis of our current approach to programming. Second order cybernetics from the 1970's onwards talks about systems and self-organisation - it is about shape and binding. In order to pursue the benefits of second order cybernetics, we should look at the ideas behind phenotropic computing as espoused by Jaron Larnier.

Larnier's ideas are part of a search for a new model beyond objects/methods that is less binary and more forgiving of errors, noise and imprecision, much like the real world.We should look at pattern recognition - e.g. bayesian spam filtering - as an approach to programming and data storage, so that when you update something in one place the system learns elsewhere.

Matt suggests this implies a need for the development of a new ethical sensibility analogous to environmentalism in the real world, and he recommends the work of Luciano Floridi on CyberEthics at Oxford University:

"What Floridi points out is that cyberspace is still relatively simple. The actions of a single individual can disproportionately effect the composition or evolution of the society that exists online. What's more, the composition of the environment quite directly affects the kinds of actions people can perform: the existence of the email protocol allows a new form of interpersonal communication."
"This combination - of being powerful and having clear consequences - puts us in a similar situation to what's happening in the real world with the environment. When humans became powerful enough to affect the environment on a global scale, a new kind of ethics emerged, one that gave value to things which might inadvertently be damaged ... In the context of cyberspace, Floridi calls this cyberethics."
"From Floridi's environmental cyberethics, wiki gardening and free software are the cyberspace equivalents of respecting rainforests and biodiversity."

Great stuff!

Emotional Design and supporting self-expression

Don Norman threw off the shackles of his past work in usability to give an entertaining keynote talk on emotional design, which Matt Jones covered and others wrote up as notes. The talk was basically a summary of his new book, Emotional Design: Why We Love (Or Hate) Everyday Things, and looked at the basic aspects of design - visceral, behavioural and reflective - and how even highly unusable products can make us fall in love with them.

This theme of emotional design and self-expression was present in (or at least relevant to) a number of other presentations such as Art-of-Logic, Experience Making, the Nokia Way, IVREA's Fludtime, Matt Webb's Glancing and Ludicorp's Virtual Worlds, Distributed Interaction in particular.

This is an area of direct relevance to the development of new, more intuitive and personal social software. If we are looking for evidence to support Norman's thesis about the power of the emotional components surrounding our interaction choices, then perhaps Edward Castronova's presentation The Future of Cyberspace Economies provides some emerging economic indices.

Castranova has studied the economics of Massively Multiplayer Online Gaming (MMOG) and found that with 10-20m participants it is worthy of attention as an economy it its own right. He looks at the trend of selling virtual gaming elements (characters, weapons, powers, etc) in the real world and estimates that there is approximately $1billion annual revenue in this trading system, giving the economy a GDP of $2k. He foresees the growth of synthetic economies based on virtual products and concludes that whilst synthetic worlds will probably occupy more of our time, diverting resources from the real world economy to the virtual, this can potentially raise our average material well-being (See Trevor Smith's collaborative notes for more details).

Distributed intelligence and evolving applications

Eric Bonabeau's presentation Evolving the Bad Guy looked at the role of evolving applications and algorithms in solving unpredictable technological problems, and in finding loopholes in systems (hence the 'bad guy' in the title). He demonstrated that every system has loopholes (the tax system, software, frequent flier programmes, elections, etc) and simple algorithms armed with just a few rules can evolve to find and exploit these loopholes. He quoted the example of research into potential vulnerabilities caused by battle damage to warships, which showed that a software algorithm based on evolutionary principles had managed to identify potential combinations of 'hits' that could destroy a warship even though each of the individual 'hits' was non-lethal, and this helped its designers build in additional protection to avoid this scenario. In areas such as combatting hackers, where software engineers are engaged in a war of attrition there is a danger of the so-called Red Queen Effect - i.e. running to stand still. Bonabeau suggests that evolutionary computing is one approach to finding otherwise obscure small changes that can have big effects in such work, rather than continue to rely on expensive over-engineered solutions.

There was also a fun talk from Scott Draves of Dreamworks about the development of the electric sheep screensaver, which uses distributed computing power to evolve virtual creatures that are subject to evolutionary processes.

What do you think?

Headshift linked here on May 9, 2006 03:10 PM
The Emerging Technology Conference 2004 was a fascinating and thought-provoking event, and these are my general impressions

Post a comment
Remember personal info?