[Sigia-l] Self organising Information (Was Data as Information)

Reinoud Bosman Reinoud.Bosman at mediacatalyst.com
Tue Jun 28 11:41:07 EDT 2005


The brain/nervous system in its early growing stages works in a way that can
be very easily adapted to hyperlinked information.

When a neural path grows it initially creates a lot of connections with
other cells. Over time the connections that are used most will grow
stronger. The others die off. This is (very simplified) the 'hardware' part
of memories being stored in our nervous system. A very Darwinian process:
the fittest connections survive.

The same principle can be applied to hyperlinks. The server keeps track of
the amount of page views that lead from certain links. Links that are not
followed are eventually removed.

A site that has implemented this adaptive linking is newsmap
(http://www.marumushi.com/apps/newsmap/newsmap.cfm)

The font-size of the headlines of articles that are popular increases.
Headlines that aren't being followed decrease in size (until they are so
small nobody can see them anymore - *pop* goes the evolutionary web ;)

r.



On 6/28/05 2:33 PM, "Stewart Dean" <stew8dean at hotmail.com> wrote:

> Hi Jan,
> 
> I read your message and it's reminded me of a theory I'm still fleshing out.
> This gets a bit nebulous so forgive me if it appears a bit, well, strange.
> 
> In many ways information and energy are related which comes over in Shannons
> work. Life, for example, is governed by what is best described as 'rate of
> change' - something that is covered by the area of artificial life - or
> complex systems / complexity / self organising systems, pick the term you're
> happiest with.
> 
> Life requires the right amount of change to work, and it needs a particular
> form of organisation, one where it can organise energy (and information)
> using existing energy/information. To low a rate of change leads to either a
> static or cyclical system, like sailing a ship with no wind or tide, to high
> and things just break down into chaos, like trying to sail a ship in a
> extreme storm.
> 
> Computers are mostly static and cyclical - they require being poked with a
> stick to do anything, like the web sites most of build (updating prices isnt
> really dynamic, it's just a simple reaction from a feed).
> 
> You brought up entropy. Yep we're fighting entropy - fighting things getting
> too static or too chaotic. The reason why we're all in a job is that this is
> a never ending fight - as long as things change, regardless of the rate of
> change, things need to be updated.
> 
> So comes my theory - how about self organising sites?  I'm not really
> talking about personalisation, or collaborative filtering (you have to admit
> it doesnt work that well for the effort users have to put into it) nor am I
> talking about Wikis - which start getting close. I'm talking about
> information that feeds of the energy of user interaction to organise it's
> self and remains usable. In essence a library that needs no librarians, only
> readers.  I think it's possible but am time poor, so to speak.
> 
> I've heard people use the term bottom up IA but these come short of true
> bottom up systems (the BBCs approach, for example,  is fundamentally top
> down due to their use of an evolved dublin core).
> 
> I can't help think someone has tried to do this before but can anyone think
> of examples?  Or do you feel I still have some explaining to do?
> 
> Cheers
> 
> Stewart Dean
> 
> 
> 
>> From: "Jursa, Jan (init)" <Jan.Jursa at init.de>
>> To: "sigia l" <sigia-l at asis.org>
>> Subject: [Sigia-l] data as information?
>> Date: Tue, 28 Jun 2005 09:29:37 +0200
>> 
>> 
>> Though nullius in verba seems to be the motto of this list :-) - there
>> is nothing wrong to familiarise oneself with elementary theories, before
>> renaming given terms.
>> 
>> See for example "The Mathematical Theory of Communication", published in
>> 1948, by Shannon/Weaver (the fathers of information theory).
>> 
>> Some statement from Shannon and Weaver (how I learned them):
>> "information turns out to be exactly that which is known in
>> thermodynamics as entropy"
>> "In particular, information must not be confused with meaning"
>> 
>> so, as I understand it. data holds three kinds of things: Information,
>> Redundancy and Noise.
>> 
>> Regarding the fundamental definition of "information" in the information
>> theory, I'd like to think of an IA as of someone who tries to remove the
>> amount on Entropy in a given environment...
>> 
>> What do you think?
>> 
>> Cheers,
>> Jan
> 
> 
> ------------
> When replying, please *trim your post* as much as possible.
> *Plain text, please; NO Attachments
> 
> Searchable Archive at http://www.info-arch.org/lists/sigia-l/
> 
> IA 06 Summit.  Mark your calendar.  March 23-27, Vancouver, BC
> 
> 
> ________________________________________
> Sigia-l mailing list -- post to: Sigia-l at asis.org
> Changes to subscription: http://mail.asis.org/mailman/listinfo/sigia-l




More information about the Sigia-l mailing list