Algorithmic Almanacs: Lots, Bots and Nots

Stevan Harnad amsciforum at GMAIL.COM
Tue Sep 16 09:30:47 EDT 2014

“*The world's most prolific writer: Sverker Johansson
<> has created more than three million
Wikipedia articles, around one tenth of the entire content of the site.
How, and why, does he do it?*” -- Norwegian Inflight Magazine

The question's interesting, though the right description is not that
Sverker Johansson “wrote” millions of Wikipedia articles but that he wrote
an online search algorithm (a “bot") that generated them automatically
(otherwise the writer of a payroll algorithm would be the “author" of
gazillions of paychecks…).

How and why indeed! It’s interesting, though, that so many Wikipedia
entries can be generated algorithmically. The boundary between an
encyclopedia and an almanac or even a chronicle of events has long been
blurred by Wikipedia.

But it is a good idea to keep in mind that what is easy to generate
algorithmically (and via its close cousin, *crowdsourcing*) in the
googlized digital era, is simple or 1st-order facts: Answers to
what/when/where questions: Apples are red, the sky is blue, it rained in
Burma on Tuesday, Arsenal beat Manchester United 4-1..

The real source of all this factual information is Google’s global digital
database, and the fact that Google has reverse-indexed it all, making it
searchable by Boolean (and/or/not) search algorithms as well as by more
complex computational (Turing) algorithms.

The questions that it’s much harder to answer algorithmically, even with
all of the Google database and all the Boolean/Turing tools, are the
higher-order how/why questions. If those could all be answered
algorithmically, most of theoretical (i.e., non-experimental) science would
be finished by now.

And the reason for that is probably that our brains don’t find all those
how/why answers just algorithmically either, but also via dynamical
(analog, sensorimotor) means that may prove accessible to future
Turing-Test scale *sensorimotor robots*, but certainly not to today's
purely *symbolic bots*, operating on purely symbolic databases.

In other words, it’s down to the symbol grounding problem
<> again…
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the SIGMETRICS mailing list