[![Martin Belam's Audience](https://i1.wp.com/www.onemanandhisblog.com/content/images/2009/07/IMG_0784-thumb-250x187-1355.jpg?resize=250%2C187)](https://i2.wp.com/www.onemanandhisblog.com/content/images/2009/07/IMG_0784.jpg)
Computers are very good at repetitive tasks ONCE human beings tell them what the task is. This means: metadata.
However, it needs to be done more intelligentally. Daily Mail: a search for Gordon Brown brought up a caravan review, because it mentioned him.
We used to have librarians, who managed and curated our data. We stopped that, and brought in CMSes that shove everything into reverse-chronological order.
All journalists have the same CMS – lots of small CMS chunks loosely joined, rather than one screen for all. We should build them Firefox addons that allow them to quickly access the data they need for their work. Build custom search engines, tailored for journalists’ needs.
A lot of anxiety in the industry right now. Martin is a glass half-full guy – we have more opportunities to reach people with our stories than every before. We have bigger reaches – a global audience.
However bad it is, we are not celibate monks having to write the news out by hand. 🙂
Lots of discussion afterwards. Metadata needs to be manually managed, we can’t pass it all to machines. And we might need to revisits it alter, as meaning shifts over time. Human intervention will also prevent spammers polluting pure machine aggregation, if there’s some mechanism of trust at work.
Sign up or Sign in to join the conversation.