Big Data

October 28, 2010

Jeff Jonas is a very impressive thinker and doer. Today he serves as chief scientist of the IBM Entity Analytics group and he is an IBM Distinguised Engineer. Yesterday I came across an interview with him by Andrew Keen. They talk on the subject of Big Data.

Some highlights:

  • Data are observations.
  • The more data we have, the more ignorant we become.
  • Organisations make sense of only about 7% of all the relevant data (in their enterprise and in cyberspace)
  • This gap between the amount of data and the making sense of it is widening.
  • GOOG is an enormous pixel sorter. It does not provide intelligence and/or awareness.

Watch the interview here.

The King is dead, long live the New King

October 26, 2010

Last week I concentrated on examining the popularity of services created to help me find interesting info, like tagging, digging, sharing and adding. Some of them work, but most of these services don’t deliver in the end. They come up, they grow, and then they seem to bump into some kind of obstacle. Why is this? I tried to analyze these services from the points of view on and theories of the diffusion of innovation. Most of these new services never move from innovators to early adopters. Not one is more widely used by the people formerly known as the the early majority.

Looking at Gartner’s Hype Cycle 2010, I again noticed that, however great their work, it’s only about technology and emerging technologies. Combining trends and hypes in the rise and fall of new emerging technologies with observations on the way real people actually behave is in my view far more interesting.

Take Search. Almost everyone starts at the homepage of a search engine, right? Use of this ‘technology’ is almost 100%. According to Gartner Search would be at The Plateau of Productivity and (their words) ‘…a sharp uptick in adoption begins, and penetration accelerates rapidly as a result of proven useful value….’

OK, one must agree. But let us examine the usefulness and the value more closely.

Most people use only one word in their search. Far too many search results are presented in less than a second.  About half of the people only click on the first result. The third result is used by just 10 %. And when people finally jump to a website, they are gone before you can count them.

I don’t know, but it seems to me that this is a perfect example of Filter Failure. Search has indeed been a critical solution to bringing audience to the web, but that’s about it. People don’t want to search, they want to find.

Sharing, adding, tagging and digging were launched because finding is not searching. Web 2.0 lead to more and more content. Content is King no more. I think time has come for a new king.

It can be done

October 21, 2010

This morning I received a tweet that pointed me to a debate organized by The Economist. This magazine is very succesful as a paper product, with great content and a huge circulation. At the same time it is a prime example of a media company that tries to understand the impact of the move to bits and actually really produces new forms of digital content. Content that matters.

The debate is on the question whether or not computing is the most important technology that originated from the last century. Two opposite views are expressed by top of the bill experts. Everyone can vote and comment. Very interesting views and opinions by readers are posted. The user engagement is quite long, I stayed at least for one hour and it had my full attention. I certainly will come back.

Of course I had to register and of course I will receive marketing materials from The Economist, but I don’t mind. I signed up for it.

The debate is sponsored by Intel. I did not object to this at all, on the contrary, I liked it. When was the last time you admired an advertiser for bringing you something on a screen?

So it can be done. Watch and judge for yourself.

Enter the debate at the medium right.

Science Fiction

October 20, 2010

In my last post I described the feeling of information overload. A feeling most people know very well. There is a tsunami of opinions, facts, news, media blah blah, data, info and it’s growing by the minute. In my blog I expressed my renewed hope for better info because the number of professionals in information architecture, information design and user experience (design) is on the rise and their profession is getting more professional.

I received a comment. A brilliant one. Did I not know of Clay Shirky’s speech at the web 2.0 summit in 2008, where he talked about information overload? In it Shirky described that our problem is not information overload, but filter failure. The web led to a big change in the business dynamics of publishing. Since the costs of producing content is practically zilch, there is no economic necessity for a filter function anymore. Before the web, publishers had to filter quality, since they had to pay upfront for producing the content and had to bear the risk of not selling poor quality.

No, I did not know Shirky shared his thoughts on this subject with us. So I checked it out, watched ten minutes on video, became impressed, searched for comments and analysis and learned a lot.

And in a funny sort of way, that’s exactly the point I tried to make. How did I not find this quality content?

And it made me think. If there is no quality, why try to search for it? What’s quality anyway? Who decides quality? Peers, friends? Experts? People, machines, algorithms?

When the web started people made homepages and linked to other homepages. We pointed each other to interesting stuff. Some people grouped interesting links in what later on became known as portals. Homepages became websites. Search was hell, as more and more websites were launched. The engines did crawl and count, but ‘quality’ was not judged.

A very interesting startup, later to be known as GOOG, came along and counted links to other websites and links coming in and thus Pagerank was born. Based upon the longtime academic practice of peering, quoting and referring, links became the way to be noticed in the growing amount of websites. A brilliant idea at the time, since quality of the website, material, opinions, design, webmaster was judged by other humans. Poor quality, no link. No link, no rank in Pagerank.

GOOG became a huge success and the de facto startingpoint for a journey into cyberspace. Eyeballs were attracted, money could be made through advertising and search results became manipulated. More and more content could be found quickly. It takes a few nanoseconds to get millions of results. People normally click on one or two, maybe three, links. They land on a website, look around for a very short time and dissappear.

Finding interesting and usefull stuff became again harder and harder. New solutions to the problem popped up. Web 2.0 came along and people started to produce and publish more and more on the web. New ways to find and share interesting stuff popped up as a new tsunami. Social bookmarking, adding, sharing, tweeting, digging, blogging, pointing, pushing.  Nowadays it takes me even too much time and effort to follow all the new approaches of sharing. That’s why I missed the point Shirky tried to make.

And he is right. It is all about the failure of the filters.

That brings us to the question what sort of filters we need. Should we return to the time and practice of the librarian and the curator? Should we integrate search with sharing, as GOOG introduced this week? Should we abandon machines and algorithms? Should we, as humans, do the filtering? Is there a way to use our collective judgement to judge quality? Let us minimize the junk and install some filters.

Around the time the web started to grow into a mass medium, I enjoyed the movie Johnny Mnemonic (1995), based upon a short story by William Gibson.  This cult classic is staged in 2021, when the whole world is connected by the gigantic Internet. Almost half of the population is suffering from a strange disease, called the Nerve Attentuation Syndrome, a.k.a ‘The Black Shakes’.

Let this scenario remain science fiction! 

More focus. Finally!

October 19, 2010

Working on a new project, I had to come up with some predictions for the next 2 – 3 years, in corporate media and publishing that is.

One of my certain bets of course is the rise of the tablets, as it is an expression of the more general migration to more mobility. Second sure bet is the rise of corporate publishing, since traditional publishers are loosing their ability to create value. Third trend is more bottom up: users want to experience the same satisfaction in their work as they do in their homes.

In my view the basic underlying force driving all these trends is the Data Deluge and the incapacity to deliver proper results. There is too much data and not enough quality info.

Therefore we will see more focus on information architecture, information design and user experience (design).


Open or closed?

October 19, 2010

Last week the semi closed system of facebook seemed not that closed, because other people besides your friends can follow you, at least if and when they pay.  Another major blunder. When will it stop and who will it stop? When the Wall Street Journal is investing huge amounts of time and effort to dig into your operation, I would certainly call it a very serious wakeup call.

GOOG, as always telling us they don’t want to be evil, published some nice figures. Revenue in Q2 was up 23% to 7,29b$. Profits were up 32% to 2,17b$.  More astounding than these figures is the simple fact that 1/3 is pure profit. No wonder they don’t know what to do with that pile of money! 

But the same week a study showed that GOOG in fact still is a one trick poney.

That can not be said of Apple. They presented another great performance in their last fiscal Q. Revenues 20,34b$ and profits 4,31b$. That’s 21% pure profit, but then again, they actually make and ship things.

Steve Jobs had a good time, claiming that the much debated issue of the battle between empires based on open or closed approaches really is not an issue at all. The real question should be what the customer wants.

Hype or Myopia?

October 11, 2010

Last week Gartner published the 2010 version of it’s Hype Cycle for Emerging Technologies.  Hypes are hypes, I agree. The acceptance of emerging technologies however can not be judged on technology alone. User behaviour, and even more important these days, consumer confidence is in the driver’s seat.

In my country a new government unveiled an uninspiring vision.

It certainly was a good week for myopia.  Let’s close the curtains, lock the door and wait until this storm is over. Which storm?  How about this one?

Can myopia be a hype?