Of databases and narratives

These days I might describe myself as a (somewhat) old-school historian trying to better understand the implications of digital technology and social media. As such, I found this piece by Lev Manovich (part 1 of a longer article) on different “information design patterns” or “forms of social media” to be analytically helpful. He discusses three such patterns – the database, the timeline and the data stream – and argues that we have now arrived in the age of the latter:

I want to suggest that in social media, as it developed until now (2004–2012), database no longer rules. Instead, social media brings forward a new form: a data stream. Instead of browsing or searching a collection of objects, a user experiences the continuous flow of events. These events are typically presented as a single column. New events appearing on top push the earlier ones from the immediate view. The most important event is always the one that is about to appear next because it heightens the experience of the “data present.” All events below immediately become “old news” – still worth reading, but not in the same category.

In the light of this, a previous point in Manovich’s argument was especially interesting: his contention that the database pattern of 1990s web sites represented a break with the narrative as “the historically dominant way of organization [sic] information”. While I’m sure one might qualify or challenge this conclusion in various ways, it seems generally valid to me. But if so, what is the data stream if not a merger of these opposing forms, a database-driven narrative or a narrative database? Put differently: “The Narrative Strikes Back”, anyone?

While Manovich does not explicitly make that connection in this installment of his article, maybe he will get back to it in the next. Whether he does or not, I am looking forward to reading it.

How revolutions are made

Speaking of the digital revolution, a few weeks ago Andrew Prescott gave a keynote address at the Digital Humanities Congress in Sheffield. I wasn’t there but the text has been posted online. In the keynote Prescott questions some widely held beliefs about revolutions both industrial, social and digital:

The Arab Spring, the arrival of printing and the Industrial Revolution all show us how change is not necessarily revolutionary or disruptive. The processes we think of as revolutionary can be lengthy, patchy in character, amorphous, difficult to measure and unpredictable, and there is no reason to think that the digital will be any different. It’s the continuities and the parallels that are often as striking as the disruptions.

In an age of rapid change were many bold claims are made, a thoughtful analysis such as this is what I call putting history to good use.

Making a difference

Speaking of 9/11, Rachel Herrmann wrote a blog post last year about how she decided to become a historian because of how that day unfolded in her classroom:

So what has stuck with me from that day has been Dr. Maskin’s behavior in our history class. Even in the face of the attacks, he retained the cool, analytical poise of a historian. On September 11th, I learned how historians have to ask those difficult questions, even when present events are shrouded in uncertainty. He made us aware that we were witnessing history in the making, an event akin to our parents’ watching the moon landing or hearing about JFK’s assassination. It was a horrible, devastating event, but it was history nonetheless, and we had to engage with it. Dr. Maskin hadn’t planned it, but that was one of the best teaching moments that I’ve seen, ever.

It is a terrific story about the difference that teachers (and historians) can make.

The nature of empathy

By way of a link from Bill Cronon on Twitter I came to read this essay by Ian McEwan, written in the wake of the 9/11 attacks. It is less a story about the events themselves than a meditation on what they say about us as human beings:

This is the nature of empathy, to think oneself into the minds of others. These are the mechanics of compassion: you are under the bedclothes, unable to sleep, and you are crouching in the brushed-steel lavatory at the rear of the plane, whispering a final message to your loved one. There is only that one thing to say, and you say it. All else is pointless. [- – -]

If the hijackers had been able to imagine themselves into the thoughts and feelings of the passengers, they would have been unable to proceed. It is hard to be cruel once you permit yourself to enter the mind of your victim. Imagining what it is like to be someone other than yourself is at the core of our humanity. It is the essence of compassion, and it is the beginning of morality.

Cronon calls it an “extraordinary piece”. It is. If you haven’t read it you should.

Simplicity vs. consistency

Mike Isaac at All Things D in a convincing, but not exactly reassuring, analysis of where Twitter is heading:

The direction in which tweets are evolving is a deviation from Twitter’s modus operandi. The company has prided its service on its simplicity: Stripped-down, text-only messages. And, for years, Twitter has resisted doing anything that would complicate the simplistic appeal. For the company to give an about-face and turn toward media is a major sea change — and if Twitter can’t be as simple as it always has been, staying consistent is the next best sort of insurance.

I don’t mind consistency, but simplicity is the whole point of using Twitter. If they lose that, they have lost everything.

Understanding semicolons

Mary Norris writes about the meanings and usage of semicolons, always a fascinating topic, in The New Yorker:

So the semicolon is exactly what it looks like: a subtle hybrid of colon and comma. Actually, in ancient Greek, the same symbol was used to indicate a question.

And it still seems to have a vestigial interrogative quality to it, a cue to the reader that the writer is not finished yet; she is holding her breath.

If you find this to be the least bit interesting (which is not a given, I admit), be sure not to miss Ben Dolnick’s article in The New York Times that Norris refers to at the outset: “Semicolons: A Love Story”.

Predicting usefulness

Robert McMillan of Wired has interviewed Robert Taylor, formerly of the U.S. Department of Defense Advanced Research Projects Agency (DARPA) and later Xerox PARC, regarding the recent discussion about the early history of the internet:

“The origins of the internet include work both sponsored by the government and Xerox PARC, so you can’t say that the internet was invented by either one alone,” [Taylor] says.

So would the internet have been invented without the government? “That’s a tough question,” he says. “Private industry does not like to start brand new directions in technology. Private industry is conservative by nature. So the ARPAnet probably could not have been built by private industry. It was deemed to be a crazy idea at the time.”

In other words: What is a mere curiosity (or “crazy idea”) today may turn out to be a very useful innovation tomorrow, but there is no way of telling in advance.

This is why publicly funded research should be allowed to be experimental and free-ranging, not driven by short-term commercial pressures. An obvious and very simple truth that politicians everywhere seem unable to accept, as they shut down funding to research not deemed to be “useful” or “profitable” enough. The internet has turned out to be quite useful, don’t you think?

Universities as brands

Siva Vaidhyanathan in the Chronicle of Higher Education:

Once you consider a university a “brand,” you have lost. I suppose university presidents lapse into such language to placate the MBA’s on their boards. But the challenges and duties of private firms do not in any way resemble the challenges or duties of universities. So we must stop using business language to describe universities. It’s not only misguided and inaccurate, but it also sets up bad incentives and standards. The University of Virginia is a wealthy and stable institution, a collection of public services, a space for thought and research, a living museum, a public forum, a stage for athletics competition, and an incubator of dreams and careers. But it’s not a firm, so it’s certainly not a “brand.”

Lessons from Norway

Two days ago the foreign minister of Norway, Jonas Gahr Støre, had an op-ed in the New York Times about the lessons to be learned from the tragedy on Utøya and in Oslo on 22 July 2011. He believes there are implications here for the global war on terrorism:

Osama bin Laden successfully provoked the West into using exceptional powers in ways that sometimes have been in conflict with its commitment to human rights and democracy. This only strengthened the case of extremists, and it shows that we should try to avoid exceptionalism and instead trust in the open system we are defending.

This is not a soft approach. It requires and allows for tough security measures. But it is firmly anchored in the rule of law and the values of democracy and accountability.

I agree. We all have much to learn from the manner in which Norwegians have handled the aftermath of last year’s horrific events. It has been dignified, determined and literally awe-inspiring in every conceivable way.

Twitter’s path

Like many others, me included, Orian Marx is worried about the direction in which Twitter seems to be heading. In a comprehensive blog post (via @ayjay) that is a good, if somewhat depressing, read for anyone who cares about the service, he writes about its past, present and (potential) future:

I have had a love / hate relationship with Twitter for four years. As a technologist, it is impossible not to be enamored with the transformative effect Twitter has had not just within my industry but the world at large. As an entrepreneur and perhaps an idealist, it is impossible not to be embittered by the trajectory upon which Twitter has set itself as a company. […]

I think Twitter will continue to spread FUD until what’s left of the ecosystem remains wilting in the carefully arranged flower beds of its walled garden, foregoing the legacy of all the good ideas that got it to where it is today.

While I hope he is wrong, I am afraid he will be proven all too right. Twitter has now reached the point where they need to start making a profit, but like so many other web services they have chosen to make their users into the product they sell rather than the customers they serve.

Along with Facebook’s failed IPO and their, as well as Google’s, serial privacy violations, Twitter’s recent actions are just the latest indication that we are entering a critical new phase in the (admittedly short) history of social media companies. After several years of explosive user growth, which has also brought with it large amounts of investor funding, many of them now face increasing pressures to generate revenue in a market where “free” is the norm. Rather than giving users the chance of paying for services, they try to build their business exclusively on ad networks.

When that happens, openness quickly gives way to attempts at control in a modern enclosure movement. Except that today, the sheep being enclosed — or shut out — are you and me.