These days I might describe myself as a (somewhat) old-school historian trying to better understand the implications of digital technology and social media. As such, I found this piece by Lev Manovich (part 1 of a longer article) on different “information design patterns” or “forms of social media” to be analytically helpful. He discusses three such patterns – the database, the timeline and the data stream – and argues that we have now arrived in the age of the latter:
I want to suggest that in social media, as it developed until now (2004–2012), database no longer rules. Instead, social media brings forward a new form: a data stream. Instead of browsing or searching a collection of objects, a user experiences the continuous flow of events. These events are typically presented as a single column. New events appearing on top push the earlier ones from the immediate view. The most important event is always the one that is about to appear next because it heightens the experience of the “data present.” All events below immediately become “old news” – still worth reading, but not in the same category.
In the light of this, a previous point in Manovich’s argument was especially interesting: his contention that the database pattern of 1990s web sites represented a break with the narrative as “the historically dominant way of organization [sic] information”. While I’m sure one might qualify or challenge this conclusion in various ways, it seems generally valid to me. But if so, what is the data stream if not a merger of these opposing forms, a database-driven narrative or a narrative database? Put differently: “The Narrative Strikes Back”, anyone?
While Manovich does not explicitly make that connection in this installment of his article, maybe he will get back to it in the next. Whether he does or not, I am looking forward to reading it.
Speaking of the digital revolution, a few weeks ago Andrew Prescott gave a keynote address at the Digital Humanities Congress in Sheffield. I wasn’t there but the text has been posted online. In the keynote Prescott questions some widely held beliefs about revolutions both industrial, social and digital:
The Arab Spring, the arrival of printing and the Industrial Revolution all show us how change is not necessarily revolutionary or disruptive. The processes we think of as revolutionary can be lengthy, patchy in character, amorphous, difficult to measure and unpredictable, and there is no reason to think that the digital will be any different. It’s the continuities and the parallels that are often as striking as the disruptions.
In an age of rapid change were many bold claims are made, a thoughtful analysis such as this is what I call putting history to good use.
Speaking of 9/11, Rachel Herrmann wrote a blog post last year about how she decided to become a historian because of how that day unfolded in her classroom:
So what has stuck with me from that day has been Dr. Maskin’s behavior in our history class. Even in the face of the attacks, he retained the cool, analytical poise of a historian. On September 11th, I learned how historians have to ask those difficult questions, even when present events are shrouded in uncertainty. He made us aware that we were witnessing history in the making, an event akin to our parents’ watching the moon landing or hearing about JFK’s assassination. It was a horrible, devastating event, but it was history nonetheless, and we had to engage with it. Dr. Maskin hadn’t planned it, but that was one of the best teaching moments that I’ve seen, ever.
It is a terrific story about the difference that teachers (and historians) can make.
By way of a link from Bill Cronon on Twitter I came to read this essay by Ian McEwan, written in the wake of the 9/11 attacks. It is less a story about the events themselves than a meditation on what they say about us as human beings:
This is the nature of empathy, to think oneself into the minds of others. These are the mechanics of compassion: you are under the bedclothes, unable to sleep, and you are crouching in the brushed-steel lavatory at the rear of the plane, whispering a final message to your loved one. There is only that one thing to say, and you say it. All else is pointless. [- – -]
If the hijackers had been able to imagine themselves into the thoughts and feelings of the passengers, they would have been unable to proceed. It is hard to be cruel once you permit yourself to enter the mind of your victim. Imagining what it is like to be someone other than yourself is at the core of our humanity. It is the essence of compassion, and it is the beginning of morality.
Cronon calls it an “extraordinary piece”. It is. If you haven’t read it you should.
Mike Isaac at All Things D in a convincing, but not exactly reassuring, analysis of where Twitter is heading:
The direction in which tweets are evolving is a deviation from Twitter’s modus operandi. The company has prided its service on its simplicity: Stripped-down, text-only messages. And, for years, Twitter has resisted doing anything that would complicate the simplistic appeal. For the company to give an about-face and turn toward media is a major sea change — and if Twitter can’t be as simple as it always has been, staying consistent is the next best sort of insurance.
I don’t mind consistency, but simplicity is the whole point of using Twitter. If they lose that, they have lost everything.
Mary Norris writes about the meanings and usage of semicolons, always a fascinating topic, in The New Yorker:
So the semicolon is exactly what it looks like: a subtle hybrid of colon and comma. Actually, in ancient Greek, the same symbol was used to indicate a question.
And it still seems to have a vestigial interrogative quality to it, a cue to the reader that the writer is not finished yet; she is holding her breath.
If you find this to be the least bit interesting (which is not a given, I admit), be sure not to miss Ben Dolnick’s article in The New York Times that Norris refers to at the outset: “Semicolons: A Love Story”.
Robert McMillan of Wired has interviewed Robert Taylor, formerly of the U.S. Department of Defense Advanced Research Projects Agency (DARPA) and later Xerox PARC, regarding the recent discussion about the early history of the internet:
“The origins of the internet include work both sponsored by the government and Xerox PARC, so you can’t say that the internet was invented by either one alone,” [Taylor] says.
So would the internet have been invented without the government? “That’s a tough question,” he says. “Private industry does not like to start brand new directions in technology. Private industry is conservative by nature. So the ARPAnet probably could not have been built by private industry. It was deemed to be a crazy idea at the time.”
In other words: What is a mere curiosity (or “crazy idea”) today may turn out to be a very useful innovation tomorrow, but there is no way of telling in advance.
This is why publicly funded research should be allowed to be experimental and free-ranging, not driven by short-term commercial pressures. An obvious and very simple truth that politicians everywhere seem unable to accept, as they shut down funding to research not deemed to be “useful” or “profitable” enough. The internet has turned out to be quite useful, don’t you think?
Siva Vaidhyanathan in the Chronicle of Higher Education:
Once you consider a university a “brand,” you have lost. I suppose university presidents lapse into such language to placate the MBA’s on their boards. But the challenges and duties of private firms do not in any way resemble the challenges or duties of universities. So we must stop using business language to describe universities. It’s not only misguided and inaccurate, but it also sets up bad incentives and standards. The University of Virginia is a wealthy and stable institution, a collection of public services, a space for thought and research, a living museum, a public forum, a stage for athletics competition, and an incubator of dreams and careers. But it’s not a firm, so it’s certainly not a “brand.”