In 2008, the year that I wrote most of Say Everything, blogging was already being declared dead. But its ostensible replacements weren't quite able to fill its shoes: Twitter and Facebook were both on a curve of rising popularity, but neither had yet become the mass-cultural phenomena each is today.
As I write now, in May, 2010, the industry's prognosticators have moved on from writing blogging's obituary to declaring the imminent mortality of the Web itself. According to them, these new networks aren't just popular; they are eclipsing the Internet As We Have Known It -- blogging included.
Silicon Valley's legendary angel investor, Ron Conway, recently told TechCrunch, "Facebook is becoming the Web. Everything you need is there... it is the universe." According to another Silicon Valley insider, "Twitter is exactly what the Internet was around 1996. It represents nothing less than the New Internet. It is the game-changer."
On one level, these model instances of digital hype are easy to dismiss. Obviously, Facebook and Twitter offer us no more than a fraction of the possibilities that the open Internet affords us today. Their growth may be phenomenal, but the enthusiasm of Internet users is fickle and easily transferred, and today's market leader can easily become tomorrow's has-been.
These predictions of the eclipse of today's Internet are less valuable as forecasts than as evidence of a cultural reaction. They seem, in their embrace of social networking as a Hot New Thing, to be forward-looking. (Forget that software-driven social networking has been around nearly as long as the Internet itself.) But for the people who make such blanket statements, they represent a wish to turn back the clock -- a postlapsarian yearning for a return to a digital communications regime that is closed, manageable and monetizable. This is nostalgia masquerading as futurism.
The Internet, as it has evolved in the nearly two decades since its emergence from the public sector that incubated it, is a messy, sometimes anarchic commons. The same openness that allows myriad novelties, including blogging, to prosper also leaves it vulnerable to con artists, junk peddlers and spam. Businesspeople from Steve Jobs on down dream of reasserting control over the environment, cleaning up the mess, banishing the hackers and cranks and porn merchants, and figuring out how to reinflate profit margins that the Web has, for the majority of industries, decimated.
For this vision to be realized, the legions of bloggers whose ascent Say Everything chronicled must drop their keyboards and docilely accept losing all the autonomy, bonhomie and voice that their posts have provided for them. No one should wait up for that to happen any time soon. Here is why -- via an examination of four disruptive forces shaping today's Web.
Probably the single question I'm most often asked as I talk to people about Say Everything is: How has Twitter changed blogging? In the zero-sum mindset of technology journalism, Twitter's rise must mean blogging's fall. Twitter's rapid growth is altering the landscape. But I think the result is auspicious in the long run, both for Twitter-style communication and for traditional blogging.
If you look back to the roots of blogging you find that there has always been a divide between two styles: One is what I'd call substantial blogging -- posting longer thoughts, ideas, stories, in texts of at least a few paragraphs; the other is "Twitter-style" -- briefer, blurtier posts, typically providing either what we now call "status updates" or recommended links. Some bloggers have always stuck to one form or another: Glenn Reynolds is the classic one-line blogger; Glenn Greenwald and Jay Rosen are both essay-writers par excellence. Other bloggers have struggled to balance their dedication to both styles: Just look at how Jason Kottke has, over the years, fiddled with how to present his longer posts and his linkblog: Together in parallel, interspersed in one stream, or on separate pages?
A historical footnote: Twitter's CEO is Evan Williams, who was previously best known as the father of Blogger. You find a style of blogging that's remarkably Twitter-like on the blogs that became the prototype for Blogger -- a private weblog called "stuff" that was shared by Williams and Meg Hourihan at their company, Pyra, and a public blog of Pyra news called Pyralerts (here's a random page from July 1999). The same style later showed up in many early Blogger blogs: brief posts, no headlines, lots of links -- it's all very familiar. In some ways, with Twitter, Williams has just reinvented the kind of blogging he was doing a decade ago.
Today, the single-line post and the linkblog aren't dead, but certainly, much of the energy of the people who like to post that way is now going into Twitter. It's convenient, it's fun, it has the energy of a shiny novelty, and it has the allure of a social platform.
But there's a nearly infinite universe of things you might wish to express that simply can't fit into 140 characters. It's not that the Twitter form forces triviality upon us; it's possible to be creative and expressive within Twitter's narrow constraints. But the form is by definition limited. The haiku is a wonderful format for a poem, but most of us wouldn't choose to adopt it for all of our verse. Yes, there are rare examples of long-form bloggers who have largely traded in that form for Twitter: Rosen, for instance, puckishly explains his enthusiasm for Twitter "mindcasting" by describing it as a response to the criticism he used to get that his blog postings were too damn long. But you'll hunt for a long time before you find other Twitter users who put as much substance into their streams as Rosen does into his. Most people who want to cast their complex thoughts into the online mix choose the longer form.
From their earliest days, blogs were dismissed as a mundane form in which people told us, pointlessly, what they had for lunch. In fact, of course, as I reported in Say Everything's first chapter, the impulse to tell the world what you had for lunch appears to predate blogging, stretching back into the primordial ooze of early Web publishing.
Today, at any rate, those who wish to share quotidian updates have a more efficient channel with which to share them. This clarifies the place of blogs as repositories for our bigger thoughts and ideas and for more lasting records of our own experiences and observations.
Twitter does have some serious deficiencies as a substitute for linkblogging and short-form blog posts. It fails at one of the most basic functions that blogging serves: the creation of a personal archive.
Blogs privilege the "now." New stuff always goes on top. But they also create a durable record of "then" -- as I learned in researching this book. One of the great contributions of blogging software is to organize the past for anyone who writes frequently online. Before blogs, with each new addition to a website we had to think, where does this go, and how will I find it later? Blog tools, as personal content management systems, ended that era.
Twitter is great at "now." But so far, it's lousy at "then." It offers no interface to the past. You can't easily navigate your way backwards in time.
Each tweet is timestamped and lives at a unique URL. So it should be possible to build the machinery to organize one's tweets into a more coherent record. But we don't really have a clear sense, or commitment from Twitter the company, of how long these URLs are going to be around. Twitter is providing the Library of Congress with a full archive of its public data so historians can access it. That's great. But what about the rest of us?
The other big weakness of Twitter as a sort of universal microblogging platform is that all its interaction is happening on one company's server, in that company's database. That poses some fierce technical problems if the Twitterverse keeps scaling up. (See for instance this comment by Chuck Shotton at Scripting News: "IMO, Twitter is a toy to be experimented with until it breaks and is replaced by a properly implemented solution that will persist, scale, and be as open as the protocols above.")
Even if Twitter can engineer its way out of this scaling dilemma, it remains a single-owner platform, which means that users and developers will always be subject to sudden changes dictated by the company's needs. One of Dave Winer's original message as a proto-blogger in the mid-'90s was to warn us about such platform control and to celebrate the arrival of the Web as the platform that nobody owns. Today Winer is sounding the same alarms about Twitter, and they are worth weighing.
Any service owned by a for-profit company will sooner or later face the pressure from investors to start "monetizing." (As I write this, in fact, Twitter is beginning to change the terms by which it shares data and splits ad revenue with its third-party developers.) Twitter may handle this transition sensitively or crudely. Either way, the results will likely remind us of the disadvantages of publishing our words using someone else's virtual printing press.
More so than Twitter, Facebook can serve, for its most avid users, as a sort of alternative-universe version of the open Internet. With its nearly half a billion users and its phenomenal leveraging of what it calls "the social graph," or network of human relationships, Facebook has managed to rise from its roots as a networking tool for college students to become the single most popular service for people to share messages and photos with their friends.
For a long time, Facebook's appeal lay in its simplicity. Facebook made it easy for new users to build a network of friends and distribute pictures and words to them. Just as America Online had once been popular as "training wheels for the Internet," Facebook served millions as training wheels for social networking. As a result, to many observers it also began to look like a natural successor to, and replacement for, the personal blogging movement. If you had been using a blog just to keep friends and relatives up to date on your doings, rather than to build a career as a writer or public expert, then wasn't it easier just to use Facebook?
In some ways, yes. Of course, Facebook has the same problem as Twitter with older material. On Facebook, you live in an eternal now, and the past just gently vanishes from view. No one who actually cares about the long-term fate of what they've written would contribute it to Facebook; you can't save or export your contributions -- you have no chance to curate the record of your life. Still, for anyone who felt public blogging was a little too exposed -- or who saw limiting communications to a smaller circle of friends as a feature rather than a bug -- Facebook's appeal was and is undeniable.
The trouble with Facebook as a blog replacement is that it has recently chosen to discard its most attractive characteristics: its simplicity and privacy. The company's speedy growth and constant tinkering has given it a classic case of "featuritis": it has added so many options and add-ons and wrinkles that it has lost the simplicity that made it attractive. Its mania for innovation, some of it ill-conceived, leaves its population in a state of perpetual semi-confusion. And as it has set out to turn a big profit, it has begun relentlessly pushing its users to take material they'd once considered private, to be seen by friends only, and expose it to the open, public Web.
For Facebook, this means more ads on more pages and more pages in search-engine listings. But for the users who originally embraced the service precisely because it let them limit access to their content to a select few friends, the changes have been both confusing and off-putting. Recently, Facebook has begun to look like one giant Skinner-box experiment to see how far you can push loyal users by changing the rules on them before they turn and run. The answer is: really far, but not forever.
This is where we glimpse another facet of Facebook's resemblance to AOL. Just as people fled AOL for the open Internet once they discovered its advantages and no longer needed AOL's hand-holding, it seems inevitable that users weaned on social networking at Facebook will gradually migrate away from the service as they realize its drawbacks: its unpredictable and arbitrary rule changes, its failure as a steward of its users' content, its casual betrayals of their trust.
Some observers argue that Facebook has achieved so powerful a "lock-in" that its users will put up with anything rather than face the pain of emigration to a different service. But the short history of social networking suggests exactly the opposite: users readily move from one service to another as long as they're motivated, and as long as they see their friends moving with them. The landscape is littered with the carcasses of once-hot and now eclipsed social networks, from Friendster to Orkut to MySpace. These networks rarely disappear -- they find and keep some population to serve -- but they have each in turn lost the mantle of leadership in their business.
There is no reason to think Facebook is invulnerable to the same dynamic. In the meantime, its impact on the universe of blogging is similar to Twitter's: far from killing off the appeal of starting a blog, it simply gives ambitious bloggers another way to build and feed a following.
Twitter and Facebook each represent further extensions of the idea of blogging -- the notion that everyone now has the opportunity to publish and share information, thoughts and updates about themselves and things they care about. The new social networks can be framed as threats to the healthy future of blogging because they can claim that they do what blogs do, only better.
In another precinct entirely, there are still media reactionaries who, rather than seeking to improve blogging, aim to reverse its influence. They dream about rolling back the Web-borne tide of personal publishing, interactivity, and peer-to-peer self-expression. They wait for "things to return to normal," for the public to go back to being content with receiving news and information in tidy bundles from a small number of profitable outlets. This dejected faction has faced one setback after another over the last decade. But recently it has been heartened by the rise of Apple's App Store model of media distribution -- which it imagines as the reconstitution of an old media order in the very heart of technological novelty.
Here, for instance, is media-biz consultant Ken Doctor -- writing, of course, on his own blog, about the allure to publishers of a "digital do-over":
Web browsing -- desktop and laptop -- has been a quicksand for publishers. Alluring, it drew them in and then got them stuck. Maybe, they hope, this next generation of reading devices -- tablets (and maybe mobile) can re-start the engines, putting them on the curve (if not ahead of it) with readers and advertisers, instead of being sent to the dustbin of history as dowdy, old, dying media.
As they rush to restart their engines inside Apple's shiny new vehicles, though, many publishers and journalists seem to be ignoring some crippling drawbacks the Apple world poses to their businesses and traditions.
Beginning with the iPhone and extending further now to the new iPad platform, Apple has built a regulated publishing environment in which every producer of media must win approval from Apple itself before being allowed to post its material in the App Store. In doing so, Apple is simply adopting a time-tested software distribution model, in which a technical platform's keeper vets products that add new functions or capabilities to make sure that they "play nicely" with consumers' existing hardware and software.
But media -- news and information and messages that people want to distribute widely or share with specific audiences -- has always been governed by different norms, socially and legally, than software tools. Apple seems to want both to build a vast digital newsstand and also to serve as editor, ratings agency and standards cop for every publication it hosts there. And so what once looked like vigilant stewardship of technical assets very quickly turns into something that looks like Big Brotherism, if not outright censorship.
Consider the bind Apple got itself into when it turned down a proposed app that political cartoonist Mark Fiore submitted to its review process in late 2009. Fiore, one of the pioneers of animated political cartooning online, found his work rejected -- apparently because it violated one of Apple's lawyerly guidelines that prohibited the ridicule of public figures. Gee, Fiore protested, "That's what I do. That's my life!" It wasn't until Fiore was awarded a Pulitzer Prize several months later that Apple reversed itself and deigned to allow his work to be distributed through the App Store.
You shouldn't have to win a Pulitzer to publish your cartoons. But it's hard to imagine any future for the App Store that does not involve an escalating number of Fiore fiascos. In the U.S. market, every socio-political flashpoint will spark another fight; internationally, Apple will have to make regular tough calls between its business interests and the efforts of political activists to use its platform. Meanwhile, Steve Jobs' promise to deliver "freedom from porn" means that Apple faces the daunting prospect of drawing its own line between the pornographic and the wholesome. There are already "pro-life" apps; will Apple also allow the distribution of information about birth control and abortion services? Why would a technology company ever put itself in the position of having to make such choices?
Back when Steve Jobs was planning the iPhone, he famously referred to the four major US cellphone carriers as "orifices" -- at once childishly mocking them and memorably characterizing their chokepoint placement in our communications networks. The App Store has turned Apple itself into an "orifice."
By making itself this sort of gatekeeper, Apple is asking for an endless stream of conflict and controversy. Yet that prospect hasn't cooled the ardor of those media executives who welcome the App Store approach, seeing it as a genie-bottling move that might allow them, once more, to package and sell media products the old way. Corporate media offerings have lost market value because of oversupply; so instead of greeting the App Store with a journalist's traditional hostility to information-control regimes, they welcome its limits as a means to reverse that oversupply and raise prices.
But this is their dream, not Apple's, and certainly not their customers'. If Apple cared about any of this it would never have placed a Web browser alongside the apps on the iPhone and iPad. But it did. That means all of Apple's devices serve as portals to the entire Web -- passable ones, in the case of the iPhone, and excellent ones, in the case of the iPad.
As long as that's the case, there can be no new monopoly, no reversal of the information over-supply. Instead, what we'll see is efforts to reproduce print design for the iPad, festooned with bits of pseudo-interactivity inherited from the CD-ROM era and then offered for sale at prices very few users will be willing to pay. These App Store-based publications will try to keep the user on their own turf, minimize links to other publishers' content, fail to connect readers with one another very well -- and thereby actually provide less value to many users than an equivalent website.
Meanwhile, starting a blog continues to require no one else's permission. No App Store approval process sits athwart your path. And anyone with a Web browser can read your work. App Store offerings can tap into some capabilities of Apple's device hardware that a browser can't. But today any decent browser taps us into a teeming multitude of other people's minds. This choice isn't even a close call.
Say Everything chronicles the rise of a form of media driven by the passion of individuals. Blogs emerged from the work of unconventional pioneers like Justin Hall, Dave Winer and Jorn Barger and spread quickly and widely because they provided an outlet for all manner of obsession. To this day, every successful blog is fueled by some blogger's unreasonable dedication to his or her subject. Such bloggers are true amateurs -- writing out of love, and typically for love.
The arrival of professional blogging, first at Nick Denton's Gawker Media and later from thousands of one-person shops and small startup companies, seemed to turn this model inside out. The writing was now for pay, and blogs began to measure their success using conventional media metrics -- page views and ad impressions. But the work was still driven by the intersection of the bloggers' and the readers' passion for the blog's topic. Denton described his method as "to take an obsession...and feed it."
Today a new approach to personal publishing online is emerging at companies like Demand Media, Associated Content and their similar competitors. These outfits -- derisively dubbed "content farms" -- scour the data from search engines and their advertising adjuncts (chiefly Google), determine what small sum can be made from a quickly produced article on some topic, commission that article, and turn a profit by paying its writer less than that sum.
On one level, this harnessing of search-engine optimization (SEO) analysis represents a brilliant innovation. It instantly solves the business-model conundrum that has plagued online publishing from its infancy. Content-farm operators can justifiably crow: Who says you can't make money on content online?
But their idea also represents a sharp break from the past. By banishing personal taste and enthusiasm from the publishing equation, they abandon both the editorial tradition and the blogging tradition. The writers for these companies superficially resemble their blogging counterparts: They write at home, earn very little money for their work, and don't typically reach a very large audience.
But where bloggers have always written about what they know and love, content-farm authors are the ultimate pens for hire. Rather than feeding the obsessions of human readers, who can applaud and argue with them, they are producing their posts and pages with a different, impassive audience in mind: the Googlebot.
For the moment, this turns out to be good business. But it is a business that's entirely tied to Google's fortunes -- and thereby vulnerable to technical, economic and social changes in the Google ecosystem. Those who live by the algorithm can also die by it.
More importantly, there is little evidence that the material produced by the content farms holds any value outside of Google. These articles are good at generating click-throughs from search results. But, having clicked on the story's headline, is anyone ever happy to read the body? Does such writing ever inspire anyone to stay up late, laugh out loud, or feel for the author? Does anyone care enough about stories generated by SEO-driven businesses to save them, reread them, or share them?
The passion that drives a good blog is no guarantee of polished writing or accurate information. But it remains the best possible indicator we have of lasting value. Every entry on a decent blog comes with an implicit assurance: Someone cared about this enough to post it.
That's why so much personal blog content survives for so long, as bloggers painstakingly ferry their archives from one publishing platform to another, while the back catalogs of so many commercial publishers vanish without a trace. And that is why blogging will continue to thrive, as more ephemeral schemes for Web domination and profit-eking come and go.
-- Scott Rosenberg, May 2010