[use this link to print out this page]
Chapter Nine
Journalists vs. Bloggers
In 2003, friends and strangers started telling John Markoff, the New York Times’s senior Silicon Valley correspondent, that he should start a blog. He’d been covering the computer industry since the early 1980s and couldn’t get that excited about this latest fad. “I’d seen things come and go in waves,” he says today. “The form of expression of participation on the Internet is always changing.” When Markoff’s friend Joi Ito, a social-media entrepreneur and a veteran blogger in Japan, urged him to take up blogging, he responded tartly, “Oh, I already have a blog. It’s at www.nytimes.com. Don’t you read it?”
Over the next few years, Markoff trotted out that line whenever he was told, “You oughta be in blogs!” The jibe never failed to elicit an enraged reaction from true believers in blogging. “I was just being a smart aleck,” Markoff admits now. But he was also expressing a genuine puzzlement about the medium that was common among those who made a living by writing. Blogging just didn’t seem like that big a deal. People had been publishing personal writing on the Web for years. Readers had been interacting with writers online for just as long. Magazines and newspapers had always had “reporter’s notebooks” and “diaries” and other regular features designed to present short, chatty items. There just wasn’t a lot about blogs that was new or worth getting excited about, was there?
This reasonable attitude failed to take one simple change into account: all these experiences and opportunities that were old-hat to professional writers had become available to virtually anyone. Getting your work published used to require jumping many hurdles; now it required next to nothing. This was what excited so many newcomers about blogging. But how could it excite those who already had access to a publishing outlet? Most journalists take their soapboxes for granted; that’s why the resentment of the soapbox-less multitude so often blindsides them.
Animus against the press had been a driving force in blogging from the start. Dave Winer wanted to seize his own story, and the technology industry’s, back from the media middlemen. Political bloggers of the left and right took fact-checking arms against a sea of error and bias. Everywhere you looked, the litany of grievances against the media was the same: You don’t get it. Or, you got it wrong. Or, you need to come clean. It was as though the pent-up pressure of a century’s worth of unpublished letters to the editor had suddenly exploded online in a fury of indignation and complaint.
But the media did not, at first, respond in kind. Newspapers and magazines published their first handful of articles about blogging (including Rebecca Mead’s New Yorker piece) in 1999 and 2000. Written mostly by Web-savvy technology reporters, these pieces noted the arrival of this new Web form with approval or, at worst, amusement. In the Chicago Tribune, for instance, Julia Keller wrote that weblogs were her pick for “most promising new genre,” and added that she’d found “precious little hype in the weblog world.” And in Slate, Rob Walker wrote that though he enjoyed weblogs, they hadn’t cut into his reading of “newspapers or ‘real’ news sites.” Some bloggers might rage against the media, but there was no reason for the pros to be alarmed — because, really, how many people actually read blogs? Or would ever?
There was one big hole in this reasoning: these same pros had already fallen hard for this new form themselves. If you peeked at the bookmark lists on the Web browsers in every newsroom in the United States, on any day from late 1999 on, you’d find that a rapidly growing percentage of the journalism profession shared an addiction to a weblog produced by Jim Romenesko called MediaNews. Founded in May 1999 as mediagossip.com and given its more sober name when it got acquired by the Poynter Institute’s website a few months later, “Romenesko,” as it was universally referred to, served as the journalism profession’s virtual water cooler. It was a simple weblog, collecting links each day to stories and columns about the media; Romenesko would also publish a smattering of his email, ranging from arguments about copyediting to disputes about business strategy. And as disaffected or mischievous employees started sending him copies of newsroom memos on sensitive subjects like budget cuts and layoffs, he would repost them verbatim. This made for irresistible reading for anyone in the business.
In other words, the same journalists who were reassuring themselves that they could ignore the anger among bloggers at their profession because “nobody reads blogs” were choosing to get a good chunk of their information about their own field from a blog. Many admitted, when pressed, that they were in fact obsessed with Romenesko — they’d keep returning to the site for updates at idle moments during their workday.
This was perfectly understandable. Journalists love to know things ahead of everyone else. Continually reloading Romenesko meant that the next time your editor mentioned some column to you, you could say, “Oh, yeah, I read that yesterday.” The trouble was that so many journalists failed to notice that their own behavior was a harbinger of a much wider shift that was under way in patterns of information consumption online. They couldn’t imagine Romenesko as a model for coverage of other fields. The site was too specialized. It was just their profession’s inside baseball. And it was all links to material on other sites. In the new vocabulary of online media, it was an “aggregator,” not a producer of “original content.” In the newsroom culture, aggregation was a low-status activity. Yes, a lot of journalists had bookmarked Romenesko. And getting your story mentioned or linked from his site was usually welcome. But few hankered after Jim Romenesko’s job.
It was only slowly that the attitude toward blogging among professional journalists, and the bent of the coverage they gave it, shifted from detached interest to a more active belittlement. This stance changed, not at all coincidentally, at a moment of unprecedented stress in the news industry. In the early 2000s, a long-term slow decline in circulation began to steepen, and profits that had been fattened by dotcom-bubble-fueled advertising began to vanish. At the same time, the professional self-respect of journalists was suffering a series of self-inflicted blows. Jayson Blair’s deceits discredited the august New York Times; angry readers blamed the entire Washington press corps for swallowing the Bush administration’s line on Saddam Hussein’s weapons of mass destruction and the urgent need to invade Iraq. Then, at the very moment when newspapers began to eye their websites as a source of revenue that might begin to replace lost profits from the declining “dead trees” edition, editors and reporters found their authority questioned and their work challenged by bloggers.
In their eyes, this self-righteous mob was kicking hardworking journalists when they were already down. Naturally, many reporters and editors responded with hostility. Their brush-offs, tinged with ridicule and contempt, kicked off an interminable debate — Journalists versus Bloggers — that still smolders in many quarters. There was an old saying that advised, “Never pick a fight with someone who buys ink by the barrel.” But this was something new: a fight between those who bought ink by the barrel and those who published without any ink at all. That meant there would, practically speaking, be no limits to how long the argument might last. In one lengthy public correspondence between blogger Jeff Jarvis and New York Times editor Bill Keller, Keller, apparently exasperated by Jarvis’s dogged, detailed replies, distilled his sense of frustration with the open-ended nature of the Journalists-versus-Bloggers dispute: “There seems to be no end to any argument in your world.”
Of course, editors are busy people. And one prerogative of an editor has always been the ability to declare, “This argument is at an end.” The job of a news editor is to say, “And now this.” The news cycle has turned! Time to move on. The trouble was, bloggers were under no obligation to pay attention to such marching orders. If you ran a blog that obsessively tracked the fluctuation of oil prices or the rise and fall of hemlines — or, for that matter, the arguments between bloggers and journalists — then nothing was going to stop you from continuing to post about it. You followed your own news cycle — just as Josh Marshall and his peers did in keeping the Trent Lott story alive after the newspapers and networks had left it behind. This characteristic of blogging became a profound irritant to editors who were accustomed to being able to set the agenda of public dialogue. The bloggers had said their piece, and the editors had responded; couldn’t everyone just move along now?
As the volleys flew back and forth between journalists and bloggers, many in the two groups came to treat each other as bitter enemies. But there were at least as many similarities between their respective identities and activities as there were differences. They were more like feuding cousins, squabbling over a family legacy: Who gets to call himself a journalist? Who should readers trust? Which group was meeting democracy’s need for reliable public information? If the Web was killing newspapers, could the new medium fill the void?
Journalism was only one of many kinds of writing that bloggers engaged in. But it was the one that attracted the most attention from professional journalists, who, like everyone, saw the opportunities of blogging through the lens of their own experience. When newspapers and TV started covering blogging and introduced it to the majority of the public that hadn’t already discovered it on their own, they inevitably shaped its story around their own concerns and fears.
The question “Is blogging journalism?” was not one typically raised from the blogger camp; it was posed overwhelmingly by journalists, who made it the theme of countless columns and the agenda of innumerable journalism-school panel discussions. The answer has always seemed simple and obvious: writing a blog neither qualified nor disqualified you for the “journalist” label. Blogging could be journalism anytime the person writing a blog chose to act like a journalist — recording and reacting to the events of the day, asking questions and seeking answers, checking facts and fixing errors. Similarly, journalists could become bloggers anytime they adopted the format of a blog as a vessel for their work.
These realities tended to be obscured by the tug of group loyalties. Blogger and journalist ought to have served as simple names for straightforward activities; too often they were used instead as badges of tribal fealty. On the surface, the argument was about accuracy, objectivity, and similar matters of practice; underneath, the conflict was over standing, rights, and respect — matters of identity. Once bloggers began to find a following as sources of information, journalists pointed at them and said, “Who appointed you?” Bloggers wheeled around at journalists and pointed right back, asking, “Well, who appointed you?”
Journalism has long straddled the line between craft and profession. On the hurlyburly frontier of the Colonial era and through the nineteenth century, anyone with a printing press — or in the employ of someone with one — could call himself a journalist. Over the last half-century, journalism schools have flourished in the United States, but newsrooms are still full of practitioners who learned on the job instead of in the classroom. The First Amendment’s protection of the press from government regulation has prevented the development of any comprehensive accreditation scheme for American journalists. What we have instead are ad-hoc rules for the rationing of scarce resources — primarily, that of access to powerful people and important events. For instance, political reporters who work for major media outfits get press passes to cover presidential trips and press conferences, while small-town papers and bloggers usually don’t. So journalists didn’t have much of an answer to “Who appointed you?” beyond “My boss.”
Most journalists view themselves as quick studies and generalists; part of the job’s appeal is that you’re always in a position to be learning something new. Good journalism required a variety of skills, from speedy research and source evaluation to interviewing technique and explanatory storytelling. Certainly, mastery of these skills often gave seasoned veterans and ace investigative reporters a valuable edge over less well-trained competitors. But these skills weren’t exactly particle physics. And now anyone who started a blog had at least the opportunity, if so inclined, to try to practice them.
That left the legal system in a quandary. It had always used a shortcut to deal with the troublesome problem of defining who was a journalist: if your paycheck came from a journalism organization, you were in. Now that shortcut was being undermined by people like Josh Wolf. Wolf was a San Francisco–based blogger who recorded videos of an anti-globalization protest in which a police officer was injured; he spent more than six months in jail in 2006 for refusing to turn over his tapes to a grand jury investigation. The court decided Wolf was an activist, or, as a U.S. attorney put it, “simply a person with a video camera who happened to record some public events.” He maintained he was a journalist and as deserving of First Amendment protection as any other reporter seeking to protect documents or sources’ identities from a prying judiciary: “The notion that I needed to be under contract by a major media outlet is preposterous,” he told the San Francisco Chronicle. Organizations like the Reporters Committee for Freedom of the Press and the Society of Professional Journalists eventually rallied behind Wolf, and the Newspaper Guild gave him a press freedom award. (Ironically, he finally won his freedom through a compromise in which he presented the videos publicly on the Web rather than turning them over to the court.)
In another “who gets to be a journalist?” controversy, Apple sued a Harvard freshman named Nicholas Ciarelli in 2005, claiming that his blog — Think Secret — was publishing trade secrets. The legal questions in each case were different, but the root issue was the same: Now that anyone could commit acts of journalism, who could claim protection under laws that privileged a special class of journalists?
The rise of blogging exposed just how porous the line between “journalist” and “non-journalist” really was. Some observers began to use the term “citizen journalism” to describe the resulting profusion of new forms of amateur reporting and experiments in community-based information-gathering. The label was embraced by journalists and educators like Dan Gillmor and Jay Rosen, a professor at New York University, who defined it thus: “When the people formerly known as the audience employ the press tools they have in their possession to inform one another, that’s citizen journalism.” Walt Mossberg, the Wall Street Journal’s popular personal technology columnist, liked to make fun of citizen journalism by likening it to “citizen surgery,” and the joke always won him a laugh. But it was a poor analogy. It suggested that journalism was a field like medicine, one that required an elaborate training regime and rigorously policed professional standards. That has never been the case. And if it were, if our lives really did depend on the quality of journalists’ work, then in recent years much of the profession lay open to charges of malpractice.
@ @ @
Anger at the American media may have been stoked by a legion of bloggers, but they didn’t spark it. It has a long history, encompassing political partisans outraged by the perception of bias, businesspeople unhappy with the difficulty of obtaining coverage for their products, and the general population of all those interview subjects who once spoke to a reporter for an hour but came out with only a three-word quote in the final article (and felt they’d been misquoted). To some, journalists are elitists who look down on the rest of humanity; to others, they are morons who can’t get anything right.
Such gripes have always set a baseline for discontent with the media. But in the early 2000s, the level spiked sharply, thanks to the Iraq debate. The Bush administration made its case for an invasion of Iraq — introduced in the spring and summer of 2002, and intensified the following autumn and winter — by blanketing news channels with information about ties between Saddam Hussein and al-Qaeda and imminent threats to the American people from the Iraqi dictator’s secret stash of chemical, biological, and nuclear weapons. With just a little bit of effort you could find reasonably abundant counter- evidence in the foreign press, in some small journals, and on specialty websites (like Josh Marshall’s). But with rare exceptions, big media outlets in the United States, and the journalists they employed, essentially repeated or reinforced the administration’s claims. Reports on Iraq’s efforts to acquire weapons of mass destruction by the New York Times’s Judith Miller that later proved erroneous attracted the most venom from online critics. But the failure was far more widespread.
The evaporation of the case for war, once American forces failed to turn up any of those WMDs, discredited not just the Bush administration but the media outlets that had served as its enablers. If journalists’ role in a democracy was to make sure that the public and its leaders worked from good information as they weighed life-and-death decisions, then the record of the Iraq war debate represented a system breakdown of imposing proportions. Here’s the entire story boiled down to a tongue-in-cheek instant-message exchange posted on the feminist blog Jezebel, between one of its bloggers, Megan Carpentier, and Wonkette founder Ana Marie Cox:
MEGAN: Like, oh my God, Ana, when are bloggers going to get ethics like real journalists?
ANA MARIE: As soon as we gain enough power to mislead a country into a stupid war.
The judgments American journalists passed on themselves, like this conclusion from a report on a 2008 panel at Harvard’s Nieman Foundation, were often as harsh as the attacks of the most livid bloggers: “Covering one of the most important stories of our time — the run-up to war in Iraq — our nation’s top reporters and editors blew it. Badly. Their credulous, stenographic recitation of the administration’s deeply flawed arguments for war made them de facto accomplices to a war undertaken on false pretenses.” A survey of Nieman Fellows gave their profession a grade of “D” for its performance on the Iraq story, in which, as one participant put it, “the national media simply reported what national leaders wanted to do without scrutiny or challenge.”
To be sure, the post-Iraq breakdown of trust in the media had a partisan element. Opponents of President Bush and his war naturally felt the betrayal the deepest. Many liberal Democrats already believed the press had helped deliver the White House to George W. Bush in 2000 by relentlessly focusing on trivial blemishes in Al Gore’s record; they blamed the media for abandoning its adversarial role. Their view was summarized by blogger Glenn Greenwald: “Propaganda thrives — predominates — in our democracy for many reasons, the principal reason being that we don’t have the sort of journalist class devoted to exposing it.”
But it wasn’t as if everyone on the right was happy with the press. If anything, conservatives’ belief in “liberal media bias” was more deeply entrenched than any equivalent belief on the other side of the political spectrum. For many conservatives, the watershed moment for giving up on the media came not during the Iraq debate, but later, during the 2004 election cycle, with the event they came to call “Rathergate.” On a 60 Minutes show that aired September 8, 2004, at the height of that year’s political campaign, CBS’s iconic anchorman, Dan Rather, presented a story about President George W. Bush’s time in the Texas Air National Guard in the early 1970s that seemed to confirm widely circulating rumors that Bush had neglected his service obligations. Rather’s story was based on a series of memos that CBS said had been written by Bush’s commander at the time.
Bloggers on the right, led by Powerline and Little Green Footballs, immediately painted the 60 Minutes story as a political hit piece from a network they’d long believed was a den of liberals. Hoping to douse the controversy, CBS posted scans of the disputed memos on its website. Within hours, a poster at the conservative Free Republic forum who called himself “Buckhead” declared they were forgeries: they were typed in a modern, proportionally spaced font and had other typographic flourishes atypical of early-1970s office typewriters. “Buckhead” was hardly a neutral observer, nor did he claim to be; he was an Atlanta lawyer named Harry MacDougald with a Republican Party pedigree who had helped draft an Arkansas Supreme Court petition for the disbarment of President Bill Clinton. His analysis leaped from Free Republic to the conservative blogs to the Drudge Report; Drudge served as a bridge to the rest of the media, which soon swarmed over the story. Probably the most arresting evidence was a simple animated image, created by Charles Johnson of the Little Green Footballs blog, that flipped back and forth between CBS’s original memo and a present-day retyping of it using Microsoft Word’s default settings. Johnson’s image made a dramatic case for Buckhead’s speculation that the memo was simply a modern document that had been run through a photocopier multiple times to blur it.
CBS responded the way newsrooms under attack so often have: it stood by its story. The memos had been “thoroughly investigated by independent experts.” The venerable network, proud inheritor of the mantle of Edward R. Murrow and Walter Cronkite, didn’t need to justify itself to some mob on the Internet waving pitchforks. Former CBS executive Jonathan Klein described the flap in terms that distilled the old guard’s disbelief at the upstarts’ challenge: “You couldn’t have a starker contrast between the multiple layers of checks and balances and a guy sitting in his living room in his pajamas, writing what he thinks.”
The trouble was that the guys in pajamas were making more sense than CBS was. CBS stonewalled at first, working from the playbook of a politician engulfed by scandal. Noticeably more determined to protect its reputation than to ascertain the truth, the network resisted suggestions that it launch its own investigation into the documents. In a followup broadcast a week after the first questions arose, Rather said, “If the documents are not what we were led to believe, I’d like to break that story.” But it was far too late for that — the Web had already broken it wide open. As more details emerged, a consensus formed, not just on the blogs but among the network’s journalistic peers, that CBS had blown it: The experts CBS relied on hadn’t solidly authenticated the memos, and the 60 Minutes team had made several leaps of trust in sources that could not be justified in retrospect. The rest of Rather’s story about George W. Bush’s service might well have been accurate, for all we knew; but in recklessly promoting the suspect documents, CBS had damaged its reputation and discredited the entire piece.
One step behind its critics, CBS eventually reached the same conclusion. On September 20, CBS president Andrew Heyward issued a statement admitting that “CBS News cannot prove that the documents are authentic. . . . We should not have used them.” Heyward fired the report’s producer, Mary Mapes, and several others involved in the story. Rather himself retired six months later — a year earlier than he’d once planned, and under a considerable cloud. In 2007 he sued his former network for $70 million for harming his reputation in the affair.
To this day, Rather and Mapes both maintain that nobody has proven their documents were forgeries. But the vaunted checks-and-balances of the CBS newsroom were supposed to meet a higher standard than that. Surely the story ought to have been nailed down before it aired. Wasn’t that what separated the journalists from the bloggers? In defending their story by insisting it has never been 100 percent debunked, Rather and Mapes sound nothing like the model newsroom pro — that eternal skeptic who lives by the motto “If your mother tells you she loves you, check it out.” Instead they resemble the caricature of the irresponsible basement-dwelling blogger — who never lets the facts interfere with a dearly held belief.
Anyone can make a factual error or misfire on a big story. CBS’s missteps with the Bush documents exposed deeper flaws in its internal processes, the self-regulatory checks-and-balances it proudly brandished as evidence of its superiority to the pajama-clad Web horde. The network had become impermeable to legitimate criticism. When credible external voices raise reasonable questions about a story, a healthy news organization should responsibly evaluate them. CBS reacted instead with what one of its former executives called “insufferable hubris and self-righteousness.”
Just as the Iraq story confirmed liberals in their belief that the conservative White House held the media in its thrall, the Rather story confirmed conservatives in their belief that the press promoted a liberal agenda. The newsroom was taking flak from both ends of the political spectrum. That’s a situation that actually reassures many newsroom veterans. Everybody’s mad at me, goes the thinking, so I must be doing something right! But there’s always another plausible explanation: you could be wrong all around.
It was painful for dedicated journalists to contemplate this possibility, but the more you looked at the field in the middle of the decade of the 2000s, the less confidently you could dismiss it. No matter what your beat was, if you were writing regularly about any topic, you now had to contend with a welter of competing voices on the Web. Some were ill-informed and unlikely to threaten a professional journalist’s standing. But many others were experts or self-taught obsessives who were willing to post about their fields around the clock and in far greater depth than any commercial publication would ever provide. It wasn’t just fan behavior — Harry Potter devotion or American Idol-ization — that inspired this sort of blog. You could find heavy hitters in the legal and health professions, technologists and economists, linguists and classical-music archivists, butchers and bakers and somewhere, no doubt, candlestick makers — initiates in every conceivable realm of arcana, publicly displaying their expertise, many hoping to boost their professional reputation or sell stuff, but many others motivated simply by the sheer delight of it. Some of these experts were ensconced in their fields, securely employed in academia or private industry (you couldn’t get much more credentialed than U.S. Appeals Court judge Richard Posner, who blogs voluminously); others were self-appointed. In either case, all that mattered was the expertise revealed in their writing. The public nature of the work meant that their credentials were earned and re-earned, post by post.
Many journalists, content in the penumbra of respect and entrée conferred by the institutions that employed them, had complacently accepted an ex officio basis for their authority. Now they faced discomforting challenges to that authority in a new environment where who you worked for mattered less than how good you were, and how good you were had become a question anyone could argue. “A passionate amateur almost always beats a bored professional,” wrote Chris Anderson (the professional who edited Wired magazine). Here and there, of course, you could still find passionate professionals, and they were priceless. But the bored pros found themselves outclassed and outgunned as never before.
For instance, if you were an American voter mildly interested in the 2008 election, you could follow the ups and downs of the polls the way you always had, in the brief stories your newspapers and magazines provided and the headlines you heard on TV and radio. But if you were an American voter who was frantically, unreasonably interested in the election and its day-to-day, state-by-state poll results, the Web now fed you as much detail as you could stand. This process had started in 2000, during the overtime-innings election that birthed Talking Points Memo, and gained real traction in 2004 with the rise of a site called Electoral-vote.com. Produced by an expatriate American computer scientist living in the Netherlands, Electoral-vote.com aggregated poll results to provide daily tracking of each state’s leaning in the closely fought Bush-Kerry contest. Similarly, in September 2004, a blog named Mystery Pollster, by veteran pollster Mark Blumenthal, began providing in-depth analysis of poll results.
In 2008 the evolutionary process of this sort of site took another giant leap with the arrival of a blog named Fivethirtyeight.com (after the total number of U.S. electoral votes). Fivethirtyeight was the brainchild of Nate Silver — a sports-statistics geek who had helped devise a complex, scenario-based modeling approach to predicting baseball season outcomes, and who decided, early in 2008, to apply the same methodology to the U.S. election. Fivethirtyeight provided an imposing volume of data and an impressive range of poll-related commentary, updated several times daily. Silver was always transparent about his own methodology and ever ready to point out the limitations and flaws in other people’s numbers. If you wanted a quick-immersion education in the vagaries of political polling, here it was, for free, and entertaining to boot. Anyone who was reading Fivethirtyeight.com daily during the summer and fall of 2008 was never going to be able to read a daily newspaper’s poll stories the same way again. Something similar was happening across the board: in field after field, the new brigades of blog-based specialists were offering devastating, and in many cases unimpeachable, critiques of mainstream media coverage, exposing it as at best shallow and at worst entirely unreliable.
There was nothing new about the “Oh God, they got it wrong again” experiences shared by so many subjects of newspaper articles or journalists’ interviews. Dave Winer, for one, had been complaining about them for years; as he put it in 2004: “No matter how diligent Our Good Reporter is, something is lost in translation. This is observable.” But until recently, these instances of garbled information or distorted messages took place in isolation: each of us knew of the particulars of shoddy coverage only in our own fields of expertise. Now it was possible to connect all these dots — to piece these dismaying fragments into a comprehensive mural of media mediocrity and error.
Since many bloggers provided both an alternative to the work of journalists and a channel for criticism of that work, it was little wonder that blogs acquired an unsavory reputation among media pros — becoming “synonymous with damp mold and scurrilous invective,” in the waggish words of Vanity Fair critic James Wolcott, a dedicated blogger himself since 2004. Journalists found many reasons to detest bloggers, but their most consistently irksome trait was their relentlessness. As the Times’s Keller had said, they just didn’t know when to stop. The world of the newsroom is a world of constrained resources — there are only so many reporters on staff, so many hours in the day, so many column inches to fill — and editors spend their workdays making choices within those limits. But bloggers lived outside these constraints. They seemed to have all the time in the world to pursue their obsessions. They provided their readers with a feast of minutiae no traditional publication would ever dream of delivering. And they were just beginning to disprove the charge that bloggers only offered opinion or commentary and never pounded the pavement to provide original reporting.
When Scooter Libby, Vice President Dick Cheney’s lieutenant, went on trial in January 2007 for his role in the leak of classified information about CIA operative Valerie Plame, among the reporters who descended on the Washington courthouse was a small swarm of bloggers. The controversy was one that liberal bloggers had helped fan in the first place, and their readers were hungry for more than the occasional terse summaries they were going to get from newspapers and broadcast outlets. So Firedoglake, a liberal blog operated by former Hollywood producer Jane Hamsher, sent a half-dozen volunteers to the trial and tag-teamed their coverage. The result was something new: a national event where the best on-the-scene reporting came not from professional press articles but from blog posts by volunteers. As Jay Rosen wrote: “If you wanted to keep up with the trial, and needed something approaching a live transcript, with analytical nuance, legal expertise, courthouse color, and recognizably human voices, Firedoglake was your best bet.” A New York Times piece on the blog’s trial coverage reported what attentive readers already could see: that the mainstream journalists covering the trial were depending on Firedoglake’s reports, too.
The Libby trial was, for the moment, an exceptional story. But there was a growing list of topics about which Web-based amateurs were providing more-comprehensive information than the traditional pros. Increasingly, journalists found that when they strayed onto the turf of one of these stories that the blogosphere held dear, they now couldn’t get away with the sort of hasty summarizing and overgeneralization that too many of them had long practiced.
Joe Klein, Time’s veteran political analyst, discovered this in November 2007, when he wrote some careless sentences about Congress’s deliberations on a bill designed to legalize the Bush administration’s secret wiretaps. Klein erroneously reported details of the provisions of a Democratic Party–sponsored version of the bill placing the wiretap program under the supervision of the FISA (Foreign Intelligence Surveillance Act) courts. When blogger Glenn Greenwald — a lawyer with a hearty appetite for rhetorical trench warfare — began pointing these mistakes out, Klein responded with a series of increasingly irritated posts on his own blog at Time’s website. It’s clear from these pieces that Klein saw the blogger as a painful distraction — some singleminded animal that had sunk its teeth into his ankles and would not let go. Couldn’t Greenwald see he had other stories to write, press conferences to attend, speeches to cover? The political beat couldn’t be all FISA, all the time! How could anyone expect him to master every detail of every issue he covered?
Klein eventually admitted, “I made a mistake by not reporting this more thoroughly.” Then he added a sentence that seemed to encapsulate the capitulation of the weary professional before the onslaught of the tireless blogging masses: “I have neither the time nor legal background to figure out who’s right.” Later he amended the line, adding the words “about this minor detail of a bill that will never find its way out of the Congress.” Klein insisted that his errors didn’t matter since the Democrats’ bill had no future — but they plainly mattered to the throng of commenters on his and Greenwald’s posts. And his dismaying admission of journalistic impotence seemed to cede to his opponents the very ground of credibility on which his profession was taking its stand.
@ @ @
Like many of his colleagues and peers, Klein found himself writing for a blog in the latter part of the 2000s because his employer wanted to boost traffic to its website. Blogs, with their frequent updates and comments and links, had proven more effective at attracting online readers and earning search- engine rank than the no-value-added repurposing of articles from print. As one part of a “throw everything at the wall and see what sticks” frenzy of Web experimentation, newspapers belatedly began handing out blogs to the newsroom staff. In 2003 there was only one New York Times writer (op-ed columnist Nicholas Kristof) doing anything that might loosely be considered blogging; by 2008 the Times published roughly seventy blogs. (Ironically, John Markoff found himself drafted by the Times to contribute to a group technology blog. His joke had come true; he really was blogging at nytimes.com.)
For these journalists, blogging was typically an added duty — something piled on top of an already groaning pile of work. They were up against competition from self-motivated bloggers whose daunting productivity was fueled by personal obsession rather than institutional edict. Some of the pros embraced their new blog work, reveling in the freedom, the informality, and the lighter or absent editing. But it was little wonder that many others resented blogging: it was making everything about their work harder.
The publishers’ new embrace of blogging was less a matter of enthusiasm than of desperation. The decade in which blogging emerged was one in which the entire offline media industry in the United States, and particularly the newspaper business, had gone into a tailspin. Newspaper publishing had been in gentle decline for several decades; circulation numbers were drifting slowly down well before the Web’s arrival. Profits fluctuated but mostly depended on the local monopolies that most papers had in their cities. Newspapers made money by bundling a variety of disparate forms of information together — news reports and opinion columns, arts reviews and movie listings, stock prices and sports scores— and then selling that bundle to subscribers and advertisers. The Web dissolved the threads that held this bundle together. The stock prices and scores and listings no longer needed to fill pages with tiny type; the Web delivered this information in a timelier and more convenient way than paper ever could. Classified ads, too, worked infinitely more efficiently online than in print — and would end up costing advertisers much less, too. As the downhill slide of the industry gathered speed, insiders could no longer pretend that this was anything but a historic, structural transition in the news media. Newspapers were not going to vanish overnight, but their lifespan wasn’t going to extend beyond those of the aging people who still read them out of habit.
It was easy to see the inevitability of this from the moment the Web went mainstream in the mid-nineties. (It was a big factor in my own decision in 1995 to forsake a job at the San Francisco Examiner for the Web, in the face of incredulous colleagues who clung to their union jobs as if they were tenured chairs.) When Intel chairman Andy Grove told the American Society of Newspaper Editors in April 1999 that they had three years to “adapt or die,” the editors’ eyes “rolled in ‘not this again’ circles,” according to a New York Times account. But as the decade wore on, it was only Grove’s number that seemed unduly pessimistic, not his general prediction.
For newspaper publishers, the Web was an attractive new distribution medium in some respects: it would let them reduce costs on trucks and “dead tree” materials. Their hope was that the rising line on their graphs that represented online revenue would cross the falling line representing print revenue before things got too dire — and before they had to scuttle so much of their operation that it couldn’t survive. But media companies were having a hard time winning their share of the booming Web advertising market; the online environment favored Google’s search-driven advertising over the sort of traditional display advertising that had always fueled media profits. So far, it had proven impossible to match the rich monopoly income of the old publishing model on the still-evolving and far more challenging and competitive Web. Maybe, many analysts (and investors in newspaper stocks) reluctantly began to conclude, that would never happen.
As changes in the news business accelerated, newsroom veterans and their business counterparts kept finding themselves at least a step behind. Newspaper executives realized they had let the portals — big, all-purpose Web services like America Online and Yahoo and MSN — usurp their role as gateways to useful information. The new-media outfits had stolen a march in the 1990s, while their old-media counterparts were still arguing about whether to go online. By 2000, it was already too late to win that fight. So when the dotcom bubble collapsed in 2000–2001, some news organizations tried to impose subscription fees on their Web users. Insiders had long been fretting that newspapers had “given away the store” on the Web (the phrase was used by Los Angeles Times business columnist David Lazarus, among many others) by failing to charge for access to their websites. It was “self-inflicted cannibalism,” in the words of New York Times veteran John Darnton. Now, the argument went, with the Web industry on the ropes and online advertising temporarily depressed, it was time to lower the toll gates and extract a price for the valuable information journalists collected. But with rare exceptions, efforts to require payment for newspaper content on the Web failed, reducing traffic and ad revenue without attracting much new revenue in subscription fees. There was no shortage of information and diversion online; readers simply went elsewhere. They divided their loyalty among an ever-widening set of options, spreading their eyeball time indiscriminately among sites produced by paid professionals and sites created out of obsession or love.
Many journalists eyed the new army of unpaid or underpaid competitors — the contributors at Huffington Post, the citizen journalists, the bloggers, and all the other providers of what media companies called “user-generated content” — with disbelief. Their resentment now went beyond simple puzzlement over why the blockheads were writing for free. Their volunteer rivals, they felt, were more than just suckers; they were tantamount to scabs, undermining secure high-paying jobs by choosing to labor under unfair conditions. (“‘Citizen journalist’ is just the pretty new construct for ‘unpaid freelancer,'” New Orleans–based journalist Kevin Allman complained in a 2007 letter to Romenesko.)
The labor-movement analogy was apt. The entire newspaper industry was entering the sort of vast technology transition that creates genuine labor crises. (While newspapers were first, magazine publishers and broadcast-license owners understood that they were probably next in line.) As online information consumption trended upward, newspaper circulation declines grew steeper, and waves of layoffs, buyouts, and consolidation convulsed the industry. Proud journalists found themselves cast in the role of the handloom weavers of nineteenth-century Britain, rendered unemployable by infernal new machines. Or maybe, as Clay Shirky wrote, they were like monastic scribes idled by the arrival of Gutenberg’s press.
All these comparisons cast them as victims, and that was exactly how many newspaper employees felt. Thanks to the newsroom ethic of church and state separation — in which the work of newsgathering was kept strictly cordoned off from the business that paid for it — journalists had little professional tradition of scrappy entrepreneurialism. Here and there, newsroom exiles experimented with “hyper-local” blogging, demonstrating how small Web outposts might begin to replace the community coverage so many papers were abandoning. But mostly the newsroom veterans disavowed any responsibility for their economic fate, and sat on the sidelines of the dismemberment of their own industry.
@ @ @
As the decade advanced, the press found its credibility and popularity among the public — its customers — at a low ebb, its financial prospects clouded, and its leaders and employees dispirited. “The news business — our crowd of overexcited people narrating events as they happen — is going out of business,” Michael Wolff hyperventilated in Vanity Fair in 2008. This environment of perpetual crisis proved hospitable terrain for the flourishing of a new subspecies of journalist: the newsroom curmudgeon.
The curmudgeons were typically, but not exclusively, newsroom veterans certain that the Web was destroying their profession but unable to propose any practical program to stem the tide of bloggers. They offered up familiar debating points from a now decade-old argument about the nature of news online as if such observations were fresh and urgent. Blogs, the curmudgeons protested, lacked rigorous standards. NPR’s distinguished correspondent, Daniel Schorr, cast his lot with them when he wrote, “A person like me who believes in the tradition of a discipline in journalism can only rue the day we’ve arrived at where we don’t need discipline or anything. All you need is a keyboard.” Blogging would debase any real journalist’s career. Pete Hamill told his journalism students at NYU, “Don’t waste your time with blogs.” Bloggers — unlike the professionals, with their legalistic newsroom codes — had no ethics; you couldn’t trust them. The attributes that made blogs such an appealing alternative to many readers, their informality and personality, made them anathema to the curmudgeons, who saw these traits as betrayals of their ideal of neutrality.
The curmudgeons’ arguments all shared a starting point in the tenets of professional journalism as practiced in mid-twentieth-century America: political impartiality; on-the-one-hand-but-on-the-other “balance”; impersonal voice. The whole bundle of “objective” attributes — what Jay Rosen called “the view from nowhere” — was etched into the journalism school curriculum. These values were held out as timeless verities, but in fact they were of relatively recent vintage. They had been shaped by the specific business needs of the publishing and broadcasting industries. As they consolidated markets and sought to sell advertising that might reach vast agglomerations of consumers, the peddlers of news found they couldn’t afford to alienate partisan populations of any stripe; neutrality was a prerequisite for profits. Yet vibrant journalism had existed without the benefit of such values — for example, in the pamphlet culture of late seventeenth-century and eighteenth-century Britain and Colonial-era America, or in the raucous partisan newspaper competition of the late nineteenth- and early-twentieth-century urban United States. And vibrant journalism could plausibly survive their demise.
The brigade of curmudgeons behaved like defenders of a faith. They were true believers in belieflessness. And they always won a lot of head-nods wherever journalists gathered. They found support outside their profession, too: as one interviewee put it, in a California public-radio report on the decline of newspapers, “Any idiot with a laptop can post his ramblings. And there’s no editor. That’s the beauty of a newspaper — there’s an editor!” But the curmudgeons often got their facts wrong, which tended to take the air out of their arguments.
In one widely followed exchange, a journalism professor and Pulitzer Prize winner named Michael Skube took the blogosphere to task in an op-ed piece in the Los Angeles Times. What our society desperately needs from journalists, he wrote, is “the patient sifting of fact, the acknowledgment that assertion is not evidence and, as the best writers understand, the depiction of real life”; what we get from bloggers, instead, is endless opinion and commentary. To support his assertion, Skube provided as evidence the names of a number of prominent blogs, including that of Josh Marshall’s Talking Points Memo.
As it happened, as Skube published his piece in August 2007, Attorney General Alberto Gonzales was preparing his resignation in the face of a scandal involving the politicization of the U.S. attorneys’ offices — a story that had been reported first, and most thoroughly, on Marshall’s blog, which had pursued the story by “patiently sifting facts” that the political journalism establishment had chosen to disregard. So for Skube to cite Talking Points Memo was at best a poor choice, and at worst an admission of deep ignorance about the subject on which he was pontificating.
Peeved at seeing his blog knocked for lacking what was in fact one of its proudest attributes, Marshall contacted Skube. First, as Marshall reported the exchange, Skube made the perplexing claim that he had not named Marshall in the piece. Then he admitted he had — but explained that it wasn’t his problem, because he hadn’t written that part of his article: the list of bloggers had been inserted by an editor! “And this is from someone who teaches journalism?” Marshall wrote. “I grant you that the blogosphere needs better bloggers. But, as usual, the need for better critics seems even more acute.”
Whatever intelligent points the curmudgeons wished to make, they had a remarkable propensity for undercutting themselves. In April 2008, Buzz Bissinger, a veteran sportswriter, appeared on Bob Costas’s HBO talk show opposite Will Leitch, an editor of the Gawker Media sports blog Deadspin. As Costas amiably interviewed Leitch, Bissinger broke in and snapped, “I think you’re full of shit.” He proceeded to denounce bloggers in blistering terms for their “despicable” writing quality and “abusive” tone. “I think that blogs are dedicated to cruelty, they’re dedicated to journalistic dishonesty, they’re dedicated to speed,” he said. “It really pisses the shit out of me.” Bissinger later apologized — apparently figuring out, after the excitement had ended, that the crudity of his own bluster had overshadowed any point he hoped to make about nastiness among bloggers.
The curmudgeons’ rhetorical volleys achieved little, partly because of their own misfires, but even more because there was a deep confusion at the heart of their beliefs. These journalists were looking, as those who feel victimized will do, for someone to blame for their displacement. In the nineties, their predecessors had blamed AOL or Yahoo for all that was going wrong in their industry; now they yearned to blame bloggers. But the curmudgeons found it difficult to establish any cause-and-effect relationship between the rise of blogging and the decline of newspapers. In fact, although these two trends emerged simultaneously, each was proceeding under its own steam. If the curmudgeons could somehow have waved a wand and made all the blogs go away, they might have felt better, but their employers would still have been in trouble.
“Is blogging journalism?” was no longer an active controversy for the curmudgeons; to them, the answer was obviously no. The argument they now favored was “Blogging can never replace real journalism!” That question stood at the heart of countless thumbsucking columns by journalism pundits who, in the latter part of the 2000s, seemed to wake up suddenly to the realization that their house was on fire. A 2006 New Yorker piece by Columbia Journalism School dean Nicholas Lemann was typical in its conclusion: “None of [Internet journalism] yet rises to the level of a journalistic culture rich enough to compete in a serious way with the old media — to function as a replacement rather than an addendum.” That was a defensible position; it was also a straw man. Lemann failed to acknowledge that the overwhelming preponderance of bloggers neither desired to replace journalists nor claimed they could serve as an acceptable substitute. Most bloggers had a visceral understanding that they participated in a complex informational ecosystem, in which their relationship with the traditional media was essentially symbiotic. Maybe they were like the profusion of forest-floor flora, dependent on the environment shaped by the big trees, but also fighting for their own patches of light. Or maybe, as the big institutions began failing, they were like the termites feeding on the fallen media tree trunks.
Ultimately, “can bloggers replace journalists?” was less a question about the unlikely prospect of a wholesale changing of the media guard than a cry from the beleaguered journalists’ hearts. Insecure and buffeted by forces outside their control, they hoped to hear the reassuring answer that they were, in fact, irreplaceable. And yet bloggers, for the most part, refused to provide that solace. They might not be trying to steal journalists’ jobs, but they weren’t going to try to save them, either. The bloggers’ indifference to the unfolding turmoil in the journalism business, their unwillingness to view it as a tragedy or civic disaster, appalled the curmudgeons and many of their colleagues. Didn’t the bloggers understand what was at risk? Didn’t they see the looming Dark Age that would envelop American democracy and culture if the institutions of media collapsed?
Apparently not. It was left to defenders of the journalistic tradition themselves to sketch out the dire consequences of the media meltdown.
@ @ @
Traditionalists offered three arguments to champion journalism’s good old ways. First, they asked, without media companies subsidizing it, who would undertake the expensive and politically perilous work of investigative journalism? Next they suggested that the proliferation of online news sources and the multiplicity of partisan blogospheres meant the triumph of the “echo chamber,” in which we only learn what we already know about, and only hear those with whom we already agree. Finally, they argued, the collapse of big media would cause the very unity of our culture to disintegrate, leaving us without a central narrative for our national life. Each of these arguments, unlike the most nostalgic carpings of the curmudgeons, was serious and substantive. And each provoked an important debate online.
The future of investigative journalism was indeed in some doubt. That had been true even before the accelerated collapse of the newspaper industry. Investigative reporters typically worked on extended assignment with little guarantee of results. At best they exposed wrongdoing and won prizes; at worst they had nothing to show for months of labor. In civic terms, their work was a high calling, the raison d’être for the institution of independent journalism in a democracy; but in business terms, it was costly to the payroll as well as troublesome for the legal budget, and in hard times it was usually the first thing to get the ax.
So it wasn’t as if the old media model had the problem of supporting investigative journalism licked. But could the low- or no-profit online world support it at all? Probably not in the traditional manner, with teams of high-paid veteran reporters free to follow their noses for months at a time in search of dirt. That model might now only be possible in the nonprofit sector — where ProPublica, a new outfit led largely by exiles from the Wall Street Journal, had pitched its tent. Of course, there were other ways to pursue original investigative journalism that might be more suited to the online world. You could see some of them in embryonic form at Talking Points Memo and other lean, enterprising blogs that had begun to find investigative niches for themselves.
The fate of investigative reporting, no matter how vital to the health of the republic, was largely a concern for journalism insiders. The fear that online news consumption might lock us into an echo chamber was an issue with a wider constituency. Critics of the Web like the legal scholar Cass Sunstein and the sociologist Robert Putnam (Bowling Alone) had long held that isolation and closed minds were the inevitable byproducts of online communication. News consumption was moving from a passive mode, in which editors assembled a batch of newsworthy stories at regular intervals for us, toward an active pursuit, in which we followed only the links that interested us and assembled our own patchquilt of daily news from multiple sources. This led one set of pessimists to worry that we’d gradually lose the experience, much cherished by newspaper lovers, of serendipitously stumbling upon little stories that we didn’t know we wanted to read. This initial line of argument quickly dissolved as these critics spent more time online themselves and discovered that the Web, far from banishing serendipity, actually generated an oversupply of fascinating novelties and distractions.
Another, more observant set of pessimists was less easily answered: they suggested that the Web’s design meant that it would foster self-reinforcing feedback loops among like-minded people, who would dedicate increasing amounts of their news consumption to reinforcing their existing preconceptions. Call it homophily, the psychologist’s term for a natural preference for people who are like ourselves; or call it confirmation bias, the statistician’s term for our tendency to trust information that confirms what we already believe. Whatever you called it, it looked dangerous. Early Web enthusiasts promoted the concept of the Daily Me — the online news source that you could customize to deliver only the information you’d expressed interest in. But the critics recoiled in horror: to them, the Daily Me looked like nothing more than a machine for constructing personal echo chambers. If we each assembled our own news diet instead of consuming what editors chose for us, they feared, we would never eat our informational spinach. We would close our minds to news that contradicted our preconceptions and become isolated in our prejudices.
Before 2000, there was little concrete evidence to suggest that any actual echo chambers could be found on the Web, outside of a handful of partisan discussion forums like FreeRepublic.com. That changed with the rise of political blogging, which gave explicit shape to separate cross-linked worlds on the left and right — visible in the roster of each side’s blogrolls, which displayed little overlap. Fans of Daily Kos or Atrios were unlikely to be fans of Power Line or Instapundit, and vice versa. But it was still hard to say that a devotion to either was actually cutting you off from your opponents: bloggers on the left or right still linked across the partisan divide, if only to argue or mock. And even if the Web had intensified the echo in some precincts, it had streamlined the satisfaction of heterodox curiosities, too. It was much easier for a liberal to follow an occasional link to a columnist at The National Review than to make the effort to purchase a copy of the magazine — and you didn’t have to feel bad about paying cash for something you disagreed with.
Scholars began running numbers on the blogosphere to try to measure whether the echo-chamber effect really existed. In 2005, one widely discussed study — a paper titled “Divided They Blog,” by Lada Adamic and Natalie Glance — concluded, unsurprisingly, that there were indeed two well-defined blogospheres, self-sorted ideologically and interacting far more heavily with their own side than across the partisan divide. Adamic and Glance charted the divide via colorful red-versus-blue network diagrams that looked like warring galactic clusters, with each side extending only a handful of tendrils across the void to parley. But which was more notable and new: the existence of the two camps, or the infrequently exercised opportunity for dialogue between them? Profound partisan divides existed long before the Web and blogs. How much did we communicate across them back then? Did bloggers’ efforts to reach out today constitute an increase or a decrease in dialogue between ideological opponents? “Divided They Blog” stopped before asking these questions.
Of course we gravitate toward writers we agree with, even as we understand we might learn more by grappling with those we disagree with; this is human nature. According to the echo-chamber argument, the advent of blogging enabled us to overindulge this preference, speeding us down an accelerating spiral into ideological isolation. But a 2007 study of “cross-ideological discussions among political bloggers,” by Eszter Hargittai, Jason Gallo, and Matthew Kane, found no evidence to support this picture. Hargittai and her colleagues, like Adamic and Glance, confirmed that bloggers flocked (and linked) along partisan lines — but also found that “bloggers across the political spectrum address each other’s writing substantively, both in agreement and disagreement.” Most notably, they discovered that the camps’ level of isolation from each other did not intensify over time, as the echo-chamber alarmists suggested it would. Bloggers were divided, and their cross-camp dialogue was limited, but their minds did not seem to grow more tightly closed.
As with so many arguments about the media, the echo-chamber advocates often seemed overly eager to scapegoat technology for problems that had deeper roots. Yes, American politics had grown bitterly polarized in the 2000s. But were the angry arguments on the Web the cause of those divisions? More likely, they simply mirrored profound disagreements among the American people about the impeachment of President Clinton, the contested outcome of the 2000 election, the Bush administration’s tactics in its war on terror, and the invasion of Iraq. What kind of media environment that accurately represented the political pysche of the American population would not bristle with rancor under the pressure of such events?
In 2004, when Howard Dean’s Web-driven 2004 campaign for the Democratic presidential nomination collapsed, critics of the Internet couldn’t contain their glee. This is what happens, they crowed, when you lock yourselves in an echo chamber with a bunch of hysterical bloggers! The Deaniacs had fed one another’s delusions in a feedback loop of unreality, and now they were paying the price. The picture wasn’t entirely inaccurate; on the other hand, without the Web, this obscure Vermont governor would never have had any credible shot at the early primary victories that ultimately eluded him. Meanwhile, four years later Barack Obama’s outsider campaign followed a similar playbook, honed by experience. This time, energies first unleashed online helped to raise a record war chest, inspire masses of new voters, and propel a long-shot candidate — with an unusual name and dark skin, no less — into the White House. Maybe echo chambers could win elections after all! More likely, the whole concept of the echo chamber offered scant insight into the subtle and diverse dynamics of political culture online.
The final argument about the pernicious impact of the Web and blogging on the American polity was a broader take on the echo-chamber idea. Once upon a time, the reasoning went, we all shared a single national conversation. We read our newspapers in the morning, watched the three networks in the evening, and took our cues from them about the topic of the day. Now, alas, the Internet, having enabled each of us to assemble our Daily Me, is depriving us of any kind of “Daily We.”
Like all notions of a vanished golden age or a lost common culture, this vision was part fairy tale and part nostalgia. We are always going to pine for the media we loved when we were young. (You can bet that, decades from now, an entire generation will look back and yearn for the long-lost common culture of their youth that today’s social-networked Web provided.) For all its sentimentality, however, this notion has held a powerful grip on critics of the Web. They worry that the Web’s openness and the fragmentation of blogging set an impossible hurdle for any effort to unify the American story.
A representative expression of this argument can be found in an otherwise astute analysis of the media industry’s convulsions by Eric Alterman in a 2008 New Yorker piece. Alterman, a liberal journalist who began blogging in 2002, wrote that the culture of the Web and blogging was dragging newspapers into a partisan vortex: “The transformation of newspapers from enterprises devoted to objective reporting to a cluster of communities, each engaged in its own kind of ‘news’ — and each with its own set of ‘truths’ upon which to base debate and discussion — will mean the loss of a single national narrative and agreed-upon set of ‘facts’ by which to conduct our politics.”
This “loss of a single national narrative” sounds grievous indeed, until you realize that such unity, if it ever did exist, represented only a short interlude in U.S. history. Alterman admits that nineteenth-century America was “dominated by brazenly partisan newspapers.” The rise of broadcast journalism and the mid-twentieth-century consolidation of newspapers did concentrate a mass audience for news by forsaking partisanship for the neutral “view from nowhere.” But if we look at what is probably the high-water mark of such concentration and the putative heyday of the “single national narrative” — the 1960s — we do not find a halcyon era of comity; instead, the decade was marked by a level of partisan conflict and outright violence that makes the 2000s look like a civics class. If Walter Cronkite was in fact able to set the nation’s agenda, he couldn’t stop its political polarization. The tumultuous record of the twentieth century, in fact, shows no correlation at all between the concentration of media authority pushing a “single national narrative” and the flourishing of a civil public-sphere debate. The world has always been messier than it looks in sentimental hindsight.
@ @ @
For outsiders listening in as journalists marshaled all these arguments, it was hard not to sense some amount of special pleading. Journalists were hardly neutral parties to the controversies they were addressing; they could not take the “view from nowhere” when their own livelihoods and authority were at stake. To the extent that a “single national narrative” was disintegrating under the pressure of millions of disparate narratives teeming on the Web — and this was, plainly, an oversimplification — it was the journalists who’d lost influence, and everyone else who had gained.
At this point in the argument, whatever side you find yourself favoring, I invite you to pause, climb up a nearby hill, and take a longer gaze down on this rhetorical battlefield. If you have tried to follow the action over the last half-decade, you have had a couple of choices: you could just attend to the traditional media’s periodic check-ins on these unfolding controversies, reading the occasional op-ed piece or news-magazine summary. But you would barely scratch the surface of the issues. Or you could dive into the profusion of posts in which the debate has actually taken place. Pursue links into the thickets of new blogs dedicated to the topic, with names like Recovering Journalist, News After Newspapers, and Reflections of a Newsosaur, started by concerned journalists, laid-off journalists, aspiring journalists, and ex-journalists. Follow the broadsides by professors and industry analysts, distinguished editors and young-turk reporters, all scratching their heads trying to figure out how to salvage their vocation from the technological whirlwind.
If you care about the fate of journalism and its role in democracy and culture, this second choice turns out to be the only satisfying option. And when you realize that, you also realize that the debate is over: you have just resolved it. In this controversy, as in most others today, to ignore bloggers is to miss the entire event. Whatever the drawbacks and limitations of blogging, it serves, today, as our culture’s indispensable public square. Rather than one tidy “unifying narrative,” it provides a noisy arena, open to everyone, for the collective working out of old conflicts and new ideas. As the profession of journalism tries to rescue itself from the wreckage of print and rethink its digital future, this is where its most knowledgeable practitioners and most creative students are doing their hardest thinking.