Introduction: What’s New
On the morning of September 11, 2001, James Marino sat at his desk at 568 Broadway, looking out a tall window that revealed a panorama of the lower Manhattan skyline. He’d come to the office early to work on his side business — a website called Broadwaystars.com. The site collected tidbits of news and gossip about the New York theater scene and served them up blog-style, time-stamped, with new items at the top. At 8:49 a.m. he’d posted a passel of links: an AIDS benefit recap. Declining box office numbers from Variety. The impending release of a cast album for a show by Rent’s Jonathan Larson titled tick, tick . . . BOOM!
Marino clicked post and looked up from his monitor. He froze a moment, stared, then started typing again:
Something very terrible just happened at the World Trade Center. I think a plane crashed into the north- western tower. It is horrible and stunning to look at. — james
posted at 9/11/2001 08:56:32 AM
Like so many other New Yorkers that morning, Marino kept watching. He also kept typing:
Oh my god. I just saw the other building blow up. A second plane approaching from the opposite direction (south to north) crashed into the south-east tower.
posted at 9/11/2001 09:06:06 AM
My office is on the 10th floor of a Soho building with large southern facing windows, so I have a full view of downtown. From my vantage, I see a huge hole where the first plane entered the north west tower from north to south. The south east tower was hit from the south to north. It is being reported that a plane or planes were hijacked and flown into the towers with passengers in them.
posted at 9/11/2001 09:06:06 AM
Marino had worked for four years at Lehman Brothers in the World Trade Center complex; he’d been there in 1993 when a car bomb exploded in the basement garage.
one tower just collapsed (south east) i am weeping. . . . . i cant tell you how i feel
posted at 9/11/2001 10:02:52 AM
The second building just exploded and collaped. This is just beyond anything that I can even conceive. I feel so much anger and helplessness.
posted at 9/11/2001 10:33:07 AM
Eventually, Marino left his desk and made his way uptown to his brother’s West Side apartment, then finally home to Westchester. That evening he posted again:
I think I lost about 100 friends today. I can’t count them. I keep breaking down. I don’t think that I have ever been so sad and cried so much. I don’t know if I will ever be the same. But I am home now. Thank you to everyone for the notes.
posted at 9/11/2001 07:31:01 PM
Most New Yorkers have indelible memories of the minutes and hours after the planes hit that day. Those memories are recorded on countless Web pages, in emails and forum postings and uploaded photos. But much of that record made its way onto the Web after the fact, by hours or days or weeks. In the chaos on the ground, a lot of people were focused on getting to safety, or their eyes were glued to TV news, or their Internet connections were disrupted by the crushing network overload.
The Broadwaystars.com posts recorded the 9/11 events as they piled up, one fraught moment on top of the previous one. Though they broke from what Marino calls the site’s typical “gossipy, snarky” tone, they were not carefully composed or particularly eloquent. Instead, they offered a witness’s raw feed, unmediated by anchorperson or interviewer — a naked vantage on a cataclysm.
Marino wrote what he saw. And if you scroll down the page he wrote, you see that at 8:49 a.m. the world was one way, and at 8:56 a.m. it was another.
@ @ @
As the day’s disasters unfolded, most Americans got their news from TV, with its ceaseless loop of following smoke and cratering towers. TV broadcast the big picture, but it couldn’t tell you whether your friend who worked at the World Trade Center had made it out in time, or whether your sister or uncle had made it home safely. The telephone systems failed almost immediately. The websites of the big news operations quickly overloaded, too. But email still worked, and smaller websites managed to stay up. The Internet’s arteries clogged, but people found ways to communicate through its capillaries.
Marino’s 8:56 post was among the first to hit the Web that day; it might well have been the very first. (The first AP story went over the wire at 8:55 AM, but most websites took a few minutes to get it up. At 8:58 a.m., Metafilter, a popular group blog, posted a link to a one- line news bulletin on the front page of the CNN website, but CNN did not yet have a full story posted.)
Later that morning, a gaming journalist and Web celebrity named Justin Hall — who’d been chronicling his life in an open Web diary since 1994 — posted:
at these times there’s two types of media valuable
friends from japan and london writing – are you okay? and the only way to reach people – small packet
and the TV
rebirth of the broadcast media. The only pipes big enough still to bring us the pictures and moving pictures of buildings collapsing.
In San Francisco, Evan Williams, who ran a service called Blogger that helped thousands of people — including Marino — run their weblogs, wanted to do something useful, so he got to work building a page that automatically collated postings about the attacks from Blogger users across the Web and listed them in chronological order. Marino’s post headed the list.
In Woodside, a rustic enclave of homes for Silicon Valley’s stockoption millionaires, a software developer named Dave Winer had started his day early, posting links and headlines to his blog, Scripting News. Winer didn’t have a TV set, but around 9:00 a.m. (6:00 a.m. on the West Coast) he received an email from a fellow blogger named Bill Seitz, alerting him to the news and suggesting he look at the Empire State Building webcam, which offered real- time snapshots from the venerable skyscraper’s observation deck. Seitz captured an image right after the collapse of the South Tower and posted it to his own blog at 10:17 a.m.; at 10:57 a.m., he posted a second image under the words “No more WTC.”
Winer’s father, Leon Winer, was a professor at Pace University, and it occurred to Winer that Pace was near the World Trade Center. When he couldn’t get through by phone, he posted a photo of his dad at the top of Scripting News and asked for help or info. Leon, it turned out, hadn’t made it farther than Grand Central that morning before the subway system shut down; then he’d had to walk miles across the 59th Street Bridge and through Queens before finally getting a bus home — all of which Winer reported to his readers.
That afternoon Winer got an email from New York Times reporter Amy Harmon, who was researching a story on how people used the Web during the disaster. She wanted to know about “the impulse to collect and share news and reaction and how the Net facilitates that.” At first Winer couldn’t get through to her by phone. Instead, he posted his answer to her question:
I see this as an opportunity to cover a big news story, it’s largely
the same reason the NY Times home page is all about this. We
want to figure out what happened, what it means, and where we
go from here. The world changed today. It’s still very fresh.
@ @ @
In September 2001, conventional wisdom held that Web content was “dead.” Vast fortunes had been poured onto the Web during the late-1990s dotcom bubble; a lucky few had made a killing, but once the stock-market game of musical chairs ended, most media companies took heavy losses. In perhaps the single worst-timed deal in history, Time Warner had traded more than half its value for America Online at the very moment that the online stock balloon was pumped up with helium and about to pop.
Now the media barons were nursing their wounds, cursing the Internet, and concluding that it had all been a bad dream. While the Web wasn’t going to go away, they decided that it wasn’t going to change everything, either. The path was clear for them to return to a familiar world of mass publishing and broadcasting, “eyeball collection” and advertising sales. At the same time, the collapse of the Internet bubble had dragged lots of new ventures and ideas down with it. Williams’s company, Blogger, had seen steady growth in users, and yet, earlier that year, had been forced to lay off all its employees and persevere as a oneperson operation. The small Web publishers that hadn’t fallen off a cliff were hanging on by their fingernails. That was how we found ourselves at Salon.com, the magazine site I’d helped start in 1995. Enthralled as I remained by the new medium, I couldn’t help thinking, Maybe it wasn’t so smart to give up that newspaper job after all.
As a business, publishing stuff on the Web had fallen on very hard times. Yet that wasn’t stopping people from publishing more stuff on the Web. The profit motive, apparently, wasn’t the only force at work here. And when the planes hit the towers, money was the last thing on most people’s minds. They first wanted to know what had happened to the people they knew and loved; later they wanted to understand what had happened and why.
According to the most thorough study of media consumption on and after 9/11, use of the Internet actually dropped for a while after the initial news spread. The crisis didn’t introduce the Internet to a vast new population or represent a “breakthrough moment” in which large numbers of people abandoned other media for online sources of information, the report concluded. Yet those who did turn to the Web sensed a breakthrough nonetheless: at that moment of crisis, many of us looked to the Web for a sense of connection and a dose of truth. The surrogate lamentations of the broadcast media’s talking heads sounded manufactured and inadequate; people felt the urgent necessity to express themselves and be heard as singular individuals. Those who posted felt the gravity of the moment and the certainty that stepping forward to record their thoughts had unquantifiable but unmistakable value.
“Only through the human stories of escape or loss have I really felt the disaster,” wrote Nick Denton, a journalist turned Internet entrepreneur, in the Guardian on September 20, 2001. “And some of the best eyewitness accounts and personal diaries of the aftermath have been published on weblogs. These stories, some laced with anecdotes of drunken binges and random flings, have a rude honesty that does not make its way through the mainstream media’s good- taste filter.”
David Weinberger, an author and Web consultant, wrote in his email newsletter: “When the Maine was sunk a hundred years ago, messages scatted over telegraph wires to feed the next edition of the newspaper. When the Arizona was sunk at Pearl Harbor, the radio announced the wreckage. When Kennedy was shot, television newscasters wept and we learned to sit on our couches while waiting for more bad news. Now, for the first time, the nation and the world could talk with itself, doing what humans do when the innocent suffer: cry, comfort, inform, and, most important, tell the story together.”
In his Guardian article, Denton wrote, “In weblogs, the Web has become a mature medium.” A column headline on CNET similarly declared that blogging had “come of age.” Such statements may be hard to believe today, years later, when blogs — frequently updated websites typically filled with links, news, or personal stories — are still widely reviled as a medium for immature ranting. In fact, within a year of writing those words, Denton himself went on to found Gawker Media, a network of rant-heavy gossip blogs powered by adolescent attitude. In retrospect, 9/11 hardly marked any sort of maturity for blogging. Instead, it marked the moment that the rest of the media woke up and noticed what the Web had birthed. Something strange and novel had landed on the doorstep: the latest monster baby from the Net. Newspapers and radio and cable news began to take note and tell people about it. That in turn sent more visitors to the bloggers’ sites, and inspired a whole new wave of bloggers to begin posting.
Many of these newcomers faced the post-9/11 world with militant anger and proudly dubbed themselves “warbloggers.” As they learned how to post and link, they felt they were exploring virgin territory, and it was, for them, as it has been for each successive new generation of blogging novices. But blogging had already been around for years. In a sense, it had been around since the birth of the Web itself.
@ @ @
From the instant people started publishing Web pages, they faced a problem unique to their new medium. How do you help readers understand what’s new? Newspapers and magazines are date-stamped bundles of information; each new edition supersedes the previous one. TV and radio offer their reports in real-time streams, a continuum of “now.” But Web pages are just files of data sitting on servers. They can be left untouched or changed at will at any time.
And that’s just what the print-world refugees who began to colonize the new medium proceeded to do in the Web’s infancy. When new material was ready for publication, it simply replaced the old. This, they quickly realized, meant they were throwing out yesterday’s news, with no archive or record of their first draft of history. So instead they started scattering the home pages of their sites with bright little “NEW” icons to direct visitors’ eyes to the latest updates. But “new” is a relative concept. At what point did “new” stop being new? And how could you separate what was new to someone who checked out your site once a day from what was new to the hourly visitor?
Telling people what’s new proved unexpectedly confounding in the early days of Web publishing. Yet a simple solution lay close to hand, embedded in the medium’s DNA by its creator.
Tim Berners-Lee — a British software engineer at CERN, the Geneva particle- physics lab — devised the World Wide Web and built the very first website at the address http://info.cern.ch in 1990. He dreamed that his invention, a method for linking pages, would knit together the primarily academic information then scattered across the larger Internet and foster intellectual collaboration. As he proselytized for his new creation, enthusiasts at other universities would crank up their own Web servers, begin publishing sites, and email Berners-Lee to tell him about the cool things they’d done. Each time the network gained a new node in this way, Berners-Lee would insert a listing for it — and a link to it — from a page on his info.cern site.
This “W3 Servers” page had a simple scheme: newcomers would be added to the top of the list. This proved convenient for the return visitor, who could immediately see the latest information without having to scroll down the page.
In choosing to organize the new links this way, Berners-Lee was borrowing from computer science the concept of the “stack.” Stacks are data structures in which each new addition is piled on the top, pushing down previous items; outputs get taken from the top, too. Stacks are therefore said to work on a “last in, first out” principle: the last thing that you add becomes the first thing you remove. Similarly, a stack of news and information is a page in which the last thing that the publisher added is the first thing that you see. Such a design was present at the Web’s creation, and would keep reasserting itself like a genetic trait, turning up at critical points in the Web’s development.
Journalists instinctively resisted this approach. They were used to ordering information according to an editorial process, rather than handing the ordering over to a set of rules. “We’ll tell you what’s important,” they said, “not the clock.” Their preference was understandable. But the model of the news stack spread anyway. Its method of making sense of the new was native to the online world, not imported from print, and today we see it as a universal building block of the medium.
Only a handful of pioneering programmers ever made much use of Berners-Lee’s servers page. But the next proto-blog to come along reached a wider audience. In 1992, Marc Andreessen, a computer-science student at the National Center for Supercomputing Applications (NCSA) at the University of Illinois, discovered Berners-Lee’s invention. He fell in love with the Web, and decided that what it really needed was pictures. He and a handful of classmates quickly wrote Mosaic, a program that let the user browse Web pages containing both text and images. Berners-Lee’s text-only browser had been created primarily for Steve Jobs’s Next operating system; Andreessen and company offered Mosaic to everyday people who used Windows and Macintosh machines. They believed, Andreessen says today, that “everybody should be using the Web,” and they were impatient with Berners-Lee’s more academic approach. Their program proved an instant hit.
Once people downloaded Mosaic, they began hunting for diverting stuff to look at. Andreessen obliged. Beginning early in 1993 with Mosaic’s initial release, he personally assembled the NCSA What’s New page. “Any page anyone put up on any topic, we would highlight,” Andreessen recalls. Like Berners-Lee before him, Andreessen added new links at the top of the list, since that was the spot people would see first. When Mosaic turned into Netscape, the browser was tweaked to help users who had slow dial-up connections. The top of the page would display first, even before the rest had loaded, which meant that the new items would always be visible the fastest. Every Netscape browser also came with a built-in button that linked directly to What’s New. That made the page a welcoming gateway to the Web for its first generation of users — and a subliminal introduction to the simple utility of a reversechronological list.
As the Web’s popularity exploded, businesses started erecting “vir- tual storefronts” that attempted to mimic physical layouts, with gimmicky graphics for ticket counters and help desks and front porches. Publishers began constructing flashy front pages for their sites with elaborate “hotlinked” images. They were all caught up in the misconception that the key to helping people find their way across the Web was to give them pictures to click on — to offer users metaphors from the physical world.
But the reverse-chronological list never entirely vanished. As the 1990s advanced, it kept reasserting itself, becoming the basis for the form that today is familiar from a million blogs: a stream of link- laden posts with the latest on top. Over time, as blogging spread from a handful of Web designers and software developers to writers and political activists and then on to the general public, this form has grown flexible and capacious enough to encompass virtually anything that anyone might wish to express.
In this way, the rise of blogs has gone a long way toward making good on the promise of the Web’s first inventors: that their creation would welcome contributions from every corner of the globe and open a floodgate of human creativity. Berners-Lee’s first browser was a tool for reading and writing: “The initial WorldWideWeb program opened with an almost blank page, ready for the jottings of the user,” he wrote in his memoir. As the Web extended its reach first to offices and then to homes, across the United States and around the world, it became theoretically possible for millions of people to publish millions of thoughts for millions of other people to read. The Web implicitly invited people to say anything — and everything.
But would they? The primordial Web was largely a repository for research and cool science experiments. Berners-Lee’s and Andreessen’s lists both grew up inside academe and were heavy on technical reports. People first understood the Web as a treasury of static knowledge and a showcase for the occasional geeky stunt (like the Trojan Room coffeepot camera from the Cambridge University computer lab, or the “fishcam” that Netscape programmer Lou Montulli created in 1994). The Web browsers that followed Berners-Lee’s weren’t two-way; it was a lot harder to write the program code for a tool that could both read and write, and everyone was racing to release new browsers at the breakneck pace of “Netscape time.” The writing part could wait. That meant that the first Web experience for most newcomers was an act of media consumption: “browsing” or “surfing.” Basically, reading.
So it was never a sure bet that the Web would evolve beyond its scholarly roots into something more personal — a sort of conversational water cooler and confessional. The first publishers of Web pages outside of the academy were a motley, self-selected population of fringe types: software programmers, science-fiction fans, ‘zine publishers, and misfit journalists. They created a profusion of what the industry now calls “user-generated content”; observers of their efforts were often dismissive.
“By many measures,” Michael Hiltzik wrote in the Los Angeles Times in 1997, “the Web reached the vast- wasteland stage faster than any other communication medium in human history.” Michael Kinsley, the celebrated editor and columnist, famously told the Washington Post in 1995 that most of the Web was “crap.” He elaborated in 1997: “The first time you go on the Web you think, ‘Wow!’ and the third time you go on, you think, ‘There’s nothing to see here.’ My point was that if the Web was going to make it, it was going to have to pass higher standards.” Indeed, the initial flowering of forum postings and personal home pages was crude, sometimes cheesy. Those visitors from the mainstream press who parachuted in to see what all the fuss was about took a quick look and decided it was one big amateur hour. Once the pros arrived and showed the world how it was done, surely all that crap would shrivel and vanish. Their misunderstanding set the stage for an angry debate about quality and accuracy and literary standards and business models that is alive today — and that has always been a little beside the point.
In those days, media companies venturing forth on the Web would declare their faith in “interactivity,” by which they usually meant “cool buttons you can play with on the screen.” And people who were actually using the Internet would say no, sorry, “interactivity” means being able to send an email or post your thoughts to a discussion. It’s just a clumsy word for communication. That communication — each reader’s ability to be a writer as well — was not some bell or whistle. It was the whole point of the Web, the defining trait of the new medium — like motion in movies, or sound in radio, or narrow columns of text in newspapers.
Only when blogs took off did this insight begin to spread beyond the most idealistic quarters of the Net. At the start of the first decade of the new century, easier tools for creating blogs and free services for hosting them removed many of the technical and economic barriers, and masses of newcomers swarmed in: college kids and retirees, diarists and polemicists, political junkies and sports fans, music lovers and investors, preachers and professors, knitters and gun nuts, doctors and librarians. The one trait they all shared was the passion behind their postings. They were indeed amateurs, in the original sense of the term: they wrote out of love. That love led them toward new rewards and into new perils.
As I write this at the end of that decade, the Web world is transfixed by a new generation of social-networking software. Facebook, MySpace, and their cohorts have taken some of the energy behind blogging and enhanced it with new tools for sharing and communicating among friends. People are flocking to these services, and as they do, they face unfamiliar quandaries: How much of your life should you expose to the Web? What line should you draw between job and home, colleagues and friends? At what point does sharing edge into Too Much Information? Is there any healthy way to keep up with the snowballing volume of Web stuff?
The experiences of the early bloggers prefigured every one of the dilemmas that we are encountering today, as we move more and more of our lives online and as new technologies — broadband to the home, wireless, digital cameras, cell phones and other handheld devices, YouTube video — keep extending the Web’s reach. Blogging, the first form of social media to be widely adopted beyond the world of technology enthusiasts, gave us a template for all the other forms that would follow. Through its lens, people could see the Web for the first time as something they were collectively building. It gave a multitude of formerly private people a public voice. It handed them a blank page and said: Learn. Learn what happens when you take strangers into your confidence. Learn what happens when you betray a confidence. See how it works when you find a lover online. See what happens when you tell the world that you’ve broken up. Find a job with your blog. Get fired from your job because of something you wrote on your blog. Expose media misinformation. Spread some misinformation of your own.
Then take everything that you’ve learned and post it.