I’ve been conducting some research on the future of new media properties like Gawker, Demand Media, Federated Media, Observers, AOL, and HuffPo. One thing that I’ve noticed is how self-critical the industry is. Many, it seems, are aware of how difficult things are right now, but there is general apathy about improving the situation.
Here’s what I mean. Below is a screenshot from Techmeme, an aggregator of Tech news.
See all of those links to other sources for the OMGPOP deal listed in the “More” section? Every article linked to is basically the same piece of writing– same length, same quotes, and generally same perspective. Marco Arment, a tech-writer and creator of Instapaper, sums it up well:
There’s a continuum between 100% original reporting and zero value being added to the source content, but I don’t think I’m being unnecessarily inflammatory by labeling the posts on the far end of the continuum as rewriting.
I could click any of the sources for the Techmeme article above and get an identical experience– a web of rewriting, paraphrasing, and regurgitation of stock copy. Recently, MG Siegler left Techcrunch and reflected upon the writing process popular there (and elsewhere online) in an awesome rant:
Most are stories written with little or no research done. They’re written as quickly as possible. The faster the better. Most are just rehashing information that spread by some other means. But that’s great, it means stories can be written without any burden beyond the writer having to read a little bit and type words fast. Many are written without the writer even having to think.
There will be 25 stories about Google TV or something else tomorrow which will all say basically the same thing. Maybe one or two of those stories will have actual insight or information. Maybe none will. If any do, it’s the exception, not the rule.
This seems to validate the worst fears of the publishing industry– a total commoditization of content. For traditional publishers hoping to avoid the “quality doesn’t matter” debate, the new media companies have decided to screw any notion of quality. What’s shocking, to me, is the deprecation of brand. Old-world companies would stand by each article as the voice of the brand. Today, every tech site just seems to blend together. There seem to be few material differences among AllthingsD, Mashable, Gizmodo, and their ilk in terms of editorial quality. Success, it seems, comes from their ability to be more link-baity than one another– to be THE article to which people link on whatever mundane topic. AllthingsD doesn’t have an “audience” in the traditional sense. They don’t have “readers.” They have people who happen to click AllthingsD links. And this seems to be increasingly true across the industry. Case in point: the meteoric rise of the Verge. In a world where tech news had strong editorial, I couldn’t imagine a site launched 5 months ago reaching 300K uniques.
While quality isn’t anything to be proud of, Arianna Huffington had a recent rant about how story topics are decided:
Going viral has gone viral. Social media have become the obsession of the media. It’s all about social now: What are the latest social tools? How can a company increase its social reach? Are reporters devoting enough time to social? Less discussed — or not at all — is the value of the thing going viral. Doesn’t matter — as long as it’s social.
The rest of the post discusses how Huffpo and its counterparts have become obsessed with writing about NOW, chasing the latest “story.” For Arianna, the problem is that too much content is planned around what’s trending or will trend. What’s missing from her rant is the reason: page views. If you’re in the business of maximizing page views, there’s a simple recipe: talk about what’s talked about or try to be talked about.
The problem is that too much content is planned around what’s trending or will trend.
Finally, there is a criticism to be made about publishing focussed on quantity. The top-of-the-page post on Techcrunch as I write this is “Apple stock up 50% this year,” posted a mere 21 minutes ago at a length of 320 words. Long ago I unsubscribed from a Techcrunch RSS feed, since 99% percent of the 100+ articles they published daily were mostly trash– the tech-industry equivalent of celebrity gossip, driven by a desire to continuously publish at volume. MG Siegler breaks it down:
Because the emphasis is on speed, even if a writer does know a lot about a company/topic, that takes a backseat. Writing a bland story with a few facts in 5 minutes is valued much higher than writing a good story in an hour. And that’s valued much higher than writing a great story over the period of a few days or god-forbid, weeks.
Quantity makes a lot of sense if you’re chasing page views, but author-time and page views are certainly not correlated. The highest quality articles just as likely to perform poorly as a benign update on some Apple news.
So what does this formula look like when applied to a traditional publisher? The New York Observer, a once highly venerated newspaper, transformed from dying, old world-publication to a proto-typical web content portal. They hired Elizabeth Spiers, founding editor of Gawker, who trashed any integrity of the former Observer brand. Felix Simon has a great article about how it went down:
The Observer is now, first and foremost, Observer.com. (It’s a hugely valuable domain name, which, by some freakish accident of history, wound up getting snaffled by a dilettantish New York weekly before it could be claimed by the venerable newspaper in England.) There’s a slew of verticals, running the gamut of New York interests — Wall Street, media, art, real estate — as well as a bold attempt to break into the tech blogosphere with BetaBeat. Page design is sophisticated and effective, with all sites linking generously to all other sites, with the emphasis on dynamic headlines rather than bland navbars.
The Observer’s inimitable voice is gone, replaced by a barrage of bloggish posts by a group of writers so young that many of them can’t even remember a time before Gawker. (Which was birthed, by Spiers, in 2003.) The old Observer was edited, on a story-by-story basis, in a way that the new online Observer isn’t — Spiers doesn’t have either the time or the money to have a layer of experienced journalists reworking her bloggers’ prose before it’s published.
Go check out observer.com. Does it remind you of the old newspaper? Does it matter? That’s right, you were distracted by that listicle about the top 25 startups in NYC on Betabeat, the “low-down on high-tech” (I thought that was Silicon Valley Insider?).
What’s crazy to me is that this formula is successful. In a world where traditional publishers can barely scrape by, feeding a page view business model means catering to the lowest common denominator of editorial and publishing toddler-level content.
Counter-argument: who cares?! While Felix hates what the Observer has changed into, he admits that any loss of integrity has little value:
And so, in the proud tradition of good blogs everywhere, readers are left with a highly variable product. The great is rare; the dull quite common. But — and this is the genius of the online format — that doesn’t matter, not any more, and certainly not half as much as it used to. When you’re working online, more is more. If you have the cojones to throw up everything, more or less regardless of quality, you’ll be rewarded for it — even the bad posts get some traffic, and it’s impossible ex ante to know which posts are going to end up getting massive pageviews. The less you worry about quality control at the low end, the more opportunities you get to print stories which will be shared or searched for or just hit some kind of nerve.
I really wanted to bold this entire paragraph since it epitomizes the supposed “future” of monetized media. It is shockingly depressing. People want junk food, regardless of how bad it is for them. Newspapers bundled everything together, so it wasn’t exactly clear that only 2% of readers cared about the stuff containing any notion of journalistic integrity. The other 98% were just reading the sex-columns and human-interest stories. Now that everything is unbundled, we’ve learned what we already know, that people gravitate toward the sensational.
Evan Williams, in a recent interview on News.me, talks about what’s wrong with this:
The web is completely oriented around new-thing-on-top. Our brains are also wired to get a rush from novelty. But most “news” we read really doesn’t matter.
Right. Everyone reads TMZ and feels bad about it afterward. But here’s the next sentence from Evan:
…a much smaller percentage of the information I actually care about or would find useful was produced in the last few hours than my reading patterns reflect.
The golden nugget here is that we only eat junk food because it’s what’s available. In other words, there’s a disconnect between why I’m reading an inane article about OMGPOP’s acquisition and not War and Peace.
My hypothesis is that these changes have a lot to with how our web consumption patterns have evolved over the past few years. The distant past of spending a few hours catching up on the news are long gone. When’s the last time you visited a media site’s homepage and browsed the category (gasp!) sections? What was once the dominate media consumption behavior has fragmented into discovery from endless channels, a mix of sharing, algorithms (Google News, Techmeme, Buzzfeed), aggregators, and endless apps trying to feed content in clever ways (I can’t even begin to rationalize the impact of Tumblr, a weird mix of sharing, aggregation, and feeds).
In short, I think that changing media consumption habits are the source of change, rather than stupid, autonomous decisions on the part of new media sites. More on that later.