So today is supposedly the day of reckonning for thousands of digital journaists who work for Patch, the horribly named AOL entry in the corporate arms race to "monetize the hyperlocal space." Big layoffs (nobody knows yet exactly how big), plummeting morale, disintegrating trust and blamethrowing between local outfits and corporate towers.
It's the same story, over and over and over. TBD in Washington. Backfence in ... well, lots of places. There are other examples, but who really cares? If you're reading the coverage of this extremely predictable failure, you'd think that the culprit is "the model."
In the case of Patch, media analyst Ken Doctor of Outsell thinks part of the problem is that the network grew too large too quickly. The downsizing, he says, "is basically an acknowledgment of failure, but in part a noble failure," adding that in 2011-12 Patch hired more journalists than anyone else in the country.
As for hyperlocal generally, Doctor says, "The economic model still doesn't work after 15 years."
Well, allow me to retort! Doesn't work for whom?
Because here's the thing, Ken (and it's not just Doctor I'm speaking to -- he's simply saying what all the media executives say). You can make that claim all day, but it doesn't explain the fact that you can still go to Baristanet.com and read more than you ever wanted to know about Montclair, N.J. They've been doing hyperlocal journalism since 2004. Still going strong. And it's not alone.
So let me Mansplain this to all you corporate media experts out there. Hyperlocal doesn't work for you. And it doesn't work for you because you're not out in the field trying to satisfy people's desire for information about their communities. The Tim Armstrongs of the world don't give a flying fuck about your community. They're chasing the hyperlocal fairy through the forest because they know that local businesses need local media in which to advertise to local customers. It's a huge market -- if you could simply organize it, aggregate it, commoditize it, and drag it back to those plush old-media economies of scale. Who wouldn't want to control that market?
Well, I wouldn't.
After years of splashing around the edges of the hyperlocal pond, I finally dove in this year and built CHSSoccer.net, a hyperlocal web site devoted to soccer in and around Charleston, SC. And while it's not like I didn't think long and hard about the value of that market before committing to it, in the end I concluded that the way to understand its value was to take a risk and get to know it first. As a reporter, the way I get to know things is by covering them. Intensely. Personally. Maybe even obsessively.
So CHSSoccer.net isn't based on the "most efficient unit-cost model," but the Seth Godin Otaku model. To wit: Making average products for average people (think standard "news judgment," or what I've come to call "the Great Averaging Machine") is the 21st century recipe for failure. To succeed in this new world, we have to make things that are remarkable, and one way to do that is to design for people who have a profound passion for that thing -- whether it's ramen noodles in Tokyo or well-made camp tools or an awesome minor-league soccer team that gets only carefully averaged attention from mainstream local media.
Of course, Godin's ideas on this subject -- like Chris Anderson's Long Tail and Dave Winer's visionary but half-formed dream of a new kind of DIY media culture -- have all, in a sense, failed. Because while Godin's now-dated books and blogs and TED talks are brilliant, they haven't really been predictive. American companies continue to chug along making average products for average people. Investors have largely given up on the Long Tail and returned to the herd-safety at the head of the curve. And all the potential media outcomes I outlined in 2009 have largely succumbed to the lowest common-denominator, with metro newspapers "lurch(ing) along zombie-style as scaled-back copies of the original."
We had chaos and hope between 2004 and 2009. Everything since has been a bloodbath. Kinda reminds me of Tiananmen Square in June 1989, except instead of Chinese security forces we got paywalls, reality TV, DRM, auto-tuned pop ... and Patch.
So yes, Patch hired journalists. But Patch was always a sweat shop, designed from the top down to prove the validity of the corporate desire to view "content" as simply a cost-factor. It was hometown news as a Beta test for a corporate-scaled platform of clone sites. And you simply can't run an enterprise at that scale, with that reliance on low-overhead -- without uniform policies, uniform productivity, uniform... everything.
How the fuck is that local?
So let me tell you what local is (and frankly, if you disagree, I don't care -- start your own local site and prove me wrong). Local is relationships. Local is trust. Local is caring about the same things your readers care about, and helping local business people make money.
Local is about speaking in your own voice, all the time, even when that's hard. Local is about feeling supported... and intensely vulnerable. You're local when someone writes you an email that criticizes something about your coverage (I got one of these emails yesterday) and you answer that person honestly, in a way that's true to yourself and your values. You're local when people recognize you as a personal resource.
Local isn't about being universally popular. But if you're accountable and serious and reliable, and you deliver a product that people want and respect, you have a shot at becoming a viable business. Oh, and it also helps if you're not a dick.
Because in the end, if local isn't human-scaled, it isn't actually local.
I knew when I started CHSSoccer.net that I was going to face unexpected problems, and I was right. I knew I was going to make mistakes, and I did. Every day is an iteration. Every idea I had at the beginning is provisional.
And the only money I've made off it so far has been by accident. I got a sponsor in April -- long before I felt ready to "sell" the site. I'm still not convinced it's ready. Is the audience large enough? Can I deliver real value to a mom-and-pop sponsor? And so on. Yet in one trip to a local bar to cover the Battery's most recent away match, two business people approached me to ask about sponsoring the site somehow.
I don't expect to make money off CHSSoccer.net this year, and I know that building a product without even proposing a firm business model for it would never fly in most boardrooms. But this is the way people used to do work, dammit. You built a good thing, and then you sold it. If the effort and the reward were commensurate, you kept on doing it. You learned from your customers, adapted, and rose according to the value you provided.
I can do this because I don't have to sit at a conference table while a bunch of self-interested pricks play clever political fuck-games with my ideas. I can do this because I earn my living as a freelance writer/editor/consultant/geek/mobile-bicycle-repair-service, and work on the site in my spare time. I can do this because I'll be happy with the outcome if CHSSoccer.net becomes a nice part-time job that cuts regular little checks to a few talented local writers and photographers, while improving soccer in my community. I'm a fan.
So when "experts" tell you that hyperlocal doesn't work, please understand what they're really saying. Hyperlocal doesn't work as a massive platform to extract the maximum amount of cash for the the minimal amount of overhead from every identifiable community in America. Local can't be commoditized. Local can't be lead-shielded to obscure their greed and bullshit from their customers.
If some tech-centric entrepreneur out there really wants to make money in "the hyperlocal space," I've got a suggestion for them: Build a shared, scalable infrastructure that start-up sites like mine can use to make the business side of hyperlocal journalism as painless and easy as Wordpress makes the publishing. Add value instead of extracting it.
And if you treat me like a human being, I'd be proud to be your first customer.
Editor's note: i sold this 4,300-word feature story to The Charleston City Paper in June 2012 after spending a week as a volunteer at the annual Allendale Paleoindian Expedition. I'd kinda forgotten about it with all the other things I've gotten into over the past year, but when a new reader discovered it recently and goosed me about it, I re-read it and thought, "Here's something I wrote that I don't hate." And since I've learned my lesson about news websites suddenly purging all evidence of my best work, I'm republishing it here on my own site now. With the proper photos for a change.
There's a sad postscript to this story, too: It's entirely possible that the May 2013 expedition might wind up being the last. Tom Pertierra, the driving force behind the logistics, funding and concept of the expedition for more than a decade, is recovering from a health crisis that occurred earlier this year. Without him, the other organizers had no choice but to cancel this year's dig. The property is rumored to be for sale. Chief investigator Al Goodyear is nearing retirement. And long before Tom took ill, controversial discoveries made in 2012 were already fracturing the professional cooperation and trust that made the Topper-Allendale dig successful for so long.
If there is ever additional investigation at Topper, its shape, focus and personnel could be radically different.
I fell in love with the Topper story more than a decade ago, back when I was the desk-bound city editor of The (Charleston) Post and Courier sending reporters and photographers down for articles. In 2004, writing about Goodyear's findings helped me win South Carolina Journalist of the Year. Being able to cover it intensely last year, as I'd always wanted to do, is probably the highlight of my freelance career.
But the best part of that long experience was really just the people I met and worked beside last May. I feel their loss -- even their sense of familial exile -- and I wish them the best. -- Dan Conover, May 6, 2013
By DAN CONOVER
MARTIN, S.C. – At a depth of about four feet, 13,000-year-old artifacts emerge from the floor of a hole known as HS-N207E66 in such dense profusion that they leave the volunteers little room to work.
Like others who've dug here since the 1980s, the crew assigned to HS-N207E66 has reached the Clovis layer at the Topper-Allendale archeological site. Excavations there tell different versions of the same story: Near the end of the last ice age, America's first great artisans came to this hillside to quarry a prized stone tool-making material called chert. The artifacts suggest the intense period of Clovis-era activity begins a few centuries before 11,000 B.C. and fades away roughly 500 years later.
Topper remains one of the most productive Clovis sites in North America, yet Clovis artifacts aren't the reason this place became famous. Just down the hill, below the chert outcropping that attracted the ancients, lies a deeper hole. And in 1988, archaeologists found something impossible there.
At the time, just about every respected archaeologist in the world knew that the mammoth-hunting Clovis people were the first human beings to reach the New World. They knew that the Clovis walked here from Siberia across a land bridge. And they knew there couldn't have been humans here before the Clovis arrived because the Americas were cut off from Europe by oceans before the Ice Age.
But then Dr. Al Goodyear III dug deeper and found simple stone tools buried in sediment far beneath the bottom of Topper's Clovis layer, and this inconvenient evidence pushed him into the midst of an already boisterous scientific revolution. The pre-Clovis rebels eventually prevailed, but for a generation of archaeologists, the price of that victory was paid in friendships, reputations and careers.
So it's only now — after millennia of silence, decades of stereotypes, and years of nasty professional infighting — that we're beginning to re-evaluate these people, the Clovis people. If they weren't the Siberian brutes we thought they were, then who were they?
And perhaps more importantly, was Clovis even "a people," or just the first great American idea?
Made In The South
Clovis culture is the most ancient and famous of the specific "cultural toolkits" archaeologists have identified in the American pre-historical past, and once upon a time, scientists thought they understood the Clovis story pretty well. Yet the more data modern researchers collect, the more mysterious the Clovis people become.
Unlike the more regional prehistoric cultures that followed, Clovis culture literally left its artifacts all over the map. Numerous finds over the past 85 years show that these wildly successful hunter-gatherers ranged widely across North and Central America and probed down into South America, too. But despite leaving so much evidence across so much dangerous wilderness real estate, the Clovis era was surprisingly short. The dates fluctuate as modern techniques for measuring the age of artifacts improve, but good estimates now suggest that Clovis culture probably emerged roughly 13,300 years ago, give or take a century or more, and faded out around the time of a sudden climate change that began about 12,800 years ago.
The late 20th century fight over Clovis-first orthodoxy split American archaeology from the mid-1970s to the late 2000s. The Old Guard, with their careers based on Clovis-first scholarship, challenged every new find down to its tiniest details. The rebels countered with more finds and took beating after beating before finally winning over a majority of their peers around 2008.
Because of that victory, there's an argument to be made today that the first great American invention was the Clovis Point, an oversized, symmetrical, double-edged spear tip with a particularly deft and risky design feature. In light of recent developments, it looks increasingly likely that this distinctive weapon — and the culture it defines — originated here in the Southeastern United States. Which would make the Clovis Point not only the first great American invention, but the South's first cultural export, pre-dating jazz and Elvis by at least 13,200 years.
"The technology is most likely home-grown," says expedition operations manager Tom Pertierra of Greenville, Fla., "because there are lanceolate points in North America that are older than Clovis and they demonstrate similar thinning technology. And the farther west you go in dating Clovis, the younger they are, not older."
In other words, Clovis didn't start in Siberia and migrate to the Carolinas. It started somewhere in South Carolina, Georgia, Florida, or Alabama, and after covering the Deep South, it moved west and north.
Connect The Dots
The idea that Clovis originated in the Southeast is only now entering the public discussion mainly because of the limits of human science. Archaeologists most likely didn't recognize the evidence because they were rational human beings who logically assumed they were looking for another answer.
To understand their dilemma, consider that the practice of archaeology is a bit like asking someone who has never driven a car to draw a picture of a 1975 AMC Pacer based on nothing more than a pitted chrome door handle and a lug nut. Whenever the past challenges us to connect its dots, much of what we see tends to resemble much of what we already assume, and so it has gone with the people we call the Clovis.
Named after the town in New Mexico where their artifacts were first identified in the 1930s, the Clovis people rose to the unassailed status as the First Americans in less than three decades. Along the way, the story of their pioneering ice-age migration became textbook dogma.
Since mainstream science in the mid-20th century assumed that reaching the New World by boat required nothing less than 15th-century sailing ships, migration across the ice-age land bridge seemed a logical assumption. With that idea in place, it was only reasonable that migration patterns in the Americas had to run from north to south and from west to east.
Other questionable assumptions followed. Because Clovis Points have been found embedded in butchered mammoth bones, writers extrapolated that the Clovis were macho super-hunters with an obsessive taste for big, dangerous animals. They must have been restless and aggressive, a male-dominated, bloodthirsty blitzkrieg that conquered a violent wilderness.
If they had a tragic flaw, it was excess awesomeness. The Clovis were so badass, the textbooks said, that they hounded the great North American megafauna to extinction in just a few gluttonous generations. Finally, with their conquest complete, the mighty Clovis retired, ceding the benefits of their hard-won glories to their less-ambitious descendants.
Most of the facts that shaped those ideas were derived by good science, but in retrospect, our mid-20th century theories about the Clovis people sound a bit too much like some middle-aged white guy at a VFW hall telling his life story to a bunch of ungrateful beatniks in 1966. When we were kids, we had to ambush a mastodon if we wanted breakfast, and sometimes it killed us. And we liked it.
But while that story aged poorly, it remained archaeology's default explanation for the peopling of the Americas until the late 2000s, when the increasing weight of three decades of evidence finally flipped Clovis-first orthodoxy on its collective head.
Today most scientists agree that there were people all over the Americas thousands of years before the Clovis appeared. Since that means at least some of those early arrivals probably got here by boat, our estimation of our Paleolithic ancestors is in the midst of an upgrade.
Yet with so much emphasis on who got here first, the Clovis themselves somehow faded into the background. It's only now, in the context of a more open-ended theory of our deep past, that the Clovis are re-emerging to renewed appreciation as one of the most fascinating, distinct, and innovative groups in American pre-history.
"You can tell by the skill levels back in the Clovis times, they were very skilled artisans in the points that they made," says Bill Lyles, a flint-knapper and retired pharmacist who has become one of the Topper site's top supervisors. "Clovis only lasted a few hundred years, and that skill level, as time passed, you begin to see it drop off. And it wasn't picked back up."
Instead of hyperactive Siberian nomads who over-hunted their way to cultural failure, what we call Clovis now looks more like a highly mobile group that managed to adapt to whatever the American landscape provided, or perhaps a successful idea that spread across existing trade networks. They almost certainly didn't exterminate the mammoths, and today it's considered far more likely that a 1,500-year cold snap called the Younger Dryas somehow led to the extinction of the great mammals and the demise of the Clovis era.
And finally, there's another theory — not mainstream, but promoted by one of the top researchers at the Smithsonian Institute — that contends that the emergence of Clovis in the Southeast offers evidence of a mind-boggling journey.
Even against a complex backdrop that looks more interesting than ever, Clovis culture stands out.
Past That Wasn't
As an academic field, pre-Columbian history marches across multiple culture-war battlefields these days. New evidence and old documents suggest that the Americas probably were far more populated in 1492 than history has properly recorded, with cities in Central and South America exceeding those of the Old World in terms of population, health, and grandeur. Even pre-Columbia North America — long viewed as an unspoiled Eden — looks more like a productively managed garden.
In this new rewrite of the American story, the rapid European settlement of the "New World" between the 15th and 18th centuries succeeded because it caused the most horrific series of plagues ever inflicted upon humanity. Serious scientists have proposed that waves of disease may have reduced the indigenous population by as much as 95 percent between 1492 and 1640.
But as controversial as the debates over those ideas have been, they're practically etiquette lessons compared to the hostility faced by the archaeologists who first asserted evidence of older-than-Clovis inhabitations here. Archaeologists used to speak in terms of a "Clovis Horizon," the idea that once a professional reached the bottom of the Clovis layer, there was no reason to dig deeper. Those who crossed that boundary were not only viewed as fools, but as heretics.
Al Goodyear, an associate research professor at the S.C. Institute of Archaeology and Anthropology at the University of South Carolina and the Topper site's primary investigator, crossed the boundary reluctantly. His Clovis finds at Topper had positioned the stout, affable Goodyear in the comfortable mainstream of Clovis-first American archaeology.
But that began to change in 1988 when — just to make sure — Goodyear decided to dig a test trench below Topper's 13,500-year-old Clovis Horizon. He says he didn't expect to find anything, and today speaks of the decision almost wistfully. "There were people who told me, 'If you ever find something beneath your Clovis layer, cover it back up and never speak of it again,'" he says. Then he shrugs.
What Goodyear found in that trench enlisted him in the uncomfortably small ranks of credentialed American archaeologists investigating older-than-Clovis evidence. When read from the bottom up, the layers suggest that unknown bands of ancient people who relied on simple micro-blade tools had quarried chert at Topper for thousands of years before disappearing around 16,000 years ago. The site's artifact record falls silent after that, perhaps for as long as 3,000 years, before revealing a burst of activity during the Clovis era. The area has been inhabited more-or-less continuously ever since.
Despite such then-controversial finds, the site attracted little public attention until 2004, when a rare chance to carbon-date Topper's very earliest artifacts determined they were older than carbon could reliably measure, making them anywhere from 40,000 to 70,000 years old. Suddenly Goodyear had the oldest archaeological site in North America, and his face was all over the cable news channels. He'd never been more famous, or looked more professionally vulnerable.
Yet the drumbeat of new finds and old dates continued, and as the months passed, Goodyear's radiocarbon date — though still ridiculously old — started looking less impossible. Clovis-first orthodoxy began to lose its unquestioned dominance at academic conferences and among the editors of peer-reviewed journals, and when the major figures in the debate arrived in Austin, Texas, for a 2008 Paleoamerican workshop organized by Tom Pertierra, the status quo finally shifted.
"Were there fistfights? No," Pertierra says. "Was there spirited discussion? Yes. But you know what we came out of there with? About an 80-20 turn [in favor of the older-than-Clovis evidence]."
Less than five years later, the idea of a Clovis Horizon looks quaint. New evidence for older-than-Clovis occupations has cropped up from Virginia to Texas, in caves in Oregon, and in mud dredged off the ocean floor near the mouth of the Chesapeake Bay. New data generated by scholars in related fields ranging from genetics to linguistics point to not one but multiple waves of pre-Columbian settlement, with several starting points and much earlier origins.
But it wasn't until 2012 that an archaeologist crossed the last boundary of mainstream acceptance. Eight years ago, Goodyear confided that he would never advise a graduate student to risk his or her career on a doctoral thesis about a pre-Clovis topic. Today, Eastern New Mexico University Ph.D-candidate Doug Sain — an eight-year supervisor at Topper who met his wife at the site — is writing his dissertation on the site's older-than-Clovis evidence.
Doug and Kristina Sain are part of the first generation of young archaeologists to enter professional life in the wake of the pre-Clovis revolution, and the new crop is well represented in the evening chow line at base camp. Joe Gingerich is a visiting researcher from the Smithsonian Institute. Chris Moore has been doing advanced geoarchaeology all over the Carolinas. Multiple graduate students come and go.
But the most easily recognizable of the young academics around camp is 32-year-old Derek T. Anderson, an archaeologist at the Cobb Institute of Archaeology and a lecturer at Mississippi State. A Topper supervisor since 2008, Anderson has carved out a reputation around camp as the quietly competent, understated authority the crew seeks out whenever tricky questions arise.
"I think that we have a better idea of what Clovis technology represents than people 25 years ago did," Anderson says when asked about the new multidisciplinary insights in American prehistory. "But there is still a lot that we don't know, particularly in the Southeast."
Though Goodyear retains the title of primary investigator, he functions more as a high-level supervisor during the annual spring dig and has worked in Columbia during much of the 2012 expedition. Pertierra runs the expedition's daily operations now, while Anderson, Sain, and the other young academics handle routine scientific duties.
Goodyear's generation broke through the Clovis Horizon, but as he and his colleagues near retirement, it's up to the next generation to make sense of what's been found there.
Coming and Going
One of the long-standing complaints about Goodyear's older-than-Clovis chert artifacts is that many of them just don't look particularly convincing. Though he's recovered pre-Clovis objects that are undeniably tools, skeptics continue to challenge many of the potential artifacts collected at the site as "geofacts," objects shaped by natural processes. Goodyear believes that his simple "micro-blade" artifacts represent a local older-than-Clovis toolkit, though he acknowledges that they don't look like much.
But there was no mistaking the ancient bi-facial projectile point discovered in Pennsylvania's Meadowcroft rock shelter in the mid-1970s, or the fact that it was buried deep beneath the Clovis Horizon. Archaeologists dubbed the new find the Miller Point.
The total number of confirmed Miller Points has yet to reach double digits, but the significance of "Miller-like points" is profound. The older tools at Topper don't resemble anything that followed them, but a Miller Point looks like someone working out a rough prototype for the Clovis Point. Suddenly everybody wants one, and there are rumors of archaeologists finding Miller-like points at digs across the South.
So when Pertierra welcomes the volunteer crew on the Monday morning that begins the Week 2 session, he lays it out. If anyone digs up a Miller Point in full archaeological context (or in scientific jargon, "in situ"), he'll make it worth their while.
Like how? someone asks.
"I don't know," Pertierra replies, grinning. "Something good. Maybe I'll buy you a car."
Maybe that's because Pertierra is the rare avocational archaeologist with the bankroll to honor that promise. He discovered a passion for archaeology in the sinkholes of Florida as a teenaged scuba diver, but pursued a career in business instead. When his entrepreneurial success provided him the freedom to return to archaeology 20 years later, Pertierra devoted himself to rethinking expedition logistics and improving the relationships between professionals and amateurs.
The result was a private company called SEPAS DLS: Southeastern Paleoamerican Survey Direct Logistical Support. When you set foot on the Topper site, the land belongs to the Clariant Corporation and any artifacts belong to the state. But every tool belongs to SEPAS, and SEPAS is Tom Pertierra. The only cost to the archaeologists for Pertierra's SEPAS support is cooperation with the volunteers, the people who not only provide the free labor at Topper, but who donate $488 a week to sleep in tents and work in claustrophobic holes.
The unpaid staff among the 60 or so people who pass through base camp during Week 2 comprise a profile that roughly reflects Pertierra's personal journey: plenty of young interns, lots of active retirees, but only a few working-age adults. The retirees call it summer camp for grown-ups, and most not only pay to work multiple week-long sessions but return each spring. "It's just like coming back to a family reunion every year," says flint-knapper Bill Lyles.
Scientists believe the Clovis people probably came here too. Mobile hunter-gatherers would have returned regularly, not only to quarry stone for future tool making, but also to manufacture tools on-site during seasonal encampments. Digs here reveal lots of tool-making debris and miss-hit failures, but even these are valuable. Archaeologists who study concentrations of flakes and broken tools have determined not only where an individual toolmaker liked to sit, but also whether the person was left- or right-handed. That's what precisely documented artifacts preserved in context offer science; evidence in context tells a story, but an artifact alone is silent. Which is why people suspect Pertierra would gladly trade a car for a Miller Point found in situ.
Still, the big driver behind Miller-mania in 2012 are the underwater dredge team's discoveries on nearby Smith Lake Creek. Almost every evening the divers return to camp with spectacular Clovis artifacts and tantalizing Miller-like finds.
But the biggest moment for the two-week dredging expedition comes on its final day of operations, when S.C. Institute of Archaeology and Anthropology marine archaeologist Ashley Deming instinctively plucks a small, green-tinted object out of the dark stream of sediment pouring from the dredge hose.
Normally, the divers just push the floating screens of dredge mud to the riverbank, where Lyles runs the crews that hose off the muck and sort any artifacts. But Deming delivers this item to Lyles personally. When she slips it into the palm of his left hand, his jaw drops involuntarily and his right hand starts patting his chest, as if to make sure that his heart doesn't stop.
It's as close to a likely Miller Point as there exists in the world — and more evidence that the direct precursors to Clovis once passed through Allendale County. But because it's been found in a jumbled stream bed, it can tell us no more.
Close, but no car.
Other evidence for Clovis origins comes from farther afield. Dennis Stanford's Solutrean Hypothesis argues that the techniques required to make flat, double-edged blades and projectile points originated with the Solutrean culture in Europe and crossed the North Atlantic during the last ice age. Tools from the period that have been found in Siberia demonstrate no knowledge of such techniques, and they closely resemble the older-than-Clovis micro-blades found at Topper.
In his 2012 book Across Atlantic Ice: The Origin of America's Clovis Culture, Stanford also cites subtle supporting evidence from a genetic marker known as Haplogroup X. If he's right, then a small percentage of the ancient genetic heritage of modern Native Americans is shared with the Basque people of Spain and France — considered the last linguistic and ethnic descendants of ice-age Europeans. None of this can be proven beyond a doubt with existing evidence, but geneticists now agree that Native American DNA comes from more than one source, although most consider those sources to be uniformly Asian.
Regardless of whether the Haplogroup X people reached the Americas directly from Europe or migrated across Asia and the Bering Strait, their genes make up only a fraction of modern Native Americans' DNA. But Stanford, who has collected stone tools from the Delaware area that match the distinctive laurel-leaf shape of the classic Solutrean blade, believes evidence of their ice-age journey is hiding in plain sight.
Add a fluted base to a double-edged Solutrean blade, and the result would look like a Clovis Point.
Respect The Flute
It's the flute that makes a Clovis Point a Clovis Point, and to understand why that feels so significant to some people, here's a little thought experiment.
You're a pre-historic hunter and you need a new spearhead to replace one you just broke. Your parents taught you how to make spearheads as a child, and because you've mastered the skill, knapping a new one takes only 30 to 45 minutes. The unfinished weapon in your hand is already functional and deadly and could be lashed to a wooden shaft as-is.
So here's the last step: To flute your new spearhead, you now have to smack a big flake off of each side, creating a slightly thinner flat surface where you can attach it to the wood. Mess up this final step — which modern knappers say is easy, if not likely — and there's nothing to do with your ruined spearhead but throw it in the bushes.
Question: So why would you do that?
Yes, fluting makes a spearhead lighter and easier to attach, but people have been tying un-fluted points to sticks for tens of thousands of years. A fluted point looks really cool, but from a risk-reward perspective, it's a final step that makes little practical sense. In other words, to modern sensibilities this Clovis insistence on fluting looks quite a bit like they were showing off.
You hear lots of variations on that theme among the volunteers. On a continent where interchangeable micro-blade practicality seems to have been the norm, a band of newcomers who show up wielding kick-ass spears tipped with enormous Clovis Points would have looked like rock stars.
Some even ponder the question in economic terms: What if the Clovis Point was the original American status symbol, a semi-durable good that conferred an abstract meaning as well as a tangible value? What if hunters from multiple mingling cultures prized them for their lethal beauty and balance, offering them as gifts during chance encounters or trading them within loose nomadic networks?
What if, as Anderson puts it, Clovis represents the first example of an idea "going viral?"
The scenario suggested by Anderson and others describes the Americas around the end of the last ice age in terms that would have sounded outrageously radical just a decade ago. Though sparsely populated, enough people have lived here over a long enough period of time to create connections between groups. Rivers and shorelines, once considered obstacles by 20th century archaeologists, serve as their highways. Instead of isolated savages, the new idea imagines these early Americans as modern, problem-solving people who developed social structures capable of transmitting information across the limits of tribes and bands.
And, sometime about 13,500 years ago, someone from the Deep South came up with the most beautiful and deadly hunting spearhead ever created. Not only does everybody want their own, but everybody wants to learn how to make more. As more humans arrive from Asia, the conversion accelerates. Within a few short centuries the Clovis way of making tools covers North America and extends into South America.
"It shows up essentially out of nowhere across the continent," Anderson says. "And I think it's just much more realistic to envision it spreading across existing social networks than to figure out how people moving away from each other across a relatively empty landscape would be able to keep in touch."
It's all based on good science, like the story that came before it. But listening to the new generation cautiously describe this emerging network-based hypothesis, it's hard not to remember that old "What you see is what you bring" adage about archaeology. The mid-20th century men who looked at the Clovis in a time of global conflict saw tough hunters who subdued a hostile world and survived. Today, in a world shaped by the internet, we look at the same artifacts and see social networks, ideas, and communication.
What will the next generation see when it peers into these masterfully shaped stones?
Whatever the answers — and we may never know them — Tom Pertierra seems happy to hear archaeologists finally asking better questions.
"It used to be, 'Were (the Clovis) the first?'" he says. "Now it's who were they, where did they come from, and where did they go? Now they're real people."
After his death, in May, much was written about Sendak’s legendary crossness, but it was really just impatience with artifice. “I refuse to lie to children,” he said. “I refuse to cater to the bullshit of innocence.” There was no roughness in his delivery.
THE BELIEVER: Do you miss the city, living out here (in Connecticut)?
MAURICE SENDAK: I really don’t like the city anymore. You get pushed and harassed and people grope you. It’s too tumultuous. It’s too crazy. I’m afraid of falling over in New York. People are all insane and talking on machines and twittering and twottering. All that. I’m here looking for peace and quiet. A yummy death.
BLVR: A yummy death?
MS: I’m just reading a book about Samuel Palmer and the ancients in England in the 1820s. You were so lucky to have William Blake. He’s lying in bed, he’s dying, and all the young men come—the famous engravers and painters—and he’s lying and dying, and suddenly he jumps up and begins to sing! “Angels, angels!” I don’t know what the song was. And he died a happy death. It can be done. [Lifts his eyebrows to two peaks] If you’re William Blake and totally crazy.
BLVR: You do some teaching out here?
MS: I have a fellowship that started last year, two men and two women living in a house, and I go over when they want me to critique, or whatever the hell. I just talk dirty. They’re nice people. Young. It’s probably not very original, but old artists like to have young artists around… to destroy. I’m joking. I really want to help them. But publishing is such an outrageously stupid profession. Or has become so.
BLVR: More so than it was?
MS: Well, nobody knows what they’re doing. I wonder if that’s always been true. I think being old is very fortunate right now. I want to get out of this as soon as possible. It’s terrible. And the great days in the 1950s and after the war, when publishing children’s books was youthful and fun… it really was. It’s not just looking back and pretending that it was good. It was good. And now it’s just stupid.
MS: Because of Rupert Murdoch. His name should be what everything is called now.
BLVR: But he publishes you!
MS: Yes! HarperCollins. He owns Harpers. I guess the rest of the world, too. He represents how bad things have become..
BLVR: Have you thought of leaving HarperCollins?
MS: Oh, sure. But I don’t know a better house. They’re all in trouble. They’re all terrible.
BLVR: What do you think of e-books?
MS: I hate them. It’s like making believe there’s another kind of sex. There isn’t another kind of sex. There isn’t another kind of book. A book is a book is a book. I know that’s terribly old-fashioned. I’m old, and when I’m gone they’ll probably try to make my books on all these things, but I’m going to fight it like hell. [Pauses] I can’t believe I’ve turned into a typical old man. I can’t believe it. I was young just minutes ago.
BLVR: Is the problem with e-books partly a problem of color?
MS: Yes. Picture books depend on color, largely. And they haven’t perfected the color in those machines. But it’s not that. It’s giving up a form that is so beautiful. A book is really like a lover. It arranges itself in your life in a way that is beautiful. Even as a kid, my sister, who was the eldest, brought books home for me, and I think I spent more time sniffing and touching them than reading. I just remember the joy of the book; the beauty of the binding. The smelling of the interior. Happy.
BLVR: Are you happy now?
MS: [Sighs] My friends are all dying. They have to die. I know that. I have to die. But two friends died last week. I was completely broken by it. One was a publisher in Zurich. I loved him and his wife. It’s the loneliness that’s very bad. They’re doing what is natural. If I was doing what was natural I would be gone, like they are. I just miss them, terribly.
BLVR: You’ve never liked being called a children’s illustrator.
MS: I never started out as a children’s book artist. What is a children’s-book artist? A moron! Some ugly fat pip-squick of a person who can’t be bothered to grow up. That’s the way we’re treated in the adult world of publishing. I remember a publishing party a thousand years ago and we were invited, people from the children’s-book department, and someone said, “Oh, you stay up so late!” Stupid man. But that’s the attitude in this country. I’m an illustrator. I have to accept my role. I will never kill myself like Vincent van Gogh. Nor will I paint beautiful water lilies like Monet. I can’t do that. I’m in the idiot role of being a kiddie-book person. It sounds like I’m complaining, but it has no effect on me. I have a good life. I’m strangely content now. Does that come through? Something changed; maybe it was his death. I can’t complain about anything. I’m a lucky buck.
Go read the whole thing.
Our world operates on an evolved relationship between humans and machines, but the tricky part of that relationship is that many of the machines we rely upon most intimately function so far beyond a layman's capabilities that they might as well be magic.
DIY hackavists have a saying: You don't really "own" a device until you can fix it. But what if we expanded this principle to a global scale? Do we own our lives, our place in the economy, our roles in a rapidly changing culture? Are we confident participants, or anxious consumers? Do we run the things in our lives, or do the things in our lives run us?
There are all sorts of reasons why Americans have traditionally divided "mind work" and "hand work," but when it comes to how we educate our children, I think we're long past the time when this division was helpful. Every modern child should have a positive, engaged relationship with technology, even as we challenge them to think creatively and abstractly.
And I can't think of a better technology for beginning that real-world relationship between children and machines than the bicycle.
Most of the business I get though my Uptown Bike Repairs service comes from simple adjustments and maintenance tasks that I could teach just about anyone over the age of 10 to perform. But the crucial lesson isn't teaching students whether to turn the barrel-adjuster on their rear derailleur to the left or right. It's teaching them how springs and cables and rachets work to move your chain up and down a series of cogs to improve your mechanical advantage while riding under different conditions.
The other name for that is "Newtonian physics."
So yes,adding a hands-on "Bike Shop" class to the middle school curriculum would do double-duty for academics, but here's the bonus: When kids repack the the ball bearings in a demonstration bicycle's steering column, they learn principles that apply to all sorts of machines, AND they develop a new relationship with their own bike back home. No longer is it some mysterious, confusing, occasionally frustrating thing. Now it's a system.
Once you understand a system in principle, you relate to it creatively. You own it mentally. And once you own one machine mentally, the second machine comes more easily.
But why a bike? Because bikes are relevant. Children ride them. Teenagers customize them. Adults use them for work and pleasure and fitness. Most of the tools required for most bike maintenance can be found around most homes. And when you teach a child to fix her bike, you match the rewards of competence with the sense of freedom that a bike represents.
Maybe students in rural districts don't need this kind of education. I grew up country, and I think that's one reason why I've felt more naturally comfortable around tools than kids who grew up without access to them. But for city kids,many of whom will commute to work on a bike as adults, learning physics, shopcraft and a healthy relationship to technology is an academic trifecta.
It's not that complex a concept. But then again, neither is a derailleur.
Photos: Our neighbor Qua'vonta Washington, 12, of Charleston has been working on bikes for about two years. He knows his way around basic adjustments, flat fixes, etc.
Long-time readers of Xark (and yes, my web analytics indicate you're still out there) will have noticed a number of changes over the years, but I'd say none have been more profound than my relatively recent decision to remove the option to comment on posts here. It's an indication of how the world has changed, but also how Xark's position in the blogosphere has diminished, that this decision generated not a single response.
But here's why I did it.
I was an early pro-comment partisan in the news business, and given the 2005-era mood of American newsrooms that put me on a very small island. Since I had a web-related job at the paper, I spent a good amount of time studying the issues and trying to come up with solutions. Along the way I found great examples of smart community moderating policies and technologies, and purchased the CMS that offered the most-evolved comment management tools. I believed then, as I believe now, that the ability to comment and share across horizontal, informal networks is the killer app for the 21st century.
Which sounds nice.
Unfortunately, newspaper and other traditional-media websites, for all their hand-wringing concerns about libel and civility circa 2005, are typically the worst offenders when it comes to building quality comment cultures. We've taught users bad habits and turned comment sections into troll ghettos.
The thing we're slow to understand is just how rapidly the culture surrounding the Web is adapting to the tools we use for creating and connecting to content. Because in the end, the quality of your comments is really a reflection of your online community, not the snazziness of your comment control dashboard. I think Xark's experiment in creating a community that wasn't focused on one topic was great while it lasted, but the new model of that kind of general engagement is a well-cultivated list of friends and follows on Facebook, Twitter and Tumblr.
What really changed between 2005 and 2009 was that regular people left blogging for social media platforms that far better suited their purposes. Blogs, once known for short blurbs and links that fit the emerging TLDNR attention span of modern readers, became the place where actual writers went to compose longer thoughts.
In that context, Xark stopped being a community of readers and writers between 2007 and 2008. Instead of fighting that, I embraced it. One of the things that came out of writing about that openly was that it revealed what I already expected: The last truly engaged members of the original Xark communities were its trolls, and I blocked them. They howled elsewhere, and then were silent.
Once a community stops functioning, its comments turn to spam. And I mean that literally. Most of the open-comments moderation I did here in 2010-11 was deleting spambots. Turning on "moderator approved" comments just meant that I did the same thing, only less frequently. And the human comments often weren't much better.
Last year I wrote an essay here and did what I normally do in today's media ecosystem: I posted it to Facebook and Twitter, where I can follow much of the conversation about my ideas, though not as fully as I can via on-post comments. And a day or so later, someone left a comment for moderation that I considered nothing more than a hateful turd deposited atop work I wanted to share with others.
In 2005-09, I'd have not only left that comment on the site, but responded to it, because that was the ethos that meant something at the time. But in 2012 I stared at that isolated piece of ignorant hate mail from some anonymous jerk and hit the delete button. There's no conversation going on here. The conversation moved. And I'm not about to let stupid strangers smear shit all over work I value based on nothing more than an outdated principle.
Still, I hung on for a while, unwilling to apply my new understanding of Xark's place in the new world. Call it nostalgia.
But as my kids say, you have to get with the now. And now this is the place where I come to write. You're free to talk about it someplace else.
So if you like that, great. If you don't, so what? It's a big internet.
Earlier this week, during a Facebook discussion about assault rifles on my friend Bill Strasburg's wall, one of his pro-gun-but-civil friends replied to something I'd written with this nifty artifact of today's bizarre conservative political culture:
"Dan, When the gov't attempts to oppress you, what are you going to use to fight back? We've seen in history, multiple times, where gov'ts have murdered millions of people after they took their guns away."
The reason I like this statement so much is that it represents a core element of the post-rational fantasy world that is currently collapsing around the heads of so many (overwhelmingly white) Americans these days. Despite the fact that our Second Amendment grants us gun rights predicated on the value of a well-regulated militia, the final fallback position for today's pro-gun partisans is the argument that untrained civilians need assault rifles so that when our own government's black helicopters come for us, we'll be able to fight back. Which our Founding Fathers could have specified as their constitutional intention, yet mysteriously didn't.
It makes me wonder. How many white American men my age have based their adult gun politics on the experience of watching the original Red Dawn as teenagers in 1984 and thinking "That looks awesome! WOLVERINES!"
But here's the thing so many of these would-be insurgent guerillas simply fail to grasp, no matter how you explain it. In a world where governments possess spy satelites, drones, wiretaps, sniffers, intelligence databases equipped with Non-Obvious Relationship Analysis agents, plus all manner of convert, overt and law-enforcement firepower, your little Bushmaster looks... cute.
Remember Saddam Hussein's Revolutionary Guard? They had some of the best tanks money could buy from the old Warsaw Pact. They spent the early stages of the first Gulf War hiding those tanks, because tanks just made better targets for our aircraft, and when they finally brought them out to fight, those tanks just made better targets for the guys in my old unit.
Sure, the Iraqis were far more effective as guerrila fighters in the Second Gulf War, but not effective in the sense of "winning." Once our armed forces adopted a more evolved counter-insurgency playbook, those Iraqi "freedom fighters" (or "Islamic terrorists," depending on your politics) were eventually reduced to something approaching irrelevence.
Which brings us to today's collapsing conservative fantasy world.
My reply to Bill's friend referenced the changing landscape of war and resistance, but concluded with this: "We do have power as citizens, and we need to use it to fight against oppression. But it's not gun power."
If the time ever comes when American citizens must choose between surrendering their liberties and resisting their government, the last thing we'll want to do is pick up assault rifles and take to the streets. Picking up an assault rifle and taking to the streets just makes you a high-resolution target. Holing up in your home with an assault rifle just isolates you for easy removal. Trying to form a group of people with assault rifles just makes you visible to communications monitoring, and then here comes the drone strike.
Our power as citizens today is primarily based on our ability to communicate in the open and to organize -- loosely -- in full public view. It's the power of information and networked transmission that can bypass mass-media propaganda channels. It's the power of distributed networks, without identifiable nodes of control. Our own Department of Defense created the original Internet as an adaptable communications system that could survive a nuclear attack because it could find a way around whatever portions of the system the enemy destroyed. It works that way for regular people, too.
Corporate interests in the "online market" have pretty much paved over the Web idealism of 2005-07. But it's important that we remember that people power, regardless of whether one lives in a democracy or a dictatorship, is based on information, communication and clarity of purpose.
In other words, assault rifles don't deter government oppression. People do.
Here is a little diversion for the nail-biting that may occur this evening. It's a BINGOBAMA Game. Each little square is filled with "Things a pundit might Say." There are four different cards and some suggestions on rules. The color pages should print nicely on letter-sized paper. If you don't have Adobe Reader, get it here: http://get.adobe.com/reader/
The day after President Clinton's speech to the Democratic National Convention, the Associated Press produced a “fact-check” that could be best described as the perfect parody of the systemic flaws I described last week. Rather than offering a limited, measured evaluation of the factual claims in the speech, the AP treated its fact-check as an opportunity to conflate fact with conflicting opinions of Bill Clinton's legacy. As Talking Points Memo headlined its post on the article, “True, but also, Lewinsky!”
Such absurdities line up nicely with my critique of the fact-check status quo and it's useful as evidence in this “What About The Facts?” conversation we're finally having. But the problem with my proposed solution to our fact-checking problem is that my information-standards system doesn't offer any short-term options for the failure we're experiencing right now. Even if we were deploying the first semantic content-management systems in newsrooms today, the emergent features I described would still take years to develop.
So what could we do, right now, with only the resources news organizations have on hand?
Back when I was working as a mechanic in a bike shop, I used to wonder whether a journeyman mechanic could make a living by offering a mobile repair service. Someone needs a tune-up, you schedule an appointment and show up with your toolkit, parts and stand. Low overhead, only one employee to pay.
Only it's not that simple. Dig into the business plan for something like that and it's easy to see how you could wind up starving. Quickly.
So it wasn't until this summer, while fixing a neighbor's bike, that the thought of trying a variation on the mobile repair idea -- except only part-time, and on a very limited geographical basis -- crossed my mind.
When I ran those numbers, they came up OK. Not as something you live on, but as something that fits pretty well within the freelance-writer-new-media-street-hustler life I've been living.
So after building a site and investing in some additional tools during the month of August, today I unveiled my CHS Uptown Bikes service. And a few minutes later, I booked my first service call.
I'm withholding judgment, obviously. I'll need do do quite a few jobs just to pay for my TS-2.2 truing stand, for instance. But we're off and rolling, and we'll see where this leads.
P.S.: If you love me, order one of our T-shirts. Pleeeeeeese...
My interest in fact-checking grew out of years of running political coverage in the Carolinas, in places where facts were often far less important than the nameless angers that animate our Southern culture. Doing that job is, in fact, a political act, though not a partisan one.
A political editor walks the line between trying to find the truth and trying to publish the truth, and must constantly compromise the latter in the service of the former. That's because when the balance of truth favors one party over the other, presenting it that way destroys the journalistic fiction that Jay Rosen dubbed "The View From Nowhere." Media companies like that view because, in theory at least, it allows them to sell their newspapers (or TV programs) to the entire electorate.
In other words, if the Republican Party produces 10 fact-mangling whoppers to every arcane Democratic stat-fudger, you've got a serious problem as a journalist. You simply can't present that ratio as-is without looking like a liberal hack.
So here's what we did -- what I did -- and what others have certainly done as well: I downplayed Republican dishonesty while judging Democratic failings with an unfairly harsh bias. I applied this to assignments, to the tone and presentation of stories, and to the various gimmicks we invented to try to evaluate claims. The results didn't reflect the true scale of the dishonesty gap, but they at least demonstrated that a gap existed. At least, they had the potential to demonstrate the gap, but only to very careful readers with a knack for drawing subtle inference. Because we could never come out and tell you what we all knew in the newsroom: Yes, "all politicians lie" (a cynical dodge if ever there was one), but the modern Republican Party is based on a set of counter-factual and faith-based beliefs, and has been for years. Not only has that foundation consistently put the party on the wrong side of fact-checkers, it has led us to where we stand today, with Mitt Romney running a campaign that has abandoned even the pretense of fact.
That dynamic is why, when I saw the rise of new media in the mid-2000s, one of my first thoughts was that journalists might be able to use technology to improve the way we cover and evaluate political issues. I studied it. I experimented with it. And eventually I came to an unhappy conclusion:
All attempts at systematic fact-checking of political statements will ultimately fail until the organization conducting them embraces an indepedent and verifiable claim to authority. Anything less is just rearranging deck chairs.
And lastly, as a special bonus:
The Five Stages of a Military Operation:
If you've read this blog more than once, you probably know that I've been talking about ways to integrate data collection and storage into journalism for years now. Along the way I've learned things about data and standards and business and finance and start-ups and non-profits.
But in the end, it really just boils down to "Do you have the money to do it?" Because while there are various ways you can get money for your idea, most of them (beyond my favorite, "Be born with it") come with a variety of "On the other hand" caveats, many of which have nothing to do with the idea itself.
That's why I've taken my idea for a networked semantic word-processor (the first step toward an integrated product we generically call a semantic content management system) to the Knight Foundation. My grant request is small by Knight standards: Just enough to fund development of an open-source version of the tool, screw up in a major way at least once, and run some user tests.
Once we've got a working version of the tool, it will either attract investors and collaborators or it won't. I'll be satisfied either way. If it's a valuable tool that people want to use, we can build a business that gets the concept off my desk and into the hands of a professional team. If it works but nobody wants it, then I can walk away knowing that I fulfilled my responsibility to the idea. And if it doesn't work, then the technical experts I've consulted will be really surprised.
So here's the drill: If you're interested in the various semantic journalism ideas that I've presented here over the years, or if you're just tired of me obsessing over something so geeky and want me to get back to more important things, like zombie movies and ugly dogs and Santa sex and video transcripts and snarky graphics, you need to visit my proposal on the Knight Foundation website.
See that grayed-out "Heart" icon? Click it (register through Tumblr if you have to, but just click on it). Have something supportive to say? Leave a comment at the bottom of the page. I don't think a small amount of buzz equates to a big advantage in a grant competition, but I suspect that the Knight people are like most of us, and pay a bit more attention to ideas that generate interest. If you wonder about this, stop on the sidewalk and stare up at a fixed point in the sky.Then count the number of passersby who stop and look into the sky with you.
Really. I'll wait.
The other day someone I follow pointed to a post on Digital Tonto called "The 7 Greatest Ideas in History." And I thought, well, that's kind of a gimme. There are a lot more bad ideas out there than good ones.
Here's my list...
In the summer of 1940, the British were the last of the Allies standing and the Nazis pretty much had England by the short hairs. In preparation for its cross-channel invasion of the British Isles, the German air force began ruthlessly attacking radar stations and British interceptor air fields on Aug. 12. By early September, the Royal Air Force had reached the breaking point.
But in late August the Germans accidentally bombed the London docks, which gave Churchill an excuse to launch a token bombing raid on Berlin two nights later. More little raids followed, and though the results were insignificant, the damage to Nazi pride wound up being catastrophic. On Sept. 7, with the RAF on the verge of collapse, Hitler ordered his bombers to stop attacking air fields and start attacking British cities as punishment. And as good as those Nazi bombers were at attacking air fields, they were hopelessly ill-equpped for attacking cities.
Given a respite, the British figther squadrons recovered, and by the end of the month they'd kicked the Luftwaffe's ass. Hitler had to call off the invasion, and, with nobody else to conquer in Western Europe, wound up invading his ally, the Soviet Union, instead.
Nobody liked New Coke.
What people tend to forget about the Bush era was that the Republican Party engineered the disaster. After the Bob Dole debacle in 1996, G.O.P. kingmakers decided the nomination was too important to leave to the voters, and so the party set out to find a candidate before the 2000 primaries and get everyone important behind him. This ill-fated expedition led them to the popular governor of Texas.
After beating back an early challenge from John McCain, Bush cruised to the nomination, and though he couldn't win the election, there was a Republican majority on the U.S. Supreme Court.
Bush proved unqiuely unqualified for anything resembling presidential leadership. Eight years later, with America's economy in shambles and our reputation in tatters, you have to wonder whether those smug GOP kingmakers wished for a do-over.
There are other Microsoft disasters, but Windows Explorer is the poster child of ineptitude. Why is it so bad? Nobody really knows.
But here's how bad it is: Almost every website in the world has a section in its code called the "Explorer Hack." It's a series of work-arounds designed to make standards-compliant, quality web code actually work with the buggy mess that is Windows Explorer.
Fortunately, the only people who still use IE are Microsoft employees and your grandmother.
In 1991 he published American Psycho, which was both a really bad novel and totally offensive. But when you're "the voice of your generation," you think you can get away with anything. The controversy sold a bunch of books, but it wiped out Ellis as an American literary figure. Thank goodness.
Sorry about doubling down on the RAF, but I've got a thing for historical aircraft, and this one is just deliciously stupid. The idea was to build a two-seat interceptor with no forward-firing guns, but four backward-firing machine guns in a turret behind the pilot, because... wait for it...
Well, actually, there just isn't a good explanation. In theory one could fly the Defiant under and kinda to the front of a German bomber and shoot sorta up and back at it, but that requires the cooperation of the enemy. So Defiant crews just sort of flew around, unable to shoot at bombers and forced to run away from approaching fighters. Must have been lonely, not to mention nerve-wracking.
And yet the British built more than a thousand of them. In the end, only the Canadians could be persuaded to take them, and that's only because Canadians suffer from congenital politeness.
To recap Sunday's New York Times story about metro newspaper companies that are now cutting back from a daily publishing schedule, some companies are doing it because it no longer adds up for them to put out a daily paper, and some people at other companies are saying that's a bad idea, because it breaks a daily reading habit and leads to cheap, lousy journalism.
Here's the important takeaway, and it's something I've been writing about for years and will keep touting until it happens: There is no bright future for journalism on the cheap. The answer to our current glut of junk info cannot be more junk info, and the reason that more companies should be moving toward a sustainable model of less frequent, higher-quality print publication paired with efficient digital-first journalism is because (for some of them) it's the only practical option available today that includes the words "quality" and "sustainable."
To recap the important points via links...
So when NYT reporter Christine Haughney quotes newspaper veterans who say things like this...
“They want them to produce more blog posts a day and not even worry about putting things together in a more thoughtful package,” Mr. (Brian) Thevenot (a St. Louis Post-Dispatch business editor who left The Times-Picayune in 2009) said. “The Times-Picayune has a sterling tradition of enterprising journalism. That’s why people are so mad. That tradition is being thrown under the bus.”
.. I alternate between varying flavors of despair and annoyance.
Despair, because short-sighted policies like those described by Thevenot are the industry standard, not the exception, regardless of the technology. The digitial tools at our disposal today could have been used to increase quality, build new relationships and increase credibility. Instead, media companies routinely view new technologies as nothing more than new ways to cut staff costs. That's because they're clear-cutting their last forest, not planting new trees for the future. Instead of growing a business that can support quality journalism, they're scaling back quality to match their expected earnings.
Annoyance, because it absolutely doesn't have to be this way.
So if you're a news executive (or some 22nd century digital archaeologist, searching for answers as to why and how journalism failed in America), and you stumble across this post, let me offer you a simple set of instructions for creating a sustainable system for quality journalism given the tools available today (while I work on trying to build new ones).
I never said it would be easy, because quality is never easy. But quality is the only way through what lies ahead.
This list of facts about Hampton Park in Charleston, SC, was compiled by neighborhood resident Kevin Eberle and posted on Facebook on May 31, 2012, by Marshall Walker. I'm reposting them here to make this valuable community knowledge easier to find via Web search later, once the original post inevitably disappears into the proprietary memory-hole that is Facebook... dc
On the subject of Hampton Park, did you know:
• That it was originally the “front yard” of a large plantation called “The Grove” which spread from Rutledge Ave. to the Ashley River and had a large, masonry house on the bluff overlooking the river with an avenue of trees in both directions, a large fish pond on one side, and an orange grove (hence, Grove St.)?
• That the park area was used as a staging area during the Siege of Charleston of 1780, the worst defeat of the colonists during the Revolutionary War?
• That the plantation was destroyed during the Revolutionary War, and the land was divided up for sale since the couple who lived there had had no children?
• That the South Carolina Jockey Club bought a large portion of the land and operated the Washington Race Course there for almost 80 years, during which time it was the most important social week of the year each February?
As a freshly minted (2010) soccer fan who has traveled for one U.S. Soccer tournament and hopes to make a second trip this fall, I've had the sense that I was riding a rising tide of American interest in the sport (1 million+ Americans watched Chelsea v. QPR; MLS attendance is up; new wave of soccer-specific stadia being built around the country, etc.).
But skeptics have a point when they question why this increase in interest is different than previous ones. It's not like Ameirican soccer fans haven't been predicting success for decades, even as the sport's fortunes rose and fell. So am I being rosy-eyed optimisitic?
Well, here's a new reason why I think I might be on to something. The people at U.S. Soccer are smart enough to understand how to use Twitter to promote their brand, and that bodes well.
If you follow U.S. Soccer (and 334,262 of you do), then you might see a Tweet from the organization offering to add you to a "fan roster" if you plan to attend an upcoming match. If you reply, U.S. Soccer puts your Twitter account into a public list of fans attending that game.
Here's why that's brilliant.
Soccer fandom, more than any other sport I've followed, is about a shared communal experience. Soccer fans stand, they sing, they bang on drums, they dance in the aisles. That's true at club matches, but U.S. Soccer relies on fans who are willing to travel hundreds or thousands of miles to support our national teams. That means you show up in a city not-your-own looking for a community of fans to join, places to meet, special deals on beer, etc.
Supporters groups like The American Outlaws and Sam's Army help make those connections, but here's U.S. Soccer stepping directly into the matchmaking game. They're not selling something. They're improving their product by improving the fan experience.
That's how you grow a sport today. Not by hyperbolic, expensive carpet-bombing marketing campaigns, but by partnering with your fans in expanding the future of the thing you both love.
Soccer may or may not continue to rise in the U.S., but there's a new generation of fans falling in love with the game here, and if the institutions involved keep making saavy little moves like this one, they might just get somewhere.
Today's news about the Media General/Berkshire Hathaway deal reveals just how dire things have become for our newspaper princes. Unable to turn its newspapers around, Media General has traded its print properties and a big chunk of its future value for $142 million in cash and enough financing to refocus its business on its television stations.
With 63 papers (along with their websites) selling for an average price of $2.25 million, its hard to believe that the valuation even comes close to covering the assessed value of the real estate involved. This is a company declaring "Take them! We don't know what to do with them!" (Here's what I like about the deal)
But what I want to talk about is actually this one graph from the NYT story linked above. This is Berkshire Hathaway boss Warren Buffett, talking to shareholders at their last meeting about the need for paywalls at their previously acquired newspaper properties:
“I don’t know of any business plan that has sustained itself that charges in one version and offers the same version free to people,” he said.
Buffett's idea is worth discussing not because it's necessarily wrong, but because it assumes the wrong answer.
I'm on the record in numerous places with my argument that newspaper companies were giving away content long before the web came along. Newspapers are in the advertising business, not the content business. It they could get you to look at their advertisements without having to pay to put content around them, that's exactly what they'd do. The problem with newspaper websites isn't that the content is available for free, it's that the advertising value of that content is so low that it doesn't make up for declining print profits.
Print subscription charges cover some portion of the cost of paper, printing and delivery, not content costs, but the paywall advocates have a point when it comes to the consumer's perspective. Subscribers think they're paying for content, even though all they're really doing is defraying overhead. So the strength of the paywall argument is a consumer-mentality argument, not an "our content has tangible value absent advertising" argument.
From that perspective, Buffett's statement has some limited validity. It argues that it's worth hobbling your web products to protect the value of your print product, an idea that's sometimes referred to as The Arkansas Model.
Here's the problem with that, and Charlestonians have been learning this lesson first-hand ever since The Post and Courier launched its paywall on May 1: Most newspaper content just isn't that unique or valuable. In addition to the websites maintained by its television news stations (who will never charge for online content), the city now benefits from a variety of local online news sources. There may be no competition for the title of "local metro daily newspaper," but there's now tremendous competition for news.
As an example, of the 18 promoted content links this hour at The Digitel (a local news and information aggregator), only one of them contained a link to a Post and Courier story, and that link was from a May 2nd update on story that was just updated today ... with a link to a story by The State, South Carolina's top newspaper.
Having done work for The Digitel, I can promise you that this is a significant change. The Post and Courier has articles about some of the same 18 subjects, and in the past The Digitel likely would have chosen to link to those pieces. But nobody links to content that's behind a harsh paywall.
So the P&C is following Buffett's advice, but who are the winners in this competition? Well, basically, everyone who used to be in competition with The Post and Courier for web traffic, particularly The Charleston City Paper, The Charleston Regional Business Journal and a variety of niche sites.
But here's the part that's most interesting: The Charleston City Paper and the Charleston Regional Business Journal are weeklies. And if you think about their approach to online content in light of Buffett's statement, that fact alone may be more important than the differences in their business models.
Think about it. As a free alt-weekly, The City Paper gives away all its content, both online and in print, which conforms to Buffett's observation even while flipping its conclusions. Meanwhile, the CRBJ charges for print subscriptions and maintains an reguarly updated, locally reported, free website. Again, the CRBJ conforms to Buffett's idea, because nobody conflates a live website and a weekly publication as being different versions of the exact same same thing.
So why aren't these companies pushing for paid content like so many metro dailies?
Because they've worked out a more competitive, more sustainable business model.
The City Paper print edition is valuable. It makes enough revenue to support a small full-time staff and pay for the original content it publishes (much of it written by freelancers). And because you can't break news very often in a weekly in the middle of a competitive media market, the journalists there never really confuse their web product with their print product. They use the web to keep people current if they've got breaking info, and they use their weekly print edition for features and content that's interesting regardless of whether or not it's published in-cycle.
The CRBJ, which has a nice print-subscription business, used to use the web to compete for timeliness with the daily paper. Now that The Post and Courier has opted to charge for its business content, local business people will have to decide whether online access to the P&C's version of local business stories is $9.99 a month better than getting the same news for free from the CRBJ.
If you take Buffett's statement in a more informed context, it really argues either for giving away everything, or making a clear distinction between your web product and your print product.
The City Paper -- and other successful, sustainable alt-weeklies around the country -- are proof that you can survive with an all-free journalism model. The CRBJ demonstrates that there's a natural partnership between branded web coverage and a less-than-daily print product. And by staying viable online, both are posititioning themselves to succeed with the next generation of news consumers. In the fight for future readers, paywalled metros have unilaterally disarmed.
Metro management teams could change their one-bundle-fits-all philosophy -- and their publication schedule -- and find success in similar ways. But as we've seen over the past few years, fundamentally changing what they do isn't typically much of an option for those guys.
My question is, how long will it take for the "Oracle of Omaha" to figure this out?
(Transparency: I worked for The Post and Courier from 1994-2008 and have done freelance work for The Digitel and the company that owns the Charleston Regional Business Journal. I write freelance articles for The Charleston City Paper. dc)
There's a nice little exchange going on between my friend Steve Buttry and Emily Olson, managing editor of the Register Citizen in Torrington, Conn., about a powerfully evocative question: How can we recapture the joy and excitement that was once part of the group experience of committing journalism?
It's an important question, because what outsiders don't really know about the old world of newspapering is that once upon a time, it was a lot of fun. People (myself included) routinely worked an extra 10 to 15 hours a week without filing for overtime simply because the job could be such a rush.
I'm not suggesting that unpaid overtime is a good thing. I'm saying that a rewarding workplace is a tangible company asset. Neither am I suggesting that everything that made newspapering fun is worth conserving. Many of our Mad Men-era attitudes were ethically irresponsible, intellectually lazy and infuriatingly arrogant. But newsroom morale isn't a touchy-feely issue. It's a quality and productivity issue.
Steve offers Olson six pieces of advice, then gives news executives four additional suggestions. They're all good, but I'd like to add six items to that list of executive advice, just to get it to an even 10.
1. Stand for something: The best people in the news business are people who came in with a sense of public service and preserved it. That didn't make them all crusaders, but they did their jobs with a sense of commitment to a higher calling, and the industry encouraged those high-minded ideals until its profit margins dropped.
Standing for something today -- to the extent that you demand quality and integrity of yourself and others -- will either burn you out or put you on the next layoff list.
In other words, if you want your people to work with inspiration, try inspiring them instead of disrepecting them. Attitude reflects leadership.
2. Fire all those assholes you promoted: Come on, you know you did it, and you did it on purpose. You promoted the yes-men, the bullies, the vengeful hacks. You gave authority to the sycophants who made you feel good about yourself despite your confusion. You replaced experienced leaders with the wisdom to recognize bad ieas and the integrity to oppose them with sociopaths who would aggressively repeat utter nonsense so long as the job came with a corner office and a small raise. You created a Reign of Terror and you promoted people who would do the dirty work so you could keep your hands clean. "The beatings will continue until morale improves" is a funny joke, but it's not a business plan.
3. Cut back on your publication schedule: You cut your news staff by a third while increasing its workload. Then you mandated "aspirational" weekly story quotas. Plus, once you gave up on trying to kill social media, you raced toward it like a binge drinker who spies an open bar at his nephew's wedding. You enforce senseless policies that encourage smart people to Tweet inanities. You think nattering about your paper on Facebook is somehow going to make money for someone not named Mark Zuckerberg. Meanwhile, you're forcing your overextended staff to churn out depressingly threadbare blogs, unread live-web updates, awkward podcasts, and astoundingly awful video.
Responding to the problem of the cheap-information glut by trying to produce more cheap information is, to say the least, counterproductive.
What you really ought to be doing is cutting back on the number of print products you publish each week. Put all your breaking news online and do a better job with your websites. Then raise the quality standard for the articles in your one-to-three-times-a-week print products. You don't produce enough interesting, shallow copy to fill seven metro papers a week. But if you had fewer editions, you could write stories that people couldn't find elsewhere, and you might accidentally make yourself relevant again.
As an added bonus, you could drop your Associated Press subscription. Think of the savings!
4. Stop doing stupid stuff: Here's an ugly little secret about your company. You think that a few old men (and a few old women who've joined the old-men club) in your outfit should enjoy a monopoly on all the relevant information and insight into business practices. On the flip side, you have this quasi-religious belief that nobody above the age of 26 understands digital media. So you run around shoving outdated business ideas down people's throats, while promoting callow cleverness with an astounding lack of review.
Treating everyone in between as if their only useful roles are silent-worker-drone or sadistic-plantation-overseer isn't exactly a recipe for fun-time creativity.
You know what really kills journalists' morale? Coming to work and finding out that you're about to make them do something obviously stupid. Again. Or reading a quote from their boss that makes them want to slam their heads against the wall just to make the pain go away.
5. Listen to your insightful peers: I'm awfully hard on the industry as a whole, not because I'm particularly mean or bitter, but because the industry as a whole has earned the nation's disdain.That's not to say there aren't enlightened, courageous, visionary executives out there. I've met a few. But they're outnumbered, outgunned in industry groups, and generally disrespected. Many have given up and cashed out, but a few are hanging on, hoping to make their communities better. They may not be right about everything, but they deserve to be heard, not mocked.
6. Go out of business: Seriously. You can't change, and you really have no interest in changing. You've had years to figure this out, you waited until you were broke to consider the problem, and yet you still don't invest in R&D. You think innovation is surfing the leading edge of the status quo, and let's be honest: All you really want to do is kill Craigslist, make people pay to read your increasingly crappy content online, and jack-up your display-ad rates.
Since you can't do that, you're following the Harvest Model -- cutting costs while you milk every last penny out of your obsolete business before shutting it down. But that will take years, and the longer you take, the harder it will be for competent start-ups to fill the niche you occupy in your local media ecosystem. Shut down, liquidate your physical assets, retire to Florida. You'll enjoy the sunshine, and we'll enjoy the journalism that follows in your wake.
There's an excellent future out there for the American news media, and I'm optimistic that we'll get there, somehow. But the same systemic flaws that are killing the morale of journalists are also standing in the way of that future. I suspect that making newsrooms better places to work again will also make our companies more viable, but I can't prove that. There's just not enough data from actual experiences.
Back in 2009, while contracted to work on a doomed content-repository project, a flash of insight struck me: The problem with grand visions of the Semantic Web was that they all assumed a top-down structure. One wickedly clever set of rules to wrangle every fact. A global ontology.
It didn't make sense. Global ontologies are like Soviet Central Planning. Rules are meant to be broken. And top-down systems are crashing and burning everywhere you look.
Plus there was another un-fixable problem: Everyone with money to spend on these projects wanted machines to do the yeoman work. Because machines are cheap.
Think about that for just a moment. We're talking about organizing the sum total and nuance of human knowledge, but the entire world assumes somehow that this is a job for machines. That the best way to understand the complex, pattern-based output of human intelligence and language is to assign computers to decode it after the fact.
People think that makes sense because they think computers are magic, not machines. Meanwhile, in the real world, Text Mining Engines aspire to 75 percent accuracy. That's why our content-repository project failed. The client's product specifications couldn't be satisfied via the vendor's pathetic 75 percent accuracy rate.
So one night I asked myself: Could you reach the goal if you flipped the script on every core assumption? Not top-down, but bottom-up? Not machine intelligence, but human intelligence, assisted by machines? Not one "global graph" but many interconnected "directories of meaning" based on capturing machine-readable statements of fact during the production of human-readable articles?
And of course, the answer is yes, you can do all these things, and you can do them profitably, so long as you follow two simple rules: 1. Build tools that make it easy to publish directories of meaning; and 2. Give users the power to make their directories cooperate with other directories.
Once you do that, the need to create perfect top-down rules for knowledge disappears, because you'll have harnessed the power of emergent properties. If you build a good directory, others will want to use it.
What's so hard about that?
But people didn't get it. Most still don't, for lots of reasons -- including our very human inability to hear anything new without forcing it to fit into old assumptions.
They're about to start getting it, though, because now Google gets it.
The search giant has constructed a bottom-up directory of meaning. The company calls the product "the knowledge graph" and the service Semantic Search. Being Google, the company still sees the problem as a data-recovery challenge, but that doesn't matter. Once such directories exist, a semantic economy based on the value of machine-readable definitions is born. Once we begin feeding that market, information becomes a public commodity. And once we give people the power to define the meaning of their own words, and then to share those meanings in a mutually beneficial way, we'll have tapped into the same emergent property that generated Wikipedia.
It's not that complex. It doesn't require any exotic programs. But it does take vision, discipline and the tiny bit courage required to buck the status quo.
The World being the terrifying, beautiful mess that is, we make our way through its mazes less like wandering, wondering, free adults, and more like children hurrying through a bad neighborhood on their way to the store with a shopping list from mother: fearfully, avoiding eye contact, with tangled senses of purpose and possibility and dread. We seek happiness and abundance and approval, but mostly we're avoiding trouble.
Given enough successful errands to the store, eventually we learn the Grants and Limits of our Authorization.
This One is authorized Work and granted the presumed nobility of his daily sacrifice. So Working Man receives his ration of respect in the abstract aggregate, but it never comes addressed with his name on it. Politicians compose hymns to his sainted martyrdom, but Working Man is authorized neither an individual face nor a privately owned voice. He is prohibited the joys and risks of freedom but granted songs that celebrate such notions on FM radio.
This One is authorized Poetry and may Wander Lonely As A Cloud, but Poetry Girl may not exhibit hard edges nor stand solid in the lovely flowing stream of This-and-That citing gear ratios and irrefutable measurements. That's not why the system authorizes poets. Your poetry license authorizes you one, but never the other.
And This One... well, you get the point. This One is mentioned only because Essayists are authorized to process Controversial Thoughts, but required by Convention to list three examples of each.
And so on.
We don't "find ourselves." We receive our authorized roles and then trim away all the bits of Self that lie beyond the edges of the cookie cutter.
If we're lucky, we eventually acquire enough perspective to see Ourselves within the System, with all its rules and enforcers and limits and inevitable gravity.
And then, without any authorization whatsoever, we walk through the walls.
It's been a weird couple of weeks in American politics, with Republicans claiming that an insurance rule that applies to religiously affiliated institutions is tantamount to an assault on First Amendment religious liberty (despite a guiding Supreme Court precedent from 1993, written by Associate Justice Antonin Scalia, that argues otherwise), followed by a series of bizarre public statements by nominal GOP front-runner Rick Santorum. Clearly the conservative idea of a culture war has returned to contend for our national attention, and will likely continue to receive bandwidth so long as the economy keeps getting better.
Which brings us to another under-discussed issue in the 2012 Presidential election: U.S. Supreme Court appointments.
In 2012, the Roberts Court is made up of nine justices between the ages of 51 (liberal Elena Kagan) and 78 (liberal Ruth Bader Ginsberg). Four of the nine are appointees of Democratic presidents (Kagan and Sondra Sotomayor, Obama; Ginsberg and Stephen Breyer, Clinton) and five were appointed by Republicans (Chief Justice John Roberts and Samuel Alito, Bush; Clarence Thomas, George H.W. Bush; and Anthony Kennedy and Scalia, Reagan).
Despite the Roberts Court's reputation for partisan division, only about 20 percent of its decisions have come on 5-4 votes. But here's the thing about those votes: Its close decisions have tended to be about the kinds of political and philosophical issues that concern voters, rather than arcane rulings on the finer points of law. It's not that the court is sharply divided on everything -- it's just divided on the really important things. Like the Citizens United ruling, in which all five Republican-appointed justices voted together.
When it comes to those kinds of decisions, the Roberts Court looks depressingly predictable. In 14 of the last 16 5-4 votes, Democratic nominees (Kagan, Sotomayor, Ginsberg and Breyer) have been on one side, and four of the five Republican nominees (Roberts, Alito, Thomas and Scalia) have been on the other, leaving Reagan appointee Kennedy to cast the decisive vote. Not that Kennedy is an equal-opportunity swinger. He decides in favor of the Republican justices more than 60 percent of the time.
Which brings us to the question: How many appointments is the 2012 Presidential winner likely to receive, and more importantly, what effect are they likely to have on the court and the country?
Barring the usual boilerplate about health and personal conscience, it's safe to say that none of the Justices who will be younger than 70 in November 2016 are likely candidates to step down. Only one of the nine justices that the current justices replaced was younger than 70 (the enigmatic David Souter, who resigned in 2009 at age 69). The other predecessors of the current crop of justices were 75, 76, 79 (two of them), 80, 83, 85 and 90 when they either stepped down or died in office. That's an average retirement age of 79.
Of this group, only two-time cancer survivor Ginsberg should be considered likely to be replaced before 2017. Should President Obama win re-election and the right to nominate her replacement, her departure would have little effect on the court's composition -- though her replacement by a conservative justice would almost certainly produce an unbreakable five-member conservative block -- with Kennedy serving only as an unnecessary sixth vote.
However, if we take history as a guide, then there's probably at least a 50 percent chance that the Presidential winner in November will get a second appointment, probably from one of those other three seventy-something justices. And if that justice turns out to be either Scalia or Kennedy and that President is Obama, then we're looking at the first major ideological shift on the court since Clarence Thomas replaced Thurgood Marshall in 1991.
Weirdly enough, every justice on the Roberts Court except Thomas replaced an outgoing justice with whom he or she was generally aligned in ideological terms. Even Kennedy replaced a swing voter in Louis Powell back in 1988.
So as you think ahead to November, remember that it's not just the number of appointments that matters, but how a candidate's appointments would affect the makeup of the court. Then consider these two scenarios:
If President Obama wins, the result will likely be a 5-4 liberal majority, or at worst, a continuation of the status quo.
If a Republican wins, the conservative block will grow from four votes and a conservative-leaning swing voter to an ideological 6-3 majority that could endure well into the 2030s.
Now imagine the caliber of nominee that a President Santorum might put forward.
See you in November.