eBook placed in front of a hard copy book.

Let’s Put Our Heads Together

(On October 28, I presented the following talk at this year’s “Books in Browsers” conference, which was hosted by the Internet Archive and sponsored by O’Reilly Media, among others. A video of this talk is available on O’Reilly Media’s YouTube channel.)

Let’s put our heads together and start a new country up. Our father’s father’s father tried, erased the parts he didn’t like.

Some of you may recognize the opening lyrics of “Cuyahoga”, a song by R.E.M. that appeared on its 1986 album, “Lifes Rich Pageant”. Many of my blog posts are based in a lyric, and in this presentation, you’ll see a handful of musical references I found helpful in preparing for today’s talk.

A year ago, I stood here and claimed that we had entered an era of content abundance, one that is forcing publishers to confront weaknesses in how they create, manage and disseminate content. I cited four implications of abundance:

  • Our content must become open, accessible and interoperable;
  • We’ll need to focus more clearly on using context to promote discovery;
  • Trying to compete on the cost of content is a losing proposition. We need to develop opportunities that encourage broader use of our content; and
  • We distinguish ourselves when we can provide readers with tools that draw upon context to help them manage abundance.

Much of my thinking at that time centered on what publishers could do to succeed in a content-abundant universe. Since then, I’ve been kicking around what abundance means for our industry – not just publishers, but also authors, agents, distributors, wholesalers, retailers, libraries and others.

Increasingly, I’ve come to feel that we need to find a way to all hang together, or surely we will each hang separately. To accomplish that, we need four things:

  • We need goals, a redefinition of what publishing is and why it matter. That is, we need to reposition publishing as the engine of the engagement economy;
  • We need rules, a set of principles that are based in fairness and recognize that we have to balance current requirements with some, perhaps many, future unknowns;
  • We need feedback, a shared way to model new approaches, test assumptions and make decisions based in fact; and
  • We need a hook, a reason to collaborate.

We’ll return to these ideas, which draw upon recent work by Jane McGonigal, in a few moments.

I’m hardly the first person to think or talk about the implications of content abundance. Michael Hart, the founder of Project Gutenberg who passed away in September, thought that portable petabyte storage capable of holding a billion ebooks would be readily accessible to a middle-class reader by 2021.

Technology advances bear him out. 2011 marks the 40th anniversary for not just Project Gutenberg, but also the introduction of the first microprocessor, the Intel 4004. In the last four decades, the number of transistors we can squeeze on a chip has grown from 2,300 to 3.1 billion, while clock speeds have increased 3,700-fold.

Much as abundance is the precursor to the development of context, capacity is the precursor to abundance. Moore’s law got us to where we are, and while growth in digital capacity may slow, it is not going to stop. This capacity is rewriting the rules of the publishing supply chain.

In an exchange that took place a few years ago, Hart predicted a reading-enabled future in which book prices plummet, literacy and education rates soar and old power structures crumble in the wake of scientific, industrial and humanitarian revolutions. That’s kind of cool if you’re part of the proletariat, but it might be a bit unnerving if you’re an oligarch (or aspiring to be one).

Now, some folks could rightly claim that a little revolution every now and then is a good thing, and I won’t argue with them. But I’ve been wondering if we might get a lot closer to the next Enlightenment without having to roll out a 21st-century guillotine. That is, I’ve been wondering if there is an opportunity in abundance.

I started out thinking that the answer might be elusive. Most of us would accept that the supply chain we’ve built to handle physical books is complicated. It’s constructed to promote efficiency and lower transaction costs as a share of total revenue. Hampered by the gravity of success, it isn’t built to adapt quickly or to promote investment in new markets.

Any supply chain is a system – a collection of processes, tools and participants. The extent to which a system can be described as “complicated” is a function of nodes and connections. The more participants and relationships you have, the more complicated the system. But even a complicated system is predictable: if you can identify and quantify the inputs, you can reliably forecast the results.

By way of comparison, a system is considered “complex” if you can identify and qualify inputs but the results are not necessarily predictable. Increasingly, the publishing value chain feels “complex”. We can no longer understand the whole system by simply looking at the sum of its parts.

Though we compete on context, metadata is largely assigned and managed by arm’s-length intermediaries. The current supply chain was not designed to provide the publishers with an understanding of how, where, when and why consumers access and consume content.

It is this unaddressed complexity that has begun to erode supply-chain predictability. New technologies don’t just lower transaction costs; they eliminate some transactions entirely. Ultimately, eliminating transactions means eliminating one or more parts of the supply chain.

Managing complex ecosystems requires new approaches, ones that Martin Reeves and Mike Deimler, both of the Boston Consulting Group, described this way:

Increasingly, industry structure is better characterized as competing webs or ecosystems of codependent companies than as a handful of competitors producing similar goods and services and working on a stable, distant and transactional basis with their suppliers and customers. In such an environment advantage will follow to those companies that can create effective strategies at the network or system level.

When I started working through these ideas, I wondered whether we had the tools in place to effectively negotiate our way to a new order. In a stable environment, most supply chain issues can be resolved as “two parties, one issue” negotiations. Think about discount rates, shipping terms, library lending, royalty rates and territorial rights.

Sometimes, these two-party, one-issue discussions play out in a series of distinct negotiations between supply-chain partners. While this makes individual negotiations more manageable, it also reduces the likelihood that options beneficial to anyone not at the table might be introduced.

Magazine publishers faced this situation early in 2010, when Anderson News and Source Interlink, two of the largest single-copy wholesalers, asked for new terms for handling newsstand copies. Publishers balked, Source Interlink backed down and Anderson News invoked its BATNA – best alternative to a negotiated agreement – and closed down its distribution business.

I feel for Anderson. Newsstand distribution is a tough business, and Anderson left it claiming they were losing money. For several reasons, magazine publishers act in counter-intuitive ways. They push too many copies into the supply chain, and the average magazine sells only a third of its draw. The cost of handling returns can be enormous.

But with Anderson’s overnight exit, magazine publishers lost 40% of their newsstand coverage, a situation that took months and millions of dollars in lost sales to sort out. We worry about the loss of retail space at Borders, but we haven’t reached the real cliff yet …

As we transition from print-only to a blended product and service supply chain, focusing on our immediate needs risks the loss of a significant supply chain cog – libraries, wholesalers and retailers included. In the current, complex system, we don’t fully understand the value added by each of these partners. Losing them can and does create a set of unintended consequences.

Abundance hasn’t quite gutted the old rules, but it has rendered them inadequate. As Peter Brantley pointed out last year, we’re trying to extend agreements made 40 or even 70 years ago. Yet we live in a time when new entrants, new content forms and new distribution options have created a maelstrom of variety.

Using serial negotiations between two parties makes it impossible to revamp our supply chains so we can respond to content abundance. What we need is a new approach: many parties, negotiating many issues simultaneously.

In the last 50 years or so, a lot of good work has been done to explore and refine the development of multiparty agreements. In relatively stable systems, well-specified procedures – auctions, sealed bids and limited markets – can be used to resolve disputes. But the success of “many parties, many issues” negotiations depends on access to data, transparency and an overarching sense of fairness.

An example dating back to the 1960s, the so-called “law of the sea”, sought to establish a system of payments for the right to mine extra-territorial sea beds. Early in the negotiations, participating nations agreed that the “common heritage of mankind … should not be prematurely exploited by those who happen to be ahead.” That’s a notion many publishers might find comforting.

As an international negotiation, “law of the sea” involved literally hundreds of participants and waded through dozens of significant issues: how mining might be financed, what royalty rates might be paid, what happens to royalty rates over time and how the proceeds, if any, might be allocated. The discussions needed to be “win-win”, exploiting differences among assumptions, tradeoffs, risk preferences and need for capital.

Ultimately, these complex negotiations benefited from the analytical underpinnings of a model developed entirely outside the negotiations themselves, in this case at MIT with support from the Department of Commerce. The model was introduced at a critical juncture and provided data that helped the negotiations focus on long term strategic and collective interests, not positions.

An interesting example, perhaps, but you may be thinking, “We lack a mechanism to make that happen.”

Or maybe we just haven’t made it yet.

It’s not that hard to leap from game theory to game design. Jane McGonigal’s recent book, Reality is Broken, examines opportunities to use the principles of game design to create a better and more immersive reality. I’d like to go back to those four things I said we needed to do to reinvent publishing:

First, we need a goal. “Survival” is really not adequate. It doesn’t motivate or sustain, and it presumes a zero-sum game (or worse). I’d like to put a not-so-radical idea on the table: abundance, digital formats, Amazon and Apple all challenge prevailing business models, but the super-threat is people not engaging in immersive reading and text-based study, the precursors to critical thinking.

We live and work in a world in which we have a narrow window to influence or convince people to do what we want them to do. We talk about the quality, value and importance of our work, and we view the act of publishing as validation. But the measure that matters starts with how what we do is received.

So I propose a far bigger, collective goal: Reposition publishing (which for me includes physical and digital forms of book, magazine and newspaper content) as the engine of the engagement economy. To make that happen, we need to increase the expectations we place on ourselves and on our readers, along the way “architecting the experience” of consumer interaction with our published works.

Second, we need rules. For a start, I like my four implications of content abundance. If we truly want to become the engine of the engagement economy, then being open, discoverable, agile and useful are defining characteristics.

But we also have to recognize that change will create winners and losers, even as we need continued support for the prevailing supply chain infrastructure. Figuring out how to migrate successfully will depend on inventing options that ultimately are based in agreements about what is fair. It will also require data that may challenge how we assign value to various roles in the new supply chain.

It’s also worth testing the extent to which various participants believe that growth in reading is possible. While I can position it as an imperative, the extent to which the industry grows or shrinks strongly influences the willingness of various participants to collaborate, combine or trade assets.

Authors, agents, publishers, wholesalers and retailers are all part of the same industry, but it’s not yet clear if our belief systems are compatible or conflicting. Does Andrew Wylie, the agent, really believe that digital royalties should be 50% and the cost of distribution should be zero? Do publishers really believe that royalties today need be the same in perpetuity, or can a different model be implemented? Are libraries a source of book sales or a net drain? Questions like these are the starting point.

Third, we need feedback. McGonigal notes that some games are designed to give you feedback first as a way to learn what to do and how to play. We need that now, just to manage a complex supply chain. The longer the lag between action and response, the more likely it is that we’ll inadvertently take actions that are with some frequency not in our best interest.

This is where models can help. The breakthrough moments in the “law of the sea” negotiations started with engagement: participants could make their assumptions explicit, plug them into the model and see the impact on their return as well as the impact on everyone else. Iterate enough times and the win-win solutions, where they exist, can be identified.

To make this happen, the roles of industry associations and standards bodies will need to change substantially. At the least, now is the time to look at pooling funding and potentially establishing a meaningful, data-driven R&D effort. R&D isn’t just about technology: some basic research about how people find, assess and consume content might give publishers reasons to re-evaluate their arm’s-length engagement with libraries.

Finally, we need a hook. Participation is voluntary here. Companies, institutions and individuals decide whether or not to play. If we want to reposition publishing, we need to provide various forms of intrinsic motivation.

And we have to be willing to give lots of people a seat at the table. The fundamental structure of the existing supply chain is under attack. We need to figure out ways to reduce both transactions and transaction costs, or retail entities will continue to do it on their own.

The role of supply-chain intermediaries will need to evolve. For a period of time, possibly a long time, we’ll need support for the physical distribution of content. We may need fewer companies providing that support, and it is likely that we’ll have to adjust terms to better reflect the cost of doing business on a smaller scale. If we treat these negotiations as we have in the past, however, the likelihood of an Anderson News moment only grows over time.

That’s why the time is now to figure out how we get “Publishing, The Game” started.

McGonigal talks about building “superstructures” – a highly collaborative network built on top of existing groups and organizations. These superstructures bring together two or more different communities that don’t already work together to help solve a big, complex problem, what she labels “super-threats”, that no single existing organization can address on its own. The new entity harnesses the unique resources, skills and activities of its subgroups, but it is fundamentally new – an idea not tried before, an untested combination of people, skills and scales of work.

I called the prospect of people not engaging with our content the publishing manifestation of a super-threat. I’d argue (pretty strongly) that it represents a super-threat not just to publishing, but to the way we function as a country, an economy and as a part of a world order.

We have a responsibility to address this threat, not just so that we can make money, but because we’re the ones with the ability to solve it.

Other industries facing an uncertain future have banded together to form and fund superstructures. The Gas Research Institute, for example, was authorized in 1976, at a time when the natural gas industry was highly fragmented among producers, wholesalers and distributors. The latter often held a local monopoly.

By 1981, GRI was spending $68.5 million on research and a total of $80.5 million on oversight and R&D. This represented about 0.2% of the wellhead price of gas that year, valued at the time at a bit more than $38 billion.

GRI undertook research and development in four areas: supply options (near term, mid-term and long-term); efficient utilization; enhanced service; and basic research. Funding, drawn from a surcharge on sales as well as some government grants, accelerated to something north of $100 million in the mid-1980s.

If you look across all of publishing in the United States, it’s about a $40 billion business. Imagine what we could do if we could create and sustain an organization with $80 million a year in funding. It’s also likely that an industry-wide commitment to addressing engagement would garner the external funding that most parties have been understandably reluctant to spend on narrower causes.

This is the opportunity in abundance: a fighting chance to remake our industry and ourselves in a way that reflects, to borrow the phrase, our better angels. To get there, we’ll need to cross some relatively uncharted waters.

McGonigal identifies a host of “fixes” for the reality that is broken. The publishing industry is already testing some of these solutions, albeit not in a structured way. I’d like to focus on three areas where I think we have started to re-frame our thinking.

First, more satisfying work. Earlier this month, at O’Reilly Media’s TOC conference in Frankfurt, Jason Epstein positioned on-demand technology (including the Espresso) as a vehicle that would help publishing foster many smaller imprints. Although it still bets big, Hachette’s Twelve limits its lists to allow adequate time to focus on the books it signs. These ideas potentially restore human scale to publishing.

In a less well-known way, think of Benetech, which works with publishers to make content accessible to the blind. As Kassia Krozser points out, Benetech makes content accessible in ways that benefit readers beyond their target audience.

In an e-mail exchange, Kassia invoked curb cuts, conceived to help people in wheelchairs, but providing benefits to anyone who pushes a baby stroller or a cart full of groceries. What is good for one segment can also be good for the greater population.

Second, a better hope of success. Reading is hard work. Innovations like Wordnik, Flipboard, Small Demons and Safari Online make it easier to find what you want, when and where you want it. They help you filter and decode a vast array of content on terms that make sense for you.

In a similar fashion, we’ve entered an era of “rights everywhere” – not as a barrier, but an opportunity. Solutions like those embodied in Valobox – in effect, progressive reading as a transaction – will become commonplace. It’s a different approach from the arm’s-length way we manage rights today, and it will require a rethinking of upstream (author and agent) relationships.

Complex systems demand a diversity of thought – variety, in effect, is the only way to manage a complex system. Considering a wide range of perspectives helps reduce the risk of failure. A game-like reinvention of publishing give us a chance to crowdsource experiments.

A third area in which we’ve begun to think differently is stronger social connectivity. The experiments abound. Look around this room. You’ll see and have heard from representatives from ReadSocialAPI, Readmill, SocialBook, Goodreads, Readability, BookRiff, Cursor, and more. Bob Stein has been working for 20 years to push the boundaries of social connectivity in reading.

We all know the common question: who wants (or needs) another platform? I’d like to reposition all of these plays with a different question: How do we make sure our platforms are at the center of public and private conversations? How do we make sure that our content is visible, available and relevant?

I started my remarks talking about the way that music seeps into my thinking about words. In Annie Liebowitz’s photo essay, American Music, musician, poet and author Patti Smith captured it this way:

Our music grants us a coat of invulnerability, a spring in which we bathe with abandon, methods of response, moments of respite, and a riot of self-expression. It is the porch song. Plunging youth. It is thick-veined hands squeezing clusters of notes from an equally thick neck. It is the Les Paul. The tenor sax. It is a platter spinning in space, etched with the words “Tutti Frutti”.

Patti Smith writes beautifully about the power of music, evoking images that may make you wonder how we will ever compete against a tidal wave of creation and consumption and media alternatives. It’s worth noting, though, that she chose to do so … in a book.

Words inspire, motivate and change. They can help us shape a new reality, but first we have to see abundance as the opportunity to reshape our business.

Let’s put our heads together. Let’s start a new industry up.

(With thanks to Don Linn, Laura Dawson, Kirk Biglione, Sheila Bounford, Kassia Krozer, Anna von Veh, Ashley Gordon, Kat Meyer and Peter Brantley, all of whom helped in the development of this talk.)

About Brian O'Leary

Founder and principal of Magellan Media Consulting, Brian O’Leary helps enterprises with media and publishing components capitalize on the power of content. A veteran of more than 30 years in the publishing industry and a prolific content producer himself, Brian leverages the breadth and depth of his experience to deliver innovative content solutions.

Leave a Reply

Your email address will not be published. Required fields are marked *