You, Over There In The Choir! We’re Preaching

May 13th, 2007 · 8 Comments
by Kassia Krozser

From what we can tell, attending the Book Industry Study Group’s recent “Making Information Pay 2007” conference was like reading BS every day. Except there was probably less snark. And less wine. Reading the wrap-ups is like reading our posts. Is that going to stop us from highlighting a few key points? Of course not. Repetition is good for the soul.

This is not a case of if you build it, they will come.

As you will recall from previous conversations, while the publishing industry was snoozing, a large portion of the world moved online. Not the entire world, but enough of the world that it would make sense for the publishing industry to follow suit. This has not been the easy, comfortable transition you would expect. In order to achieve the goal, industry professionals are being forced to abandon the “that’s how we’ve always done it” mentality.

Mike Shatzkin of the Idea Logical Co., said, “The day will come when you should have done it last week.” Hopefully, the crowd shouted back, “Yeah!”

We’ve been reading various and sundry accounts of the presentations that generated the most brain waves, and it seems the O’Reilly ideas were the hot ticket (have we mentioned the upcoming Tools of Change conference? You, all of you, really need to be there. Take our word for it.). For years now, really, years, we’ve been all excited about O’Reilly’s Safari program. This is clearly the wave of the future for a specific type of non-fiction work; the traditional publishing cycle is simply too slow and expensive for time-sensitive work. The trick is to make the information available in multiple formats: traditional book, electronic, and bite-sized. Sometimes all you really need is a chapter.

And, of course, when you think beyond the book, you can also think beyond the one-way communication. You can have images, audio, interactivity, 3-D, virtual reality, whatever it takes to augment the message. We have noted many times in the past that the readers are online and clamoring for content. The only thing keeping them from embracing online publishing endeavors is the fact that online publishing endeavors aren’t doing a thing for readers. Stop complaining about declining sales and start engaging the people who want your product.

Publishing representatives such as Chris Hart of Random House took the idea of Digital Asset Distributors (DADs…how, uh, old school patriarchal, no?) introduced by Mike Shatzkin (quoted above) and explained that his company is moving forward, starting to get the concept of community and content. As Publisher’s Lunch said, “Hart underscored the company’s broad efforts are “not a move against Google and Amazon” and it’s “not about DRM, and not about e-books.” It’s about “having books in the internet conversation and it’s about books versus every other media possible.””

But in so many ways it is a move against the Internet giants. As the music industry is still painfully learning (it hurts to watch, it really does), if consumers don’t get the content in the way they want, they will work around the restrictions. iTunes has effectively proven that people will pay if access is easy and the price is reasonable. It amuses us that publishing professionals “e-books are nowhere near achieving their potential” without noting that publishers have been deliberately obtuse about the idea of ebooks. It’s not that the demand doesn’t exist, it’s that the demand doesn’t exist in way publisher’s decided it should.

Price. Ease-of-use. The right format for the right information. Publishers are just now coming around to this world-view. Now if only they could get on board with what matters to consumers: finding what they want. Random House’s “search inside” feature is great if you know what you’re searching for inside a specific book. Typing various search phrases into the little search box at the top of the page returns what can only be termed as useless results. It is unclear how the search results relate to the search term.

Searching inside a book works only when the customer already knows what information is inside the book. Searching for a book requires so much more. Let us give a short example. We typed “to be or not to be” in the search box. The results were, well, confusing. After a little effort (why would RH’s author page default to “events” as the first search item on the page rather than just authors?), we determined that William Shakespeare is published by the house. We also, through careful research, discovered that Hamlet is a RH title.

This is why effective, well-considered, well-built search matters. It matters a lot. This is an obvious example with obvious results. The fact that the obvious didn’t happen shows that publishers have a long way to go when it comes to fulfilling their vision. It also tells us that dissing the search giants is not in the publishing industry’s best interest.

We do not tout Google because of our undying love for the brand — though it is currently a personal favorite (how can you not love Gmail?)– but because it is the best at what it does. Information aggregation in a meaningful way. Let’s just say that the first Google results for “to be or not to be” point to Shakespeare. When we are searching for something, we want relevant results, simple as that. Most businesses are very, very bad at implementing search. Though, in the course of human invention, the search engine is a relatively new invention, it is a fairly well-defined concept. Yet it is so often done poorly.

It is not wholly the fault of the technology. Even the best search engine can only work with what it is given. Bad content creation leads to bad search. This, more than anything, is why we have decided skepticism when it comes to announcements such as the one where “…the AAP and the BISG will soon work together to develop a common standard for searching for online book content that is not entirely dependent on using Google.”

First off, what does this mean? Are we to believe that the publishing industry has somehow managed to do something Microsoft, Yahoo!, and a host of other search professionals have failed at? Are we supposed to believe that the publishing industry will create better search technology? Seriously. Is there swamp land involved with this deal?

Second — and we know, we know, you’ve heard this before — where are the searchers? This is not a rhetorical question. When customers (again, as always, the ones who buy stuff) are looking for information online, where do they start? It’s not the Random House website. In fact, in our long years of using the Internet for fun and business, including observing average citizens going about their online business in an average way, we have yet to see someone turn to the publishing industry as the first line of research. Watch real people search for real stuff online. Not canned, test searches designed by someone who lives in an IT bubble. Real people. Your mother.

This is not a case of if you build it, they will come. If you’re not indexable by major search engines, you won’t be found. Simple as that.

Google is not the problem. The problem is not working with Google, Microsoft, and all the other search companies. The problem is with what publishing professionals like to call “control over content”. This is generally corporate way of saying that the publishers will decide when and how to release information. It is also a way to say that publishers will decide how to withhold information. Maybe it’s because we’re feeling cynical, but are publishing houses the best choices to decide what level of information will be appropriate for consumers to seek and find the books (in whatever format they comes) they want?

Digitize, monetize, and sell lots of books. But stop thinking that it’s 1996 and you have time to figure it all out. Terence McKenna once noted “…that time is speeding up, that history is moving faster and faster.” Just like time and history, so are the publishing industry’s customers.

File Under: The Future of Publishing

8 responses so far ↓

  • Don Linn // May 14, 2007 at 2:27 pm

    While BISG was doing its thing, there was a similar meeting held by The International Digital Publishing Forum (IDPF) at the McGraw Hill building that was preaching a similar message.

    Three big takeaways:
    1) Standards for digital content files are very near completion, meaning that many different kinds of input files can be easily converted into one standard type of digital package, which can then be output in as many many different formats as there are devices. This will make the costs of conversion decrease dramatically, which should increase the amount of digital product available, which should, in turn, make it more attractive to consumers. You don’t have to wait to figure out which device is going to prevail, just digitize your content now.
    2) Don’t assume that a satisfying ‘book experience’ necessarily means ‘like a book’.
    3) The rate of change is accelerating and is about to accelerate further.

  • Kassia Krozser // May 14, 2007 at 9:18 pm

    I think I was in a hotel room that overlooked McGraw Hill once…

    Oh right. Topic at hand. I’m glad you mentioned the IDPF (say that three times fast). I don’t suppose that the concept of “text” figured into the standards? Still the most universal format around. Remember the olden days when XML was going to standardize data formatting and/or retrieval/republishing? I imagine this is still a viable concept but one discussed among geeks more than conference attendees. Standardized mark-up was going to lead the way.

    I am, as you can well imagine, leery of propietary “standards”. That is to say, I am leery of standards that are unique to an industry and not necessarily adopted across technologies. I do not know if that is the case here, but as a veteran of format wars, do, naturally, recall the multiple attempts by the music industry to establish standards on the general public.

    Unfortunately, said public had already adopted its choice and the music industry realized that it was going to be harder than expected to impose its will on the people. Darn democracies!

    So, yeah, I am completely in favor of a standard that works with the widest possible range of devices. From laptops to future inventions.

    Number two. Wow. Yes. Totally. Book is still relatively new to our species. Story? That’s the ticket.

    3? See Terence McKenna quote. Blew me away in 1992. Still makes sense now. Moreso than ever.

  • Kirk Biglione // May 14, 2007 at 11:41 pm

    Terence McKenna also said “Business as usual is no longer on the menu”.

    Unfortunately it looks like all too many in the industry are still ordering from old take-out menus.

  • Claire Evans // May 15, 2007 at 8:28 am

    Excuse me, but I am trying to play catch up here. Are you saying that XML is so 2002? If so, what’s 2007 (not to mention 2012), and, generally speaking, how would you digitize in a way that transcends format, medium, device?

  • Kassia Krozser // May 15, 2007 at 9:27 am

    Not saying that XML is 2002 (though, in a way, it is!), but that it represents a standard for marking up content for reuse across multiple platforms. It is a great concept. It’s also supposed to lead to universal content sharing (which Don mentioned in his comment). I grow very uncomfortable when industries try to create their own sets of “standards” without buy-in from the community at large.

    I used the example of the music industry. They spent years and millions of dollars trying to develop a “standard” for delivering music to the masses. What prevailed — because it worked universally — is the MP3 format. Proprietary leads to closed systems. You must have the right software, platform, version, device, etc. I have yet to hear the publishing industry say that they’re basing their own standards on open standards (like XML).

    Personally, the only format I trust for longevity is text, but even I see that plain text is impractical for deploying content across formats. You really do need more in order to fulfill the mission…

  • Allen Noren // May 15, 2007 at 11:20 am

    For those of you who are interested in attending the Tools of Change Conference (TOC), you can use the following 25% discount code that was extended to BISG attendees:

    25% discount code: toc07bsg

  • KR Blog » Blog Archive » Take Two // May 16, 2007 at 5:31 am

    […] you’ll find a bit of chortling, some snark, and a good amount of thinking beyond the book. Kassia Krozser reports on the Book Industry Study Group’s “Making Information Pay 2007″ […]

  • Bob Martinengo // May 21, 2007 at 7:40 am

    A little history here. Markup languages go back a long way (the 60s!) and have more to do with information management than writing for readers. XML will help you find what you’re looking for but wont make it a better read.

    Here is an early (1995) attempt to convince writers and editors that ‘structure’ was more than good plotting (note the cute title):

    README.1ST: SGML for Writers and Editors, ISBN 0134327179

    “With this book, writers and editors can learn what they need to know to prepare and structure documents using the Standard Generalized Markup Language (SGML), the new standard for electronic and database publishing. This book presents a non-technical overview of SGML in language that writers and editors can understand. It explains why SGML focuses on structure and essentially disregards formatting issues and shows how to define a document’s structure, writing with SGML structure in mind.”