Thursday, December 31, 2009

WiFi Plus Mobile = A Whole New Ballgame for Content and Communications

What can happen when you combine wireless broadband Web access with local wifi hotspots? A lot, when you come to think of it. Gizmos such as Novatel's "MiFi" unit that enable someone with mobile broadband access to set up their own local wireless "hot spot" have been out for several months, but the importance of these units is heating up with their connection to higher-powered broadband networks and the addition of features such as onboard data storage on SD cards. With more throughput and storage for data comes more ability to use these units to coordinate the bushel of devices that technophiles now travel with, helping them to synchronize communications with the world and with one another through one convenient hub. But it also presages a major shift in how homes, businesses and the world connect with one another for content.

Today most people have their mobile connectivity running in parallel with their home or office connectivity, including parallel networks for voice, video and data that cost a handsome sum for most people using them. Yet with one of these mobile network hub devices, it's easy to see how all but the most demanding uses for voice, data and video can funnel through a mobile broadband connection that can stay on our desktop or follow us on the go. Our smart phones, our eBook readers, our netbooks, our desktops, our in-home phones and our home entertainment devices can all be brought together on one seamless wifi-based communications medium.

This is likely to accelerate the move in voice communications away from traditional point-to-point circuit networks and towards an era in which voice communications are a feature of integrated voice, data and video services. It also means that we're more likely to overcome some of the global connectivity issues that exist for mobile devices: be it CDMA or GSM networks underlying mobile broadband connectivity, if you're near a hotspot of some origin, you should be able to get voice and data communications. Services such as Skype will certainly prosper in the process, but other services such as Google Voice, which help voice communications to get routed to any number of devices, are also likely to prosper as voice communications become more identity-centered rather than phone number-centered.

The bigger picture, though, is of a world in which inexpensive broadband hub devices can be placed easily in small communities and used to power local communications with both the outside world and with people within the community. Today we're seeing these devices powering personal communications, but I think that the larger potential is for devices that can connect communities with one another first and foremost with a minimum of technology. If you are living in a community in which each person cannot afford a mobile phone, that community may be able to afford collectively one connection to the outside world which is shared with a MiFi-like device that can make its connection available to the community in a reasonable scope, say a kilometer or so. People in that community could then use their mobile devices to communicate with one another and with the world, with very little ongoing cost to any one person beyond the initial cost of their own device. Most importantly, you could set up these local communications networks with or without direct connectivity to the outside world. You could have your own local Web of sorts, perhaps even with services such as Google Wave being used on a federated basis to facilitate content collection, communications and collaboration.

In turn, these individual communities could cooperate with other local communities to build "bottom-up" communications networks, developing regional communications systems that may be centered around local languages and dialects, connecting to more commonly used languages found in the "outside world" through a handful of communications access points. Every kilometer or so you could poke a solar-powered hub device into a convenient spot to keep the influence of a particular network growing. All of this would be developed on global communications standards, of course, enabling new ways to connect to the world over time, but regional communications would thrive, with or without help from the "outside world."

While the more than 1.4 billion people already using the Web are certainly a significant marketplace, I do believe that much of the future power of Web-based communications will be found in the expansion of more "bottom-up" networks amongst the five-plus billion other people in communities that find themselves on a different economic and cultural playing field than the rest of the world. We talk sometimes about the "dark Web" of content unavailable to search engines on the Internet, but there's a far greater "dark Web" of knowledge and culture that's beyond the Web altogether. The "top-down" Web will penetrate this arena only so far, as it tends to be in the hands of people who have, in their own way, a great deal of autonomy, in part because of their economic isolation. But as the "bottom-up" Webs begin to meet the world of the Web as a whole, it will be exciting to see how both economic and culture opportunities for people on both sides of this divide develop.

Fortunately there are devices coming along that should help to accelerate this convergence. The One Laptop Per Child organization is targeting the release of a $75 device called XO-3 that is a bone-simple tablet equipped with wireless communications. As technology tends to push towards such visionary price points sometimes more rapidly than the pioneers, I think that it's safe to say that within a few years the convergence of such devices with localized broadband networking will enable communities around the world to join the Web age in ways that may surprise the rest of the world. So if you're looking for great new opportunities in content markets, I think that "going local" may take on a whole new range of meaning shortly. We'll keep you posted on these trends throughout 2010. Have a happy new year celebration and best wishes for a prosperous 2010!

Wednesday, December 23, 2009

Shore Communications Inc. Waves Goodbye to 2009

What a year it's been.
  • iPhones rocked, Google shocked and social media was no longer mocked as publishers and technology companies flocked to online content business models;
  • Bing had a fling and even Windows 7 would sing as Kindle took wing, but proprietary platforms are no longer king;
  • Those in the cloud were quite proud of profits that wowed enterprise and media markets and vowed that all content would thrive in its shroud;
  • Enterprise vendors clung to tight margins and hung on to hopes of new profits among rescaled businesses flung across a changing world;
  • Twitter got the Web a-flitter about real-time chitter-chat, making news publishers bitter about the new heavy hitter;
  • Murdoch howled about profits fouled by search engines that prowled for news, while AP scowled at content reuses that tempted its members to throw in the towel;
  • Smart phones got fast and netbooks now cast a shadow over the last bits of old-school computing;
  • Save the best for last! It's Wave, the rave of brave trend-setters, promising an enclave that will repave the road to the Web's future;
  • Feel like you need a suture or two? Don't worry. The couture of content will change soon enough. The future is bright - for those who are tough.
Everyone at Shore Communications wishes you a great holiday season and a fantastic 2010. Enjoy what is important, and let's build the future of content together next year! I hope that you enjoy the following year-end video.

Friday, December 11, 2009

Farewell, Editor & Publisher: How Many Canaries Can You Ignore?

In the process of selling off several of its core B2B entertainment industry titles, Nielsen Business Media also announced the eminent closing of Editor & Publisher, the century-plus old trade publication that had chronicled the ins and outs of the news industry. At a time at which magazine closings seem to be about as regular as train stops on a commuter line, E&P's demise is not exceptional in many ways. Any number of trade publications are struggling to survive in an era in which online media enables unlimited competition for the attention of its readership and for its advertisers' and subscribers' cash. But there is something particularly poignant about E&P's shuttering. After all, if an industry which insists that the quality of its content will be its distinguishing factor cannot support the high-quality journalism covering itself, then how can they expect others to do likewise for their own interests?

There are few people who can scream about canaries in coal mines and get away with it for long, and I am no exception to that rule. If you haven't figured out that most publishers are caught between highly skilled staffs oriented towards traditional publishing platforms and new platforms that can't deliver them decent salaries with room for both management's profits and platform reinvestment, then you must have been clipping your bond coupons on a tropical island. But that doesn't mean that publications like Editor & Publisher have to die. What it does mean, though, is that in some ways the publishing industry is returning to its roots of scrappy, independent publishing that may do better without the overhead of large, corporate parents.

This doesn't mean that news publications will always do best as independent outlets, but it does mean that publishers that are mean, lean and more focused on their markets than on hitting the train back to comfortable suburban homes are going to do just fine. The good news is that Web infrastructure is perfectly suited to such operations, most especially when publishers listen to their audiences and engage them effectively. An interesting an ironic example of this positioning is the recent rebirth of Conde Nast's former Web site by American City Business Journals as a portal oriented towards the owners of small and medium businesses. With a platform that is well designed to slice and dice content and functionality for any number of focused local and topic-oriented markets, ACBL's no-nonsense approach to publishing is far more emblematic of what will succeed moving forward in profitable B2B and consumer media than the high-gloss world of major media companies.

The caveat to this approach, though, is that the scrappy publishers must push themselves to the extreme to take advantage of highly affordable publishing technologies to outpace major media companies in having audiences adopt their brands on the platforms that they prefer. This is to some degree why blog-oriented publishers such as TechCrunch and The Huffington Post have survived and thrived in online media. Having been handed the equivalent of a guerrilla fighter's AK-47 automatic rifle in today's affordable social media publishing technologies and deploying the tactics and strategies that they enable, lean and agile online-first publications and their technology partners have carved away a good portion of the meat of publishing's profits.

It's not as if the major media companies can out-tech these smaller rivals easily, either. The expense and useful life of proprietary content technology development is rarely beneficial to a publisher today. There are some exceptions to this rule on the very high end of content markets such as in financial securities trading and other specialized professional functions, but in general it's source-agnostic content technologies that have defined today's most successful publishing platforms. For general media markets, publishers have tried again and again to gain the upper hand through sponsoring source-specific content technologies that simply don't deliver all of the information and experiences that people expect now through source-agnostic technologies.

It's what you might call a prolonged mourning for the mass-production printing press era, the ability to define a marketplace through a technology that only traditional publishers could afford and master easily. Sorry, that train left the station a long time ago. By ceding their technological superiority to others, publishers sealed their fate years ago. If Compuserve had knocked the socks off of the Web in its ability to amaze and delight content audiences, it would still be around today. Consortium services like Hulu are trying to regain some of that high ground of technology, but as long as they fail to leverage all of the content that people find to be valuable based on the artificial divide of "it isn't real content," they will always fall short of audiences who know "real" when they see it.

In short, I do think that the closing of Editor & Publisher is a small but significant landmark in the history of publishing. It marks the point in the publishing industry's history when it admitted that it no longer really cared about its traditional strengths. Print publishing and the editorial disciplines that drove it are now officially legacies that will inform the future, but no longer define it. There will continue to be print products indefinitely, and highly customized print products are likely to be a growing marketplace for some time. But when an industry will no longer buy coverage of its own traditional operations, then it's time to admit that a chapter in that industry's history has been finished. I wish the very best of luck to the staff of Editor & Publisher, they have put out quality journalism in the face of enormous industry change. I hope that we will see E&P resurface in the near future as a web-first publication, perhaps with a focus on the future rather than on the past.

Thursday, December 10, 2009

What's the New Normal? Mark Logic Digital Publishing Summit Examines Cross-Platform Publishing Opportunities

While Mark Logic is far from the only game in town for cross-platform publishing technologies, its recent Digital Publishing Summit at the Plaza Hotel in New York City was a huge down payment on establishing itself as a thought leader that could merge the best of East Coast and West Coast thinking in enterprise and media content markets. As one would expect with a vendor-sponsored conference, the day was filled with "friendlies" who use and support Mark Logic and its XML-based databases, APIs and content delivery services. But it if you had to pick friends, CEO Dave Kellogg and staff picked some friends who had excellent examples of how cross-platform and cross-source publishing is "the new normal" that is helping to drive value in the publishing industry. The trick is, though, is that this new normal is filled with some ironies that the content industry is still struggling to absorb.

With a packed ballroom listening on (nothing like "free" as the price of admission for networking in this economy), Dave Kellogg opened with a lively video, followed by Outsell's David Worlock pointing out that user-oriented networked services, not pre-conceived publications, are the key to this "revolution" in publishing services. Yet at the same time his slides showed a pyramid of value-add content services from simple published documents to "workbenches" that seemed to be quite standard in its pre-conceived product flow. Databases are indeed key components in today's publishing environment, but as exemplified by Mark Logic's technologies, the database is now - that is, whatever a user needs it to be in the moment. Both enterprise and media oriented publishers are discovering that publishing cultures centered around traditional databases, be they for traditional editorial content, business data or multimedia, are not agile enough to respond to the demands of their markets.

Richard Maggiotto, Founder, President & CEO of Zinio, highlighted similar ironies that print publishers face in confronting mobile markets. Zinio is moving beyond simple "page-flipping" technology for magazines on PCs and mobile devices to enable video-like animations of content, including ads, to draw magazine publishers into more appealing online presentations in their software. One demo that Richard flashed on the screen was for a $30,000 watch, paid for by a manufacturer that refused to produce Web ads. A beautiful ad, but the question becomes: how can you build a market based on a tiny sliver of people who are using iPhones but preferring magazine-like layouts of content? Building beautiful and engaging content is a plus for any audience, but no arbitrary container in today's online world is going to fence an audience in to your message for very long.

I had to take a phone call at this point, so I missed a good portion of a presentation by Chris Tse, Director of Information at BusinessWeek, who focused on their "BX" social media initiatives. Ironically, when I came back, Tse was explaining how social media content was harder to monetize than traditional editorial content, although he acknowledged that it would probably grow in its revenue impact over time. So even when you have good design, interactivity, repurposed content and social interaction, there's no guarantee that you'll have the systems in place to match revenue opportunities to your content - or have a sales force that knows how to sell it.

Kent Anderson, Executive Director for Product Development at The New England Journal of Medicine, a leading Sci-Tech journals publisher, showed off a popular "diagnose the disease" quiz
that they had ported over from their Web site to the iPhone, and, through Mark Logic's infrastructure, easily retooled for Google's Android and other mobile platforms. The growth of the app's use on iPhone was quite extraordinary, paralleling the growth of overall iPhone use. But when Kent was quizzed about the impact on overall subscription revenues in the Q&A, he expressed some optimism for future, non-free applications in mobile markets but didn't offer any indication of how the app helps to boost core journal subscription revenues. Certainly highly functional mobile apps can help to build a publisher's brand value through higher engagement, but there needs to be a clear conversion strategy devised to ensure that the engagement actually converts that brand value into revenues efficiently. Repurposing content in and of itself doesn't ensure those conversions, though it can help to define a much larger addressable marketplace.

Shannon Holman, Director of Content Management for McGraw-Hill Higher Education and Lee Fife, VP of Publishing Solutions for Flatirons Solutions, put on an excellent demo of McGraw-Hill's Create online custom textbook creation application. Their development of Create was based on the assumption that they needed to empower their customers to design and customize their custom textbooks online, instead of relying on institutional sales forces. The Create application does an excellent job of fulfilling this mission, enabling its users to choose specific sections of books, insert personal course materials and papers and produce both PDFs and bound, custom-printed textbooks on demand with remarkable ease. This interactivity that allows clients to package content the way that they really need it packaged was probably the closest example of "the new normal" during the day's presentations. But even here, the very success of the Create application leaves McGraw-Hill's institutional salespeople scratching their heads somewhat. Better that in the long run, though, then becoming a captive of sales methods that may be out of date.

The final featured speaker of the day was Gordon Crovitz, former Publisher of The Wall Street Journal and a founder of Journalism Online, which is preparing to launch in 2010 an online content ecommerce service that will enable people to have one single sign-on for accessing premium content sources across the Web and mobile platforms. Crovitz outlined at a high level the range of use and pricing models that the Journalism Online platform will support, such as single-article micropayments, multi-article/time-based payments, bulk multi-publication subscriptions and print/online bundled subscriptions.

Interestingly, both the questions that came up from the audience afterwards and some discussion in the panel discussion following Crovitz' panel indicated that there was still a fair amount of resistance from some people in publishing to this concept - and not necessarily for the reasons that you might think. Some people were concerned about Journalism Online being a publisher-centric model, solving their own particular pricing problems but not necessarily solving problems for audiences. This is a reasonable point, one that highlights how publishers are to some degree still on a fishing expedition for successful online revenue models for premium online content that no technology alone can answer. Yet Crovitz emphasizes that premium's opportunities lie where people already believe in your content brand. In other words, premium plays well when you have a relationship with an audience that's already valued above the norm. You may, as Crovitz suggests, convert only a fraction of them, but if the relationship will support it, then demand it where the value suggests that it's worth it to them.

So what is "the new normal" in the era of repurposeable content? To put it succinctly, it's having content that's always ready to attain its highest value in audience-defined moments. Be it through search engines, self-published and self-packaged content, real-time collaboration or easily repurposed and relicensed data and editorial content, the companies that can chase those moments most effectively wins. Sometimes this means being able to aggregate content from any number of sources more rapidly and effectively than anyone else, based on your insights into audience demands. But often it means letting your content flow to where your audiences want to consume it and to be ready to know how to make money with it once it gets there. A multi-platform strategy for repurposed content is not simply slamming the same product into different packages.

Multi-platform publishing also requires the recognition that it's not about platforms at all - it's recognizing that your audience has to be the center of your publishing at all times - and to recognize that each platform and application may draw out a different audience persona from the same person. It's not enough to ask "What does your customer do ten minutes before and after they use your content." It's also necessary to ask your audiences, "who are you" in each platform environment. Your hardcore diagnostician may be all business on a PC, but be out for kicks or socialization on their iPhone - or vice versa. These types of variations only enhance the need for good content multipurposing infrastructure, even though that infrastructure will not guarantee that you'll be offering the content that they want most.

Mark Logic's Digital Publishing Summit probably raised more questions for publishers than it answered, but that's probably not a bad thing in a market in which publishers have very few clear-cut options for succeeding in content markets. It also left outside the doors of the ballroom the uncomfortable fact that many platforms are in use today that enable people to aggregate content on their own with minimal assistance from traditional publishers. You can have the best aggregation and monetization strategy in the world, but if your audiences are creating and aggregating more content than you can, then it's going to be an uphill battle for most any publisher. But within those constraints, Mark Logic is showing the way to a "new normal" for publishers in which matching any content to any audience demand is creating a much more flexible, responsive and audience-centric publishing industry.

Tuesday, December 8, 2009

Endgame/New Game: Google Search Moves Focus on The Moment of Highest Content Value

In a typical game of chess, there are three distinct phases of play: the opening, in which a handful of chess pieces stake out strategic territory on the chessboard, the middle game, in which the positions of many pieces are used to jockey for control of the chessboard, and the endgame, in which the pieces are traded and moved rapidly into a reduced and final push for ultimate control of the board and the strategic goal of the game - capturing the king. It takes both logic and passion to excel at chess, but at the end of the day it's a well-executed plan that wins the day.

You might say that Google has been in the process of introducing its own endgame for online publishing, quietly moving dozens of initiatives into strategic positions which in and of themselves may seem inconsequential to the game as a whole - until its ultimate position begins to evolve rapidly. As in a chess endgame, Google's recent moves are swift, monumental in their impact and, potentially, decisive in determining the outcome of how content becomes valuable on the Web. Media critics like Ken Auletta have quipped that Google needs more "Kirks" and fewer "Spocks" to succeed, mistaking the crowded middle game of media posturing against Google for an ongoing battle, when in fact Google has been keeping its well-reasoned eye on the pieces that will be most important for the outcome of the game.

What's the king that needs to be captured in this endgame? The Moment. Media companies continue to churn out outdated moves such as media players serving up magazine-like renditions of their own content, thinking that quality that reflects the last game that they won is what will win the day. In the meantime, Google's intense concentration on processing power in cloud computing, Web-standardized applications and search dominance have revealed a strategy that is quickly eliminating viable moves for many B2B and consumer content and technology companies. After the September introduction of The Second Web via its Google Wave preview platform for real-time collaboration, Google has in recent days extended its dominance of The Moment via three new initiatives: expanded personalization of search results, real-time search results and voice, location and sight-activated mobile searches, including Google Goggles, a point-and-click camera-activated search feature.

Danny Sullivan at Search Engine Land has an excellent analysis of how Google's debut of personalized searching that doesn't require a Google login is introducing a "new normal" for its search environment, in which the content presented in search results will by default be different for different people based on their last 180 searches on Google. What is The Moment for these people? Where their interests have been most recently. Instead of waiting for editorial boards to decide what The Moment should be, Google is yet again trumping traditional editorial functions and allowing people's own behavior to have a seat at the editorial table automatically.

The introduction of content from real-time Web sources such as Twitter, Facebook and other status-oriented messaging services in Google search results extends The Moment into content sources that have split-second relevancy to online content seekers. Klipp Bodnar points out that this stream of tweets and postings means that B2B companies can no longer ignore real-time in favor of traditional SEO strategies if they're going to get people's attention. It's a broader scope than that, of course: nobody can afford to ignore real-time social media content generation now any more than a securities trader can ignore real-time stock tickers. All brands must enter the real-time conversation of The Moment to keep in touch with their markets and to define their markets.

Google's mobile search initiatives, introduced last week at the Computer History Museum, are perhaps the most profound in their potential impact, even if their ultimate powers are years away from being felt. Voice-activated and GPS-activated Web search is being perfected rapidly at Google and through other outlets, but the Google Goggles initiative, previewed in its development phases on MSNBC recently, brings a point-and-click element to The Moment that promises to give Google a real leg-up in mobile search markets. Using the camera in mobile phones, Goggles enables searches for information on things such as landmarks, stores, products and text simply by filling the camera's viewfinder with the item and clicking. Remember all of those fussy infra-red applications that were supposed to get us "beaming" business cards to one another? Now, just take a photo of someone's card and it will be uploaded into a contacts record. In just those few capabilities already targeted, whole content markets are about to develop as people capture content in The Moment.

And who will have all of the search data and metadata regarding all of these Moments? Yep. Yet again, Google is positioning itself to be the cloud-empowered master of what people are interested in right now, giving them the ability to bring people closer to their interests and passions simply by asking for them. And, yet again, by including as much content as possible in serving their customers, Google doesn't second-guess what people consider to be valuable in The Moment. If the stock and news tickers of the 20th century distributing content from central markets and publishers were the gold mines of Moments in that era, Google's absorption and distribution of content from anywhere to anywhere in The Moment has enabled it to enlarge its unique databases far more broadly and rapidly than any other publisher on earth. And, like a chess endgame, the speed with which other players are losing effective counter-moves against Google's strategic position in The Moment is only quickening.

No small wonder, then, that the U.S. Federal Trade Commission is scrutinizing Google's acquisition of AdMob, a leading mobile ad network. Markets thrive when there are still a good number of pieces on the board to keep competition high. But perhaps it's time for the FTC and companies in the content industry to look beyond this rapidly emptying game board and to consider what the next round of content industry chess is going to look like. If The Moment is the new center of the publishing industry, how does content become most valuable in this context? The answer to this question is, in part, to acknowledge that the companies who collect the most input about the world most rapidly become the most knowledgeable about what is happening in The Moment.

It's a phenomenon that I call "the Sensor Society," a world in which our corporate awareness and memory becomes a valuable through common access in a way that reverses the "information is power" equation. Certainly having private information will continue to empower people and organizations in select circumstances, but for the average person or business having access to all information in the right context is becoming a more powerful resource for decision-making. To borrow a concept from my book Content Nation, some portion of the DNA of society is migrating into the Google-dominated cloud, with each of us feeding that part of our collective consciousness through our voices, our camera "eyes" and our fingers touching screens and keyboards. That may be a good thing for society as a whole, but it will be an enormous challenge for institutions who are not ready to accept that migration as a beneficial development.

What does this mean for publishers? It means good things for those that can manage to get their content into these personally defined Moments more effectively. But it also takes an acceptance that "the first draft of history" that many in the media business cherish as their mission is taking on a radically new form. Like the "playback" feature in Google Wave, everyone will have access to who did what where and when soon enough. The question is, who edited it the best? Google has staked its claim as the world's dominant editorial resource for displaying billions of histories a day, sweeping away front pages across the Web into a stream that assembles Moments that matter most to audiences.

We will spend time with content in any number of spaces thanks to this editorial resource, as we have on the Web for many years. But Google has accelerated the endgame radically in the past few months for those not tuned into The Moment. 2010 is going to be a year of momentous change in the content industry. Publishers that are tuned into The Moment will be in good shape to take on all of the inputs of The Sensor Society and to trigger astounding growth in cloud-based content markets. For those that aren't tuned in, well, you better get used to the idea that you're playing a two-dimensional game of chess against a 3-D chess master. Set up the chess pieces again, Spock. It's a whole new game.

Monday, November 30, 2009

Google Scholar Takes On Legal Content: Are Enterprise Suppliers Threatened?

When Google Scholar launched five years ago on the Web, its aggregation of freely available scientific literature and citations launched some sizable seismic activity in publishing circles. All of a sudden, content that had been aggregated only via expensive subscription database services was available for free and accessible as easily as any Web page. Five years later, Google Scholar has expanded to include most freely available academic research sources, as well as abstracts from subscription sources and public patent records and is an increasingly popular resource for researchers and students. However, major aggregators of scientific publications still remain successful, in large part because they continue to develop more sophisticated search and display applications and, well, because time has been on their side. Pressures from Open Access advocates who press for free access to scientific research and an increasing array of applications built using Google Scholar as a source have begun to open major cracks in the barriers to entry into scientific publishing markets, but the people in charge of enterprise purse strings did not use Google Scholar in their university days. So, in spite of budget cuts. the status quo remains largely intact for many scholarly publishers.

With this in mind, some reasonable skepticism is probably in order as Google announces the launch of a new Google Scholar service that makes full text legal opinions and legal citations available for case documents from U.S. federal and state district, appellate and supreme courts. Public records are becoming more commonly available in general thanks to both Google and other publishers that see opportunities in generating value from public content, so this move should come as no major surprise to anyone. Yet this first major foray by Google into legal content is surprisingly strong - and may be the beneficiary of better timing than earlier Google Scholar product improvements. While legal publishers will rest soundly knowing that the search capabilities for legal documents in Google Scholar are limited to simple "white box" queries, they may not be so tranquil when they look at the results themselves. Documents are rich in links to legal references in the cited documents, a capability that has been for many years one of the key calling cards for legal databases.

Things get even more interesting when you look at the citations tab that is available for each located legal document. Google Scholar offers you brief, in-context snippets of how a case was cited in key documents, as well as comprehensive listings of citations in court documents and documents related contextually to the selected document. While that's far from the full capabilities that a LexisNexis or Thomson West offer to their professional clients, it's pretty much pointed at the core of their database offerings, nevertheless.

The Above the Law blog has a good summary of analysis and reactions from both legal experts and publishers, but I think that the most salient point comes from Social Media Law Student, which points out that this freely available information is likely to become a "go-to" content source for students who may not have ready access to subscription-based content sources. Looking at the offerings coming to market from, though, which I walked through recently as a part of my SIIA CODiE judging for Best Aggregation Service, it's not as if LexisNexis isn't aware of this "digital native" culture gap, as they try to index both public documents and freely available Web content to make it more accessible to legal students and professionals.

The threat that Google Scholar's new legal content represents to established publishers, though, is the exposure of a huge body of public documents to applications builders and content services. Much as Google Books' scanned out-of-print library holdings have created a resource for ebook platforms from the likes of Sony and Barnes and Noble, this new initiative from Google opens up more cost-effective competition for legal services publishers who may want to attack legal markets from new and innovative angles using Google Scholar as a resource. Some of the innovators may be startup companies in the mold of Collexis, which leveraged publicly available scientific content to showcase their innovative content discovery tools. Others may be business information competitors in adjacent markets, who may see a way to pick off some of the "low-lying fruit" using core legal content maintained by Google.

None of these really add up to a significant challenge to either LexisNexis or Thomson West in the short run, but they will tend to hold down their margins as they lose some market share and lose leverage at the negotiating table at contract renewal time. What this does add up to, though, is a strong case to have professional-grade legal information services more integrated into a far wider array of business information sources to support enterprise decision-making on many levels. If digital natives will have increased access to well-integrated legal content, the high end of legal information markets will need more unique content and integration across a fuller range of business information sources to justify premium prices.

As I mentioned earlier on ContentBlogger, I do think that Reed Elsevier would be smart to consider selling LexisNexis at this time in anticipation of this likely consolidation - or, alternatively, expand its business information holdings to build a broader base of services for LexisNexis. I think that the former is more feasible than the latter given current market conditions, and would enable Reed Elsevier to cash in on the still-formidable value of LexisNexis before it begins to lose significant market growth potential. Thomson was able to spin off its print assets near the peak of their value before print publishing markets ran aground, a trick that Reed Elsevier was not as fortunate in managing in the sale of its Reed Business Information publishing assets. Google's new legal offerings are not a death knell for premium legal information services, but they are a canary in the coal mine for database services based on public legal records. We'll be watching this space carefully in the months ahead.

Friday, November 20, 2009

Making More Pies: The "Google Phone" and Chrome OS Aim to Expand Content Markets

It seems as if there's hardly a week that goes by lately without some major announcement from Google, Microsoft and other technology providers that has major repercussions for the content industry. In the past week, we've had not just a major announcement but a major rumor surfacing anew that has me thinking about how Google's strength as a marketing organization is in defining new markets that others are often unwilling to develop. In other words, where many publishers and technology companies focus on gaining slices of the same old market share pie, Google seems to be becoming the leader in defining whole new kinds of content markets to bake.

On the product announcement front, Google used the unveiling of its Chrome OS operating system as an open source platform to give a quick demo of its still-developing features (video). As I highlighted in ContentBlogger in July, Chrome OS, targeted for release next year, will be a computer operating system expressly for devices such as netbooks that use mostly Web-oriented content and applications. The result is a machine that can operate with minimal local data storage and that can boot up to a login prompt in seven seconds and get on the Web in just a few seconds more. So in less time than it takes the typical mobile phone to get ready you can access Web content and applications easily.

The Chrome OS interface is no real surprise to those already using Google's Chrome browser to look at the Web - it is, in essence, the same. There is a permanent "tab" open to allow one to start applications, which operate in tabs much the same as Web pages do currently in the Chrome browser, or you can have the applications pop up from the bottom of the display as "panels." Web links can activate apps as well, such as in the above display, which shows a music clip on MySpace playing after clicking a link on a Google search results page. The demo also showed how data in the Chrome OS "cloud" from any tabbed window can be pulled into Google Docs for more sophisticated manipulation and how games and ebooks from Google Books can be viewed easily and stay as persistent content in a given tab or as full-screen applications.

People expecting the "wow" factor that Microsoft or Apple has tried to engineer into its most current operating systems are likely to be underwhelmed by Chrome OS, a non "wow" factor that was echoed in a recent poll that I conducted in Google Wave. In the poll, only a plurality of people felt that Chrome OS would have a major impact on computing in two to three years. After all, who is going to get excited about an operating system that looks and acts just like today's browsers? I think, though, that this is where the pies come in. With only about a fifth of the world's population having access to the Web, Chrome OS as an open operating system is perfectly positioned to help the other five billion people who do not have Web access to build content in the clouds very cost-effectively. Most of these people will never see a PC in their lives and will find a Chrome OS device to be perfectly adequate. Of the 1.4 billion people who have access to the Web already, most of their time is spent on the Web anyway. That leaves Apple Macs and devices using Microsoft Windows 7 to go after the relatively affluent and sophisticated markets that have a lot of sophisticated gizmos in their homes and enterprises, a significant market, to be sure, but one in which the need for content outside of the cloud will be a diminishing factor. All of a sudden Chrome OS has the ability to make the entire PC-based marketplace look like a niche market.

Underscoring this positioning of an expanded global cloud as an expanded marketplace pie is the recent repackaging of the "Google Phone" rumor by TechCrunch. If Michael Arrington's latest "confirmed, super-high confidence information" is to be believed, Google is going to start advertising a Google-branded mobile phone device in January that will be built by an OEM hardware partner to Google's own specifications. In the short run, one assumes that this will be an "apples-to-apples" competitor for Apple's iPhone, supporting applications and Voice over IP telephony in a way that is less compromised than Google Android implementations found on smart phones released so far. But with heavy investments in Google's Android operating system by handset manufacturers such as Samsung, HTC and Motorola and a still-fragmented U.S. mobile market to navigate, it's doubtful that such a "Google Phone" is going to make enormous headway in developed markets any time soon based on just these features.

Instead, the more likely play for Google's potential phone device is a new market altogether: ad-supported mobile VoIP telephone and Web access. In other words, in the middle of a global recession and with a huge number of people who have yet to touch either a mobile phone or the Web, what better price point for a mobile phone service could you have than "free?" The features of Google Voice already await people needing voicemail and phone call redirection, so people falling off of telephone calling plans as the economy continues to tighten may see access to phone calls through ad-supported broadband and Web "hot spots" to be a "good enough" telephony and Web combination while they await funds to get more high-powered services from major telephone carriers. For those who could never afford or deal with mobile Web access, the Google Phone may offer a simple and affordable way into mobile communications that would be a stepping stone to a Chrome OS-powered netbook device.

All of this in the short term is likely to be fairly underwhelming stuff for people looking for the "what's in it for me for better results this quarter" solution to all of their content market problems. But in a sense that's the exact point. Google is one of the few companies in the content and technology industry that has been investing very patiently in long-term market development goals that will broaden their potential revenue base by huge magnitudes. Others have been innovators, to be sure, and profitable in their own right. But by plodding away at technologies and content services such as Chrome OS, Android, Google Apps, Google Wave and Google Voice, and by continuing to refine existing services such as its search engine, ad networks and YouTube videos, Google learns how to build a larger market in which they can satisfy at least 80 percent of its daily needs.

As Google expands into developing nations and "digital natives" markets more rapidly than many of its competitors, the slice of the "old" 20 percent that can be satisfied by more specialized technologies will continue to look smaller and less powerful as a content market play. With everything to gain and little to lose, Google's greatest barrier to competitive forces is the unwillingness of its competitors to risk everything to play on the same ground. The sophisticates who follow the content industry will continue to be underwhelmed by many Google products and services - until they recognize that in large part it is becoming the content industry as we will know it.

Monday, November 16, 2009

Darn, Why Did They Think of It First? News Media Companies Adapt to Online Value Points

I have to chuckle a bit at the recent Poynter Online email interview with Wikimedia Foundation's Jimmy Wales, in which he discusses an internal memo gleaned from Associated Press (PDF) by Nieman Journalism Lab. The AP memo, entitled "Protect, Point, Pay - An Associated Press Plan for Reclaiming News Content Online," covers a lot of ground already familiar to those following AP's efforts to put in premium packaging for news content. However, in addition to conjuring up long-standing concerns about Google and other major search engines as competitive forces, the memo also highlights AP's concern about the millions of topic-oriented pages in Wikipedia that are capturing traffic when people search for breaking news. At last the light bulb begins to go off in some minds that perhaps the issue is not so much search engines but that search engines are directing people towards the most popular destinations for specific topics. Hmm, perhaps this might have something to do with...the quality of the content that they find there?

The AP memo points out that Wikipedia articles are rich with links and structured content that drive people to other trusted information sources, a concept that the memo suggests could be adopted by the AP for its own content. As Wales points out wryly, though, "Creating authoritative canonical pages based on the latest from the AP sounds like a good idea they should have implemented years ago." In other words, after more than five years of Wikipedia building both its content and its brand as a "go-to" source for freshly updated topic-oriented content that dominates search engine results, it dawns on some folks in the news business that perhaps there's a business model in there somewhere. Layer in the growth of online portals that are aggregating links to top topics content more effectively, and one wonders just what people are going to be willing to pay for those carefully designed hNews objects that AP is hoping to use to "reclaim" the news business.

The answer to that wondering seems to come in part from a recent study on consumer attitudes towards premium news content by the Boston Group highlighted in The New York Times. The study indicates that fewer than half in the U.S. are willing to pay for news content online and that of those who would be willing to pay the preferred tariff weighs in at about $3 a month. This seems to line up with long-time assertions by Journalism Online's Gordon Crovitz, who claims that premium news sites can expect to be able to charge for about ten percent of their online content. I've noted oftentimes that a system for managing access to paid content is long overdue, but news organizations should take a hint from the payments being extracted from iPhone apps and recognize that online markets reward functionality and community input that meets personal needs more than it does deathless prose and a good network of inside contacts.

A topic-oriented Web site for news content sponsored by AP would be a good idea, but one wonders whether AP or any other news organization is up to the task of building both the content and the brand necessary to contend in search engine wars for their audience's attention. At the same time, AP's emphasis on "protective" content packaging as a means to establish fair licensing of AP content seems to miss the real revenue opportunity available to AP and other news organizations. When a publishing-enabled global audience is your most effective distribution mechanism, a strategy of "joint supplier negotiation" suggested by the AP memo is not likely to succeed.

What is needed for AP and other professional news organizations to succeed in online content licensing is a system that encourages the distribution of their content through the most efficient and popular channels available at any given moment. Instead of fighting your audience, empower and encourage your audiences to be distributors of your content - and help them to profit from it as well. Highly automated content licensing with a billing mechanism akin to mobile phone usage units - and that can help individuals to profit from AP content when it's appropriate - is the key to this concept, and should be the cornerstone of AP's premium content strategy.

With such a scheme in place, AP's members can focus on beating the competition at their own game by becoming the most effective agnostic aggregators of news content in any given market. Yes, news organizations will continue to staff up with their own editorial resources, but the news of today - and tomorrow - needs to collect the best content from whatever source that it comes from more effectively than the competition. You can have some exclusive content, to be sure, but exclusivity alone cannot power success.

This can be seen clearly in how information providers in the financial industry are required to aggregate content from as many different sources as possible to help information-hungry decision makers. Over time you may develop unique assets, but the fundamental game is giving people what they want, where they want it, when they want it. If you yell at your markets for wanting to play a different game, don't be surprised by the blank stares that you get before they go to pay attention to people who listen more effectively.

I do hope for the sake of professional news producers that AP does come up with an effective content distribution strategy, and there are some hopeful outlines in the AP memo to that effect. But the largest thing that needs to change in the AP strategy is their attitude, which still treats the Web as an object of fear and scorn. More than 1.4 billion people around the world seem to feel otherwise about electronic content, people who both consume and contribute value to the news gathering and distribution process. It's time for the AP to recognize that their mission needs to embrace those 1.4 billion people more effectively if they are to value their brand and their content enough to consider seriously the prospect of regular payments for it.

Friday, November 13, 2009

Business Information Consolidation: D&B Pursues InfoGroup to Diversify Offerings

While business information remains a robust market segment in the content industry, it has not been without its challenges in recent years. Increasingly rapid changes in organizations and careers trigger demand for ever-fresher information on companies, people and products, making services that can help it to be found and used effectively critical to most business operations. What was once an industry of bulk data, mailing lists and a few integrated company reports is now a market that demands integration of business information into sales and marketing platforms, strategic dashboards and all-in-one online services.

It's no surprise, then, that Dun & Bradstreet is among the companies mentioned by Reuters putting in a bid for infoGroup, the Omaha, Nebraska-based business information service that produces mailing list services and OneSource, an integrated database of global business information sources targeted at major corporations. D&B finds itself in the awkward situation of having a "gold standard" reputation for its core company information listings but relatively few options for it to leverage that information for greater profits in its own operations. D&B's Hoovers online business information service is doing well in capturing users in small and medium organizations with a mixture of subscription and ad-supported services, but that leaves larger organizations and bulk data services to others - including its parent D&B.

While the infoGroup bidding process could go any number of ways, including a "no-sale" decision, my guess is that we're very likely to see D&B come out on the top of this process. D&B and infoGroup have much to offer one another, in terms of both operations abilities and markets. For infoGroup the pluses it brings include a huge wealth of business and consumer contact data, its ruthless efficiencies in driving out costs from data acquisition and maintenance and a OneSource platform that brings together a very broad array of high-quality business information sources in both its own online services and in enterprise platforms such as CRM and business intelligence portals. For D&B, its company ratings, profiles, Hoover's online savvy and its highly respected brand and enterprise sales and support organization would combine to provide a parent that could build a far more complete portfolio of business information services. No merger is perfect or without pain, but this looks like one that will create some pretty strong market mojo.

And it will take some mojo to keep up with the changes in the business information market over the next few years. The emphasis on business information services is on integration, real-time freshness and usefulness and having all of the sources at your fingertips needed to make decisions about corporate strategy, sales and marketing. Companies like Axciom and Experian are expanding their footprints in business information services rapidly, making an expansion of D&B's overall profile in business information services a priority if they are to leverage their brand effectively. And in the wings are expanding business information services from Dow Jones, and probable expansions by Thomson Reuters as well - with perhaps even an acquisition of LexisNexis assets from Reed Elsevier in play. Throw in younger business information brands such as Jigsaw, InsideView and Zoominfo beginning to cater to not only online-aware companies but core corporate markets as well, and you can see that business information is not a sleepy content market sector by any stretch of the imagination.

This appears to be one of those situations where two companies with both the right needs and the right level of maturities in their operations and management come along at the right time. It took a few years for infoGroup to whip its properties into better shape, and it's taken a few years for D&B to integrate Hoover's operations effectively and to identify the greater opportunities for their products and services. Here's hoping that these two companies find that their fits are as complementary as they appear to be.

Wednesday, November 11, 2009

Cutting Losses: Smith Bows Out as Reed Elsevier CEO

In a move that shocked many B2B media insiders - including Incisive Media CEO Tim Weller - global information provider Reed Elsevier has announced the resignation of their CEO Ian Smith, to be replaced by Erik Engstrom, CEO of their Elsevier division. While early speculation from FT's Alphaville blog depicted the management shift as "a proper executive-level knifing," more considered comments from industry analysts and insiders in The Independent seem to indicate that Smith was falling on his own sword in recognition of some major challenges not easily resolved by someone with limited media experience. Three key factors were arguing strongly for changes at Reed Elsevier sooner rather than later: the selloff of Reed Business Information assets had stalled, pre-tax profits were down 52 percent in half-year results and investors lacked confidence in both projected earnings and Smith's aggressive recapitalization efforts. With Smith's mentor Jan Hommen having departed from Reed Elsevier's board in January to head the ING bank, a graceful exit was probably in order.

For all of the corporate drama that this move has generated, it's easy to forget that Smith's move to float more stock to reduce debt and to fund Reed Elsevier for more aggressive organic growth was a very sound move, even if it is one that displeases investors in the short term. The real question is whether Engstrom will be up to the challenge of using that capital effectively in a struggling economy. Certainly Engstrom's Elsevier unit is the most effectively positioned business unit in the Reed Elsevier empire today, with deep and widely successful enterprise information products and a growing folio of academic and scientific publications. Yet as relatively strong as Elsevier may be, growth will be a major challenge for Reed Elsevier, even if the economy is laid aside as a contributing factor.

The key problem that Engstrom faces is that few of the tricks that have worked for Reed Elsevier in the past are likely to lead to growth in the future. B2B magazine publishers over-romanticized the likelihood of revenues from traditional channels in the face of massive changes in online information delivery and were therefore ill-prepared to adjust to cutbacks in events attendance and slimmer online ad revenues. At the same time growth by title acquisition, licensing and data integration was making for a relatively rosy top line for Elsevier and LexisNexis but failed to leave enough room in budgets after debt and development costs to fund new product development. Fairly aggressive staff and operations streamlining at LexisNexis have improved the outlook for their business information operations somewhat, but the overall forecast for both LexisNexis and Elsevier highlights modestly incremental product development.

On the surface the smart approach would seem to be to "Glocer-ize" operations at Reed Elsevier as rapidly as possible. Thomson Reuters CEO Tom Glocer moved rapidly in recent years to pare away redundancies and legacy products with limited upside and to focus operations on enhanced integration of enterprise content services across their holdings. Unfortunately there are far fewer synergies available between LexisNexis and Elsevier than those found in Thomson Reuters holdings, with the cultures of the two divisions still remaining miles apart, both literally and figuratively. With ever-broadening competition for the core content licensing services of LexisNexis, including more aggressive development of Dow Jones' enterprise information holdings, Reed Elsevier looks increasingly like a company with one fairly stable boat and three heavy anchors failing to find a bottom.

While speculation remains in the air about a possible move to merge Wolters Kluwer operations in to Reed Elsevier, the more probable short-term solution would seem to lie in disposing of some or all of LexisNexis as promptly as possible while its asking price is still worthy. One possible solution would be to spin off LexisNexis operations to Thomson Reuters or Dow Jones to bolster their competitive positions in legal and business information. Thomson Reuters would be a better strategic fit overall for a spinoff, especially if Thomson Reuters could flip back some or all of its scientific holdings to Reed Elsevier, but regulatory concerns about merging LexisNexis into Thomson West would probably make a wholesale spinoff to Thomson Reuters doubtful. A more probable resolution to overcome regulatory hurdles might lie in offering LexisNexis legal assets to Dow Jones and its news licensing assets to Thomson Reuters, which has lacked archives depth since returning its interest in Factiva to Dow Jones.

Whatever the specific solution may be, Reed Elsevier needs cash to focus on building up its scientific and medical assets for growth as rapidly as possible. Cheap financing as a means to grow stables of titles is off the menu for a while, thankfully, so Smith's forecast for organic growth requires an acceptance that it will have to come by focusing far more aggressively on its Elsevier division. Elsevier is not without its own challenges - scientific publishing faces strong pushback from corporate and academic libraries that find it increasingly hard to afford the full range of journals that most publishers offer - but both scientific research and applied sciences are markets still crying out for productivity gains that would warrant increased product investments. By contrast, productivity in legal markets are moving away from many of LexisNexis' core database strengths, which would benefit from more integration with other platforms.

There's always the possibility that Engstrom may decide to go for short-term gains and shuffle the Reed Elsevier portfolio just enough to tweak out a year or two of decent earnings. Here's hoping that he finds the courage to make some very tough decisions as to what is likely to provide the best returns for Reed Elsevier investors in both the short run and the long run. Moving on a sale of LexisNexis, by far the most attractive disposable asset available from Reed Elsevier, will enable them to take advantage of its value while it still has some attractiveness in the enterprise information marketplace. Without further integration of their information with financial market information and successful media operations, LexisNexis is not likely to contribute significantly to Reed Elsevier growth for some time to come. We'll see how Engstrom decides to cut his losses, but here's hoping that his moves help to strengthen both Reed Elsevier and enterprise information markets overall.

Thursday, October 22, 2009

Going Pro(sumer): Wall Street Journal Pro Edition Targets Web-Aware Enterprises

I've been suggesting to my friends at Dow Jones for more than five years that they needed to consider how to use their Factiva content more aggressively on the Web as a source for virtual aggregation of news and business information. Well, five years isn't that long in enterprise content product cycles, I suppose, so when I tweeted the announcement by Dow Jones of its new Wall Street Journal Profession Edition yesterday morning, I was pleased to see that the WSJ had finally started to package licensed content from Dow Jones Factiva's news and business information database into an editorially-managed online edition. The WSJ Pro package will be strictly a premium offering, offered at first only to Dow Jones' enterprise customers starting in November, with wider availability expected next year.

In a loose sense you can think of WSJ Pro as a Huffington Post for business professionals, a mix of content developed by WSJ staff writers and six sections of sector-oriented business news and information culled by WSJ editors from Factiva's extensive database and Web search infrastructure. However, using the extensive search-based analysis tools that Factiva has amassed, WSJ Pro will also provide its subscribers with the ability to unearth trends from its content. With a year of archived Factiva licensed content available along with two years of WSJ archives, WSJ Pro subscribers will be getting access to both content and trend analysis from in-depth premium business information sources unavailable in on the Web in many instances. Other must-have features such as custom alerts for email and mobile devices are also included in the subscription package, which will cost USD 49 a month.

Some are labeling the WSJ Pro package as a shot across the bow at Bloomberg and Thomson Reuters, which is a shot not too far off the mark, given that for decades many financial services companies have been able to negotiate similar price points from major financial information services for people off their trading floors, who used them mostly for news retrieval and casual price quotes on securities. WSJ Pro is aimed largely at such people, who are very Web-centric already in their information retrieval habits and looking for something a little more professional-grade. The trading arena itself uses more machine-executed trades and the remaining people on trading desks using very sophisticated analysis packages, so there are fewer people who can use the high-grade financial information products developed by companies like Bloomberg and Thomson Reuters. It makes sense, then, to focus on average professionals accessing better-than-the-Web information about business and finance who are willing to use a ad/subscription-supported prosumer product like WSJ Pro.

This move is also, of course, a way to counter some of the stagnation that Factiva faces in large-scale enterprise subscriptions. With central information budgets facing cutbacks in many of the enterprises targeted by Factiva and other major business information providers, using a more media-oriented model for delivering business information to specific individuals who are willing to pay for it offers Factiva a way to slide its content over into a new sales profile that can weather central budget cutbacks by appealing more to individuals who may be willing to carry a personal subscription to their products from other budget sources - perhaps even from their own pockets. Pioneering Web business information providers such as Hoover's have established the viability of this type of media/subscription model for years, so there's no reason to think that it won't succeed for Dow Jones as well.

So as much as professionals who already use Bloomberg and Thomson Reuters services may be targets for WSJ Pro, clearly a broader range of enterprise business information users may find the package to be appealing. The "prosumer" segment of business information is likely to be one of the fastest growing segments for business information use in the years ahead, as central information budgets recover slowly from the effects of the economic downturn while more aggressive executives in need of support for decision-making decide to up their personal investments in business information to close their knowledge gaps.

You can quibble a bit about the pricing, perhaps, which is not high compared to WSJ print packages but at a non-bulk price still a little high compared to some premium business information services, but no doubt WSJ has done their homework on this and is likely to meet their revenue goals with their "prosumer" WSJ Pro package. I have little doubt that this package will be a strong success - if but because both Bloomberg and Thomson Reuters are now scrambling to come up with business news assets that can help them to broaden their own offerings. When you get the incumbents moving quickly, you must be doing something right.

Tuesday, October 20, 2009

Getting eBooks Right - Finally. Meet Barnes & Noble's Nook

As exciting as Amazon's Kindle has been for many early adopters of content technologies, its screaming limitations and awkward business model have been threats all along to its long-term success. But as long as really viable alternatives were not available, few people seemed to focus on the potential for Amazon to be painted into an uncomfortable box. With the nearing launch of the Nook device from book retailer Barnes & Noble, that time of unchallenged supremacy for the Kindle seems to be drawing to a close.

As much as Kindle has been hailed as a breakthrough for eBooks, I do think that Nook will be a far greater breakthrough for the average book reader and for book publishers and retailers. The Kindle was a nifty piece of breakthrough technology, but it did little to improve the lot of publishers looking at dwindling margins and nothing to help book retailers who are able to shoot cannons through their stores oftentimes without hitting a customer. Nook is well thought-out through and through from a technology standpoint, a customer standpoint and a retailing standpoint.

First, the gizmo itself, which will be available for sale in a few weeks. It uses eInk display technology for the book content, as does Kindle, and it can download books via wireless connections like its Amazon brethren. It has access to millions of books, a convenient online store, and tons of storage and battery life. But this is where the stories of these two devices begin to diverge. Where the Kindle is a completely proprietary platform, the Nook is based on Google's up-and-coming Android operating system for mobile devices, which ties it in immediately with dozens of other Android-enabled devices hitting the marketplace this fall and next year. Barnes and Noble sees clearly that proprietary devices are not going to be a viable barrier to entry when devices based on open source software and Web standards are setting the pace for electronic content access. Using Android enables the Nook to have a slick touch-sensitive color display in addition to the eInk text display that allows for book covers and other attractive graphics to be displayed. Instead of waiting for eInk to solve the color display problem, this is a simple and useful solution that opens up the Nook to other Web functionality and slicker feature navigation more effectively.

Behind the hardware and software is wireless connectivity both for wifi hot spots and for broadband wireless Web networks, a two-fer combination that bests Amazon broadband-only access but also opens up interesting possibilities for retailers. Nook owners who are visiting Barnes & Noble stores will be able to read books via Nook in their stores for free. What a great way to attract people to their retail outlets - and, eventually, what a great way to transition to site-licensing free content access on a subscription basis via affiliates such as high-end coffee shops, university and community libraries and so on once print-on-demand services can be packaged by Barnes and Noble more effectively. Having the right physical context for content remains a winning strategy for content packaging, and Nook's marketing strategy promises to get the 'where" of content right.

Nook also gets many of the "hows" of book content right. Purchasers of eBooks can use Nook to share a book with other people for up to fourteen days and will be able to mark them up with personal notes. Lending can be enabled across both the Nook itself and other portable devices enabled for ePub-formatted eBooks. This also opens up Nooks for library books using the ePub format, in addition to PDF-formatted eBooks that are popular on the Web - and not supported at this time by Kindles. The combination of these features finally offers readers the kind of usability for eBooks that they have been used to having as print readers in an electronic format. Instead of making the hardware and software artificial barriers to a full experience, Barnes and Noble has embraced the experience - and, in the process, has enabled the Nook to be a much more "must-have" place to consume and share content.

Finally, the Nook comes in at a comfy $259 price, twenty dollars less than the current price for the original-size Kindle while offering a display as large as the Kindle2 model. For a fully wireless-enabled device, this will give the Nook a strong advantage going into the holiday season in a lean year - and strong traffic in both their online outlets and retail stores. And while I can't vouch for the hands-on experience, the look of the unit promises to be at least as rewarding as the Kindle. Lacking a physical keyboard, one assumes that the Nook will make use of the Android software-managed touch keyboard capabilities, which, while not an ideal interface, cannot be worse than the amazingly awkward keyboard on Kindles.

So let's see. Great interface, great physical package, great rights management, standardized electronic format, use and share content the way book readers like to, good reasons to visit their retail outlets, go-anywhere networking, Android compatibility - yep, I'd say that Barnes and Noble has just leaped into the center of the new-hotness race for electronic content consumption. I think that it's safe to say that Barnes and Noble is poised to become a major player in electronic book retailing with a device and a marketing strategy that is likely to heat up the book services race to a raging boil. But don't count out Amazon yet - especially with their recent efforts to re-invent the business of local retail delivery. Local contexts is where the money is in content delivery, and both Amazon and Barnes and Noble will have a shot at new approaches to local markets in the years ahead. As for me, well, if a Nook showed up in my holiday stocking, I won't be thinking that it resembles a lump of coal.

Sunday, October 18, 2009

Analyze This: Bloomerberg and Thomson Reuters Up News Analysis Commitments

These are not the rosiest of times for financial information services, with fewer people using their services in the face of large-scale financial industry cutbacks, but out of adversity sometimes comes opportunity. While there are fewer professionals generating and consuming market analysis and opinion at investment banks and major buy-side firms, the thirst for market insights is as strong as ever, both among professionals and consumers of investments. That thirst may not be enough to float the salaries of as many investment bank analysts as in previous times, but there's plenty of money for financial information companies to fill in the gaps.

It's no surprise, then, that at virtually the same time there were deals announced by both Thomson Reuters and Bloomberg, L.P. to acquire two leading publishers of market insight and analysis. For Bloomberg the target is BusinessWeek, McGraw-Hill's prestigious but financially challenged business media outet, while Thomson Reuters is opting for, an online source of market insight and opinion that was growing very smartly until financial markets headed down last year. In both instances the timing of these deals certainly favors the buyers, who get to pick up assets at comfortable rates, but the ultimate outcomes of these deals may differ significantly.

For Bloomberg, the acquisition of BusinessWeek poses some major challenges but also unveils some major opportunities as well. BusinessWeek's print and online assets were redesigned recently to be targeted towards more online-oriented audiences, yet failed to attract major new audiences and advertisers. Taking the online know-how from the BusinessWeek team and its market analysts to combine it with a wealth of breaking news and opinion from Bloomberg may help Bloomberg to create a far more viable challenge to Dow Jones' Wall Street Journal, most especially in online markets. The rise of "prosumer" investors who expect greater depth from business information sources to help them manage private portfolios are obvious targets, people who will benefit not only from BusinessWeek editorial content but their sophisticated approach to online content design and management. This may help Bloomberg to extend towards the consumer spectrum of financial information services in print and online more effectively, with an overall global profile more similar to Dow Jones' consumer media news assets.

For Thomson Reuters, the acquisition of is a little more of a match for its core strengths, but also a bit less of a stretch towards direct competition with the consumer side of WSJ. BreakingViews focuses more than BusinessWeek on breaking in-depth company analysis, more akin to WSJ's Marketwatch portal but also more oriented than Marketwatch towards financial professionals. With a somewhat more "pro" than "prosumer" focus, BreakingViews may lack the broad consumer appeal of a BusinessWeek, but it's also more likely to command premium rates from advertisers seeking high-level executives and high net worth investors. While this may pose more of a challenge than Bloomberg may face in building a broader global consumer brand for financial information, it's also probably a focus that will provide returns more quickly and efficiently.

With strong arms already into broadcast television and radio, Bloomberg has an opportunity to create a deeper brand that can compete in broader markets, but it may be a long time for those markets to recover to the point that the investment may be worth it. This tends to argue towards BusinessWeek assets being refocused rapidly towards a prosumer profile more similar to what Thomson Reuters is seeking, but the shoe may not fit as gracefully. The media will buzz more for a while about the BusinessWeek acquisition, no doubt, given its penchant to feast on its own most prominent members whenever possible, but it seems as of Thomson Reuters may have opted for the better of these two deals from the perspective of building stronger information assets that can extend its strengths in both professional and consumer markets. Given the bargain basement price that Bloomberg has paid for BusinessWeek, at least they have very little to lose and plenty to gain.

For both Bloomberg and Thomson Reuters, they gain a wider array of assets to tailor to overlapping audiences for financial information markets that can smooth out revenue streams. It's been a grim period for financial markets, but market analysis is a key ingredient that can help financial information companies to ride out the gloomy periods until trade-related revenues pick up steam again.

Wednesday, October 14, 2009

Who's on Top of Enterprise Search? The Companies that Deliver Niche-Relevant Content

A recent press release from Autonomy hailed an IDC report that gave them the leading market share for the search and discovery technology market. While congratulations are no doubt in order for Autonomy, which has thrived as other major competitors have struggled to gain momentum in general enterprise search markets, there's a wrinkle to this boast that should give one pause to wonder. Sue Feldman's indicating in the report that Autonomy has a 14.4 percent share of the search and discovery market in 2008, which is certainly nothing to downplay but also not a crushing dominance of this market. In other words, even the world's dominant enterprise-oriented search technology provider is little more than a niche player.

This is in part because there really isn't "a" search technology marketplace in any strict sense of the term. That may sound strange at first, but it's certainly true that search as a content location tool can only measure its success against very specific needs. Each enterprise, each publisher and media outlet, each marketplace has specific needs for content that determine whether a particular technology has been well tuned to its needs. We can use tech terms such as precision and recall to define in general terms how effective a search technology may be in returning useful information, but if a technology can't deliver editorial value very specific to an enterprise, it's just a general tool that is rapidly and easily commoditized rather than a powerful content tool.

The importance of catering to very tailored content delivery needs was underscored in my mind by a recent chat with Craig Carpenter, Vice President of Marketing for Recommind, a company providing content categorization and discovery tools that are finding particular success in legal and corporate compliance markets. Recommind has focused its capabilities on supporting functions such as e-discovery processes that enable an organization to understand what documents relate to a particular legal matter in the early phases of assessing a case. Going through emails, word processing and other unstructured enterprise documents rapidly to determine which ones relate to key figures in a legal matter or or compliance issue is a good stress test for any search technology. With recent U.S. government rules encouraging the use of electronic tools to accelerate content discovery, Recommind is one of a few companies that are well positioned to both accelerate compliance with those expectations and to eliminate legal expenses associated with the discovery process.

Certainly companies like Autonomy may be competitive in such situations, but when companies such as Recommind are focused more deeply on the needs of specific market sectors, they become, in effect, like subscription enterprise information services, delivering highly relevant content rapidly and reliably. There are, in truth, fairly few ways to attack search from a technology standpoint, so the most profitable victories in enterprise search and discovery technologies tend to go to the companies that have technology that is highly tuned to the very specific needs of a given market or client. That doesn't necessarily make one technology better than another in attacking those problems, but oftentimes only better tuned and one step ahead of other technology providers. So the fact that a company like Recommind is down in the depths of tuning their technologies to legal discovery and corporate compliance can offer them better margins for solving more focused, high-value enterprise problems - often the same kinds of problems that many enterprise publishers are trying to solve.

I do think that companies like Recommind that have done the heavy lifting on difficult enterprise search problems in specific sectors or problem sets can turn out to be double threats in enterprise content markets. Not only do they get to solve higher-value problems that are easier to measure for ROI, they also get to redefine market opportunities into other adjacent markets that may be difficult for others to attack. For example, when you look at the technology issues behind legal discovery, corporate compliance and more general high-value enterprise problems such as records management and knowledge management, there's a lot of overlap with a whole different range of technology services providers. On the other side of the spectrum, being able to categorize and organize content for the legal sector very effectively also begins to nibble at the opportunities for subscription enterprise services such as Thomson West and LexisNexis, which are also focusing more on semantic content organization but not necessarily with the deep technology focus of niche players such as Recommind.

Of course, the opposite forces of two-sided competition from large rivals can push back at niche-oriented technology players, but in general today's markets seem to be favoring specific solutions that make specific pains go away quickly in enterprises, with more general solutions with bigger tickets and fuzzier ROI being strung out on longer sales cycles. I don't think that we'll be seeing many new players like Recommind entering enterprise markets any time soon, but I do think that those that were able to get launched and cash-positive in the past few years are going to be tough competitors in the two-prong fight for content and technology dominance in the enterprise. Individually they may not take up anything like a 14 percent share of search and discovery markets, but when you look at their ability to respond to the best revenue opportunities within those markets, you can pretty much forget about the pie as a whole and start looking for the plums inside the pie that matter most.