Monday, December 15, 2008

Net Neutrality Spin: WSJ's Take on Google's Caching Plans Draws Fire

Talk about a bad hair day for WSJ tech journalists.

When The Wall Street Journal ran an article today on a Google plan to add "edge caching" servers at key internet service provider facilities, this fairly common practice to accelerate content delivery to audiences via the Web was mangled into a political imbrollio. To wit, their lede:

The celebrated openness of the Internet -- network providers are not supposed to give preferential treatment to any traffic -- is quietly losing powerful defenders.

Google Inc. has approached major cable and phone companies that carry Internet traffic with a proposal to create a fast lane for its own content, according to documents reviewed by The Wall Street Journal. Google has traditionally been one of the loudest advocates of equal network access for all content providers.

Google was quick to correct the WSJ's outlook, as noted on their public policy blog and in a subsequent AFP story. Their point:

Despite the hyperbolic tone and confused claims in Monday's Journal story, I want to be perfectly clear about one thing: Google remains strongly committed to the principle of net neutrality, and we will continue to work with policymakers in the years ahead to keep the Internet free and open.

Intellectual property guru and Net Neutrality proponent Lawrence Lessig noted that his take on Google and the political ramifications of this move were a bit off-key in the WSJ article as well:

The article is an indirect effort to gin up a drama about a drama about an alleged shift in Obama's policies about network neutrality. What's the evidence for the shift? That Google allegedly is negotiating for faster service on some network pipes. And that "prominent Internet scholars, some of whom have advised President-elect Barack Obama on technology issues, have softened their views on the subject."

Who are these "Internet scholars"? Me. ...I've not seen anything during the Obama campaign or from the transition to indicate it has shifted its view about network neutrality at all.

With more moving pieces than a Swiss watch in Washington right now, the current political environment surrounding Net Neutrality and other Web access issues during a transition in Washington's power brokers is bound to be subject to as much jockeying and bullying as possible. Today the U.S. Federal Communications Commission canceled a vote on making radio frequencies available that would provide free Internet access as a public utility, bowing to pressures from both industry advocates and politicians. There's a big push for open Web access, but plenty of pressure from all points of view keeping things comfortably in neutral for now.

Net Neutrality and related issues such as public Web wireless frequencies seem to boil down to one basic concept: Don't make audiences pay for artificial scarcity. Carriers are still free to sell "bigger pipes" and better overall service levels, but artificial cartels based on reserving audience-facing Internet bandwidth for private use will only create more challenges for publishers in the long run. If you want to have proof that this is so, just take a look at the balkanized state of mobile service carriers that lassoed content providers for many years into deals for distribution on their private networks. What publishers now confront are scattered and overpriced deals for growing but underperforming mobile markets, even as the carriers now reach for ad revenue shares to sweeten their take.

Proprietary mobile breakthroughs such as the iPhone and the Amazon's Kindle are great for publishers in many ways, but they represent a relatively small share of the potential marketplace for mobile content and ultimately just continue the myth that artificial network scarcity can benefit the publishing industry as a whole. All these devices do is lock publishers in to proprietary networks that are bound to make it harder to reach their audiences cost-effectively.

The truth is that the fastest-evolving, most cost-effective technology changes are best for publishers, making it imperative to enable an environment in which mobile and Web technology providers are not resting on proprietary laurels that hinder the development of Web and mobile markets for publishers. Without these breakthroughs, the audience reach that content producers need to make mobile networks a highly profitable distribution medium is not likely to materialize. Let's keep the future of publishing out of the hands of companies that still can't tell us whether to dial "1", an area code or nothing extra to make a phone call to the next town.
Net Neutrality will ensure that there is a cost-effective, rapidly evolving electronic distribution infrastructure that serves publishers best.

Tuesday, December 9, 2008

Times Widgets Beta: Embedding News Content in Context

Widget distribution networks are becoming a popular vehicle for major content distributors to get their content in context in weblogs, personal Web pages, portals and other content outlets. The New York Times joins the list of self-service widget distributors today with the beta launch of its Times Widgets feature. Using a simple point-and-click online form anyone can select NYT headlines from any of their more than 10,000 topical RSS feeds and get code that you can insert into your favorite publishing software or enjoy a one-click insert into iGoogle, Blogger, Vox or Netvibes. The net result is a display of recent headlines from one or more feeds, each with its own tabbed display. The popular Gigya widget distribution service provides the widget plumbing for Times Widgets, which promises that more platforms can be added as instant-add options soon enough.

It's a great positioning for the NYT's RSS feed content, which is popular enough with RSS enthusiasts but not necessarily getting the referral links out to the pages of news enthusiasts as quickly as news organizations would like. The problem is a familiar one: even with a very simple feed like RSS, only a small percentage of people are willing to do the minor heavy lifting to put an RSS feed into a useful place. Feeds are great, but the technologies to get them into useful places easily have been lagging. Widgets make it easy to manage feeds as part of a published page, ensuring not just the exposure of content but the ability to do more things with a widget payload over time.

It will also make it easier for the NYT to get come data as to which people using widgets are worth approaching to be advertising partners as well: there's nothing to say that money-making content cannnot be in those widget payloads, after all. Moves like the Times Widgets beta are examples of how publishers can use widget distribution technologies to open doors both to referral links and to advertising partners that can add value to their brands in a far more cost-effective way than traditional business development efforts. Not a bad deal for just a little bit of development effort. Kudos, folks, the building may have to go but with efforts like this there are good reasons to hope that mainstream news content can find its most valuable contexts more efficiently than ever.

Monday, December 8, 2008

Newspaper Apocalypse: What's the Next Right Step?

Good news about the newspaper industry has been an oxymoron at best in a sinking global economy, and today is no exception. TheStreet.com confirms the buzz that The New York Times is taking out a USD 225 million loan against its new office building off of Times Square while the Wall Street Journal notes that Sam Zell's Tribune Co. is sniffing out options for a Chapter 11 bankruptcy restructuring. Quite a change of pace from last year's triumphal posturing of new media headquarters and highly unrealistic revenue goals for private acquisitions would eventually lead to new glories. 'T'ain't working, apparently, as print ad revenues continue to crater except for feature article sections that vie with magazines for more targeted interest groups. As was noted in a study from earlier this year 37 percent of Americans go online for their news, while only 27 percent were picking up a newspaper on any given day. Newspapers in the U.S. are now officially a legacy product, though they still represent the majority of ad revenues for most news organizations. The only large markets where newspapers are growing significantly are in nations such as India, where the penetration of the Web still lags behind the thirst for news.

While some well-diversified media companies are prepared for the long run of news' transition into a more electronic future, 2009 is shaping up to be the year in which the newspaper industry begins to face either massive restructuring or widespread collapse. Yet there is hope for traditional providers of news - if they can put their best efforts behind the most profitable opportunities. Here are a few thoughts as to where traditionally print-oriented news organizations must be headed in 2009 to build a more profitable future:
  • Get better than bloggers and search engines at aggregating news. Mainstream journalists are still equipped oftentimes with the personal networks that enable them to deliver breaking news effectively, but nobody trusts any single news organization as their source for news. Instead, many online news users are turning to bloggers, search engines and messaging services such as Twitter to aggregate breaking news on the topics that matter most to them. In other words, while referral links are highly valuable for people who bother to engage full-length news stories, the sites that provide them are the "go-to" stops for a rapidly growing number of news hounds. Getting breaking news to appear more automatically in these other venues - and to have revenue-producing ads and partnership "hooks" in that remote content - is a key factor for making the most of these aggregators. However, it also points to the lingering question: why aren't more mainstream news organizations aggregating more links from other sources in their own core news coverage? I would agree that automated aggregation services like Sphere are of limited value in this regard, but the source-agnostic form of editorial content aggregation favored by bloggers and outlets such as the Huffington Post and Newser appear to be enabling far more engagement for online audiences than "not invented here" news organizations that still insist that their own teams must create most every drop of news that they monetize.

  • Love print as a service, not as your brand. In the nineteenth century newspapers grew up in buildings that housed their editorial staffs, printing presses and loading docks - self-contained factories very much in the model of that era's mass manufacturing. In the twentieth century printing presses in many markets moved away to remote locations but most still produced newsprint products only for one source of editorial content and ads. In an era in which news can be aggregated effectively by anyone, that model is no longer a cost-effective approach to print production. Print will continue to thrive as a reading format for some time, but it's far less likely that printing presses are going to be running news and ads from only one source. It's far more likely that new types of newspapers are going to be with us very shortly, ones which license news from today's newspaper staffs and other news sources and share revenues and links to online materials via Data Matrix codes and other print-to-online linking technologies. Individual news organizations are not likely to invest enough in these new kinds of source-agnostic aggregation technologies fast enough to make a difference to their bottom lines, so suffering news organizations would be smart to band together to make such technologies happen sooner rather than later. Alternatively, the time for a "Google Newspapers" printing plant in major markets that aggregates content from many sources agnostically may have come at long last.

  • Enable community-generated news more effectively. Small-market newspapers and television cable news outlets have become fairly aggressive in embracing their audiences as sources of news and entertainment. Yet major newspaper chains in many markets are still struggling to get their hands around what it means to empower everyday people as news producers. Social media provides some of the most engaging content online today, yet many publishers still shy away from empowering local news gatherers that do not conform to traditional models of journalism. But many sources of community-generated content - sports scores, traffic reports, eyewitness news - are highly engaging sources of content that can be monetized easily. In an era of real-time broadcast news alerts from anyone on services such as Twitter newspapers need to rethink what's the best way to engage a community that already knows how to publish to one another.
There's no doubt that many news organizations are hitting the right buttons in making decisions on the future of making money from news, but the pace at which those decisions are being made has left a gaping chasm between the cost of sustaining their greatest revenue-generator - print publishing - and the cost of investing more heavily in online publishing methods that will carry them forward to long-term profitability. As much as online is the answer, though, I think that it's time for publishers to take a far more radical approach to print as soon as possible. Print will survive and thrive - the only question is, in whose hands? The time to release the medium from the brand is at hand, and it can come none too soon for most news organizations' bottom lines.

Friday, November 7, 2008

SIIA Panel on 19 November in NYC - Cloud Computing and Content: Where Are the Best Opportunities?

I am looking forward to moderating a panel for the SIIA on the 19th that will focus on cloud computing and its impact on publishing. I am particularly pleased that we have a balance of publishers and technology companies that will be able to address the issue from both a media perspective and an enterprise perspective, an aspect that should be of particular interest to SIIA members. Marc Frons, CTO of The New York Times, Larry Schwartz, the President of Newstex, Charles Matheson of EMC and Matt Turner of Mark Logic will provide a multi-dimensional view of how important cloud computing will be to shaping the competitive landscape of the content industry. Please register soon for this event.

Below are the preliminary questions that I've assembled for our panel, if you have additional or alternative questions that you'd like to have asked please add them to the comments of this post. See you on the 19th in NYC - or online via the webcast!

1. How does your company use cloud computing to provide better services for your clients/audiences? How do your clients/audiences benefit from it? What really is the cloud from your perspective?

2. The key advantages of cloud computing revolve around scalability, economy, ease of deployment, and ease of content and services integration. Which of these are offering you and your clients the most “bang for the buck?”

3. Why should enterprise and media oriented publishers care about cloud computing? What real advantages can it provide to them in the marketplace?

4. When we say “cloud computing” there are three basic types of networks that can support content from cloud computing: enterprise networks, public networks, clouds that combine both enterprise and public networks. Looking at how enterprises are using cloud computing to access content, how open are they today to using cloud computing to combine their internal and external content resources?

5. A cloud is only as good as its ability to have access to everything that ought to be in it. Where are we doing well and where are we falling short today in making seamless access to content in cloud computing a reality? How is content affecting the way in which people think of content aggregation?

6. Cloud computing offers many companies the ability to scale up new content services inside and outside the enterprise very rapidly. If this is so, then how does a company allocate its proprietary technology resources most effectively to compete with potential competitors that can take advantage of the same scalability? Does cloud computing enable more publishers and enterprises to scale up more cost-effectively to be mid-sized and even large competitors more rapidly?

7. Thinking of everything that we’ve discussed today, what would be your recommendations for the best ways for enterprise and media publishers to approach cloud computing?

Wednesday, November 5, 2008

Election Night Winner: Data and Visualization

As our nation looks at the election results this evening - and now this early morning - there are many statements about who won and why. There are many answers to this questions - certainly from a social media perspective I noted on Content Nation the transformation of politics through the collaborative efforts of citizens and the use of easily embedded content helped to change the landscape of American politics - but for the major television networks clearly it was data and visualization tools that carried Election Night.

Election Night is The SuperBowl of politics, so it's not surprising that many of the high-tech content tools that make that sporting event enjoyable were present on major television networks - and then some. Many Americans are already familiar with CNN's John King mastery of the "Magic Wall," the two-handed touch screen that enables him to analyze election data at any number of levels with remarkable ease and clarity and to activate embedded graphics and videos. It's a toy that nobody else really has, a coup that gives CNN a technology advantage that is hard to find oftentimes in broadcast media. Not to be outdone, MSNBC tried to deal with an electoral map that hovered in mid-air and resembled a video action game display. Clearly a somewhat generation was on this network's mind, less focused on data and more on the landscape of content.

CNN slammed back an interview with video and musical artist Will-i-am, who had produced popular election videos circulated on YouTube and other outlets. Wil-i-am was blue-screened from two angles at his remote location in Chicago and made to appear as if he was standing holographically in the CNN studios as he was interviewed by correspondent Wolf Blitzer. John King came in with a hovering "Virtual Capitol" display that allowed him to analyze the impact of House and Senate races on the balance of Congressional power.Take that, SuperBowl field overlays!

In addition to these on-air twists of technology were the many online maps, charts and data tables that were updating throughout the night with remarkable reliablity. While the Internet was a little wobbly at times through the night for the most part every major political Web site was easily accessed and provided oodles of data to pore through on national and local elections. The embedding of many of these graphical tools in social media outlets emphasized how much major media outlets are moving towards content with data and user interaction features as a way to build their brands in the places that audiences appreciate their content the most.

The real question is, though, why more publishers aren't producing such content on a more regular basis to bolster their brands. Clearly data and data visualization tools are providing content that really engages audiences and provides major opportunities for sponsorship and co-branding. Some outlets took advantage of these opportunities on election night, but more publishers need to think more proactively about how to develop content that brings people not just text but data and visualization capabilities that tell a compelling story anywhere that people want it. Perhaps this election night has been very revolutionary from a political standpoint, but the real revolution in enabling highly engaging content through data and visualization tools for mass audiences has only just begun.

Friday, October 31, 2008

Book Deal Googled: Out-of-Print Books Come Out from the Snippet-Hole

There now, that wasn't so hard, was it...?

Well, of course it took a long time, but at the end of the day most of the several years between Google's introduction of its book scanning program for out-of-copyright and out-of-print books and the recently announced settlement with the book industry for USD 125 million has been a matter of the book publishing industry deciding to name a reasonable price that would sync up with the realities of book publishing in an electronic marketplace. Since the book industry was barely interested in e-books and print-on-demand a few years ago, it's understandable that the magic number was not readily at hand back then. But now that eBooks are beginning to take off via Kindle and mobile phones via Amazon and other outlets and print-on-demand publishing is beginning to look more attractive as a business model the book industry has some real revenue and traffic data and a marketing plan that will benefit from Google and other partners pushing their out-of-print wares.

In many ways this enables the book industry to monetize fringe content far more effectively via Google partners such as Amazon, in essence validating the value of Chris Anderson's "long tail" theory for content that was sometimes discounted by book industry executives resistant to Google's scanning efforts. The settlement is really just a bulk licensing fee to make it easier to administer long-tail revenues, not too different than the industry royalties paid by radio stations. This sets up people to buy books in print and in e-reading devices like Amazon's Kindle based on Google Books "broadcasts" just as premium downloads and CDs are fed by online and broadcast radio revenues. With finding an audience for one's content the greatest challenges for all publishers Google Books has become a powerful browsing engine that maximizes the value of any title, new or old, for an audience that is just right for it.

With the new agreement Google becomes a premium destination as well: you will be able to browse full pages of scanned books covered by the agreement instead of snippets and opt to pay for the full online rights to the book via Google Books - or purchase them for your private online "bookshelf." On the surface this may look like a bad thing for Amazon and it's proprietary Kindle strategy, and certainly Amazon would love for their gizmo to get as much momentum as possible. But as successful as Kindle has been with many core book enthusiasts it hasn't escaped Amazon's attention in all likelihood that the mobile market is exploding and that they are going to lose market share for books in general if they cannot get their inventory onto as many mobile devices as possible.

Enter Google's new Android operating system, which will be able to power any number of mobile and handheld devices - including perhaps, Kindles. As Amazon's portal specialty is shopping support and fulfillment, in the long run Amazon is better off partnering with Google and other platform providers to make their inventory relevant in as many venues as possible. Amazon may also turn up a winner with the Google out-of-print deal for print-on-demand support. Already a growing number of titles at Amazon are produced on a print-on-demand basis anyway, so Google and help to power that capability as well.

So all in all this deal is likely to turn into a content industry love-fest over the next few years, a peace treaty that finally enables book publishers to leverage the vast power of Google's book scanning initiative, thus avoiding expensive or less powerful alternatives and enabling book marketers to accelerate their increasingly aggressive exploitation of online channels for their marketing efforts. I can't say that I didn't say several years ago that this would happen eventually, but for now let's all just be glad that there are better times ahead for book publishers who are learning how to exploit electronic content markets far more effectively.

Wednesday, October 22, 2008

Plugging In the Content Cloud: Oracle Deploys MuseGlobal to Connect External Content to Enterprises

The announcement of Oracle's deal with content connector specialists MuseGlobal, Inc. to deploy their EverConnect technology for Oracle's Secure Enterprise Search platform may appear like a passing note in enterprise search at first pass, but it's worth more than a casual glance if you're considering the future of high-value content services in enterprises. Oracle Secure Enterprise Search already comes equipped with a library of content source connector modules that make it possible for enterprises to integrate a wide variety of enterprise content sources into their search interface. Oracle is using MuseConnect, a platform-specific version of MuseGlobal's EverConnect content connector technology, to extend its search reach to include specific types of external content targeted at specific industry verticals, including Web and subscription sources for finance, legal, medical education and research.

Oracle is not alone in trying to integrate internal and external sources of content to better their value propositions for their enterprise clients, of course. Many enterprise publishers already have infrastructure that is designed to integrate enterprise, Web and subscription content sources on their own publishing platforms while other enterprise search vendors such as Google are also deploying content connectors for a wide variety of content sources to build up the value of their enterprise search engines. Not surprisingly, MuseGlobal technology figures in more than a few of these vendors' efforts, with each of them doing their utmost to define a useful aggregation of content that will add value to the daily workflows of enterprise workers. Content connector technology acts as the "glue" that makes such aggregation possible, widening the range of content sources available through a seamless interface and ensuring reliable access.

Content connectors are enabling a wider array of platform providers to create useful applications based on "content clouds," aggregating content from as many sources as possible with access to any specific source a technical detail that is generally not a concern of a person using the platform. If history is any predictor of the future, these content cloud applications that can combine enterprise and external sources of content are going to be powerful tools in the hands of organizations trying to make sense of large amounts of information on a day-to-day or moment-by-moment basis. Just as investment banks in the 1990s drove their profitability to new heights based on networked content source connectors that fueled powerful financial software to drive desktop and automated trading decisions more effectively, so will content clouds built for enterprise platforms enable a wide variety of 21st century organizations to become aware of threats and opportunities in their marketplaces and develop more powerful decision support services based on the widest range of quality content sources available.

So while you may think of content connectors as search engine technology, it's safe to say that their ability to connect powerful applications to a wide variety of content sources puts them in the middle of the "content clouds" that are likely to drive publishing and content technology profitability in many enterprises for years to come. Technology companies like Oracle, IBM, EMC and Google want to make sure that they can drive up their enterprise value propositions based on those clouds, of course, even as enterprise publishers try to do the same from their well-established position of creating insight from content sources. Certainly technology such as MuseGlobal's MuseConnect content connectors focused on content sources for specific industry verticals can help them to do that. In the meantime, though, the biggest winners in this wrestling match to deliver enterprise value may be the companies that can deliver the content clouds that clients want most effectively. That certainly was the case with trading room systems vendors in investment banking, so I don't expect it to be too much different as content clouds begin to become the focus of a wider range of enterprise publishing efforts. Keep your eyes on the content cloud experts, folks - and may the most seamless and flexible clouds win.

Monday, October 20, 2008

Fading AP Contracts: Old-School Distribution Struggles to Find a Market Model

Editor & Publisher notes along with many others the announcement by the Tribune Company that it has given its two-year notice to discontinue receiving content from the Associated Press. The E&P article cites the recent rate hikes from AP as a key factor in its decision, but other accounts also highlight concerns raised by other newspapers subscribing to the service regarding AP's cutting back on local coverage and its efforts to create a more competitive position for its own content through non-newspaper outlets that compete directly or indirectly with member outlets. Whatever the exact reasons in these instances, the pullout echoes sentiments surfacing in some of Shore's private research that indicates a growing dissatisfaction with AP as a source of content.

Although some of the growing rebellion against AP services no doubt is fired by cost, content and competition from the membership-driven service, there is another key factor that is driving newspapers to reconsider AP as a source of content: the marketplace. In local newspapers and media outlets there is a dwindling interest in national news as a revenue driver, as 24x7 online and broadcast sources diminish the need of local residents to turn to their hometown papers for this view of the world. There is more money to be had by many of these papers by building up deeper and more engaged local content and by building special interest sections for holidays and other event-driven interests that will attract local advertisers more effectively. Put simply, with dwindling budgets to cover world and national events many papers are making the choice to rally their limited resources around locally focused content and advertisers.

The other key factor in the challenge to AP, though, is that there is an increasing reservoir of options for media outlets that want high-quality editorial to insert into their publications. Link exchanges, content swaps and other cooperative online publishing options enable the online editions of local papers to insert content from other newspapers and media outlets into their own sites and to host their own content elsewhere at partner sites. In other words, when revenue isn't all about what happens on your own Web site but also about driving more traffic to inventory from relationships with online publishing partners there are more options for local publishers to drive up both page inventory and audience engagement. AP delivers content inventory, but not the kind of inventory that's most likely to engage the audiences that value a local newspaper brand in a way that will drive the highest revenues.

While some newspapers seem to question the refocusing of AP's content on more analysis and opinion pieces as an additional point of concern, in general the real issue for most publishers confronting their rising AP charges is that as good as AP news can be it's not what will drive their profits moving forward in most instances. While AP has spent a great deal of legal and marketing effort to shore up the value of the AP brand through copyright protection and brand positioning, it has in many ways failed to identify how a cooperative news distribution service can help its members to generate more revenues cost-effectively. With their members cutting their own collaborative content deals left and right, oftentimes with providers of unique online sources of content, the power of the Web to make these deals work without AP's infrastructure is the chief challenge to AP's future.

All of this argues for a selloff of AP in the next couple of years to an owner that can take advantage of its extensive network of reporters and stringers to package its core assets more effectively to a broader base of clients beyond dwindling newspaper properties. News Corp would be the most likely taker, in part because of Rupert Murdoch's designs already in place to provide better global marketing for Dow Jones resources (already aligned closely with AP in financial markets), though others such as Google continue to be bandied about. The missed opportunity in this, of course, is the opportunity to redefine AP as a new kind of distribution channel for high-quality content based on a new generation of news producers and to enable it to include a cross-platform network of news enthusiasts who will add value to its brand based on their enthusiasm for commenting on news content. If everyone wants to do content swaps and link exchanges, for example, why isn't AP positioned as a channel designed to make that easier?

On this note I think that one of the great missed opportunities for AP has been its failure to adopt a strategy for embracing social media more effectively. While an acquisition of a player such as Newsvine would not have stanched the bleeding based on its core asset issues it would have at least started to position AP as a service that could center communities around key news assets. If audience engagement is the key to online publishing profits, catchy headlines and great ledes are not necessarily going to help your members as much as giving people a good reason to stay on a page - something that good social media can help to do very effectively.

AP's pricing will help to define for its members what AP needs to do to cover costs for existing editorial operations, but that's little more than an opening argument when AP members are looking for concluding remarks as to how AP will help them to drive revenues more effectively. It's probably best at this time for AP to seek a parent aggressively that will help them to maintain their core editorial assets while enabling them to invest in a broader array of content assets and services that will bolster their value over time. By all indications current AP members will not be the ones to sponsor that investment, so it's most definitely time to go find buyers to make those investments while there's still a good opportunity to do so.

Wednesday, October 1, 2008

Cloudy Forecast: Microsoft Ups the Ante for Publishers in Cloud Computing

I am going to be moderating a panel on the opportunities for publishing in cloud computing on November 19th - more to come on that - so needless to say my head is in the cloud (computing) to some degree already. But when Microsoft announces a major initiative to adapt its Windows operating system for cloud computing for Amazon's web services platform you know that the balance of power is shifting away from enterprise servers faster than you might think. This is great news for network services providers and potentially good news for Microsoft, whose desktop Windows operating system is becoming ever more ponderous and is being readied for a crash diet. The bottom line from a technology perspective is that we're returning to the days of complex technology being "out there" in the network and user-oriented technology being oriented away from general computing and towards serving up content from network services.

The move towards cloud computing may seem rather "back to the future" in some ways for those of us who lived through the days of mainframe computing and (really) dumb terminals, but when did it really make sense for companies to have thousands of dollars of over-complex content and software on people's desks in the first place? The network is the natural place for most content services to live, making it far easier for peers to communicate and collaborate with one another as publishers and to provide them with the ability to benefit from sophisticated services with a minimum of in-your-face technology hassles. This is no surprise to publishers that are succeeding with the move to online digitual publishing services, but it does pose an issue for content and technology companies that had been focused on enterprise sales.

In recent years much of the "value-add" component for sophisticated enterprise content services and the technologies that support them has revolved around tailored software and information services based on integration with enterprise I.T. platforms. The early enterprise entrants in cloud computing such as Salesforce.com's network-based services have strong participation from many enterprises, but the big push for margins has positioned many enterprise content providers towards strategic sales that involve I.T. teams in major companies. Cloud sales were an investment in the future, to be sure, but present revenues were focused behind the firewalls of enteprise publishing clients oftentimes.

Clearly the rapid acceleration of enteprise-oriented I.T. services towards network services available via highly scalable Web infrastructure is going to put more and more pressure on this line of marketing for high-end enterprise publishers. Web services, which enable publishers to integrate their content easily and rapidly with other content via standardized programming techniques, are flourishing in cloud computing environments, enabling user-defined "mix and match" content services intergrated into a wide variety of platforms and productivity tools. This is good news for publishers who want to get their content up and running as quickly and as easily as possible in enterprise-oriented applications - but bad news for publishers who wanted to sell people on the idea that doing so was really expensive and hard.

The go0d news for enterprise publishers is that cloud computing is likely to spawn a widening breed of tailored content applications that can be deployed more rapidly and efficiently. Long and risky product development cycles for advanced publishers are likely to give way to general frameworks for cloud-enabled content applications that will have easily tailored core functions that can be changed to meet individual client needs more rapidly. In the process of doing so, many major aggregators may begin to look at what their real core strengths need to be, leaving some likely to look further and further afield for just the right content sources to aggregate as needed for specific client applications. Instead of focusing on database curation, it's more likely that tomorrow's major enterprise publishers will be focused on Web services curation, being experts in assembling just the right content from any number of databases and Web sources that meet their clients' needs.

While in many instances existing staff skill sets will be transferable to the cloud computing environment, I expect that more than a few of the major publishers are ill prepared for the cultural leaps required to survive and to thrive as content services experts in cloud computing. We're all familiar with the reogranizations that have been the focus at major enterprise publishers such as LexisNexis that are aimed at blasting away very I.T.-centric product development cultures in favor of more client-centric cultures. What happens when the Web services-centric model of cloud computing impels these companies to accelerate the culture change for their core revenue lines that much more quickly? There are great opportunities for major publishers in the shift to network-oriented enterprise services, but I suspect that more than one five-year plan may be floating out their H.Q. office windows shortly as the depth of the impact of cloud computing services on the enterprise content industry becomes more clear to them.

Tuesday, September 9, 2008

Google Goes for Newspaper Archives: Elbowing in on Pay-Per-View and Subscription Database Models?

AP notes along with others the announcement that Google plans to extend its print archives scanning program to include the print archives of any newspaper that would like to participate in their program. This new effort builds upon Google's existing scanning efforts to capture books and other materials in the archives of major libraries. Early participants in the newspaper scanning program include Montreal, Quebec's Chronicle-Telegraph, the Pittsburgh Post-Gazette and the St. Petersburg Times in Florida. Regional newspapers are struggling to find sources of revenue for their print assets what will offset plummeting print ad income, so the prospect of exposing their archives for revenues from Google's AdWords and to benefit from referral links to their subscription signup pages is found money for assets that are otherwise sound asleep in most library collections.

Unlike previous arrangements for newspaper archives, which were arranged based on access to subscription or pay-per-view databases or limited access to "snippets" of copyrighted content, the newspaper scanning program's direct parallels with the Google Books program means that people will be able to benefit both from the literal image of a newspaper as it existed at the time but also from text-based searching of those news sources. The differences in approaches are clear and somewhat startling when you compare the scan-based approach to other approaches. For example, a Google News search for "Man Walks on Moon" in the Google News 1969 archives, for example, yields dozens of pay-per-view articles on the topic, but eventually one can look at an ad-supported article from the Pittsburgh Post-Gazette that captures not only the words but also the flavor of graphics, editorial cartoons and other features that were of importance in the era of the early space program, with key search terms highlighted in the scanned text image.

For larger media organizations this approach may not be as appealing as waiting for the "big fish" of pay-per-view and subscription database revenues, but for regional and local newspapers this is likely a very attractive alternative to microfiche collections which are expensive to create and will have relatively low-volume, one-time sales, versus the evergreen potential for revenues from online scanned archives. This alternative to microfiche and subscription databases also puts pressure on suppliers such as ProQuest and Cengage to justify the breadth of their archives as a key selling point. AdWords revenues will not be the answer for every publisher's need to monetize archives but it appears that Google has found another way to add value to hard-to-find content sources that challenges publishers to think more creatively about how they intend to add value to the delivery of their archived content.

Monday, September 8, 2008

Life with Kindle: A Page-Turning Device That Can Satisfy. For Now.

Amongst other things that I was checking out during my book-writing sabbatical was Amazon's Kindle portable reading device, courtesy of the Westport Public Library's lending desk. I checked out the unit for a few days, which actually turned into about two weeks due to a bad cold that caught me unexpectedly, but it was long enough to appreciate the ins and outs of this increasingly popular device.

A Kindle starts up easily enough by sliding a slim switch on the back of the unit, though its being next to a switch that activates the unit's wireless networking capabilities makes this something a bit awkward to do by habit. You have to flip the Kindle unit over to make sure that you're hitting the right switch most time. There are a lot of little ergonomic issues like this in the Kindle, ideas that look good in the design phase that perhaps could have been better thought out in real life. The keyboard of a Kindle falls into that category also, being barely usable for hunting and pecking but with a slippery and ambiguous feel that makes it unthinkable to use it for more than a few must-do tasks.

Overall, though, many of the key features are remarkably easy to use. The unit boots up quickly and its basic page turning functions are remarkably intuitive, with large broad keys on each side of the unit for turning forward and backwards. A Kindle will boot up to where you were last looking at content, so it's not always necessary to bookmark where you were last reading - same when you return to a specific book. There is a small scroll wheel at the bottom of a thin channel that parallels the main screen: scroll the wheel and a kind-of cursor will move up and down next to the screen and allow you to select from pop-up menus or to click on links. I thought that this would be a really inconvenient interface but you get used to it fairly easily. I can see how its steadiness will be useful in bumpy environments like subway trains. So for basic functions and navigation control you can give it a "weird but usable" rating for the most part.

The eInk display was somewhat disappointing in that the background was rather grayish rather than whitish, which made many illustrations almost impossible to make out clearly and made it a little more difficult to use in dim light. But in spite of this the display was remarkably readable for text - especially when the font size was bumped up a bit. Whew - for those of us who rely on reading glasses or progressive lenses, this is a blessing. There are plenty of great books that I'd love to pore through that have bitsy little print that wears my eyes out very quickly. With a Kindle you don't get print fatigue or the fatigue of looking at a backit screen. With bumped-up font sizes there's not that much information on any given page but the ease of turning to a new page of content makes up for that mild inconvenience easily. I found that I really enjoyed reading materials on the Kindle once I got settled in for a good sit-down.

The early Kindle models now available do provide Web access, but except for a handful of Web sites well adapted to the unit it's largely an exercise in fumbling through awkwardly formatted content - and also a feature that led to the unit freezing twice. A push of a bent paper clip into the unit's reset hole got it back to good order, but this is not a unit meant to replace mobile units with more robust Web browsing capabilities. Still, for a quick sneak peek at the headlines, it beats going back to the PC sometimes. The wireless service was quite good at my home, so chances are it will perform reasonably well with its network connectivity turned on wherever broadband services perform well. However, leaving the wireless connection does drain the batteries far more quickly than normal local -only reading would. In reading-only mode the Kindle batteries last for many days of typical use.

It's certainly a unit that I would consider as a convenience for future book purchases, especially given Amazon's pricing that enables one to purchase both a printed book and a Kindle-compatible copy for one purchase price, or get a Kindle-only copy for an even steeper discount. But what of gift books - or, for that matter, the huge library of printed books already at my disposal? The huge gap in Kindle's market strategy is a lack of "hooks" to keep people attached to their existing libraries and to be able to move on to new books once their usefulness has run their course. There's no real concept of a "used" market for Kindle books, much less the ability to add significant value to them in a way that could be onpassed to others.

More importantly there is little ability to use a Kindle book to activate online content. For example, if I am reading a passage and would like to research a specific person or historical event mentioned in the book, there are no "hooks" to online content that would make that easy - nor any way to store that research with my Kindle book copy for future reference. It's still a fairly unimaginative approach to book marketing. This may reflect the generally conservative approach to book packaging and marketing that still grips many publishing houses, but this conservatism now competes with a demographic curve that is racing against the clock.

Like the music industry print publishers have locked in their future to proprietary technologies to protect existing business models, but in the process of doing so they may have sold away their futures. With an explosion of different kinds of portable devices reaching the marketplace today and the promise of an even more complex array of devices fitting people's lifestyles in the future, why on earth would an entire industry select a proprietary platform to develop their future revenues? In a few years I believe that we will look at experiments such as the iPod and the Kindle much as people today look back on proprietary electronic content services such as Compuserve or the original AOL and ask themselves, what were we thinking?

The future of book publishing will rest on more open publishing platforms that enable book content to move to the contexts and popular devices in which it's valued most far more effectively and that will enable others to add value around a given book independent of its initial publisher. Book publishers already are more aware that their best strengths lie in talent management, providing services that leverage as many aspects of an author's value as rapidly and as effectively as possible through the lifecycle of a given work of authorship. But expect that more nimble companies who see the ability to manage talented authors more effectively through a variety of publishing media to challenge traditional publishing houses over the next few years, especially those who are best able to leveral social media outlets to build and maintain loyal communities of readers and commenters. The Kindle is a nifty little device, but it's just a hint of where the future of book publishing could take us in the not too distant future.

Tuesday, July 22, 2008

Back in Book-Writing Mode...

Hi, sorry that the blog's been a bit slow, back in book-writing mode, will have more soon.

Friday, July 11, 2008

When Blogs Become Big Media: ContentNext Purchased by Guardian for $30 Mil

On a personal note I couldn't be more happy that after pioneering serious blog journalism on the business of media Rafat Ali's ContentNext is being acquired by the Guardian Media Group for north of USD 3o million, according to Dow Jones' All Things Digital blog (and of course it's very appropriate that Dow Jones returns the favor to Rafat after his many scoops on them). Rafat's worked very hard for this moment and had the wisdom to assemble a great team to help him make it happen. The story seems to be that contentNext will remain an independent product with the Guardian helping to provide growth and outlets for its content and events for professionals.

That's the key point to remember about this acquisition: it's not really a "blog acquisition." Rafat Ali started with a relatively simple blog structure but he focused early on broadening the mission of the publication as a source of serious trade journalism and on attracting a global clientele of serious media business people to his content and his events that made it far more like a traditional B2B trade publication with a strong events component than a blog without a magazine. That's hardly a bad thing, but from an acquisitions standpoint this is really about a traditional publishing group broadening its portfolio with a B2B play that just happens to have started life using blogging technology. ContentNext puts together great events, has great industry reporting and was smart enough to do all with from day one useing Web-based publishing and marketing methods. In fact there's no reason why a publication stable like ContentNext couldn't add a print component and be entirely successful - though it's not likely any time soon.

The real question isn't why ContentNext got a fairly healthy multiple for its operations but rather why more B2B publications don't look more aggressively at acquiring born-on-the-web publications that can help them to trim down to similarly responsive and profitable proportions. The worst enemy that B2B trade journalism has is the legacy of born-in-print executives who are trying to find a place to employ their dated skill sets in the digital age - and dragging down the long-term profitability of B2B media in the process. With a core publication family like ContentNext under its belt the Guardian Media Group has a publishing team that's successful in its own right but which can also provide a blueprint for managing B2B media successfully for years to come in other market sectors. Getting a team that does it the right way already may be a better option in many instances to provide existing internal Web operations a model to follow.

In the meantime, congratulations to Rafat and to all of the great people at ContentNext - enjoy every moment of it.

Friday, July 4, 2008

ShoreViews Video for 4 July 2007

Currently I am working on Chapter 6 of Content Nation, which focuses on the impact of social media on politics. It seems only appropriate to be doing this on our nation's Independence Day. Below I share you a video that celebrates how content was such an important part of the story of that fabled day in 1776. For those of you celebrating today, have a great day!

Monday, June 30, 2008

The Payoff: LinkedIn Focuses on Monetization through Ads and Targeted Research

LinkedIn's growing success is both admired and feared by many in the content business, but the rap against them for quite some time has been, "Well, yeah, but where's the monetization?" In truth LinkedIn has been growing revenues steadily through traditional brand ads, partnerships and payments for premium services. But with two key moves LinkedIn is raising the bar on its prospects for revenues - and for a potential exit at a more appreciable price.

The fist LinkedIn initative is its new DirectAds service, which enables LinkedIn members with profiles to produce simple text ads on a self-service basis that can appear in other members' profile pages. Similar in overall concept to Facebook's SocialAds program - a link to the advertiser's profile appears in each ad to ensure that marketing is on a conversational basis with a known entitiy - DirectAds has the added benefit of being able to target executive peers in the LinkedIn network with a great deal of granularity - and charges healthy but affordable minimum rates to do so - a $25 minimum for a flight of ads, with impressions based on a variable formula. Filtering options include many of the criteria found in a typical member's profile, including the ability to limit ads to specific geographic regions.

The potential for DirectAds is very strong within LinkedIn itself, but it also has the potential to provide B2B publishers with some real concerns as this evolves. Though there is no announced plan to take DirectAds off-site into other publishing venues, certainly classifieds in B2B journals and Web sites could be easily targeted by LinkedIn with its extensive network of top-shelf executives and salespeople. More importantly, it's not too hard to imagine that a B2B publisher seeking revenues from companies trying to get a message through to very specific executives would jump at the chance to use DirectAds to get rates far higher than classifieds for its very targeted profiling capbilities. In very tightly knit B2B communities DirectAds would play very well in B2B publishing venues. Technologically, it would not be hard to implement at all - it would only take enabling a B2B publishing site with Google's OpenSocial API. With such a combination DirectAds would have a Google AdWords/AdSense revenue combo for on-site/off-site revenues that could be impressive indeed. If done properly - hopefully avoiding Facebook's pratfall with its Beacon program that released private data in a user-unfriendly manner - this has the potential to be to B2B publishing what Google was to consumer publishing, turning advertising into relationship building with one click of the mouse. With its potential for ultra-precise targeting, it could put somewhat of a dent in marketing lists services as well in time.

The other interesting new program at LinkedIn is the LinkedIn Research Network, which leverages some of the concepts that it employed in LinkedIn Answers to provide a tool that can enable executives to conduct peer-to-peer industry research. As in LinkedIn Answers members of LinkedIn can pose questions to peers in the LinkedIn network, using LinkedIn's extensive structured and unstructured member profile data to zero in on just the right people to target for questions. The Research Network provides its users with a workbench to monitor responses to questions and to in effect build research panel who can be contacted for additional questions.

The revenue hook in Linked in Research Network is its use of LinkedIn's private InMail network to contact members. Members may use InMail for contacting up to 20 people at a time, presumably to cut down on "spam" research requests and presumably to make it easier to meter the pricing to a reasonable block of minimum requests. Of course, one can sign up for InMail at any number of premium levels, so the real hook is to promote InMail premium subscription revenues as much as possible. Given that the demo video was intent on saying that this product was targeted primarily at financial industry analysts trying to contact experts in companies and market sectors, perhaps their initial expectations for its use are limited. But clearly its ability to combine the art of research into the art of marketing will make this a popular option for many over time.

With both of these options LinkedIn is taking a relatively low-key approach to product development, moving relatively slowly to ensure that their most valuable asset - the trust and security that the LinkedIn system of opt-in relationships has protected through its development - will not be tainted or abused. Executives are a conservative bunch when it comes to dealing with their personal reputations, but LinkedIn has proved to more than 20 million professionals so far that it is by and large a very trustworthy environment. With that trust as a primary asset, it's likely that LinkedIn has set the stage for some solid revenue development that is likely to upend a few B2B applecarts in the long run. For the time being, though LinkedIn is just at the begininning of what promises to be a long battle for the rights to what professionals value most in carrying out their business - trusted relationships that can yield revenues.

Thursday, June 26, 2008

Cruising the Mall: Microsoft Scoops up Powerset for Universal Search

It seems like only a few weeks ago that I was blogging about semantic search startup Powerset's soft-launch beta. In fact, it WAS only six weeks ago that we were covering Poweret's soft launch of new semantic search technology. But in that six weeks Barney Pell's crew got in a ton of good PR and a few meetings that have already resulted in a USD 100 million exit into the hands of Microsoft, according to VentureBeat. It wasn't so many years ago that Barney was a part of the bumpy exit of WhizBang Labs and its Web mining technologies. This time around his team was well ahead of the burn rate and blessed with both a good idea and good timing. With tons of cash on hand after their war chest for a Yahoo acquisition Microsoft was ready to vent by spending some large (or, for them, small) at the deals mall to pump up its search for more advertising revenues.

Given Powerset's ability to parse natural language questions as well as to provide "factz" topic clusters that could draw in related content, the target for Microsoft has to be the revived Ask.com portal as much as Google's leading search engine. Already Microsoft's Live.com search engine provides rich search results that emulate Ask's more user-friendly approach to search-driven content aggregation, but Ask still manages more meaningful responses based on natural language queries. Better front-end parsing and clustering of results terms from Powerset's technologies would certainly help Live to get more relevant and rich results that could help to build a larger audience, though how Powerset's technology will fare in absorbing Web content lacking the encyclopedic style of it's trial Wikipedia content remains to be seen. On most test queries using natural language questions one finds Google to be at least or more relevant in its results than existing major search engines, so even with new semantic technology Microsoft has its work cut out for them.

A better match for Powerset might be found on the enterprise side of Microsoft's offerings, where its recently acquired FAST enterprise search technology may benefit from some extra semantic search and clustering mojo - and find somewhat more structured content sources against which to apply semantic algorithms. That's not to say that Powerset won't succeed with open Web content, but in general semantic search technologies are most easily tuned when they're digesting documents with relatively similar styles. It would seem that this would be easier to tune to an individual enterprise's needs overall than to a world of Web content that could be in any shape at any time.

A better question might be why Microsoft hasn't considered purchasing Answers.com if they are so interested in natural language queries. With millions of pre-formed questions already in its WikiAnswers database many natural language questions map very neatly to its answer sets. In other words, sometimes the best answer to a full-sentence is a person who understood the question in all of its semantic details and has already provided the answer. This is far from a goof-proof solution to semantic search, but it's an approach worth considering as a valuable supplement to semantic document parsing.

In any event the Powerset set now finds itself in the enviable position of having sold their ship before it ever went down the launching track into the waters. That's certainly more than a few publishing portals can say these days. Congratulations to Barney and all of the other rocket scientists at Powerset - it pays to have a technology that solves a problem that companies with deep pockets are ready to get their hands on.

Tuesday, June 24, 2008

NYSE Goes "Real-Time" to the Web: Accepting the New Buy-Side Reality

The New York Stock Exchange has been careful through the years to keep feeds of trading data released to public media outlets hobbled with a fifteen-minute delay - in part to protect its revenues from financial institutions being charged for real-time data and in part to offer their member firms an information advantage that would help them have an upper hand with retail investors. But with most of NYSE's competitors being far more lax about releasing real-time trade reports and the definition of "real-time" having changed with powerful new low-latency trading systems for professional traders NYSE has re-evaluated its position on real-time trade reports for the public. Today NYSE Euronext launched its "Realtime Stock Prices" product for media, allowing unlimited distribution of real-time quotes to the public without tracking individual use. The product requires a distributor to pay an undisclosed bulk fee for the rights to public data distribution.

With NYSE's share of securities trading slipping and it's reputation as a market friendly to small investors slipping along with it real-time quotes from the public should have been a default position years ago, as we've argued oftentimes in ContentBlogger. Today's retail investors have more options than ever for making money in the markets, with NYSE's stumbling "blue chip" stocks being far from the most attractive alternatives for many. Forcing people to pay for real-time trade reports was only discouraging further participation in NYSE equities markets by retail investors - especially when other exchanges seeking market share were more than glad to use market data as a lure to new traders.

CNBC has long been a leader in public market data - I led the development and installation of their first delayed data system years ago for Quotron - so it's expected that they've opted to be on the edge of NYSE's release of this product. But the other announced client - Google - is one that was expected also but one that couldn't have come at a worse time for Yahoo. Real-time quotes from NYSE have been available from Yahoo at a premium for many years, so in a time when they have been trying to look plump to acquirers it's not surprising that they didn't opt to give up their NYSE quote revenues ("back door" real-time quotes from private electronic markets on Yahoo aren't strongly representative of the full market). So by default the go-ahead went to Google, whose Google Finance portal has become a very strong content offering. If nothing else the public knowledge that full NYSE real-time quotes are available at Google will provide some needed publicity for Google Finance at a time when Yahoo is slow to give up existing revenues.

I would hardly be alone in chastising NYSE for dragging their heels on releasing real-time quotes to the public, but it's sad that it has taken this long to get NYSE to make this move. It is, unfortunately, a familiar refrain in the content industry: major institution covets proprietary content revenues, squeezes them out for as long as possible while the markets move to find both acceptable substitutes and better ways of doing business. Publishing is in essence a very conservative business, so it's not surprising that NYSE would try to keep this formula going for so long. But in an era when the buyers of securities have and demand information at least as good as most selling institutions failing to serve the buy side in financial markets effectively is to ignore the fundamental shift in the content industry that empowers people with independent access to content from around the world. Your content may seem safe as a proprietary asset, but if it's not driving your clients' profits in its most valuable user-defined contexts it is far from a safe bet in today's content markets.

Nokia's Move to Open Mobile Software: Aimed at Google, But Impacting Publishers

The mobile phone world was a-twitter with word that Nokia has purchased mobile software maker Symbian and will make the core of its software an open source resource some time later this year via a new Symbian Foundation, with other open-source assets to follow. Engadget notes that the Symbian foundation will include many of the mobile industry's biggest names and will include technology donated from both Nokia and many others, including Motorola, Sony Ericsson and NTT DoCoM. Other members will include Texas Instruments, Vodafone, Samsung, LG, and, interestingly, AT&T, which has had great success as of late with the proprietary Apple iPhone platform.

Clearly the impending launch of Google's open source Andriod mobile platform, delayed in launch until the fall but looming nevertheless, has forced the hand of mobile equipment providers and network operators to consider the potential impact of having their highly proprietary approaches to mobile technologies "googled" away to the demand for more common mobile standards for software to power more content services development. By creating a common core of technologies based on a company with which it's had a long-standing relationship Nokia gets to expand the value of their knowledge of the platform in a way that may transform their business model over time from one of manufacturing to one of enabling systems development. Given the demand for mobile services in developing nations this will enable companies like Nokia to have a hand in those markets without having to bear the full cost of either hardware or software development through the Foundation's partner network.

But more importantly for the content industry this puts at least as much pressure on providers Microsoft, Palm, Apple and Research in Motion to recognize that there is ever more pressure on proprietary operating system solutions to justify their ways. With speed wireless broadband network services opening up the Web to mobile devices the ability to deliver platform-specific content services will become icing on the cake for those who want new status toys but for the bread-and-butter corprorate worker or mobile entrepreneurs and family members it may take more than just a few proprietary services and a delightful interface to keep people locked into a proprietary platform. For content suppliers looking for new "choke points" via proprietary platforms the short-term news via suppliers like Microsoft, RIM and Apple looks good, but the picture over the horizon is likely to look vastly different in less than a year. Be it via the Symbian Foundation or Android platforms, publishers need to stop looking again and again for new ways to activate old business models via mobile platforms and look far more aggressively at how they will survive and thrive in a world enabled with open and universal access to Web-enabled content sources.

Monday, June 23, 2008

In the Battle Between Thomson Reuters and Bloomberg, the Winner May Be the Cloud

A day that highlights world financial giant Citigroup's layoff of about ten percent of its workforce is a somewhat odd time to be running a profile of Thomson Reuters, but The New York Times has done just that. The article is entitled "The New Fight for Financial News," but of course the battle between Bloomberg and its perennial rivals now combined into a single company is fought on may levels well beyond the news front. Thomson Reuters CEO Tom Glocer likens Bloomberg in the article to the equivalent of Richard Branson's Virgin Atlantic airline shaking up the marketplace for transatlantic flights in the 1990s, an apt analogy on at least two levels. It's apt in the sense that Bloomberg forced its competition into many radical and painful changes to keep up with its growing market share - the new combined Thomson Reuters entity is just about toe-to-toe with Bloomberg for its piece of the financial information marketplace - but also apt in the sense that there's a new generation of competition that's putting both the financial information marketplace and the airlines on alert.

That new generation is not necessarily of the same type and heritage as either Thomson Reuters or Bloomberg. What impressed me most at the recent SIFMA conference and expo in New York was how the traditional financial information vendors are receding into the background as the technologists are coming to the fore. The exhibition floors were chockablock with networking technologies this year, both for low-latency automated trading services and for more general information and trade execution network services from vendors such as BT Radianz. Cloud computing was also on display at the SIFMA show from Salesforce.com, with a more aggressive and extensive display of its capabilities to support brokerage marketing operations. Also noteworthy was SDS Financial Technologies' moves to support more automated crossing networks for commodities and futures trading, helping to reduce execution costs and liquidity problems for a marketplace still tied to many face-to-face trading pits.

So while companies like Thomson Reuters and Bloomberg are going to continue to try to dominate on the desktops of investment bankers and portfolio managers for the foreseeable future, a lot of the action in financial information is taking place well away from the desktop and in the bowels of computer networks that support securities trading and sales. Not all of these stories are about the dominance of the Web as the cloud of choice - the financial marketplace has many specialized networks that support its sophisticated information-driven marketplaces - but certainly the concept of cloud computing popularized by the Web in which desktop technology is just an interface to sophisticated services from potentially any network providing information and execution services. Certainly the robust trading floor technologies developed in the past few decades will continue to be a part of this mix but with today's cutbacks by Citigroup serves as a reminder that we may be nearing the end of the era of big investment bank trading floors as the driver for measuring the success of financial information services.

With more and more workflows in securities trading having become fully automated in recent years it's not clear that the desktop-oriented services of companies like Thomson Reuters and Bloomberg are going to work out in the long run for high-growth information services. Instead, it's far more likely that more and more network-oriented "cloud computing" services are going to subsume more and more profitable parts of securities transaction support while information suppliers find an increasingly narrow range of clientele ready to spend handsomely on major desktop integration services. While the hedge fund trading of recent years hit speed bumps in recent months much as programmed trading caused hiccups in the 1987 crash, the ability of a small team of hedge fund managers to build dominant positions in a marketplace by mining information aggressively from alternative information sources not provided by traditional vendors should be a wakeup call to Thomson Reuters and Bloomberg that anyone can extract useful content from any cloud very quickly and effectively.

Major financial information vendors have had similar challenges in the past and have responded with valuable services to rebuild their position in the marketplace, but it's not clear to me that we're on another full-blown cycle towards that goal right now. I think that we're more likely to see cloud computing services gaining more and more power as they provide well-integrated information services to ever more concentrated and sophistated trading operations. I don't think that this means that Lehman Brothers will be moving back to its old South William Street HQ any time soon (now a cozy inn) but I think that we will be seeing the financial information industry looking more like it did in the 1950s than it did in the 1990s over the next ten years - with fewer and fewer direct product presences on trading floors and more and more integration into cloud computing services. There are opportunities there for Thomson Reuters and Bloomberg as well, of course, but perhaps not the types of opportunities that are driving their organizations today. In the meantime, congratulations to Tom for a great profile article.

Tuesday, June 17, 2008

AP Challenges Weblogs on Use of Content. Is the AP Brand at Stake?

I've tried to remain low-key about the Associated Press action against the Drudge Retort, a parody of the famous Drudge Report political Web site, but given the furor out there I think that a post on the topic is worthwhile. The AP has raised "takedown requests" claiming violations of the Digital Millennium Copyright Act (DMCA) and other laws in unlicensed use of its content in seven of the Drudge Retort's blog post. Not only is the Drudge Retort being challenged on its own use of AP's content but as well for people in comments sections that quote paragraphs from AP content. The Drudge Retort's Rogers Cadenhead commented on the takedown letter on his own weblog and provided a summary of each of the takedown requests, citing the examples.

Similar to the lawsuit raised by AP against Moreover for their use of AP headlines and ledes to provide links to AP content the concern of AP seems to center on the use of headlines and ledes as copyrighted content. Unlike the AP/Moreover suit, though, this takedown letter focuses on only seven items rather than a bulk use of AP headlines and ledes. And unlike the AP/Moreover suit, some of the headlines on the Drudge Retort site were not AP headlines but headlines rewritten by the site's staff. Also notable was that the sections of text from AP stories were quite small. In all of the sections posted by the Drudge Retort itself they were either just a lede sentence or a lede plus a quote from someone at a public event.

The Drudge Report appears to have complied with the takedown order and AP's Jim Kennedy promises guidelines for bloggers using AP content, but awareness of it spread quickly through social bookmarking services and weblogs and has ignited a widespread reaction from major bloggers and mainstream commentators. TechCrunch's Michael Arrington offered one of the stronger statements, claiming that his prominent weblog would no longer reference AP content. Others were more inflamed in their rhetoric, including this gem from Matthew Ingram:
I don’t want to be accused of succumbing to Godwin’s Law, but I would argue that a dialogue with the AP has about as much chance of being “constructive” as Chamberlain’s discussions with Hitler over the fate of eastern Europe.
The New York Times' Saul Hansell tries to steer a calm course through the AP challenge in their Bits blog but in the era of sub-millisecond delays of information transition used to power most large-scale trading of financial securities his citation of the century-old "Hot News" New York statute is shaky at best. If someone is linking to a story that's already minutes, hours or days old on the Web, much less in investment banks, how "hot" can that news be? And since to get the story in full one must still go to the licensed source, the licensed source is going to benefit financially from more public awareness of their having a story available.

The clear benefit of inbound links and short, fair use-style citations can be seen in the impact that social bookmarking has had on AP licensors. Looking at the data at right from Compete.com, news Web sites that are major licensors of AP content do not appear to have been harmed by the growth of social bookmarking sites such as Digg, which provide similar small snippets of content and headlines from AP and other sources. In fact, one could argue by such a trend that much of the growth at news sites in recent years has been due to the attention that weblogs and social bookmarking sites have paid to their content. Social media is the news world's best friend at this point, providing an editorial capability that curates high-value content from professional media organizations that would otherwise be ignored.

But the real point seems to be whether AP can gain financially from this exercise. Facing a dwindling number of mainstream media companies available to purchase its content AP its struggling to come up with a way to build a broader base of revenues in an environment in which their audience has become a far greater source of content curation than their traditional client base. Whatever the validity of AP's legal citations - they seem to be to be quite weak and awaiting only a decent lawyer in opposition to them to have them swept away - they are alienating the very marketplace that is driving growth for their existing licensors at a time when that marketplace needs AP content less than ever before. It is all too unfortunately like the RIAA-led lawsuits against consumers of online music, which have done little to change the fate of music publishers who have lacked a coherent marketing strategy to deal with the power of online music consumers to drive both tastes and sales.

As valuable as AP content may be, for most news stories that people will link to and comment upon online there are readily available substitutes from other wire services. AP's position as a service bureau complicates their ability to counter the power of proprietary wire services such as Reuters and Agence France-Presse, but clearly the problem is one of having only so many popularly-tracked newsworthy events to cover that will result in real "hot news" that others lack. In the meantime weblogs and other emerging publishing outlets are creating new sources of news and newsworthy opinions that could be syndicated by AP into their distribution network far more aggressively.

From a marketing perspective the real issue for AP, like the music business, seems to be far less about protecting an existing product line and far more about what needs to be done to rethink both the product line and the marketing rationale for the core product. Instead of resorting to lawsuits and takedown letters as a primary strategy to enforce the value of AP content on the Web, tactics that could create both legal confusion and a potential dilution of the value of the AP brand in the eyes of consumers, AP needs a "win-win" strategy that looks upon the drivers of economic value in online publishing more realistically - and that begins to incorporate new sources of content worth distributing to its worldwide subscribers and more valuable services.

A more refreshing approach to the opportunities available from social media is definitely in order. Simple example: instead of thinking about charging people for using AP headlines, why not PAY people for the click-throughs that they bring to subscriber content and charge higher rates to subscribers for the service? Hmm, maybe those bloggers are pretty good folks after all.
In the meantime, perhaps that nice linear relationship between social media growth and sites using AP content may not be looking so linear for a while.

Thursday, June 12, 2008

Yahoo Opts for Google Ad/IM Deal, MSFT Deal is Off. Can We All Go Home Now?

In what promises (for now) to be the end of the Silicon Valley soap opera known as the future of Yahoo, AP reports that Yahoo opted to seal a deal with Google for both the use of Google's ad network and enabling the interoperability of their instant messaging networks shortly after having announced the suspension of their attempt to revive talks with Microsoft on a potential acquisition deal. Yahoo shares tumbled immediately afterwards, leaving the long money on Yahoo holding a devalued stock but a deal that is likely to be one of the best ways forward for ensuring a reasonable future for Yahoo.

As noted two months ago in ContentBlogger, a deal with Google seemed to have been the best route for Yahoo all along, promising lots of new Yahoo page inventory for Google's more robust ad inventory and complementary media and technology profiles that were never as much at loggerheads as people made out some years ago. As for Icahn et al., while some may have been looking out for shareholders wanting short-term money out of what they had assumed was a cooked goose they never really seemed to have the goose's best interests in mind - or, for that matter, the best interests of Microsoft shareholders. After Yahoo would have been carved up it would be hard to believe that there would be a whole anything that would be greater than the sum of the parts.

As much as people tried to paint this as a Yahoo desperation deal clearly it was moreso a desperation deal by Microsoft to buy some time to build a broader position in online markets for its faltering ad network, with virtually no apparent upside for Yahoo properties. There was a lot of Ballmer bluster but underneath it all Microsoft was rolling the dice heavily for a very risky deal that had little solid strategy behind it beyond a temporary ad revenue boost from peeling away Yahoo ad accounts.

By contrast the deal consummated by Yahoo with Google is expected to pump in significant new ad revenues to Yahoo from Google's superior ad network, a total win-win any way you look at it. The deal is non-exclusive, so Yahoo can choose a plan "B" any time that it wants. In the meantime the other huge win-win is the promised interoperability of instant messaging networks. Google already has interoperability with AOL's still-popular messaging network, so the stage is set for the next major deal to whisper about - a Twitter acquisition that will provide a unified front end to the world of instant messaging.

With a generation of Web users coming of age focused on IM, Facebook and other platforms, email systems creaking with offensive and virus-laden spam have become a legacy messaging technology that wil die a slow and largely unprofitable death in much the same way that the telegraph lingered well past its prime. We use email because we have to - not because we want to. Focusing on accelerating the growth and usefulness of IM systems while leaving their email services to take their own paths is a smart move for both Google and Yahoo. A merger of Yahoo mail accounts to either Google or Microsoft's mail networks would have been a long, painful and largely unprofitable endeavor.

I felt all along that an independent Yahoo would be better for the content industry as a whole so I am glad that at least tonight we can go to sleep knowing that there will be a wider variety of good platforms through which to publish content than if the Yahoo deal with Microsoft had gone through. Jerry Yang's team still has a lot of challenges ahead of them but with an improving stable of user-friendly destination content properties and a progressive approach to supporting brand advertisers Yahoo promises to have a strong place alongside other major online portals for some time to come. At least I hope so - I really don't relish a deal war as ugly as this one any time soon.

Monday, June 9, 2008

3G iPhone Launches with Developer's Kit: Let the Competition Begin

The world tripped over one another to ooh and aah at the latest version of Apple's iPhone, a somewhat sleeker model with 3G wireless Internet access and a software development toolkit that enables applications to be built for the iPhone that can take advantage of all of it's "new hotness" interface features. Prominent among the new applications at launch is Microsoft Exchange, a shot across the bow to enterprise users equipped with Blackberries and feeling that, well, they're just not as hip as the next sales and busdev guy. Toss in promised interfaces to home appliances and Microsoft's home strategy takes a bit of hit as well.

Also prominent is the new USD 200 domestic price tag, presumably subsidized by AT&T in much the same manner as other mobile phones to promote mass sales and mass usage of AT&T services. Now people wanting to keep up with the tech-leader Joneses down the street can pile on and join the fun. Put these factors all together and you have a highly competitive platform (albeit one that still lacks a keyboard) that makes consumer and enterprise content accessible in mobile markets as never before. That's the good rah-rah news, in any event.

The not-so-good news is that the exclusive deal with AT&T puts pressure on other mobile carriers to come up with their own deals that can compete with AT&T at a price point that's much closer to attainable luxury for most folks. Supporting a plethora of platforms has hindered the ability of applications developers to create software that scales to markets and has drageed down enabling full Web access on 3G networks, hobbling the ability of U.S. carriers to prepare for this inevitable moment of challenge by Apple and AT&T. Instead of focusing intently on content, most mobile carriers have focused too much on the tech of the platform, instead of viewing mobile devices as just another blank screen that can be painted with content from any application.

However, these aggressive moves by Apple and AT&T may be more a preparation for emerging competition. Microsoft or Google or both will benefit from other mobile carriers and device makers trying to create more cost-effective alternatives to the iPhone now that the USD 200 price barrier has been breached. Microsoft is the more likely beneficiary in the short term, but with profitability becoming an issue, especially with the cost of 3G Web services pushing margins down, Google's Android cross-platform operating system is likely to emerge as the platform that allows more profits at lower price points for both mobile device manufacturers and carrier networks. As noted in TheStreet.com recently a preview version of an iPhone-like phone equipped with Andriod offered touch-screen operation, 3G Web access, software development interfaces for applications and many other features which are likely to come in close to iPhone functionality without the content and software licensing baggage that comes along from Apple.

There's no doubt that the iPhone will continue to be the Lexus of Web-enabled phones for a while, but there's also no doubt that the world has been waiting for the Toyota version to show up for a while. Especially in burgeoning markets like China and India, where Apple's licensing strategy is likely to be less appealing, Android-equipped phones that enable integrated Web access and language-independent hardware are more likely to be the global winners in mobile communications. So while the hoopla around the iPhone 3G launch looks hot for today, remember that in the fall we're likely to be talking about a different perspective on its future.