Wednesday, May 28, 2008

Google Health, Google Apps API Opens Up: Building More Content in the Cloud

The launch of the beta for Google Health
caught a bit of media ripple last week, but with the never-ending machinations between Microsoft and Yahoo I suspect that it got lost in the shuffle by some. That's probably just as well, given the hyping of last year's big health launches, some of which have gone on to greater glory and others of which are back at the drawing board. Revolution Health has been an enormously successful media launch, for example, closing in rapidly on well-established leader WebMD's visitor count in little more than a year, while Microsoft's heavy-handed HealthVault did a great job of collecting and touting major health care partners but also did an equally good job of scaring away people who felt uncertain how improving corporate productivity was in their personal best interests.

Google Health plies a middle ground of sorts between these two major efforts, focusing on relating expert content and online media to someone's personal medical history. Like other services Google enables the import of health information from a select list of hospitals and medical testing companies and can find information that relates to known symptoms as well as search for doctors in a given specialty in a particular location. As you can see in the expandable screen grab to the right it's a typically low-key approach from Google. It doesn't present itself terribly differently from any other Google application, explains the user benefits simply right up front and encourages one to explore its capabilities gently and incrementally well within a user's control.

In some ways Google has benefited from the relatively slow start to online medical records gathering by Microsoft, even if it's been a little snookered by Revolution Health's aggressive grab of media attention. An MIT Technology Review article makes it clear that Google is working
with its limited list of partners to understand what it will take to make people feel comfortable with entering and maintaining their health care information online. Terms and conditions make it clear to the user in the part that's appearing in the scrolling window that their information is theirs to control, so perhaps there's reason to hope. Starting with the approach that there's much to learn about what makes people comfortable with this particular kind of online personal data is probably a good approach, allowing Google to add features and content gradually.

In the meantime Google has also opened up its Google Apps APIs to developers, enabling anyone to use the highly scalable Google infrastructure to develop online applications that stand on their own or integrate with Google capabilities. WIth more that 150,000 developers already queued up to use the Google APIs we may be witnessing the beginning of the Google cloud beginning to subsume large portions of the online application development space. Combined with enhanced Andriod functionality for its mobile platform and the introduction of Google Gears, a desktop (and, presumably, mobile) client that will enable one to store data from the Web locally, it's clear that there's less and less space for Microsoft to lay claim to the personal content that's at the heart of its claim to personal computing. If the Web can lay claim as the primary repository for all of our content, with some items spun off to our local devices as needed, then Microsoft will continue to find itself positioned increasingly as a facilitator of appliance interfaces -a positioning underscored by Microsoft's announcement of a finger-friendly Windows 7 due to ship in...2010.

So on both the Google Health front and the Google Apps API front Google is continuing to position itself for prowess within the content cloud, building up relationships that will quietly unfold on a myriad of devices through a myriad of applications all developed on and stored in Google's powerful server and operating system infrastructure. It's not a media strategy by many people's estimates, much less an enterprise content strategy, but as these clouds begin to gather steam through the next few years prepare to be amazed yet again at the power of Google to keep focused on long-term objectives for delivering value through publishing that continue to amaze.

SIIA NetGain 2008: The Intersection of Content and Software Seeks Greater Value

As a company that has as its tagline "Where Content, Technology and People Meet" Shore can only applaud the SIIA's decision to combine sessions for software and content professionals at its annual West Coast conference into one event this year, now dubbed NetGain. Seeing companies like and Deloitte Consulting in one set of rooms and companies like LexisNexis and Wolters Kluwer in another room at this conference always seemed like a huge lost opportunity, only the moreso as Software as a Service begins to transform the face of enterprise I.T. services and content providers move more towards workflow applications and content integration technologies to build their market value. At the same time services like Google have long demonstrated that a technology that provides highly valuable context for content can be a publishing platform unto itself. So in many ways the software industry and the content industry are chasing the same high-value market opportunities and need to recognize that they have to speak in the same forums together for enterprise markets as much as for media markets.

I did not live-blog this conference, in part to participate in the SIIA's experiment in using Twitter to cover the event (see LiveTwitter's events page and look for NetGain updates). Larry Schwartz, President of Newstex, LLC, provided a consolidated collection of people's Twitter messages here for those wanting a more blow-by-blow account of the proceedings. I also posted earlier a piece reviewing presentations by and Google that underscored the importance of "cloud computing" in delivering enterprise content services.

On one level NetGain was such a perfectly natural blend of conference attendees from the SIIA Content and Software divisions that one wonders why this wasn't done earlier. This was underscored by the similar product themes brought out in the conference sessions. When software providers talked about "Software as a Service," what it really seems to say in many ways is that software companies are not succeeding as much as they used to simply by licensing their software as intellectual property and need to adopt licensing models more akin to those used for many years by enterprise subscription database services. When content providers talked about the importance of "workflow applications," What they seemed to be saying was that they cannot survive just on licensing intellectual property that gets commoditized unless it's put to work through really useful software services. Either way both software publishers and content publishers are chasing the same value proposition in the enterprise increasingly.

And for that matter, how different is "cloud computing" from the decades-old content services provided in the financial services industry by securities exchanges and companies like Thomson Reuters and Bloomberg? Certainly the Web has accelerated the development of client-server content services beyond any scale of earlier enterprise services but at the end of the day software and content services have been in a merging industry for a long time. Alacra, which won a CODiE award at NetGain for its ability to integrate content into enterprise workflows, has been working diligently for more than a decade on its powerful AlacraBook content integration services. Eventually trends catch up with long-established realities, I suppose.

The big difference today seems to be the influence of the one key ingredient that was somewhat under-represented at NetGain: social media services. Clay Shirky delivered his usual great speech about how social media services are revolutionizing publishing and ecommerce and there was a very good panel discussion lead by Dave McClure, but the increasing preponderance of social media publishing services both outside and inside major enterprises just didn't seem to register with most of the NetGain attendees. We're moving rapidly towards a predominant publishing environment in which the audience is seeking out and defining the value that it needs from content far more rapidly than traditional I.T. and publishing services are defining it.

This raises the question: what is the platform for today's and tomorrow's publishers? Certainly and Google, along with other presenters, raised a compelling case for the applications programming interface, or API, being the platform of choice for the forseeable future. Being able to plug in content and functionality into one or more platforms via APIs enables people with both content and technology services to put their capabilities into the contexts that audiences value most very rapidly. Certainly the flourishing success of Facebook's APIs has helped to fuel its growth even as Google's OpenSocial API promises to bring content into social media contexts more universally. If a platform does not have the ability for content and functionality to grow through the efforts of third parties then it's going to be hard to fuel growth efficiently.

But the real platform of today and tomorrow is the community built around a platform. Bloomberg and Reuters proved this out years ago as their messaging and conversational dealing services enabled securities market traders to communicate with one another more efficiently and to contribute valuable content that resulted in the execution of securities trades. While much of the financial industry's technology and content services have shifted towards more automated functionality, the heart of what provides the firms using these services with a market advantage is the ability of people to collaborate in marketplaces through publishing. Today a new generation of business information services is emerging, highlighted at NetGain by Hoover's and ECNext, both of which are focusing on how to lock in content value through their audiences providing valuable content in the context of their platforms. A publishing community is a community that can become the heart of any platform's value. Looking at how itself is moving towards integrating social media functionality this concept is hardly a secret.

There were also a lot of interesting exhibits by CODiE candidates at NetGain, which allowed people to get more "hands on" with their products before voting - sometimes literally. I especially enjoyed J.J. Keller's Safe.Sim truck driving simulator, which although it did not win in its CODiE category was both a very powerful training and evaluation tool as well as a "sleeper" software hit. With a little bit of repackaging and some consumer marketing know-how this could be a huge software hit. Truckers and truck fans around the nation and no doubt worldwide would jump at the opportunity to have a multi-player online version of this, complete with their own customizations. As for me, well, I guess I have a few things to learn about backing a semi into a loading dock.

In the paid exhibitors area I was especially impressed by a couple of offerings. Mzinga is an OEM social media community development service for both enterprise and consumer markets, enabling the collection and sharing of valuable content that builds value inside and outside the firewall. Well worth a look if you're considering stepping into social media more deeply. Vitrium Systems enables PDFs to be turned into intelligent content payloads that track audience behavior without requiring plug-ins or downloads and can also provide DRM for PDF content. For those still emphasizing print-formatted content this is an interesting play, especially for those interested in getting more play out if eBook content.

On the SIIA Previews agenda two later presentations stood out clearly. Watch Zuora, a company that promises to enable subscription models for practically everything, including content and technology to be sure but also just about any business model for any fungible product or service. Model-wise I think that they're on to something big and I plan to highlight them in future writings. It's a spinoff of ideas from using telecommunications technology, Keep an eye on this one, it may take a while to take off but I think that it has the potential to hockey-stick.

Another strong Previews offering was SlideRocket, which combines powerful presentation tools, graphics development and community content to create a new way to develop and share presentations that can capture metrics on how people look at them. I think of it being to tools like PowerPoint and Photoshop and Flash what and Facebook were to enterprise software and online publishing - services that defined their own categories as a new kind of publishing and in the process of doing so redefined several market segments at once by focusing on owning user content. I can't wait to get my hands on the beta.

So it was a great event, though I would hope that next year we get to see more participation both by more West Coast local firms and more major East Coast and overseas publishers. I would say that the only real disappointment that I had from the event was the rather quiet audience, which seemed in many instances to be of the opinion that while things were changing rapidly in the publishing and software industries the changes that many tout as revolutionary are not going to sweep away long-standing business models any time soon. There's more than a small grain of truth in that outlook, of course. Yet looking at the news industry, now reeling from the effects of having largely ignored the need to transform themselves radically in the face of a decade of online publishing, I think that it's safe to say that NetGain represented in many ways the admission that later is sooner than many may think.

Tuesday, May 20, 2008 @ NetGain: The Cloud that Ate the Enterprise Marketplace

At the SIIA NetGain conference in San Francisco George Hu, EVP of Products and Marketing, gave both a great summary of their product philosophy and a demonstration of their integration with Google Apps. Nothing terribly new in all of this, but what struck me more strongly than ever was how both their philosophy and their product development parallels and integrates with Google. George mentioned conversations with Google CEO Eric Schmidt which indicate that they are aligned on far more than just the product level. It would be foolish of me to speculate on a potential acquisition of SFDC based on George's comments, but the more that SFDC develops as a market presence the more that it seems that it is repeating the Google business model for enterprise content services (also known as Software as a Service, or SaaS).

First and foremost, SFDC built a highly scalable architecture that would allow for multi-tenant hosting, a very geekish way of saying that they have a server farm that has common management of SFDC software for thousands of companies' protected data sets. This is not so different from Google's commitment to creating a highly scalable common search service for its online audience, instead of trying to use online search services as a way to sell software and hardware (does anyone remember AltaVista?). Making your services highly scalable as one of your primary proprietary advantages gives SFDC enormous power to become a defacto content services platform much in the same way that Google's power to crawl effectively gave it a key market advantage.

Instead of having to sell copies of this capability, like Google SFDC focuses on content services. Yes, we call them applications in many instances, but the net focus of these applications is to enable people to consume or publish content. Enterprise publishers talk about enabling workflows as a premium content service: there's no real difference between what SFDC is doing and what publishers are attempting, other than the desire of publishers to promote their own proprietary content. Add in SFDC's integration with Google Apps, including Gmail for email services, and you have an "80 percent solution" for enterprise workflows similar in scope and impact to Google's 80 percent solution for search. Yes, we still have many high-quality search engines for enterprises, just as there will continue to be many other high-value I.T. products in enterprises, but as a percentage of I.T. expenditures they are certain to dwindle as content services enabled via the Internet "cloud."

The similarity of's marketing model was underscored by a presentation at NetGain from Google's Matthew Glotzbach, Product Management Director for Google Enterprise. Matthew highlighted in a simple graph how in enterprises the mediation of I.T. departments and other business functions in the purchasing of content and technology services from vendors is different from the consumer model, in which people can access and select services from any number of vendors without intermediation - creating more effective competition and, ultimately, coopetition between vendors. Security, data privacy are always touted as barriers to a transition to more consumer-like access to enterprise content but increasingly with the theft of laptop computers in airports, offices and just about anywhere it's not clear that a mobile-enabled workforce is going to be served well by anything but highly scaled cloud infrastructure.

Long story short, we are well on the way to the Google-ization of both enterprise content companies and enterprise I.T. companies by in combination with Google, with Google Apps acting as the "glue" between the two parallel clouds. While there will always be other clouds out there for specialized purposes - you won't see low-latency securities trading networks on SFDC any time soon, for example - I think that what we're seeing is the content/applications cloud enabled by as the emerging de facto environment for delivering content and technology services for much of today's corporate environment.
In the process of becoming that de facto platform, the ability of small and medium sized businesses to scale themselves rapidly and effectively will change the competitive landscape in business quite rapidly on a global basis. About the only real difference between Google's dominance and SFDC's probable dominance is that one did it on ad revenues and the other on subscription revenues. I.T. vendors and content vendors looking at the SaaS space need to move far more rapidly to build effective cloud-based products and services - and to recognize that a winning strategy includes owning the cloud sometimes and sometimes playing in other people's clouds. I hope that's not too cloudy to you, if it is, give a holler.

Thursday, May 15, 2008

The New Portfolio: CBS Quietly Amasses Must-See Content for the Web

The CBS Corporation Web site features photos of their television news staff, most of whom are well into their sixties, or well further in some instances, interspersed with shots of the young stars of some of their entertainment shows. Both motifs show some of the demographic challenges that CBS faces in developing audiences. Having missed largely the shift into cable television news and entertainment and faced with a rapidly aging audience for its news products, CBS has been leapfrogging its marketing strategy into online content development. The latest of these is a fairly "big fish" - CNET Networks, which according to was puchased for USD 1.8 billion, a 44 percent premium above its current share price.

This number may be a little eye-popping for some in the media industry, but this is no mistaken enthusiasm. CNET is one of the oldest commercial Web sites offering news on the technology industry and consumer goods, with a solid top-200 audience and some of the best journalism and analysis on the Web. CNET has always stood for best practices in publishing and site design on the Web, with a solid team of largely Bay Area journalists, analysts and bloggers, a great library of videos on tech and gadget topics, product reviews, well-tracked blogs, strong comments and a great channel strategy. There's not too much not to like. I think the factor that impresses me more, though, is that as a bazillion blogs have sprouted up to talk about topics in CNET's domain it's held on to its audience very nicely through a diverse array of content assets. While its U.S.-centric focus limits some of its appeal for growth in other markets, it's likely to be a good revenue generator for years to come.

More to the point, it begins to round out a portfolio of solid and up-and-coming destination content holdings that CBS has assembled. As CNET's own news blog notes it will make CBS one of the 10 most popular Internet companies in the United States, with a combined 54 million monthly unique visitors and about 200 million users worldwide. While much of the media business focuses on familiar moguls and the battles of print titles to condense into some sort of stable business CBS has become quietly a superstar of destination content the old fashioned way - by building up a portfolio of superstar publishing properties. It makes one wonder what investment bankers were thinking as they continued to spin out questionable multiples on continually sinking print-based news properties.

There are still profitable print titles left to play with, but CBS is reaching for and developing the online brands that will help it to bridge into the next generation of content consumers very aggressively - while milking what it can out of old media channels. Kudos to CBS for a well-timed and solid purchase and for focusing on the properties that will help their shareholders have a brighter future as the next generation bypasses cable television for new forms of news and entertainment.

Wednesday, May 14, 2008

Perils of Pearlstine: Norm Pearlstine Takes on Repositioning Bloomberg Content

In years past one could visit the head office of Bloomberg, L.P. and peer into the newsroom right off of the main lobby. Mike Bloomberg's office was right off of that news floor, with a glass partition that segregated him about as much as a head of an investment bank trading floor is separated from his or her operations. This was a natural for someone whose career took off in the trading rooms of Merrill Lynch driven by traders responding to real-time news events, but it also underscored the importance to Bloomberg of making authoritative market-moving news a key component of its success.

Times change, and now Bloomberg has announced the appointment of Time Inc. and Wall Street Journal veteran Norm Pearlstine as their first Chief Content Officer, a move that one presumes will enable Bloomberg to leverage its news and data assets more effectively in rapidly changing professional and consumer business news markets. Certainly this will help Bloomberg to move its revenue base more heavily away from professional markets, where its ubiquitous content displays are encountering fewer seats in an increasingly automated and specialized securities trading industry.

As I've noted for several years the financial information industry, like many enterprise content sectors, is moving away from a "bell curve" market model, in which lots of money is made off of many people equipped with subscription content delivery, to a "U"-shaped market model, in which lots of money is made off of highly automated content services and highly analytical services for a small cadre of decision-makers, with your typical "seat" revenues being realized more profitably through a media model where delivery has been commoditized as a benefit. Bloomberg has been relatively slow to respond to these changes, sticking to its highly profitable professional products but only recently beginning to up investment in its media brand audiences.

That's a challenging formula for growth given the continuing evolution of both Thomson Reuters and Dow Jones in supporting media markets more aggressively. Bloomberg 's online operations have grown audence significantly in the past year, almost doubling its online portal audience, but still trails Reuters and Dow Jones significantly for global markets. Thomson Reuters reported 18 percent quarter-on-quarter growth in its media sales in its first combined reports, an indication of how its global presence in online news markets has helped to fuel profits. So while Bloomberg's online, television and licensed content is strong, there is room for growth, especially in overseas markets.

But undoubtedly the increasingly sophisticated presence of Dow Jones has to loom large in Bloomberg's radar as much as the newly combined forces of Thomson Reuters. News Corp has managed this acquisition very wisely so far, retaining an online subscription base that both Thomson Reuters and Bloomberg lack while beefing up its Enterprise Media Group with its Generate acquisition. As these kinds of products that create professional value out of media sources begin to be adopted to Dow Jones' online media offerings Bloomberg will be challenged to devise both more powerful media offerings and a subscription community willing to pay for them. This will be at least as tricky as building a global content brand out of its existing news operations. The real challenge for Bloomberg is to respond to both new opportunities for media revenues and new challenges to high-end content analytics and real-time sales intelligence services in its core markets from newly strengthened players such as Dow Jones.

Pearlstine brings a deep and impressive legacy in the content industry to Bloomberg, but more importantly he brings an outlook on the media business which recognizes that the days of a handful of news monopolies dominating news gathering and dissemination are drawing to a close. To succeed with an electronic news brand one must not only excel at traditional journalism but as well one must excel in making news valuable in whatever context an audience finds it to be valuable. While it's not clear that Pearlstine's insider view of the media industry will lead Bloomberg to new successes in adapting to this more contextual view of the content marketplace he is likely to help open doors for Bloomberg to build out a more competitive brand for both online markets and for print markets seeking out new sources of editorial content.

Tuesday, May 13, 2008

Powerset: Transforming Publications Through Semantic Search Technology

There are rocket scientists, then there are rocket scientists - and then there's Barney Pell, long-time Silicon Valley startup maven and currently the Founder and Chief Technology Officer at Powerset. Barney is one of those rare people who has been a rocket scientist via both the NASA side of the term and the software industry side, an outlook that has helped him to assemble many teams through the years that have developed advanced search and language processing technologies. Powerset has unveiled its first effort recently at a new technology to provide rich content from semantic searches, an interesting look at how one can completely reshape the face of a content product via enhanced search technologies.

Using Wikidpedia as its primary target content, Powerset technology analyzes search phrases to come up with search results that match natural language phrases as well as keywords. This being a very early stage debut of technology some search targets work better than others and overall I'd have to say that it's a technology that seems to do best with people and things as opposed to concepts. For example, if you type in "Who is Bill Gates?" you get the screen similar to the top of the above screen grab, which includes a top deck of biographical information from the Freebase reference database followed by Powerset's sets of semantic analysis called "Factz" that focus on what the Wikipedia article says about this prominent figure. One of these sets, for example, tells us that Gates gave testimony, a speech, an address, a demo, a presentation and a deposition. You can click on any of these terms to get more details from the underlying article.

Below the initial bio and Factz information is a set of search results for the initial query, including the best-match article on Microsoft founder Bill Gates. This is in essence the straight Wikipedia article with links mapped over to Powerset's version of this content, along with a handy visual presentation of the article's outline on the right or another listing of key Factz organized within the article outline. I like some of the inferences that it's come up with in the Wikipedia definition of Content that I contributed a while back: "information provides value; experiences provide value; content provides value." True enough.

I like how Powerset prefixes organic search results with federated content, taking a best stab at results on very focused topics that enable people to obtain knowledge more quickly and effectively. The automatically generated Factz, though, suffer from the same problem that most semantic tools experience when they examine a very small data set: spotty inferences. For example, in the Factz about Bill Gates Powerset inferred that he founded Cher, an inference drawn from the fact that biographer Howard Johns was known for revealing the addresses of these and other celebrities. Hmm. Don't think that I'd put that info down on my "final Jepoardy" slate. I am also not so crazy about the organic search results, which tend to err on the side of word proximity. Again, with a relatively narrow data set such as Wikipedia it's not always easy to tune content analysis well to the capabilities of semantic text analysis in search engines.

The big picture for this early-days release of Powerset is that it is a great demonstration of how one particular source of content can be transformed through search and content federation technologies into an altogether different kind of publication. Oftentimes I talk these days about search technologies being similar to datafeed technologies, but in this instance it's important to recognize that search technologies are also end-publishing technologies in and of themselves that can aggregate, filter and organize content in altogether new ways that enhance the value of one or more core publications. Using free content from Wikipedia and Freebase the Powerset technology does a good job of demonstrating this concept simply, albeit with some early growing pains. Publishers wanting to stay in the forefront of content markets are turning in droves to content federation technologies as a solution to add value to existing product sets, so expect to hear more from technologies such as Powerset that help publishers to add value rapidly.

Sunday, May 11, 2008

A Return to Normalcy? Adjusting to Traditional Content Economics Again

There's the usual spate of moans and groans about print profits coming out of the Argyle Executive Forum on Leadership in Media, according to Red Herring, which featured insights from many key figures in today's news media markets. This negative outlook is underscored by News Corp's withdrawal from bidding for New York area newspaper Newsday based on it being "uneconomical" and setting the stage for a potential takeover of the paper by Cablevision. While revenues continue to climb from online content at news outlets classified revenues are still highly vulnerable from online competitors, making it hard to translate growing online audiences into profiles that resemble print.

There's not much new in all of this, to be sure, but I was interested in the following comment from the Argyle conference:
Norman Pearlstine of the Carlyle Group told attendees that newspapers enjoyed a brief period of monopoly that attracted investors and convinced many families to take their businesses public. However, he said, for most of its history, the newspaper business did not enjoy the double-digit margins that characterized the 1980s and 1990s. “At the end of the 19th century there were 29 newspapers in Chicago,” he said.
In other words at the end of the day perhaps the consolidation in the print industry of the past fifty years or so, first in response to rising fixed labor costs and television competition and then from the Web, created an illusion that highly capitalized media operations would yield superior results in an industry that has historically favored diversity in lower-margin operations. By creating larger swaths of exclusivity for fewer brands in major markets, newspapers and other print outlets were able to attract advertisers for several decades and provide reach at the same level of television markets. But in doing so they never really addressed the lack of technologies that could deliver higher margins except through higher production volume. This created an artificial illusion of technology scarcity that helped to drive both margins and the expectations of people creating print content. As long as there was a steady stream of companies to acquire to build up the illusion of scarcity, this worked rather well. But we seem to have come to the end of the run of worthwhile mass market print acquisitions. Big will probably get bigger yet if government regulations allow it, but to far less avail.

By contrast, the Red Herring article highlights how Playboy Magazine was one of the very first to invest heavily in Web technologies and to learn how to make them both profitable and attractive to advertisers and audiences, including heavy investments in online video and multiplatform delivery. The result: a highly profitable and attractive operation that offers some unique appeal to online audiences based on both content and branding. Instead of focusing on acquisitions in a sea of abundant competitors to create more artificial scarcity, Playboy worked to create something more appealing what would create quality that would be hard to replace.

Another important contrast comes from a recent MediaPost article, in which Ken Doctor points out that local newspapers are still doing fairly well, in part because many local advertisers as well as audiences have yet to be able to leverage a confusing array of online options effectively. This creates a real scarcity of audiences focused on local online content that are easy for advertisers to attract with some scale. Online alternatives are catching up fairly quickly in terms of content quality, but until GPS-enabled advertising services grow more sophisticated local print will continue to offer a ray of hope for print.

The bottom line is that it's far from clear that major media outlets as we know them really need to exist as they have for the past fifty or so years much longer. If the historical state of content is a wide variety of focused outlets with relatively low revenues, low volumes and low margins, then maybe what online publishing is beginning to usher in is simply the return of publishing to its more normal state. The difference with current markets is that electronic content aggregation makes it relatively easy for a wide variety of publications to leverage common technology. For example, individual weblogs such as ours use a tiny fraction of the power found in Google's infrastructure. So by focusing their capital mostly on pure infrastructure, Google has created true scarcity of highly scalable publishing capabilities that can service both localized and broad audiences very effectively.

Notably even companies like Google go out and buy market share through acquisitions - online video outlet Blinkx is rumored to be on their short list of short-term possible acquistions -
but these tend to be acquisitions that bring in both unique technologies and unique audiences. Where major media companies look mostly at reducing costs through online and print publication consolidation, the Googles of the world stay focused on creating more unique product value through acquisitions. With such an insistence on sticking with old metrics for performance it's not clear that established media companies can commit their capital effectively to gain a market advantage as long as they continue to focus on creating more artificial scarcity for dated products and dated delivery technologies. In the meantime private equity abounds to fund technology platforms that will take away the best opportunities for a wide variety of producers content with higher margins on lower volume and advertisers pleased with more focused audiences.

In other words, it's very unclear where the news industry goes from this point if they don't want to invest far more heavily in new electronic product development for more focused audiences. With a sour economy making it all the more hard to raise more capital for investment, expect media titans to continue to wrestle with their place in a content market traditionally dominated by smaller, more agile and more innovative players.

Monday, May 5, 2008

Adhere Solutions and MuseGlobal Enhance Google's Enterprise Content Aggregation

The announcement of Adhere Solution's partnership with MuseGlobal to launch the "All Access Connector," a federated content integration solution for the Google Search Appliance, is one of those situations where an event is both obvious and profound in its potential impact on the marketplace. As enterprises today face an explosion of internal and external content sources that they need to integrate to create insightful content services there is a huge gap that has arisen between what most content platforms can do to unify that information and what enterprises really need. This is particularly true in enterprise search, where many search services fail to provide access to all of the sources that a person typically needs to access.

Federated search solutions have been one route to address this problem, querying interfaces to multiple searchable sources and assembling the results "on the fly" to yield a combined search result. Instead of trying to shoehorn all of the needed information into a single database or search index federated search enables content to live wherever it has to and to come together when needed via multiple queries into integrated search results. Some do this better than others, and some have been at it for longer than others. MuseGlobal falls into both camps pretty handily, having been providing federated content solutions for more than a decade which has allowed them to hammer out an infrastructure that will pull together thousands of different types of content sources together via federated queries.

All well and good, but the question is, how do you make this sing in the eyes of enterprise users? MuseGlobal's support of Adhere Solutions, a company that includes Googlephile Steven Arnold's son Erik Arnold as a Director, points towards a very powerful possible answer to that question: the Google Search Appliance. While the GSA is a popular search tool in many major enterprises it's not been deemed the "go-to" search interface when it somes to getting all the right content from the right places all in one place in many instances. Federated content capabilities from MuseGlobal united with the GSA seem to fill that gap very handily. Capable of searching any number of search engines, internal and subscription databases and feeds as well as harvesting content via its own site crawlers, the MuseGlobal platform turns GSA into a clearing house for all of the content sources than an enterprise user might want - all delivered on the highly popular Google interface that provides access to Web content as well.

Combine this with both Google's programming interfaces for applications development and MuseGlobal's own extensive library of content integration tools and all of a sudden the GSA looks like a lot more beefy competitor for expanded use within the enterprise. And since the MuseGlobal library of source connectors includes many interfaces to subscription content services as well it's a platform that can put subscription database providers on a new footing with their users as well. All of a suddent the GSA looks less like a user-friendly also-ran and a lot more like a growing hub for enterprise and online content resources.

We hear lots of talk about workflow as the key solution that's going to enable value-add enterprise content services to build new revenues, but the ability to pull together a comprehensive set of sources that their customers' users really need to do the job is a slow and laborious process oftentimes for many subscription database providers to accomplish. At the same time enterprise portal providers are stymied oftentimes by users who refuse to use their solutions to any great degree because they're used to getting the answers they want from the search engines they rely upon as ther real "go-to" workflow solutions. The All Access Connector solution offered by Access Solutions and MuseGlobal offer both camps a lot to think about as they ponder how best to ensure that they are delivering the content that their users want in the applications that drive their productivity the most. The era of The New Aggregation's ability to deliver more content value from more content sources more rapidly than ever is upon us in full, indeed.

Happy Trails: Yahoo-Microsoft Deal Sunsets as Jerry Yang Contemplates a New Dawn

Yahoo CEO Jerry Yang adds to his signature in his weblog posts the moniker "Chief Yahoo," a label that seems to be more of an epithet in the mouths of some shareholders and dealmakers disappointed by Yahoo's recent and apparently final rejection of a potential Microsoft takeover. With Yahoo stock plummeting on the first market day after the deal fell through the sore attitudes towards Jerry Yang's rejection of Microsoft's offer claims of needing some Prozac seem to be at least tied with the claimed "high fives" amongst some Yahoo executives when news of the deal failure came through. Even Yang himself on Yahoo's corporate weblog claimed that "No one is celebrating about the outcome of these past three months… and no one should." It was a tough battle with bad blood generated both inside and outside of Yahoo in the process.

But there's no doubt in my mind that Yang made the right decision for Yahoo shareholders as well as for the company itself. While there were some important synergies that would have come out of a Microsoft deal, in general it would have been an acquisition by a company driven by old concepts of intellectual property value of a company that is starting to move far more aggressively into new concepts for realizing the value of intellectual property. CNET News notes that Yang is betting heavily that its more open approach to content integration using its own APIs as well as emerging APIs such as OpenSocial will increase significantly the exposure of Yahoo content to audiences in increasingly valuable contexts. Combine that with a completed deal to use Google's ad networks and to integrate in AOL's user base and you have the makings of a company that will shine in building highly engaged audiences using content from many sources. Think of Yahoo as an enormous warehouse of content, commerce and community that can be rejiggered into countless social media applications. Sounds like the man has a plan to me.

In the meantime Microsoft is left licking its wounds from what was perhaps their last great opportunity to leverage their way into more secure online revenues in the face of stagnating income from its traditional product lines and modest growth from its online ventures. The Yahoo acquisition would have brought them some synergies but at the end of the day it was largely a cash flow fix and an attempt to buy an audience for Microsoft's online tools that may or may not have succeeded, given their history of coming in very strong and proprietary with such efforts. By the time they would have focused on Yahoo's existing efforts to open up their content and to focus on contextualization rather than IP ownership as the key to revenues it's not likely that they would have survived Microsoft's more traditional outlook on IP value generation.

In this parting of the ways Yang will face angry shareholders and some shell-shocked employees for some period of time and softened share prices as the new(er) Yahoo takes shape. It's unclear that he will survive this unsettled environment in his current position but hopefully his vision for a Yahoo more in tune with today's most valuable opportunities for content will continue to move on. In the meantime Microsoft needs to consider both new cash cows and new stars on its matrix of properties to help it make a transition to a future that is moving away steadily from proprietary software on proprietary platforms as the most certain long-term bet for steady and growing revenues.

Thursday, May 1, 2008

OnCopyright 2008: The Future of Copyright is Here

The first OnCopyright event from Copyright Clearance Center held in New York this week was a forum established to probe the value copyright in an era of electronic distribution and how to profit from it. Panels ranged from technology, legal and publishing issues across the board. You'll find below links to our events blog here as I complete entries and below that a summary analysis posted after the conference.
In sum it was an excellent event with really meaty panels, which, though a bit rambling at times, managed to delve deep into very important topics relating to copyright and intellectual property law. At the end of the day I had a chance to speak with Suzanne Vega for a few minutes to ask her about how her cooperation with remixers helped her to extend her brand. She said that it was very helpful, a point that did not come out in the panel and a point that it at the core of what copyright law seems to be missing in general: copyright is not building brand value for original works creators as effectively as it used to.

Ultimately it's not the distribution of copies that's at issue as much as the fact that we have a copyright system that still focuses on the right to distribution of a copy as the primary key for determining when and how the value of content is realized. With essentially free distribution to and from billions of points worldwide this concept no longer scales well as a relatively simple tool to manage content commerce given the traditional method for establishing licensing through contracts negotiated through legal departments.

This problem was underscored in a conversation at the conference with SLA CEO Janice LaChance. Janice defended her panel from the Buying and Selling eContent conference in which prominent corporate librarians bemoaned publishers doing little to address many key issues regarding their business models, especially how they related to copyright. Put simply, the publishing industry has enormous vested interests in managing copyright through traditional legal and business channels, preferring the intricacies of case-by-case dealmaking to the risk of distributing content to the wrong people under the wrong terms.

This emphasis on legal departments as key elements of publishers' fundamental revenue models and opportunistic lawsuits that argue for copyright enforcement on increasingly arbitrary grounds has created an utterly balkanized landscape of kludgey deals and half-considered rulings in dozens of courts that in essence has dismantled much of the value of the once common and simple concept of copyright.

In the meantime the online economy has prospered, not by corrupting copyright but by creating value out of content in legitimate derivative works and in new sources of original authorship which in sum dwarfs the output of traditional publishing outlets. Services such as those from the conference's sponsor Copyright Clearance Center are facilitating the ability of people to apply copyright effectively online in a far more automated fashion for specific items of content. Providing value in context is the true value of publishing, a concept that is conflicting more and more with the mass manufacturing model that drives the production of much of today's copyrighted content. Much of the value of online content for a given audience where infinite supply reigns is fleeting, highly contextual and oriented more towards executing business deals or building relationships.

The fundamental concept of copyright - that creating a temporary monopoly for a publisher based on the premise that control of distribution will sustain publishers - is becoming far more limited in its effectiveness to deliver value. The question is not whether someone should have a right to license their content for use under copyright but rather how they should license it. This is why I have suggested for several years that publishers focus on the concept of context rights rather than copyright. In other words, once content has been distributed, it finds its value most easily. The fleeting moments and contexts in which it becomes valuable are difficult to predict in advance in an online environment and the relationships that will result in those moments harder yet to predict.

What the copyright industry needs to adapt to is a different view of what technology will help rights holders to make the most of content that benefits most from unfettered distribution. I believe that this will lead towards is a new style of licensing that is more fully automated and which uses a variety of predefined models to compensate content creators for their works. The rewards may be smaller overall in many instances in terms of money exchanged, but offering more exploitable brand value over time as people discover not only the value of a particular work but the value of a relationship with the creator of the work.

I applaud Copyright Clearance Center loudly for the courage that they exhibited in assembing this event. It brought together many important players with very intelligent thoughts about copyright and the challenges that it faces. An institution such as CCC needs to embrace the future of licensing content boldly at this juncture to ensure both its own future and the future of compensation mechanisms that can encourage and reward the creation of value through publishing.