Tuesday, May 25, 2010

SIIA NetGain 2010: If You Could Ask Google One Question, What Would it Be?

I am going to be part of the SIIA NetGain bus caravan today that is traveling down Highway 101 to visit some of the major content technology companies in Silicon Valley. As many of you know, I tend to ask a lot of questions when I am at events, and I expect that my visit to these companies will be filled with great opportunities for questions. With Google being the leading company on this little journey, it seemed like a good idea to ask people online what question they'd like to ask. I started with a question on LinkedIn Answers, which drew quite a few responses. I have shared the best of these responses in an embedded wave below. Feel free to log in to Wave to vote for these or to add your own (signup is free), or, alternatively, add yours in the comments of this blog post.

Friday, May 21, 2010

Google I/O 2010: Stepping Stones to The Second Web

There was plenty of whooping and clapping at the Google I/O 2010 developers conference in San Francisco this week as Google rolled out new and promised features for their expanding stable of content technologies. Much of the excitement, though, had been pretty much pre-empted by strong educated guesses as to what would be unveiled. An abbreviated list of key announcements should include:
  • The launch of Google Wave as a Google Labs "experiment" (read: beta) for anyone with a Google Accounts login and announcement of enterprise software partners for Wave
  • The announcement of the Chrome Web Store, a Web site that will enable the download of free and premium browser-based software that can run in any modern browser (but that integrates most nicely into Google's own Chrome browser with drag-and drop app icons)
  • Enterprise- and media-ready cloud computing resources, including App Engine for BusinessGoogle Storage for Developers, BigQuery massive data analysis, Prediction API data pattern prediction processing
  • Support for the VP8 video processing codec as an open Web standard that will be applied to all of YouTube's videos and supported by Adobe
  • A preview of Google's TV offerings with partners Sony, Logitech, Intel and Dish TV, which include an Android-based settop box that enables sophisticated combined searching of broadcast and Web videos
  • The unveiling of the forthcoming Version 2.2 of the Android operating system (Froyo), with accelerated Web app performance and features, "tethering" of Web connectivity via WiFi and more automated syncing of data and apps from the Web
  • Support for accelerated Web applications performance
  • Voice-to-audio phrase translation
  • Free browser-based open source fonts
  • Improved integration of display ads into mobile devices and AJAX-based Web services
Well, that's a pretty well-packed couple of days of keynotes and workshops, but what does it all mean? With the exception of Google's TV moves, most of these announcements are incremental improvements to what is amounting to a full-scale defense by Google of the Web as the main vehicle for all enterprise and media content. With an explosion of mobile and in-home electronics, communications channels and content packaging schemes, a Web based on universal search and standardized content and applications has been in danger of being upstaged by any number of proprietary technology plays. 

Google is focusing on knocking down the barriers between platforms via supporting Web browser-oriented standards such as HTML 5 aggressively and by enabling inter-device connectivity wherever possible. For example, one of the niftier "gee whiz" demos was showing off how you can use an Android-based phone to search by voice for television shows that can appear via Google TV on your home TV screen, where you can choose from high-definition television shows running from the Web, your DVR recorder or a premium television service. A couch potato's dream, to be sure, made possible cost-effectively via Android open source code and Web standards. 

Another somewhat surprising demo was by Sports Illustrated, which showed off an interactive online magazine based completely in HTML 5, complete with embedded videos that you can drag and drop with a finger touch. The SI exec was positively beaming about open Web standards as the future of publishing - quite a different tune than what they were singing at the Apple demos a few weeks ago. Games running as browser-based applications were also featured prominently as examples of functionality that Google's optimization of browser-based computing via cloud computing and improved programming performance within its own browsers will facilitate.

None of this will change publishing altogether today, tomorrow or even several months from now, but in sum it means that Google is edging us towards the era of The Second Web, in which text, video, software applications and voice communications flow seamlessly to and from any imaginable device in endlessly intertwined and tailored streams. Artificial barriers to content distribution and combination are coming crashing down as never before, but they are giving way to a new framework for content monetization in The Second Web that will be scaled to help all kinds of publishers to prosper - if they are willing to subject their content to the all-seeing eye of Google infrastructure. Do it privately in your own Google enterprise cloud if you must, but at some point your content is coming out of that cloud to communicate or collaborate with the world. When it does, Google will be waiting for you.

If you want to render slick Web content with high-performance software and video embedded into it, Google says, sure, why not. If you want to focus on massively scaled enterprise applications that require tight security and service level agreements, Google says, sure, why not. If you want to use your own hardware or hosting, Google says, sure, why not. Because Google is interested primarily in one thing and one thing alone: generating as much Web-based content as possible that can be searched, analyzed and packaged somewhere by someone - preferably by Google, but not always. 

Publishing is a numbers game, but Google does the math differently than most other publishers, betting that the company that can deal most effectively with the largest array of content sources - regardless of who creates them and where and how they sell them - can win. Think of it as the Wal-Mart model turned inside-out. You don't draw the world to the big-box store, you make the world your box and pull out what's most interesting to specific people at specific times. This was underscored particularly in Google's approach to its forthcoming Chrome Web Store, which will feature apps built on both Google infrastructure and other browser-compatible apps, and in its approach to video search, which apparently will begin to incorporate video from across the Web in addition to YouTube content.

There were a lot of shoes dropped at this year's Google I/O event, but there are many more yet to fall. Google Editions' launch this summer will make for some exciting news, as will the launch of Android and Chrome OS-based tablet devices and the debut of expanded Google Voice and videoconferencing tools. These will be further steps forward towards The Second Web, but for now they will be advancing on the stepping stones of the major infrastructure pieces that Google has put in place to get people beating a path towards it. By mid-2011, between Google and others who are accelerating the development of Web-based content and applications we'll wonder what publishers and communications companies were thinking for the past fifteen years. The Web was always going to win. Google just gave it a good, hard kick.

Wednesday, May 19, 2010

Wave at One: Where We Are Today

This week's Google I/O developers' conference marks the first anniversary of the public debut of Wave. What was a very intriguing demo at that event has bloomed into a powerful new platform which, despite wildly overoptimistic expectations when it opened up to a preview audience last September, continues to develop towards being a really usable collaboration and publishing tool. To give you a sense of what Wave is like today, I have embedded a wave below from my ContentWave blog on Wave that covers the key changes since last year and that captures some of the Wave community's ongoing discussions. Note that embedding waves in a Web page is now about as simple as embedding a video in a blog. That should give you a hint as to how far Wave has come towards getting integrated into the Web. If you have a Wave login, feel free to sign in and join it. If you haven't logged in ever to Wave, as of today anyone with a Google Account (Gmail or Apps) can log in to Wave here. If you're having a hard time viewing the wave for whatever reason, here's a link to a PDF version of the main text - generated by a Wave extension tool. Trying to eat my own dog food, after all. If you don't see the beginning of the wave below that displays the ContentWave logo, just place your cursor over the wave and scroll up until you see it.

Monday, May 17, 2010

OneSource iSell: Attacking Today's Real Prospecting Challenges

Infogroup's OneSource unit is a premium aggregator of business information that has been long focused on integrating their services into workflow tools used by sales and marketing professionals. With licensed content from dozens of sources edited and organized by OneSource staff and technologies, their services offer integrated views of financials, executive bios, news and "talking points" for people trying to find and talk intelligently to potential customers and business partners in global markets.

Though OneSource offers valuable information, its services have tended to focus on high-level views of people and organizations - not necessarily the best of matches for sales executives needing to do strategic selling that touches many middle management-level prospects in targeted organizations. At the same time emerging services such as InsideView and Dow Jones' Generate acquisition in 2008 have enabled competitors to build analysis of news and information from news and social media sources that alert sales professionals to events that may trigger specific types of sales activities. Between these emerging services and the mounting popularity of social media as sales and marketing tools the pressure was on OneSource to take their game to a new level.

With the introduction of its new iSell service, OneSource is definitely putting its game on to deliver a new level of relevance and personalization for sales and marketing professionals. Billed as a personalized prospecting tool that integrates real-time sales intelligence and triggers, iSell is an entirely new interface for OneSource's collection of compiled, edited, harvested and social media content that addresses many of the key points that drive sales and marketing professionals. One of the key needs that OneSource tries to focus on in iSell is the need to get to the right people at the right time with the right things to say in a way that keeps people selling rather than researching. iSell tries to address this need with a vastly streamlined approach to traversing OneSource content with the help of key new sources that help its users to focus on the right sales leads more easily.

Personal contacts content from sources such as Jigsaw that focuses on mid-level management is a key addition to iSell that helps the service to become more relevant to more sales professionals. Just adding a database, though, does not make a service instantly more usable. The key to iSell is a slick new interface that really does help its users to focus on prospects far more easily than earlier OneSource services. In a demo of iSell last week, I was impressed by the ease with which a new user of iSell can get productive with the service. Fire it up for the first time and you answer a few questions that help the service to understand how you focus on sales and the type of people and organizations that you need to reach. Once completed, iSell is then configured to feed you the information that you need to get you productive.

The central pane of iSell is a Facebook-like listing of contacts information, with pop-up biography information and a sidebar of potentially related prospects of a similar level and focus appearing on the right. When I say "Facebook-like," that's not just a generalization; the interface is intentionally and carefully designed with icons, look and feel and and light presentation of visible information that do replicate the best social media services' ability to keep people from being overwhelmed with too much information and focused on helping people to filter it intuitively. Navigation on the left and on the top of the page make it easy to drill down into related data sets, including iSales' own "sales trigger" tool that parses news sources to relate breaking events to key activities that are likely to invoke sales activities. Drill down into company information and you don't get the usual automatic listing of top executives but instead the types of contacts that you're focused on in your sales processes.

Social media from Twitter and other sources is keyed to searches for company names, a nice plus but also an indication that iSell is still working on getting some social media sources sorted in a way that fits into the flow of its other sources. The top tabs in the interface enable easy access to views of OneSource content such as company financials, its SWOT company profiles and "talking points" that help sales executives to build selling stories easily. There is a wealth of information available behind the iSell interface and it's easy to find it. This makes it easy to find key prospects in iSell and to import them into your favorite CRM platform for building a sales pipeline.

Business information suppliers are in general a lot more sophisticated these days in trying to build usable solutions for sales professionals, so getting the iSell interface "right" was a key priority for OneSource. Clearly they have succeeded in doing just that. It's an attractive, highly usable service which makes the very best of OneSource's extensive information resources for a very high-value, focused service. While much of the information behind iSell is from the core of long-standing OneSource services, the new features and additional sources put that core information in a much more usable context. The sales trigger features are still not tapping some of the key content sources that can help sales professionals to be alerted for potential sales actions, but the framework is there for an expanded capability over time.

This is all great stuff, but it still comes at a price. Since iSell rides on top of OneSource's collection of curated databases, it should be no surprise that premium pricing and price packaging is needed to experience iSell. OneSource feels that their pricing makes it competitive with other premium business information services, hoping that iSell will be an appealing alternative to new clients as well as existing OneSource clients. These targets for iSell will no doubt be very pleased with the results, but hopefully the very useful and usable tools in iSell will be positioned in time to be more accessible to broader sales and marketing audiences. In the meantime, iSell makes the most of OneSource-curated content to deliver prospect hunters a great tool to deliver timely and highly focused sales and marketing insights.

Out from the Shadows: Low-Latency Trading Services Gain Mainstream Attention

In the 1987 stock market crash, the world as a whole became aware of electronic information services that had enabled financial institutions to take advantage of small gaps in the timing of market activity between trading of securities futures and options markets and trading in stock markets. "Program trading" was tagged as one of the key villains in the market rise and fall, resulting in some self-imposed limitations taken by many securities markets on the actions that were allowed to trigger such electronic trading. Investor confidence returned, and the securities markets enjoyed a great, long run of successes. Yet both government regulators and the mainstream media did little to keep an eye on how content technologies were continuing to change the face of financial markets.

In 2010, of course, we are living with the consequences of this lack of awareness and responses to the continued progress of financial information services. It's somewhat strange and ironic, then, that while trade magazines have focused for many years on low-latency trading - the technologies that allow near-zero lag time in obtaining financial market information and executing securities trades instantaneously based on that information - mainstream outlets such as The New York Times are just beginning to catch up to the realities of how investment markets have worked in the low-latency era.

Low-latency trading technologies and services are in many ways just an accelerated version of the technologies that began evolving in financial markets during the 1980s. If financial markets have always had somewhat of a casino-like ambiance, then low-latency trading allowed the "whales" of finance to turn the computer-based "back rooms" where these systems work into huge profit-drivers, enabling these investors to move in and out of market positions fast enough to keep ahead of retail investors armed with slower and fewer information sources. Being able to trade in advance of generally available knowledge has always been at the heart of the profits for securities dealers, and information services have been only too glad to help design and deploy services on their behalf which can give financial institutions an inside edge.

Having an edge in today's Web-based world, though, is tougher to obtain than ever. Most dialed-in small investors are on top of the latest trends impacting markets through online financial information services that deliver news almost as quickly as news obtained by financial institutions. Many, using services such as Twitter, may in fact be on an equal footing with breaking insights that may trigger shifts in investments. It's no surprise, then, that financial institutions are dialed into social media statusing and discussion services almost as much as traditional information sources. With so much information transparency, it's hard for traders to profit from securities transactions without enormous volume or ontaking huge risk.

With the instability in financial markets of the past few years, regulators seem to be asking anew what the role of financial securities markets should be in advancing their nations' economies. Put simply, the question that needs to be answered is how best to encourage individuals and institutions to take on the risk of investing in capital markets in a way that helps economies to grow and investors to buy and sell their securities in liquid markets. Unfortunately, the answer to this question may not please either the financial institutions that have ruled financial markets for several decades or the information companies that have serviced them.

The answer to this question is, ultimately, to limit severely the ability of financial institutions to bet against those holding the securities that represent the primary risks in their markets by engaging in limitless trading of financial derivatives products. The derivatives beast unleashed by program trading in the 1980s was never put back in its cage; it was simply given the appearance of civility while it was on the loose. Yes, derivatives serve an important purpose in enabling investors and their trading partners to smooth out their investment risks, but when they displace capital investments as a primary focus and purpose then both those investors and the information services that support them are on shaky ground.

This is not to say that low-latency trading should die along with restricted derivatives trading. If shifts in the marketplace are likely to create new streams of demand for buying and selling a specific stock or bond, then traders should be allowed to stay ahead of that flow in an efficient manner. This can only help to support market liquidity in the long run. But what's been missing for the past decade in capital markets is a focus on the value of fundamental objective analysis of the underlying risks of capital investments.

It's no secret that traditional ratings agencies have fallen short in supporting this goal, even as investment banks have continued to outsource the analysis of many securities to third parties that usually lack detailed insights into the institutions that they are analyzing. At the same time there has been little incentive given to information services companies to bring more transparent fundamental securities analysis to global markets. Most developing nations fight issues of corruption and conflicts of interest in the regulation of their securities markets even worse than those found in North American, European and developed Asia/Oceania markets.

Yet if a fraction of the profits that flow out of today's low-latency trading were invested in developing more efficient global securities analysis services, how much better capital markets would be in the long run. I for one would love to have much more confidence in how to invest in small startups and going concerns in the U.S. and overseas. The ability to do this effectively has been lost as information services companies followed the lead of major financial institutions chasing profits in supporting trading in risky derivatives that drew capital away from primary investments.

With the relatively low profits obtainable from everyday securities trades it's doubtful that today's investment banks are going to bend backward to enable this to happen, though. Instead, it's far more likely that we'll see a regime of globally-sponsored regulations and funding sources that will begin to lay the groundwork for a new world of financial information services that will provide developing businesses and developing nations more efficient access to capital markets. The global propagation of XBRL as a data formatting standard for financial information reporting is likely to make the development of such services more cost-efficient than ever.

Will the likes of Thomson Reuters, Bloomberg and others be the ones to profit first and foremost from such services? Perhaps. But I do think that these emerging trends are just as likely to spawn a new generation of  Web-inspired financial institutions and financial information services companies that are willing to package financial information more the way that the investors seeking long-term growth in their holdings want to see it. We know where the era of "get rich quick" has lead us. It's time to get used to the idea that global "get rich slowly" investing is likely to gain more prominence for financial information services in the years ahead.

Wednesday, May 5, 2010

Beyond Free: Chris Anderson and His Hopes for the "Third Great Platform"

When ContentBlogger last encountered Wired Magazine's Editor in Chief Chris Anderson, he was pitching his then-new book "Free," which chronicled how the economics of Web connectivity were driving both publishers and the global economy as a whole towards new models for delivering products and services. Anderson has made quite a bundle from selling and talking about the book, a delicious bit of irony that underscores how perceptions are the key to arguing for premium price points. It shouldn't come as a surprise, then, that Anderson was talking up the place of premium content at the recent MarkLogic User Conference in San Francisco. Chris focused in his talk on how electronic editorial tools are being used to create electronic content for Wired Magazine on Apple's iPad. The new electronic "app" edition of Wired Magazine will be available via a "freemium" model, through which he expects that some portion of users will be convinced that the full-fare version will be worth a subscription or "newsstand" purchase.

Anderson referred to the iPad and its presumed touchpad competitors as the "third great platform," one which will offer the publishing industry an opportunity to "fix" the mistakes it made along the way with Web-based content. In the new Wired iPad edition, you'll find yourself in an environment that looks in many ways like a print magazine, but also in an environment that will allow for interactivity and content such as video. The idea, Chris underscored, is to drive up the amount of time that people spend with their content. Testing with simple iPhone apps used for Conde Nast's GQ magazine indicates that typical readers are using these apps for 50 minutes at a time, so he may be on to something. Towards the end of the talk in the Q&A session Anderson noted, "If we're seeing 40 minutes of use in a year from now, then we'll have been successful."

For some portion of his tech-oriented audience, Chris may very well be right. People like immersive online experiences, as shown in the success of Web and app-based game-playing services, so making an online magazine more like a touch-sensitive content arcade might appeal to some people. Anderson pointed out that before the Web became popular CD-ROM-based multimedia products were fairly successful in accomplishing this type of game-like engagement. After all, electronic games are typically a form of story-telling also. If Anderson can manage to convert a journalism school-bred editorial staff into that kind of story-telling machine, then there may be a success story for magazines in his strategy.

Realistically, though, it's a strategy that relies on a lot of long shots, the longest being that native Web-based content cannot duplicate these kinds of experiences in a way that would undercut the pricing that magazine publishers seek via app-based packaging. What traditional publishers are banking on is that they can create immersive experiences that can be more engaging than those that the Web itself produces. Yes, a reader may spend only four minutes on your Web site, but they create their own immersion by following links as they please and creating content as they please. In essence, the Web enables people to create their own editorial narratives based on curated links, be they curated by professional editors or via social media outlets. From this perspective you might say that the Web as a whole is the biggest game on earth. That's tough for any single publisher to take on.

As Chris pointed out in his talk, radio as a commercial medium was born as a way to engage people long enough to justify sponsors with huge advertising budgets to reach mass audiences. It worked because the medium itself was essentially free to that audience. But commercial radio is, of course, a broadcast medium, a one-way channel that edged out competition with amateurs on nearby frequencies. This was, in a sense, one of the first attempts at "walled garden" marketing for electronic content. There is, after all, no technical reason that amateur communications couldn't operate on the same radio frequencies as commercial communications. The "immersive container" approach to online content that Anderson is advocating faces this very challenge via the Web. The publishing industry is in essence trying to engineer a "commercial band of frequencies" for mobile content. To do this now would be the equivalent of trying to put in one-way radio receivers in everyone's home after people had been using highly convenient and advanced two-way radios for more than fifteen years. It's doable, but it's a reach.

I think that these efforts at contained immersive experiences for news-oriented content are largely wasted. As I mentioned to Anderson in a comment after his presentation, prose is not a game. Thinking back to the 1990s era of CD-ROM multimedia that he referenced, there were only a handful of these experiences that were really immersive; most were just a repackaging of pre-existing text and graphics with a bit of glitz around it. CD-ROMs died not because of the Web but because they simply couldn't continue to entertain us with small sets of static content. If organizations like Wired can tell immersive stories using the full breadth of content available to people on the Web, then they may have a chance to keep us in the theatre. If not, then, well, Chris knows as well as anyone else the fate of the "hits" economy. Superstar mass media journalists are unlikely to survive this trend, game-like packaging or not. They're simply no match for dancing cats and highly niched content.

In a sense this brings journalism back to its roots in Rome, when wealthy Romans had their servants write up accounts of what was happening down in the Forum that they could read at their leisure in their hilltop villas. It's a model that works for a handful of people with notable wealth, but it's not likely a model that will fill the pages of online magazines with ads for mass market goods. The rest of us are already having very engaging direct conversations online with many of the very marketing organizations that Anderson hopes to attract into his online magazines. Will his new electronic Wired be able to sustain enough engagement with readers that goes beyond today's Web to attract mass marketers that are learning already how to have their own massive Web conversations? For some, perhaps. Perhaps there will even be enough to build a nice magazine business model for mobile platforms. But my guess is that it will not happen at the price points that publishers are hoping for or for the size of audiences that they're hoping for. Good luck, Chris; let's compare notes a year from now.

Tuesday, May 4, 2010

MarkLogic User Conference 2010 - Content as the Platform

Give an analyst a free conference pass and travel expenses to go to a rocking conference with hundreds of key people attending from content and technology companies and it's worth at least a thought. Make it MarkLogic that's doing the hosting and get a slot to moderate a great panel on mobile markets and it's pretty much a lock in my book. Mark Logic does an excellent job not only of attracting good speakers but with forming a really cogent program of topics that rivals many trade associations for its quality and thoughtfulness. Mind you, the MarkLogic User Conference 2010 was sponsored, but by a company that's right in the thick of the platforms that are creating value in today's content industry.

And what is creating value? Being able to transform, combine and deliver content from as many sources as possible that are relevant to an audience on as many platforms as possible. In some ways cross-platform is the platform of choice these days, with a proliferation of personal, professional and mobile outlets for content that are driving content use wherever and whenever it becomes useful in the moment. MarkLogic is far from the only company in this mix, but it has certainly come a long way in the past several years to position itself as one of the leading platforms for cross-platform content aggregation and deployment.

I'll was commenting on the individual speakers and panels during the event on Google Buzz and tweeting links to those posts using the #mluc10 tag, and have listed the Buzz links here along with an event wrapup.
To sum up the conference, it was a great event with lots of good presentations and discussions that brought the ballroom of the Intercontinental Hotel in San Francisco up to full capacity often. Since much of the event was multi-tracked I had to miss a lot of the presentations, but what I was able to see painted a very interesting picture of how content development is moving ever closer to the user interface via technologies such as MarkLogic's. While "cross-platform as the platform" was the theme I chose for the panel that I moderated on mobile markets, in a broader sense technologies like MarkLogic that help to compress the time and complexity required to express structured content as user experiences is turning content itself into the platform. 

It's not just that an XML server can aggregate content from many different types of sources in a common format and structure. On the front end of that server both the MarkLogic-extended XQuery language and more common standards for content expression such as XSLT (soon to be supported directly by MarkLogic) collapse the complexity and layers required to take content from a raw, normalized form into usable content on the Web and in applications. In a sense, we're at the point where Web browsers and app containers on mobile platforms are sophisticated enough that we no longer need oodles of complex infrastructure out on the Web to serve up relatively dumb content. Content can be expressed to browsers with dynamic elements of both structured content and programmed presentation so tightly bound with data query tools that the need for many intermediate layers of programming and functionality begins to fade away. In a sense, the query itself becomes the expression of the content - a Content Management System on the fly,  if you will.

An interesting example of this surfaced in one of the presentations that focused on MarkLogic's technology strategy for managing dynamic failover. In essence, rather than trying to manage failover synchronization issues via an operating system's file system, MarkLogic decided instead to bypass the file system data writing features of file systems and instead write to the computer's disk partitions directly. To put it more simply, in large part MarkLogic eliminates the need to rely on one of a computer operating system's most essential features to read and write data from computer storage to keep backup computers systems in sync (this is an old technique, but applied in an interesting way). So from a very low, "close to the hardware" level to a very high "close to the interface" level, MarkLogic technologies are eliminating whole layers of cost, complexity and investment to get sophisticated content pulled from many sources published effectively. You can add in these layers as needed, of course, but they're far less necessary, now.

This allows people who are content experts and user interface experts to get to the point of being able to apply their insights far more rapidly and effectively. There were numerous examples of this at the MarkLogic conference for a wide variety of publications and applications, including analysis of national security threats, scientific journal publishing and custom book publishing. Instead of getting caught up in technology issues that are far removed from the display containers such as browsers in which content is expressed, most of the effort can now be spent far closer to those platforms and their users. As the breadth and sophistication of content display tools has been increasing dramatically in recent years, this is a critical factor for spending product development dollars wisely.

You really cannot afford to spend too heavily on infrastructure far removed from users when your return on investment in publishing is from managing the user-level complexities required to build large audiences through today's explosion of services, devices and display formats. You have to be very close to the content in a form that's readily transformed for their use. Content is the platform indeed, leaving most everything else behind that content as a commoditized cloud supporting it. Specific performance requirements such as high levels of security or high-speed delivery may draw in specialized technologies into this mix, but it will be increasingly rare that we'll see standing IT departments dedicated to layers of technologies that are easily eliminated via the "content as the platform" model.

When I first met MarkLogic CEO Dave Kellogg many years ago, I was one of the first analysts for whom he was turning slides in his pitches to sell the world on the idea of easily queried XML servers. Much has changed for MarkLogic since those early days, but the most important change is that publishers of all sizes in many markets are now "getting" the MarkLogic vision and adopting it to accelerate their content services development. Between technologies like MarkLogic and concepts such as cloud computing and software as a service, content technologies are pushing us towards an era in which more efforts by publishers can be spent on highly satisfying electronic content experiences rather than on IT for IT's sake. 

User-Generated Content as the Platform: Thoughts about Salesforce.com's Acquisition of Jigsaw

On one level the recent $142 million acquisition of the Jigsaw online business information service by software-as-a-service vendor Salesforce.com is a no-brainer. Salesforce.com's AppExchange has featured Jigsaw's Prospector app as a highly rated option for people wanting to integrate fresh contact and company information into their Salesforce marketing data. While other business information providers such as Hoover's, OneSource and InsideView have also been aggressive with providing Salesforce content integration, Jigsaw stood out as an up-and-comer with a rapidly deepening database of content that is now refreshed regularly both from its online community of professionals and from enterprises using its feed-oriented data cleansing services.

Given that Jigsaw's business model emphasizes content sharing as a means to generating quality data, being able to facilitate that sharing via Salesforce's rapidly expanding base of sales and marketing-oriented customers gives Salesforce some very interesting advantages. Professionals using Salesforce.com for contact management are adding new contacts regularly. Being able to both harvest and supply those updates as part of a database drawn from Jigsaw in a more integrated service can enable Salesforce.com to become a business data powerhouse in its own right, underscoring its "solutions, not software" sales theme quite handily. Certainly Salesforce.com intends to remain a "friendly" with major business information suppliers, as underscored by its decision to have Jigsaw stand as a separately held company for now. By the same token, the success that business information suppliers have had with Salesforce integrations continues to be one of their best entry points into user-driven revenues that are tightly bound with a workflow-oriented Web platform that's driving productivity in enterprises of all sizes.

But as with its budding social media efforts, Salesforce.com is perhaps acknowledging in its Jigsaw acquisition that to some degree content itself is the platform that will drive its future growth. SaaS is seeing widening acceptance in many enterprises, but enterprise-oriented business information suppliers have been in the SaaS business for many years in a sense, providing tools that are targeted increasingly at integrating their premium data sources into enterprise workflows. In fact, if you look at major enterprise content technology plays, many that rely on end-user applications, either on or off the Web "cloud," are stalling in their growth for lack of unique and regularly refreshed content from relevant sources that's an integral part of their services. Enterprise-oriented content technologies are in many ways very mature at this point, turning the advantage to those that can leverage them to deliver new content and new types of content services more rapidly than their competitors. Thinking of the Mark Logic user conference that I am attending this week, their XML database and query technologies enable their clients to develop content clouds very rapidly from any number of content sources. While the Mark Logic "glue" is very important in its own right, it is its ability to present well-formatted content for any number of display applications that is the essence of its platform value.

The "glue" from a Jigsaw perspective is its users, both those using the service directly and those feeding in updates via enterprise-supplied feeds. I find myself speaking increasingly with clients and other professionals about owning user profiles as a key ingredient in generating content value. While networking with other services' user profile logins can accelerate the use of an information service, being able to activate people using your platform to be suppliers as well as consumers of high-quality content creates both content and a networking effect that can generate unique value that's expressed both in data quality and customer retention. Salesforce.com sees this in a rudimentary way with its new Chatter online collaboration service, but it's a model that's been proven in the financial industry for decades based on contributions of private deals and securities quotes on vendor networks with private logins.

So who will get to the users first to "own" the user-generated content platform, the technology platform providers or the information companies? It's a footrace, to be sure, since both are focusing intently on many of the same customer requirements. At this point, though, you have to give somewhat of an edge to companies like Salesforce.com and Jigsaw that have a more agnostic view of what constitutes quality content for enterprises. Often traditional publishers are far less of the "quality is as quality does" mindset, with legacy operations oriented towards content packaging that may not be moved easily into focusing on locking down content assets contributed by their audiences, either directly via services like Jigsaw or indirectly via the usage data that they provide. But this will vary quite a bit from one marketplace to another. With a "horizontal" focus on business roles found in many different types of companies, Salesforce.com can "own" the content platform more easily than in vertical markets that have dominant sector-specific content technology providers.

Still, the general trend seems to be towards a new middle ground between enterprise content and technology providers providing whatever it takes to get people to use their platform as the primary place where they provide important information about their areas of expertise. You need not call it "social media" or "user-generated content" if these phrases make you squeamish; you can think of it simply as a new way to "own" a unique relationship with your clients that can lead to subscription contract renewal lock-ins that will be very hard to dislodge. Or, in the instance of Jigsaw, it can even lead to new incremental business models that can turn usage into self-service sales far more pervasively and rapidly. Whatever the label or the method, you have to take you hats off to honor Salesforce.com for locking in a key growing content resource. In the battle to own highly valuable content generated by business professionals around the world, score one for the platform.

Right Tree, Wrong Bark? AP's Cross-Platform Initiatives Offer Mild Hopes

Lately it's a rare day when there's good news for the newspaper industry, but paidContent.org notes that the recent board meeting of the Associated Press held out an intriguing glimmer of hope for beleaguered news providers. It would seem that the AP intends to offer what amounts to centralized business development for its member newspapers and, it would appear, even for non-members seeking to get a more fair shake from an increasingly fragmented marketplace for news.

While AP members continue to distribute and consume news via AP's subscription feeds, increasingly they focus on the placement of their own content and links on Web sites and mobile information services. This is somewhat good news for these news organizations, but not as good as they might hope. With typically limited business development teams and labyrinthine hoops put up by their legal departments through which to jump, the typical news organization is not very effective at extracting revenues from Web and mobile platforms.

Enter the AP, which hopes to offer a more industry-wide approach to negotiating with mobile and Web platform providers. Already tracking where and how AP content from members is used around the Web, AP is potentially in a good position to help news publishers to use their collective clout to do unto Web sites large and small in collective negotiations what the AP has done fairly successfully with major targets like Google. In other words, if AP's distribution technologies are no longer a key market advantage for its members, its sales force and its ability to manage various business models across a wide range of news consumers cost-effectively might be a good way for it to find a place in news' future.

In a general sense it appears as if the AP is locking in on the right target - delivering a scalable business model for news monetization that's more in sync with today's content technologies. It's an approach that's particularly important as mobile platforms begin to proliferate, especially since so many mobile technology and services providers are eager to take a healthy cut of revenues for the right to their "exclusive" platforms. Mobile markets are rife with artificial scarcity, and overdue for a "content cop" like AP to come in and lay down the law for how news content can and should be licensed in a Web-aware era. The main question, though, is how well the AP will be able to define unique value propositions to mobile and Web news outlets.

This leads towards the other key leg of the cross-platform initiative that the AP is planning. While intentionally vague at this point, AP Chairman Dean Singleton noted that "We need one voice, not only to work on business relationships but also new products that we might do together and also application frameworks so we decided AP should speak for the industry and work for the industry." In other words, AP seems to be moving towards the concept of "journalism as a platform," a combination of technologies and business development services that can enable traditional news outlets to compete more cost-effectively by building more powerful licensing networks and audiences in a Web-scaled world of content distribution.

If the technologies developed by the AP are targeted towards mostly towards effective cross-platform licensing and user profile management, I think that this might be the right bark coming from the right dog under the right tree. There are far too many platforms emerging for any one news organization to approach effectively on its own, so collective bargaining with the proper leadership could prove to be very advantageous. But in staying "hands off" with initiatives such as Press+ and at the same time mentioning "new products," I am concerned that AP may try to overreach its targeted goals with cumbersome centralized technology development efforts not in tune with today's news world.

You can go after all the platform providers that you want for content licensing, but it won't change the fact that billions of people around the world can set up a blog or a Facebook page for free in no time flat and become influential news publishers in their own right, AP Stylebook be damned. Yes, journalism quality matters, but the quality of technologies specific to news production are very hard to define as market advantages today. Short of a massive rethinking of how the news industry develops and deploys news-generating technologies, I don't hold out much hope at this point for the "products" focus in AP's vision.

But a year is a long time on the Web - though it may not prove to be long enough for many struggling news providers. Here's hoping that AP can nail down both the right business development techniques and the right technology tools quickly enough to help ensure a lasting presence of quality journalism in electronic markets.