Wednesday, July 29, 2009

The New Yahoo: Go-It-Alone Rhetoric Takes a Back Seat to Web Realities

If I had a dollar for every opportunity over the past few years to blog about the ins and outs of Yahoo's present and future, I could take you out for a pretty good dinner. The soap-operatic saga of how the leading but beleaguered Web portal lost many opportunities for greater industry dominance are well-chronicled, but now a completing deal for Yahoo to use Microsoft's new Bing search engine in exchange for Microsoft using Yahoo's ad network appears to set the stage for a new assessment of Yahoo's place in the online content industry that rises above the the usual cult of obsession with Silicon Valley personalities. More importantly, this deal is not the only step that Yahoo is taking to strengthen its position as an online destination that solves problems for people with engaging content.

On at least one level the deal appears to be a no-brainer. Yahoo's search capabilities are quite good for consumer search, but they lack Microsoft's investments in the engineering mojo of its Powerset-enhanced Bing search engine to accelerate the maturing of search results into rich, contextual content. Yahoo has good ad technology and brand marketing, but needs both more inventory and more overall market share to get a more serious share of advertisers' budgets. Each organization will be able to take capital out of competing for their common but smaller pieces of the online search and ad pies and concentrate more on drawing market share away from Google and other sites using Google services. In doing so they will be able to build online and mobile revenues more effectively through their combined audiences.

This is all good, and probably well-needed competition for Google to strengthen the online breed. It also puts Yahoo's efforts to re-engineer its future as a direct competitor to Google comfortably in the past: Yahoo's greatest growth came during its earlier technology partnership with Google, which allowed Yahoo to concentrate on user experiences and content partnerships more effectively. Different partners, now, but similar opportunities await. So in spite of the "Yahoo has thrown in the towel" rhetoric floating around - or worse - there's reason to believe that this alliance is a good step towards Yahoo using its more limited assets to do what most successful Web companies do anyway: use alliances to do what you do best and to leave the rest to others. Bing will kill the Yahoo brand no more than Google's search and ad alliance killed the AOL brand; there's plenty of room for Yahoo to be a strong aggregator and services provider through and around Bing's capabilities. It may also, of course, be a way for Microsoft to absorb the benefits of a Yahoo one step at a time while avoiding regulatory issues that an acquisition might raise, but given the iffy online future for both companies individually it's probable that a trial marriage through this deal that strengthens the assets of both companies is a more realistic step at this time than risking capital on a merger.

Yahoo is also not relying simply on Microsoft to reposition its strengths in the Web marketplace. In today's world of virtual aggregation, Yahoo's recent home page redesign beta, which includes links to major online Web sites such as Facebook and eBay is an indication that they have finally accepted that Yahoo's strength as a brand can't grow exclusively on traditional content licensing deals. If Yahoo is to be the "starting point" of using the Web, as suggested by Jerry Yang, Yahoo’s co-founder and former chief executive, then it has to do as the Web itself does and become more adept at using links as a form of powerful brand endorsement. A media cynic may look at this and say, "Well, it's nothing more than a big Huffington Post with some extra ecommerce features," but if it does what people want it to do and they come back for more, then, well, who's going to laugh last? A successful product is first and foremost about meeting the needs of your markets cost-effectively, after all.

There are still many hurdles for Yahoo to overcome before it can be labeled a truly "hot property" again, but the new Microsoft alliance and the home page redesign are both key indicators that Yahoo is focusing increasingly on the things that will keep people coming back for more. The days of walled gardens filled with licensed content built one deal at a time are a waning phenomenon, but that leaves many hopeful days ahead for those who help people make the most of their online experience in whatever garden suits them best. Hopefully Yahoo will remain a key player in those efforts through their latest moves.

Monday, July 27, 2009

Smelling a Better Business Model, AP Deploys Its Own Open Content Tracking Microformat

AP has taken quite a bit of heat as of late from industry pundits because of its highly visible copyright enforcement efforts, but it has also been looking at ways in which it can leverage emerging technologies to do a better job at building a better business model. It's no secret to AP that there's more money to be made in sweet solutions rather than sour legal tactics, and also no secret that its traditional business model of licensing feeds of content to a handful of select distribution partners is a cumbersome way to develop new revenue streams in the Web era. But what is somewhat surprising is that AP has bypassed a number of technology companies courting them for their services to come up with their own solution to these issues.

In cooperation with the Media Standards Trust, AP is leveraging W3C-defined standards for coding XHTML Web pages with data microformats to launch a news registry service that will track the usage of AP content across the Web. The hNews microformat proposed by AP is an open standard that provides metadata such as source organization, dateline, principles behind its creation and, of course, rights definitions. The hNews format in and of itself is not an enforcement mechanism, but rather simply a series of data definitions that enable software to take actions based on that data. For example, the openDemocracy Web site is experimenting with hNews microformats, but one can easily cut and paste content into a blog or Web page without restriction.

What hNews enables in theory, though, is software that can reference hNews metadata to send information about how content is being used in relation to the rights expressed in that metadata to other points on the Web. AP claims in various postings and articles that it is leveraging hNews to drive a "beacon" program that will report back to AP how its content is being used. There are no readily available details on this beacon program, only vague statements describing how it would be used. Presumably it would operate somewhat like the Tracer technology from Tynt, which embeds a small piece of Javascript in a Web page that affects how content is copied and enables usage reporting back to a central service.

Although AP's registry is not like a digital rights management scheme that "locks up" content in an encoded digital wrapper that prevents viewing by unauthorized people, AP seems to be going out of its way to make statements which claim that it is a protecting technology for publishers. A widely circulated graphic from AP states that their registry provides a "protective format" which puts content in a box-like "container" that will enable content usage based on rights expressed through hNews. Without more express details about AP's beacon technology it's hard to make any real conclusions about these claims, but clearly the concept is to enable viewing while enforcing policies on content reuse through software that is activated via approved distributors of AP content.

Reactions to AP's initiative have been muted in many instances and downright hostile in other instances, including a sharply worded post by Jeff Jarvis which claims that AP needs to be replaced by a better way to manage news distribtion (which he hopes to help mastermind, of course). Ironically, Jarvis's scheme to compensate link referrers from ad revenues obtained by the owners of the original content is not so different in its ultimate goals from what AP is trying to accomplish: rewarding recognized third parties who are helping to increase the value of original content. While Jarvis is right in that there are few inherent market advantages today in AP's core business model that would prevent entrepreneurs from usurping its role in helping news organizations get more value from their content production, it will take more than "reverse syndication" - compensating people who provide links to content - to provide meaningful revenues to today's news organizations.

It is this financial gap between the "link economy" and traditional news feed licensing that AP is hoping to target with its initiative. AP hopes that people who have an opportunity to use content from organizations that use their registry will pay for a license to use that content on a commercial basis. In other words, links provide context for original copies of content, but AP wants to encourage licensed content to appear in as many contexts as possible where it can make money. A link to an original article in a general news Web site on making Christmas ornaments, for example, is not going to have the same value to advertisers as that same article in a microsite or special section focused on preparing for that holiday. The value of links on the Web is indisputable, but facilitating revenues from repurposed content in a more automated and exact fashion is at least as important for many original content producers. In an era in which reproducing and distributing content is largely trivial, being able to create valuable contexts for content is the market differentiator that drives content value.

While I have strong reservations about how effective AP's "beacon" technology will prove to be, it's only fair to acknowledge that AP is at least trying to grapple seriously with how to build a more effective infrastructure for licensing its content in an era in which content distribution is highly commoditized. AP also opens the door to enabling any number of publishers to do so as well - which could lead to an expansion of AP's role as a source of more efficient content licensing services. From this perspective, the AP registry initiative may enable AP to license its content more rapidly and efficiently to an ever-widening range of distribution partners that they will need as outlets for content from AP member news organizations hard-pressed to keep their operations afloat through their stand-alone Web sites. It's unfortunate that it is taking mainstream publishers so long to get these concepts underway (we've been talking about them for years), but at least AP is starting to pick up the scent of a viable new business model.

The Future of Scholarly Publishing: Elsevier Experiments with Online Journal Formats

Scholarly publishing is faced with as broad an array of challenges as any other segment of the traditionally print-oriented publishing industry, with push-back from traditional sources of purchasing such as academic and corporate libraries, a dip in ad revenues as well as a rising array of potential online substitutes for sharing scholarly information. No small surprise, then, that Elsevier would be floating two new prototypes for presenting its scholarly journal content in online formats. presented in two possible combinations of enhanced content and navigation, the prototype articles provide features such as tabbed sections for different types of content, interactive illustrations, multimedia clips and user comments. ReadWriteWeb notes that in some ways the prototype online journal articles are more a collection of current best practices for organizing online content in a more sophisticated way, perhaps reminiscent of how "social media press releases" have been used to promote more efficient reuse of PR materials, and suggests that many of the features are moot since most scientific readers are going to read an entire article anyway.

There's some truth in RWW's observations, but the full measure of the Elsevier prototypes is greater than the scope of their media-oriented comments. In the race to come up with more effective scientific, technical and medical research, products and services, organizations using scholarly content from publishers such as Elsevier are seeking to find ways to integrate that content into people's workflows in ways that will accelerate their ability to obtain breakthrough insights. Segmentation of journal content into forms that are more easily repurposed for any number of software applications and online services is therefore an essential step. Services such as Knovel Library have been doing this as a post-production service for several years for scientific reference publishers, creating easily referenced charts, interactive graphs and other services that accelerate productivity in the SciTech workplace.

So as much as Elsevier's prototypes are important as presentations of journal content intended for accessing different aspects of a specific journal article, the prototypes also indicate more movement by Elsevier to provide "pre-shredded" content that can be easily repurposed for reading and insights that look for patterns across many articles. With that in mind, some of the obvious potential shortfalls of the experimental formats are somewhat forgiveable. For example, while comments appear in one of the prototypes (probably to make it easier for people to absorb two groups of possible and contrasting new features), the use of these possible formats to act as an anchor for ongoing broader discussion of a particular research topic appears to be fairly limited. That's probably fair game for add-on applications, developed either by Elsevier, their clients or third party suppliers.

I find the prototype formats to be useful and appealing, though I do agree with RWW that these represent in large part current online best practices. They are necessary changes, in all likelihood, for any scientific publisher to undertake these days. However, as many mainstream media organizations have discovered in their push to integrate content into more sophisticated rich online presentations, necessary changes do not always translate into changes sufficient to guarantee stable or improved revenues. These new formats are a strong indication that scientific publishers are grappling with the right issues as to how to improve their content for their audiences, but in and of themselves they may not change the debate on content value that they have with many of their current enterprise buyers in a fundamental way. What is more likely to happen is that they will enable additional value-add applications and services that will set the stage for enhanced value to their clients - and enhanced publishing revenues. Here's hoping that the experiment continues and moves in even more positive directions.

Monday, July 20, 2009

From Terrorist Detection to Market Insight: Understanding Our New "New Rules of Engagement" Subscription Study

I am at a customer site today as part of our team that is delivering the results of a project based our new narrative research techniques that we're using as the basis of our new subscription study, "New Rules of Engagement: Re-Tooling Information Sales and Marketing for the New Economy," sponsored by the Software and Information Industry Assoication and Special Libraries Association. Narrative research has evolved out of efforts to understand the often weak and ambiguous signals from global terrorist networks. Needless to say, you can't really do market research on terrorists, but we saw that this technique is an excellent way for our clients to analyze customers rapidly in an innovative way that fits with many of their most critical research needs.

As with terrorist networks, many publishers and technology companies are dealing with rapidly shifting client behaviors, with lots of asymmetrical behavior that's difficult to analyze using tradional research methods. In traditional research, one formulates a hypothesis to test using quantitative or qualitative research techniques. In quantative studies, for example, someone interviews subjects and then filters down the results into a cohesive picture. In quantitative research, a questionnaire asks specific questions that requires people to respond to specific possible responses. These are both good techniques if you want to filter out a lot of possible answers that may not be your focus. But as good as that can be, many of the opportunities and threats that our clients face lie beyond this type of pre-determined focus.

An analogy as to why this is important was used in our client presentation today. We asked the people in the room to look at a short video of six people passing basketballs to one another, three wearing white shirts and three wearing black shirts, and to count the number of times that the people with white shirts passed the ball to one another. There was some disagreement on how many times the white shirted people passed the ball, but surprisingly several people missed another key input - a person in a black gorilla suit walked in and out of the scene during the passing. In other words, our ability to filter and to concentrate on specific goal not only may not give us exact anwers but may also ignore or focus on interesting phenomena that could be potentially important or a actually just a distraction.

Narrative research addresses this key gap in human perceptions in interpreting information about markets by enabling people to tell and to code unbiased stories about how they use or make decisions relating to products and services and then have them passed through software that relates their responses to key themes. When patterns emerge from this process, research sponsors can then refer to the original, unbiased stories and find new ways to analyze them. Instead of being "locked in" to specific biases or ideas that formed the information, you can refer back to the original unbiased stories and find new ways to interpret them individually or in aggregate. When you get enough stories to draw statistically significant conclusions, the result is an extremely powerful database that can answer different questions again and again over time on a very cost-effective basis. If you add more stories over time to that database, the results can be even more powerful, as you can begin to track changes in perceptions that you would not have been able to detect if you had had to form a specific idea ahead of time for testing via traditional research.

The net result for "New Rules" subscribers will be a rich, reusable resource of hundreds of stories from executives and implementers in enterprises telling how they use and make decisions on obtaining information services that they use to perform their jobs. In today's volatile economy, being able to hear unbiased stories from these complex and shifting decision makers and to analyze them quickly and effectively can be a critical factor in responding to the many changes in organizations that are compelling new and accelerated approaches to buying and implementing enterprise information services. Combined with the on-site workshops what we will be conducting for the core research subscribers I expect that "New Rules" will be the core element of many company's strategy planning efforts this year. I encourage you to investigate our prospectus and to see if you're ready to take advantage of this ground-breaking approach to market research that can power the marketing of your information products and services.

Sunday, July 19, 2009

Life with Daylife: On-Demand Feature Content Development Grows Up

There have been any number of content aggregation services surfacing in recent years that have helped publishers to expand the richness of their Web sites. APIs, feeds and other tools are helping publishers to power up new online presences that offer new opportunities for ad revenues and audience building. But as the marketplace for news begins to revolve increasingly around passing topics, it becomes harder to use such tools to respond rapidly enough to revenue-generating opportunities. The "bogie" for this new model often mentioned is The Huffington Post, which has made an art of whipping up special sections of content and links from a wide variety of sources focused on headline-grabbing topics wrapped with its own layer of editorial content from bloggers. How do publishers respond to this model with their own instant feature sites?

Enter Daylife, a content aggregation service which is evolving past APIs and feeds to deliver through its Daylife Select service what you might call a HuffPost-in-a-box service that can enable publishers of all kinds to develop new and improved online content focused around specific topics rapidly. Using its own blend of semantic analysis and content serving technologies, Daylife can serve up text, photos and other multimedia content from a wide variety of sources or from a publisher's own content to create complete pages of topic-specific content very rapidly - complete with built-in ad inventory. What I find to be impressive about Daylife Select is that it is really a complete publication in its own right with great usability and appeal, as seen on its own site, but not just your typical autopiloted content technology. Content served up automatically can be managed by a non-technical staff to deliver a true editorial presence and can be supplemented by original content such as a publisher's own blogs though Daylife technology. Instead of waiting days or weeks to get APIs and other tools set up, Daylife Select can provide a tailored, branded and highly navigable topic-focused presence for many major themes within minutes.

Most importantly, although many major publishers are using Daylife technology to whip up valuable focused content, major consumer companies such as Kellogg's and Purina are also using Daylife to deliver focused content for their own clients. The idea of companies developing their own content to attract people in their marketing scope is nothing new, of course, but the ease with which this can be done through a service such as Daylife begins to point out how important it can be to enable publishers to be able to support marketers rapidly and effectively with content aggregated in whatever form their clients need with whatever overarching branding serves their needs best as effectively as possible.

To paraphrase Forrest Gump, "content is as content does"; that is, the content brands that are willing to work actively through tools such as Daylife to aggregate whatever content works best for their audiences and their marketing partners most effectively wins the publishing game. A simple concept, but one with which many publishers continue to struggle as they try to adapt traditional editorial methods to today's content aggregations tools that enable many editorial functions to fall into place automatically. Yes, a service like Daylife cannot replace all of the editorial value of a traditional newsroom and more robust editorial content development platforms, but when it can provide most of the robust functionality that people expect from an online publication today along with access to deep and high-quality content, it's time for publishers to think more actively about how they can use tools such as Daylife to enable their content to succeed in any number of topic-specific "instant portals" and other efficiently managed content presences far more actively.

In other words, why complain about HuffPost when you can succed with their model any number of times over in any number of content categories? It may not bring back the salad days of high-flying publishers, but this type of rapid and effective content aggregation may help publishers to deploy focused publications with content from a wide variety of sources far more cost-effectively - and in doing so make the best of their native editorial resources far more efficiently. I think that we're going to see more services like Daylife coming to light over the next few years, a trend that offers great promise for publishers if they can master it well.

Monday, July 13, 2009

Welcome Stuart Weinstein to the Shore Team

You may have noticed that the Web site for Shore is looking a little fresher these days, thanks in part to some overdue updating of our widget technologies and thanks in larger part to the expansion of our team which is dedicated to offering you the industry-leading insights that will help you to accelerate your content and technology marketing efforts. I am proud of the many experienced professionals who are now part of the Shore network who are now on board.

One key addition that you may have noticed is Stuart Weinstein, who, as our Business Development specialist, is helping Shore to focus on expanding our range of services available to you. First task on deck for Stuart: the marketing of our New Rules of Engagement subscription research service. Stuart Weinstein is a veteran of enterprise content sales with a long track record of success in the information industry.

In addition to helping Shore with its own business development efforts, Stuart is available to help your business engineer more robust sales and customer retention programs. Perhaps that will mean fresh thinking about cross selling, or a push to develop a VAR strategy to reach new clients. Perhaps your current sales personnel need to be realigned. Stuart can analyze your current sales infrastructure and then help you to design a new or enhanced model that will help you turn more prospects into customers, and then keep them as customers for the long haul.

Along with the rest of our team, Stuart believes in the comprehensive and holistic approach to marketing and selling information services that other Shore team members use to help our clients to accelerate their success. Stuart's "bona fides" include more than 17 years experience in sales, consistently exceeding quotas and developing a reputation as an expert in database marketing. He has built and managed sales teams, with full P&L responsibility, for some of the biggest names in the content industry. And he has designed powerful customer acquisition, cross-selling and retention strategies.

Before embarking on his consulting life, for example, Stuart held a senior sales position with Round1 Private Capital Marketplace, Inc., in New York City, where he sold market data to venture capital firms with substantial assets under management. He also served as an integral member of a management team that established product development priorities, features, functions and pricing strategy.

An earlier position was with Datamonitor, Inc., also in New York, where Stuart was the senior manager in the financial services sector. He built the sales team, managed territories and developed commission and bonus plans.

Prior to that post, Stuart spent a number of years at Thomson (now Thomson Reuters) units Gale Group and Intelligence Data. Among other assignments, he was responsible for managing major accounts. In one year, he exceeded quota by more than 200%.

In his off hours, Stuart is an avid and competitive sailor in his hometown of Fairfield, Connecticut. Indeed, on many evenings during the warm weather months, Stuart’s is the last of the boats on Long Island Sound to return to harbor as darkness falls.

Does your company need a new perspective on its sales efforts, and then a realistic, workable game plan to make it happen? Give Stuart a call. Welcome aboard!

Viralheat Redux: Real-Time Social Media Monitoring for Everyone

About a year ago I had the opportunity to look at Viralheat, a media monitoring service that at the time was focused on real-time analysis of trends in online video services such as Hulu. Viralheat was good stuff and ahead of its time in many ways, though positioned as a high-end service aimed at a fairly narrow audience. It was interesting, then, to see recently the evolution of Viralheat into a more broadly based real-time trend monitoring service that covers a wide array of social media outlets and regularly updated Web sites and that can be yours to use for as little as $10 a month.

Viralheat allows you to choose key words and phrases and to track key statistics on how frequently they are popping up as fresh mentions in today's real-time publishing environments. You can get summary stats for cross-site mentions or drill down into trends found in specific online services. Viralheat's graphs are highly reminiscent of those found in Google Analytics - a possible exit strategy in the making? - and the interface as a whole has matured into a very well-designed tool that groups information into very easy-to-digest summary of key metions of terms. I like especially the three-tab summaries that form the body of Viralheat's content, which aggregate mentions in separate tabs for messages, websites and videos. This really helps you to get a sense of these three very distinct types of influence and to be able to use Viralheat as a high-power aggregation service that can trump many other online aggregation tools for ease of monitoring.

While many of the summary statistics are basically just tallies and percentages, one key tool in Viralheat is a color-coded summary of positive, neutral and negative sentiment discovered for a chosen term. In a screen grab provided by Viralheat this statistic revealed that although the new Bing search engine from Microsoft had strong mentions in social media, videos and Web sites over a recent week, more than 86 percent of these mentions were rated with neutral sentiment - in other words, most people weren't waxing strongly about the new service but were instead just spreading the word about it. This type of take can help to separate perceived buzz from mere volume quite rapidly.

Taken in sum with the other statistics Viralheat is offering a strong basic workbench of media analytics that almost anyone can afford to use to understand how their products, services and brands are resonating moment-by-moment through the countless number of online media outlets that are the front lines to true market influence. It was only a year or so ago that such types of services were used mostly by major ad agencies, corporations and PR firms to track the performance of trends in online media services. Now, thanks to highly scalable cloud computing services, good and essential monitoring can be used by any size organization to understand trends over an even wider range of services than those used by traditional monitoring services.

Most importantly, by covering the waterfront of the most popular message-oriented online social media services, Viralheat can tap into trends in the highly distributed world of social media publishing in which many trends take form and influence opinions well before they are packaged in traditional media outlets. If you've been thinking that you need to be able to monitor social media more effectively for your organization but you don't know where to begin, you list of excuses has become much shorter.

Wednesday, July 8, 2009

Google Chrome OS: The Post-PC, Post-Phone Era Begins in Earnest

I have been using Google Chrome as my Web browser for several months, now, after giving up on Microsoft Internet Explorer years ago and then suffering from Firefox's add-ons and crashes bogging down what little memory was left on my PC. The Chrome browser fires off separate processes for each window or tab that you open, making it easier to keep it humming along as a stable Web appliance. If one window or tab fouls up, you get a polite "aw, snap" message from Chrome and the rest of your browsing stays intact. That's the kind of simplicity and reliability that's sadly lacking from most other Web browsing software that tries to address too many technology agendas.

Google is now expanding its Chrome brand to include an emerging computer operating system that has been announced today on its official blog. ChromeOS will be an open-source operating sytem based on a Linux kernel that will be released on netbooks and other devices sometime in the next year or so. The goal of ChromeOS is fairly straightforward: turn netbooks into the "instant-on" Web appliances that phones, PCs and even Apple's Mac computers were never designed to be. Conceived of originally in the era of minicomputers and early microprocessors, PCs and Macs were always modeled on machines that were ultimately never meant to be consumer appliances. My PC today, overburdened with software that I rarely use, takes at least as long with a 1.7 gigahertz processor to start up as my original 66 megahertz home PC did more than sixteen years ago. That was fine when I used my PC for a lot of my work: today most of my work takes place on the Web.

Google's assets, by contrast, are almost exclusively Web-based - as are the content assets of most individuals and an increasing number of institutions. Just as the Chrome browser strips out most non-essential functions to get people into Web standards-based functionality as cleanly as possible, so will ChromeOS support appliances that have Web access as their primary goal. The Google blog makes clear that desktop functionality in ChromeOS will be kept to a minimum with this in mind: just cut to the browser, thank you very much, you know where I'm going. ChromeOS may overlap somewhat with its Android operating system targeted at mobile phones with this goal in mind, but as the takeup on Android in the netbook world has only begun - and as mobile voice communications are migrating increasingly into the Web itself - any conflict between choosing ChromeOS and Android in the netbook market is likely to be minimal. What's more likely is that Android will be to ChromeOS as Windows Mobile is to PC-based versions of Windows, except that ChromeOS will not target enterprise-strength desktops and servers. Why bother, when Google specializes in platform-neutral access to all of the content on those platforms?

With that in mind, some of the hysteria in today's split-second reactions in the media to this announcement are a little hyperbolic. I doubt that there will be a "nuclear winter for Microsoft" as a result of the ChromeOS announcement. Enterprises will continue to need heavy-duty information appliances to address a wide variety of publishing needs, while at-home gamers and entertainment buffs will continue to want the maximum hardware and software available to maximize their experiences. It's unlikely that ChromeOS will beat any significant paths into these markets any time soon, though its promises of virus-free operation may inspire some crossovers. Instead, Google will more likely use ChromeOS-based appliances to expand the global footprint of people able to access the Web cost-effectively and reliably in as many ways as possible. In other words, the five billion or so people who have yet to access the Web can help Google to redefine the pie from which it draws market share for its content and technology services, just as it redefined the advertising pie with its AdWords contextual search ads and the aggregation pie with its many content services.

With most content being maintained already in the cloud of Web storage and services, Google ChromeOS is a reminder that after all these years the fundamental story about what is changing human communications remains the Web itself. The appliances that make Web access possible will be made more efficient via ChromeOS but it's the content and communications which they access which will continue to drive the changes in the world prompted by more universal electronic publishing and content consumption. With emerging tools such as Google's Wave messaging environment beginning to redefine how people communicate collaboratively via voice, text and images, it's likely that ChromeOS will be a middle-of-the-road technology strategy that will, in the long run, create an environment in which PCs and mobile phones as we have known them are pushed to the sidelines to cater to increasingly legacy-bound markets while ChromeOS defines the new "just-right" level of technology for most on-the-go and in-lap-at-home content use. Others such as Microsoft and Apple are starting to aim for that "just-right" Web niche as well, of course, so the pie will have more than one slice out of it. So yes, let's pay attention to ChromeOS, recognize its significance to the long-term future of content platforms - and then let's get back to being as serious about the Web as possible.

Friday, July 3, 2009

Independence Day and Social Media - ShoreViews Video

As people in the U.S. and get ready for the holiday weekend, I hope that you have a chance to enjoy friends and family and to celebrate the role that content has played in making our world a better place. Below is a video capturing my relfections on the role that social media played in events in our nation more than two hundred years ago that still ring true today. Have a great holiday!