I’ve noted before that I sort of wear two hats at The 451 Group, covering both content management and collaborative technologies. They’re related surely and perhaps more so every day, but traditionally have been rather separate. In any event, I have the benefit of looking at most things through (at least) two sets of lenses and am not so far in the weeds in one market that I miss related implications.
For example, Infovark has an interesting post about emergent systems and as I was looking specifically at their table, it struck me how much their definition of “explicit” (rules-based, top down, centralized, push) defines what most WCM vendors are trying to do today with targeted content delivery. But “emergent” technologies are the opposite (outcome-based, bottom up, decentralized, pull).
In short, it’s the difference between content targeting and user-generated content. There seems to me to be a real gap between those vendors doing the former and those supporting the latter.
We’ve been spending a lot of time with vendors in the customer community realm of late, while still keeping a close eye on WCM marketplace. The relationship between these two seems quite obvious to me, though we’re only starting to see bits of it in the market. There is some partnering going on and at least one OEM I know of though have been asked not to publicize yet. I suspect we’ll see more of this in the coming months.
I spoke with two web content management (WCM) vendors in the past week that are investing heavily in online marketing. In the WCM realm, this mostly means user-friendly tools that marketers can use to not only create content for a web site but also to test it and target it to site visitors.
A question I’ve been asking lately is, how automated can / should this targeting be? Interwoven, FatWire, Tridion and others have tools that let marketers segment visitors and then build some rules around how content should be targeted to these segments. This is a fairly manual process – segments and rules have to be manually created and managed. This is workable for sites that have a few broad segments and relatively shallow content / product catalogs. But wouldn’t be manageable with multiple, detailed customer segments and a deep product catalog.
This is one of the reasons Amazon, with arguably the deepest product catalog around, has long applied what we used to call “collaborative filtering” on its site — you know, the “readers who bought X also bought Y.” This approach has its own drawbacks to be sure (on Amazon, I get a strange list of recommendations based on the books I purchase for myself, for my kids or as gifts) but it wouldn’t be feasible for someone at Amazon to manually create cross-sell rules for every item Amazon sells.
A crew of start-ups like Baynote, Aggregate Knowledge and Loomia offer updated approaches to collaborative filtering that use more inputs (like time on page, search terms, clicks, scroll rate etc.) than early collaborative filtering tools. These vendors take different approaches (i.e., behavioral vs. contextual) but they’re similar in making recommendations automatically.
Some WCM vendors note that customers are leery of a “black box” making recommendations with live content on their sites. That isn’t surprising really. They also note that most customers are only beginning to segment customers or to get their feet wet with content testing (like multivariate testing to test layout or content success rates) and aren’t ready for automated recommendations yet. Still, Vignette just signed an OEM agreement with Baynote, so there must be some interest.
So which is the right approach? Ultimately both rules-based and automated targeting are likely to have roles to play. As emerging online marketing suites that include WCM, web analytics, testing and targeting tools come together, they’ll let the marketers choose the right approach for different types of content and/or different customer segments. But we aren’t there yet.