January 27th, 2012 — Archiving, Collaboration, Content management, Data management, eDiscovery, Search, Text analysis
Every New Year affords us the opportunity to dust down our collective crystal balls and predict what we think will be the key trends and technologies dominating our respective coverage areas over the coming 12 months.We at 451 Research just published our 2012 Preview report; at almost 100 pages it’s a monster, but offers some great insights across twelve technology subsectors, spanning from managed hosting and the future of cloud to the emergence of software-defined networking and solid state storage; and everything in between. The report is available to both 451Research clients and non-clients (in return for a few details); access the landing page
here. There’s a press release of highlights
here. Also, mark your diaries for a webinar discussing report highlights on Thursday Feb 9 at noon ET, which will be open for clients and non-clients to attend. Registration details to follow soon…
Here are a selection of key takeaways from the first part of the Information Management preview, which focuses on information governance, ediscovery, search, collaboration and file sharing. (Matt Aslett will be posting highlights of part 2, which focuses more on data management and analytics, shortly.)
- One of the most obvious common themes that will continue to influence technology spending decisions in the coming year is the impact of continued explosive data and information growth. This continues to shape new legal frameworks and technology stacks around information governance and e-discovery, as well as to drive a new breed of applications growing up around what we term the ‘Total Data’ landscape.
- Data volumes and distributed data drive the need for more automation and auto-classification capabilities will continue to emerge more successfully in e-discovery, information governance and data protection veins — indeed, we expect to see more intersection between these, as we noted in a recent post.
- The maturing of the cloud model – especially as it relates to file sharing and collaboration, but also from a more structured database perspective – will drive new opportunities and challenges for IT professionals in the coming year. Looks like 2012 may be the year of ‘Dropbox for the enterprise.’
- One of the big emerging issues that rose to the fore in 2011, and is bound to get more attention as the New Year proceeds, is around the dearth of IT and business skills in some of these areas, without which the industry at large will struggle to harness and truly exploit the attendant opportunities.
- The changes in information management in recent years have encouraged (or forced) collaboration between IT departments, as well as between IT and other functions. Although this highlights that many of the issues here are as much about people and processes as they are about technology, the organizations able to leap ahead in 2012 will be those that can most effectively manage the interaction of all three.
- We also see more movement of underlying information management infrastructures into the applications arena. This is true with search-based applications, as well as in the Web-experience management vein, which moves beyond pure Web content management. And while Microsoft SharePoint continues to gain adoption as a base layer of content-management infrastructure, there is also growth in the ISV community that can extend SharePoint into different areas at the application-level.
There is a lot more in the report about proposed changes in the e-discovery arena, advances of the cloud, enterprise search and impact of mobile devices and bring-your-device-to-work on information management.
January 20th, 2012 — Archiving, eDiscovery, M&A
We commented recently on Symantec’s acquisition of cloud archiving specialist LiveOffice. The announcement also afforded Big Yellow an opportunity to unveil what it calls “Intelligent Information Governance;” an over-arching theme that provides the context for some of the product-level integrations it has been working on. For example, it just announced improved integration between its Clearwell eDiscovery suite and its on-premise archive software, EnterpriseVault (stay tuned for more on this following LegalTech later this month).
There’s clearly an opportunity to go deeper than product-level ‘integration,’ however. In a blog post, Symantec VP Brian Dye raised an issue that we have been seeing for a while, especially among some of our larger end-user clients. In the post, Brian discusses the fundamental contention that all of us – from individuals to corporations to governments — face around information governance — striking the right balance between control of information and freedom of information.
Software has emerged to help us manage this contention, most typically through data loss prevention (DLP) tools – to control what data does and doesn’t leave the organization — and eDiscovery and records management tools, to control what data is retained, and for how long. Brian noted that there is an opportunity to do much more here by linking the two sides of what is in many ways the same coin, for example by sharing the classification schemes used to define and manage critical and confidential information.
This is an idea that we have discussed at length internally, with some of our larger end-user clients, and with a good few security and IM vendors. Notably, many vendors responded by telling us that, though a good idea in principle, in reality organizations are too siloed to get value from such capabilities; DLP is owned and operated by the security team, while eDiscovery is managed by legal, records management and technology teams. While some of the end-users we have discussed this with are certainly siloed to a point, they are also working to address this issue by developing a more collaborative approach, establishing cross-functional teams, and so on.
A cynic would point out that some self interest might be at play here too from a vendor perspective; why sell one integrated product to a company when you can sell them essentially the same technology twice. But of course, we’re not the remotest bit cynical (!) There is also the reality that at most large vendors, product portfolios have been put together at least in part by acquisitions. Security and e-discovery products may be sold separately because they are, in fact, separate products with little to no integration in terms of products or sales organizations. And vendors may not yet be motivated to do the hard integration work (technically, organizationally), if they are not seeing consistent enough demand from consolidated buying teams at large organizations.
Wendy Nather, Research Director of our security practice, notes that such integration is desirable;
– Users don’t WANT to have meta-thoughts about their data; they just want to get their work done, which is why it’s hard to implement a user-driven classification process for DLP or for governance. The alternative is a top-down implementation, and that would work even better with only one ‘top’ — that is, the security and legal teams working from the same integrated page.
However, Wendy also notes that such an approach is itself not without complexity;
– Confidential data can be highly contextual in nature (for example, when data samples get small enough to identify individuals, triggering HIPAA or FERPA); you need advanced analytics on top of your DLP to trigger a re-classification when this happens. Why, you might even call this Data Event Management (DEM).
It’s notable that Symantec is now starting to talk up the notion of a unified, or converged approach to data classification. Of course, it is one of the better-positioned vendors to take advantage here, given its acquisitions in both DLP (Vontu in 2007) and eDiscovery (Clearwell in 2011), while LiveOffice adds some intriguing options for doing some of this in the cloud (especially if merged with its hosted security offerings from MessageLabs).
Nonetheless, we look forward to hearing more from Symantec — and others — about progress here through 2012. Indeed, if you are attending LegalTech in New York in a couple of weeks, then our eDiscovery analyst David Horrigan would love to hear your thoughts. Additionally, senior security analyst Steve Coplan will be taking a longer look at the convergence of data management and security in his upcoming report on “The Identities of Data.”
In other words, this is a topic that we’re expending a fair amount of energy on ourselves; watch this space!
January 18th, 2012 — Archiving, eDiscovery, M&A
As if to underscore our belief that the cloud is set to play a bigger role in all things Information Management-related in 2012, Symantec announced this week that it had acquired cloud archiving specialist LiveOffice for $115m, its first acquisition in eight months (451 research clients can read the full deal-analysis report here.
Though the deal was not a huge surprise — some of LiveOffice’s executive team (including CEO and COO) hail from Symantec, which has for the last year been reselling LiveOffice, rebranded as EnterpriseVault.Cloud – it is a significant endorsement of the cloud archiving market; a sub-sector that we have been following closely for a couple of years (we published a detailed, long-form report on the market in late 2010), but has yet to really come to life.
Symantec, which of course dominates the on-premise email archiving market, notes that about half of all archive deployments now go to the cloud. In this respect, cloud archiving is a market that it simply has to participate in more directly. Accordingly, LiveOffice provides Symantec with a better means of serving the smaller organizations that tend to opt for the cloud model, which requires far fewer skills and resources to set up and manage than on-prem models. Of course, it also means Symantec doesn’t have to be religious about which model it promotes; whether on-prem, cloud or a hybrid of the two, it now caters to all requirements.
Symantec also made an interesting comment that LiveOffice is at the right point in its own development where the application of Symantec’s huge scale can help in growing the business, rather than be a hindrance. This is a refreshingly honest acknowledgement that it hasn’t always got the balance right in the past; buy a company that is too small, and the weight of a giant like Symantec risks starving it of oxygen altogether, rather than fanning the flames that made it successful in the first place.
The question now is whether this move may help spark broader growth of the cloud archiving market. LiveOffice was one of the first cloud providers to archive other data types beyond email, and can now store and index a wide variety of data, including from social media, file servers, SharePoint and even SaaS applications; as more data, workloads and applications move to the cloud, so cloud-based archiving will become more relevant. One big factor in the cloud players’ favor is that email is increasingly going the hosted route, especially for SMEs; if you run corporate email as a service, then you aren’t going to deploy an email archive on-premise.
All in all, we think this is a good move by Symantec, and one that could drive interest in the other cloud-archiving pure plays out there.
January 9th, 2012 — Uncategorized
It’s my pleasure to announce that we recently recruited Martin Schneider as a Research Manager (press release here), based in our San Francisco office. Martin will be focused primarily on the innovation and disruption taking place at ‘the top of the stack’ in the application software space; a part of the industry that is undergoing huge change through the combined impact of cloud, software-as-a-service and social media.
This is actually Martin’s second stint with the company, and is part of the reason why we are so excited to have him back in the building. Martin first joined 451 in 2004, where as analyst and then senior analyst he spearheaded our coverage of the CRM software market. Since being tempted away to join the vendor side in 2007, Martin has worked in various senior marketing roles for two software startups, first for CRM specialist SugarCRM, and then for cloud storage and data management firm Basho.
This experience at the sharp-end of the startup world, allied with Martin’s extensive industry knowledge, contacts, prolific work-rate and unbridled enthusiasm, means that we now have a top-class, commercially-minded analyst to spearhead coverage and help lead the broader debate in a critical part of the industry.
Equally important is that Martin will be a key link within the 451 Research chain; after all, applications are what enterprise IT really cares about, and what the rest of the stack is optimized for. Therefore, Martin’s role will involve extensive collaboration with multiple 451 Research practices, analysts and research directors, especially around infrastructure and cloud computing, as well as information management.
In other good news, we also announced that Matt Aslett has been promoted to the role of Research Manager. Frequent readers of this blog will already be very familiar with Matt, and this is a well-deserved promotion. Matt is recognized by the industry as being at the forefront in his field. His published analysis is thoroughly informed, insightful and prolific, our clients love him, and he has become a popular public speaker for all things data-related, especially around his concept of “total data.”
With these two promotions 451 Research is starting 2012 with a bang. Welcome Martin and congratulations Matt.
One other announcement is that Nick Patience recently left the company to pursue an opportunity in the vendor world. As a co-founder, Nick has played an instrumental role in the company’s growth and development over the last 12 years. We’re sorry to see him go, but wish him the best of luck in his new role on the dark side!
October 29th, 2010 — Archiving, Data management, eDiscovery, M&A, Search, Storage
The cloud archiving market will generate around $193m in revenues in 2010, growing at a CAGR of 36% to reach $664m by 2014.
This is a key finding from a new 451 report published this week, which offers an in-depth analysis of the growing opportunity around how the cloud is being utilized to meet enterprise data retention requirements.
As well as sizing the market, the 50-page report – Cloud Archiving; A New Model for Enterprise Data Retention – details market evolution, adoption drivers and benefits, plus potential drawbacks and risks.
These issues are examined in more detail via five case studies offering real world experiences of organizations that have embraced the cloud for archiving purposes. The report also offers a comprehensive overview of the key players from a supplier perspective, with detailed profiles of cloud archive service providers, with discussion of related enabling technologies that will act as a catalyst for adoption, as well as expected future market developments.
Profiled suppliers include:
- Autonomy
- Dell
- Global Relay
- Google
- i365
- Iron Mountain
- LiveOffice
- Microsoft
- Mimecast
- Nirvanix
- Proofpoint
- SMARSH
- Sonian
- Zetta
Why a dedicated report on archiving in the cloud, you may ask? It’s a fair question, and one that we encountered internally, since archiving aging data is hardly the most dynamic-sounding application for the cloud.
However, we believe cloud archiving is an important market for a couple of reasons. First, archiving is a relatively low-risk way of leveraging cloud economics for data storage and retention, and is less affected by the performance/latency limitation that have stymied enterprise adoption of other cloud-storage applications, such as online backup. For this reason, the market is already big enough in revenue terms to sustain a good number of suppliers; a broad spectrum that spans from Internet/IT giants to tiny, VC-backed startups. It is also set to experience continued healthy growth in the coming years as adoption extends from niche, highly regulated markets (such as financial services) to more mainstream organizations. This will pull additional suppliers – including some large players — into the market through a combination of organic development and acquisition.
Second, archiving is establishing itself as a crucial ‘gateway’ application for the cloud that could encourage organizations to embrace the cloud for other IT processes. Though it is still clearly early days, innovative suppliers are looking at ways in which data stored in an archive can be leveraged in other valuable ways.
All of these issues, and more, are examined in much more detail in the report, which is available to CloudScape subscribers here and Information Management subscribers here. An executive summary and table of contents (PDF) can be found here.
Finally, the report should act as an excellent primer for those interested in knowing more about how the cloud can be leveraged to help support ediscovery processes; this will be covered in much more detail in another report to be published soon by Katey Wood.
September 2nd, 2010 — Storage
I had the opportunity to meet up with David Scott, CEO of 3PAR, the current belle of the ball in storage as the bidding war between HP and Dell continues to intensify (read our analysis of the deals for free by clicking here). Though discussion of any details concerning the acquisition process was strictly off limits, Scott provided some interesting color on why he believes the battle for 3PAR is taking his company’s valuation to unprecedented levels.
Actually, our conversation was a continuation of a discussion that we began over dinner at 3PAR’s analyst event in California a few weeks ago. During that discussion I asked Scott why 3PAR hadn’t yet been acquired; his response pretty much described the events that are now playing out. Scott believed there was in effect a Mexican stand-off taking place; multiple vendors would potentially be very interested in making a bid for 3PAR, but a fear of being outbid – and losing out – was holding them back. Thus, for the time being it was generally in all potential suitors’ best interests for 3PAR to remain independent.
Why Dell decided to break rank and shoot first is not entirely certain at this point — though HP losing its CEO may have been a trigger — and was certainly not on the menu for discussion with Scott. But the CEO was more forthcoming on the reasons for this fear of being outbid, which are rooted in 3PAR’s scarcity; ie the belief that there is no viable alternative acquisition target to 3PAR. The bidding war that has played out since Dell made its first offer would appear to support this. Why? Scarcity seems like a crazy assumption to make in an industry that is constantly spitting out new startups.
Scott’s reasoning for this scarcity has both demand-side and a supply-side dimensions, both of which have taken a couple of turns of the IT cycle to come to fruition. On the supply side; cast your mind back a decade, and the IT world was alive with the prospect that ‘xSPs’ (especially storage service providers and application service providers) would play a transformative role in delivering IT as a service; Cloud 1.0, if you like. What these xSPs required was a way of building these services on a scalable, secure and shared technology infrastructure. Unfortunately for SSPs such as Storage Networks, the infrastructure components to build such a stack were not available, and the entire model collapsed under the weight of having to build dedicated systems for each customer.
But the promise of the xSP model was also the catalyst for innovation at all levels of the IT stack. There was nothing inherently wrong with the model of IT-as-a-service – it was, and remains, highly attractive. What was needed was a new underlying architecture that could provide the required scale and flexibility as cost effectively as possible, such as blade servers, virtualization software and ‘utility’ storage.
Thus, as interest in the xSP model began to build, VC money started to flow into storage startups developing ‘carrier grade’ platforms; in particular Cereva Networks, Yotta Yotta, Zambeel and 3PAR. Only one of those companies managed to make a go of it; the rest succumbed to the same burst bubble that did for the xSPs. Cereva (which had raised almost $140m in VC funding) collapsed in 2002, Zambeel (which raised around $66m) closed its doors in 2003, while the assets of YottaYotta (which took in around $100m) were eventually acquired by EMC.
As the only remaining player in this new generation of high-end storage platforms, Scott says 3PAR was in a unique position. Perhaps even more crucially, these failures meant VCs were now loath to invest in high-end storage startups; even if the next “3PAR killer” came along, it would have struggled for funding. Instead, VCs turned their attention to startups targeting the mid-range storage market – LeftHand Networks, EqualLogic, Compellent, Pillar — which was growing much more quickly than the now-slowing high-end space.
Scott admits 3PAR came under pressure to target the mid-range space more aggressively (and it did release smaller versions of its InServ arrays), but the company’s core efforts remained on the high-end, with a continuing focus on direct-, rather than channel-based, sales. Scott and his team remained as convinced as ever that ‘utility’ computing was real, and would eventually pay dividends via 3PAR’s scalable storage platform.
In particular it found traction with the next generation of service providers –such as managed hosting providers and telcos – that, subscribers to the cloud model attest, will collectively host the vast majority of the enterprise IT workloads of the future. Indeed – and this is where the demand-side argument comes in – the post-recession reality for organizations of all types and sizes – from financial services giants to local government offices – is that they are looking for more cost effective methods of running their IT processes.
These service providers differentiate themselves on quality of service and cost, and the only way of achieving this – according to Scott – is through best of breed IT infrastructure. Scott and co have made much of the fact that seven of the ten largest service providers by revenue are 3PAR customers, and we’re sure this point is not lost on HP, Dell or any other would-be acquirer.
Of course, with hindsight it’s easy to make the facts fit a story, but we’d note that 3PAR‘s own strategy and messaging has scarcely changed since day one. 3PAR has always targeted ‘utility’ computing, and has stuck with the term as the rest of the industry dispensed with what to them was just the latest buzzword (for proof, see the first research report (451 clients only) we wrote on 3PAR, back in 2002). Indeed, for 3PAR and Scott, delivering IT as a utility is an integral part of its proposition; it gets to the core of why the company believes it is different, and why (at least) two giants of the industry are prepared to pay well-over-the odds to own.
May 21st, 2010 — Storage
We recently attended EMC’s annual user conflab – EMC World – in Boston. The 451 Group was there in force, with Kathleen Reidy and Katey Wood representing our Information Management agenda, as well as Henry Baltazar and myself on the storage side. Yes, it’s taken me longer than I though to put some thoughts together – which I am attributing to the fact that I have been involved in the Uptime Institute’s Symposium in New York this week; an excuse that I am sticking to!
For our take on some of the specific product announcements that EMC made at the show, I would refer you to the reports we have already published (on V-Plex, Atmos, SourceOne, mid-range storage and Backup and Recovery Systems). But aside from these, I was struck by a few other broader themes at EMC that i think are worth commenting on further.
First, the unavoidable — and even overwhelming — high-level message at EMC World revolved around the ‘journey to the private cloud,’ – in other words, how EMC is claiming to help customers move from where they are now to a future where their IT is more efficient, flexible and responsive to the business. Whether or not you believe the ‘private cloud’ message is the right one – and I talked with as many ‘believers’ as I did ‘skeptics’ – there’s no doubt that EMC has the proverbial ball and is running with it. I can’t think of many other single-vendor conferences that are as fully committed to cloud as EMC, and given EMC’s established position in the enterprise datacenter and its range of services that range across virtualization, security and information management, you can understand why it has cloud religion.
But there undoubtedly is risk associated with such a committed position; I don’t believe ‘cloud’ will necessarily go the way of ‘ILM,’ for example, but EMC needs to start delivering tangible evidence that it really is helping customers achieve results in new and innovative ways.
Another issue EMC has to be careful about is its characterization of ‘virtualization’ versus ‘verticalization.’ This is designed to position EMC’s ‘virtualization’ approach as a more flexible and dynamic way of deploying a range of IT services and apps across best-of-breed ‘pools’ of infrastructure, more dynamically than through the vertical stacks that are being espoused by Oracle in particular.
Though I believe that a fascinating — even idealogical — battle is shaping up here, it’s not quite so clear-cut as EMC would have you believe. What is a vBlock if not a vertically integrated and highly optimized storage, server, network and virtualization stack? And doesn’t the new vBlock announcement with SAP offer an alternative that is in many ways comparable with the Oracle ‘stack’ (especially if you throw in Sybase as well)? I get the difference between an Oracle-only stack and a more partner-driven alternative, but I think the characterization of virtualization as ‘good’ and verticalization as ‘bad’ is overly simplistic; the reality is much more nuanced, and EMC itself is embracing elements of both.
Speaking of journeys, it’s also clear to me that EMC is on a journey of its own, both in terms of the products it offers (and the way it is building them), and in terms of how it positions itself. EMC has always been a technology company that lets its products do the talking; but in an era where larger enterprises are looking to do business with fewer strategic partners, this isn’t always enough. Hence, the ‘journey to the private cloud’ is designed to help EMC position itself as a more strategic partner for its customers, while efforts such as the VCE (VMware, Cisco and EMC) coalition bring in the other infrastructure elements that EMC itself doesn’t offer. At the conference itself, much of the messaging was focused on how EMC can help deliver value to customers, and not just on the products themselves.
This approach is a rather radical change for EMC. Though it remains at its core a conservative organization, I think this more ‘holistic’ approach is evidence that two senior management additions EMC has added recently are starting to make their presence felt.
The first hire was that of COO Pat Gelsinger, an ex-Intel exec who has been brought to assemble a plan to execute on the private cloud strategy. As well as a very strong technical pedigree, Gelsinger’s strength is the combination of an ability to conceive and articulate the big picture, as well as understand the tactical steps that are required to realize this; including product development, customer satisfaction and M&A. It seems to me that Gelsinger is already immensely respected within EMC, and already seems regarded by some as CEO-in-waiting; a transition that would be a shoe-in should this strategy pay off.
The other key addition is that of ex-Veritas and Symantec CMO Jeremy Burton as EMC’s first chief marketing officer. To me, this appointment underscores EMC’s need to market itself both more aggressively, as well as differently, in order to maintain and grow its position in the market. Though Burton has only been in the job for a few weeks, we got a sense at EMC World of how he may reshape EMC’s public image; a more light-hearted approach to keynotes (some of which worked better than others, but you have to start somewhere!) bore Burton’s hallmarks, for example.
But if Burton came to EMC for a challenge, I think he has one; EMC’s reputation and brand in the large datacenter is solid, but it has work to do to build its image in the lower-reaches of the market, an area that CEO Joe Tucci has highlighted as a major growth opportunity.
Although this is as much a product challenge as anything else, EMC must also carefully consider how it brands itself to this audience. Will an existing EMC brand – Clariion, Iomega or even Mozy — appeal to a smaller storage buyer, or does it come up with something entirely new? Given its disparate product set here, could an acquisition of an established smaller-end player provide it with an instant presence?
Then there’s the issue of direct marketing; today, EMC spends a fraction of its rivals on advertising in the trade and business press. Given Burton’s background at Oracle and Symantec, plus the growing imperative for IT companies to appeal to the C-level suite to reinforce their strategic relevance, could EMC soon be featuring on the back page of the Economist?
March 19th, 2010 — Storage
I’m going to be presenting the introductory session at a BrightTalk virtual conference on March 25 on the role and impact of the virtual server revolution on the storage infrastructure. Although it’s been evident for some time that the emergence of server virtualization has had — and continues to have — a meaningful impact on the storage world, the sheer pace of change here makes this a worthwhile topic to revisit. As the first presenter of the event — the conference runs all day — it’s my job to set the scene; as well as introducing the topic within the context of the challenges that IT and storage managers face, I’ll outline a few issues that will hopefully serve as discussion points throughout the day.
Deciding on which issues to focus on is actually a lot harder than it sounds — I only have 45 minutes — because, when you start digging into it, the impact of virtualization on storage is profound on just about every level; performance, capacity (and more importantly, capacity utilization), data protection and reliability, and management.
I’ll aim to touch on as many of these points as time allows, as well as provide some thoughts on the questions that IT and storage managers should be asking when considering how to improve their storage infrastructure to get the most out of an increasingly virtualized datacenter.
The idea is to make this a thought-provoking and interactive session. Register for the live presentation here: http://www.brighttalk.com/webcast/6907. After registering you will receive a confirmation email as well as a 24-hour reminder email. As a live attendee you will be able to interact with me by posing questions which I will be able to answer on air. If you are unable to watch live, the presentation will remain available via the link above for on-demand participation.
April 1st, 2008 — Archiving, Content management, Storage
When Nick first unveiled this blog last month he rightly noted ‘storage’ as one of the many categories that falls into a capacious bucket we term ‘information management.’ With this in mind he reminded me that it would be appropriate for the 451 Group’s storage research team to contribute to the debate, so here it is!
For the uninitiated, storage can appear to be either a bit of a black hole, or just a lot of spinning rust, so I’m not going to start with a storage 101 (although if you have a 451 password you can peruse our recent research here). Suffice to say that storage is just one element of the information management infrastructure, but its role is certainly evolving.
Storage systems and associated software traditionally have provided applications and users with the data they need, when they need it, along with the required levels of protection. Clearly, storage has had to become smarter (not to mention cheaper) to deal with issues like data growth; technologies such as data deduplication help firms grapple with the “too much” part of information management. But up until now the lines of demarcation between “storage” (and data management) and “information” management have been fairly clear. Even though larger “portfolio” vendors such as EMC and IBM have feet in both camps, the reality is that such products and services are organized, managed and sold separately.
That said, there’s no doubt these worlds are coming together. The issues we as analysts are grappling with relate to where and why this taking place, how it manifests itself, the role of technology, and the impact of this on vendor, investor and end-user strategies. At the very least there is a demand for technologies that help organizations bridge the gap – and the juxtaposition – between the fairly closeted, back-end storage “silo” and the more, shall we say, liberated, front-end interface where information meets its consumers.
Here, a number of competing forces are challenging, even forcing, organizations to become smarter about understanding what “information” they have in their storage infrastructure; data retention vs data disposition, regulated vs unregulated data and public vs private data being just three. Armed with such intelligence, firms can, in theory, make better decisions about how (and how long) data is stored, protected, retained and made available to support changing business requirements.
“Hang on a minute,” I hear you cry. “Isn’t this what Information Lifecycle Management (ILM) was supposed to be about?” Well, yes, I’m afraid it was. And one thing that covering the storage industry for almost a decade has told me is that it moves at a glacial pace. In the case of ILM, the iceberg has probably lapped it by now. The hows and whys of ILM’s failure to capture the imagination of the industry is probably best left for another day, but I believe that at least one aim of ILM – helping organizations better understand their data so it can better support the business — still makes perfect sense.
What we are now seeing is the emergence of some real business drivers that are compelling a variety of stakeholders – from CIOs to General Counsel — to take an active interest in better understanding their data. This, in turn, is driving industry consolidation as larger vendors in particular move to fill out their product portfolios; the latest example of this is the news of HP’s acquisition of Australia-based records management specialist Tower Software. Over the next few weeks I’ll be exploring in more detail three areas where we think this storage-information gap is being bridged; in eDiscovery, archiving and security. Stay tuned for our deeper thoughts and perspectives in this fast-moving space.