Wednesday, November 11, 2009

IBM Redbooks rock

Many of my former IBM colleagues/friends like to bring up the fact that I'm rarely nice to IBM on this blog. They're right, but it's because I hold IBM up to high standards. When they don't deliver, I make it a point to voice my opinion. For example, I have no idea what the brains trust in the IBM Tivoli Security group over in Austin have been doing for the past few years but your products have stagnated. And the new ones you bring out aren't particularly exciting. It's disappointing to say this, but IBM are losing ground (falling behind if you listen to Forrester) in the Identity & Access Management stakes to Oracle and CA. Wake up product management team!

One of the BEST things out of IBM however, are their Redbooks. These are essentially step-by-step guides for implementing IBM technology with lots of background information thrown in and fictitious "real-world" scenarios. Not many people realise IBM publishes these books but they're great if you're doing anything with IBM software or hardware (and in many cases, even when you're not).

I'm bringing this up because I noticed they released the latest Identity Management Design Guide, which leverages Tivoli Identity Manager 5.1 a few days ago (the one I co-authored 2 iterations back was for version 4.5.1). It's a little thicker (i.e. it has more pages) than the version of the book I was involved with, but in looking through this latest version one thing stood out: I still recognise most of the content, especially the parts I wrote. It's nice to see the content is being leveraged in subsequent versions. What that means is that the materials produced with each iteration are solid and practical, which is true for pretty much all the Redbooks that get released.

In short, Redbooks are a great resource for:
1) People who want to learn about an IBM product or a subject area.
2) People who want to implement the relevant IBM product.
3) Competitors who want to take a detailed peek at an IBM product ;-)

I'm sorry IBMers, but I couldn't resist taking a stab at IBM Tivoli Security. The Redbooks however, remain one of the best things out of IBM. They are infinitely better than those marketing data sheets we've all wasted time reading.

Saturday, November 07, 2009

CA DLP headed in the right direction

When CA acquired Orchestria, I said it was a good move. I even wrote a follow-up post about why Identity & Access Management (IAM) and Data Security/Data Leakage Prevention (DLP) fit so well together. 2 weeks ago, CA sent out a fairly lengthy press release with a list of products they've updated. The 2 products that caught my eye were GRC Manager 2.5 and DLP 12.0. This post covers the DLP product.

I spoke with Gijo Mathew, Vice President of Security Management at CA about the DLP announcement to get a better understanding of CA's strategy in the longer term and clear up a few things which confused me with their press release. Here are the "new features" for DLP 12.0 which I've lifted from the release:
  • Enhanced Discovery – Provides the ability to scan data locally on endpoints and to scan directly into structured ODBC databases to identify sensitive data.
  • Extended Endpoint Control – Leverages existing data protection policies to control of end-user activity such as moving data to writable CDs or DVDs, and taking a screen print of sensitive content.
  • Seamless Archive Integration – Integrates with CA Message Manager, a product in CA’s Information Governance Suite, to help deliver end-to-end message surveillance, reporting, and archiving.
The first thing I should point out is that the ability to scan structured databases is a BIG plus. Many DLP vendors out there do quite a lot with either unstructured data (e.g. files on disk, data in memory) or structured data (e.g. databases), but they don't usually handle both. Orchestria fell into the "unstructured data" bucket. Now under the CA banner, they can finally support the ability to scan and classify data sitting in databases. Note however, that the ability to scan/identify/classify data and the ability to enforce controls over access to this data are completely separate things. To be able to properly enforce controls over structured data, a product would need to hook into the low level database security mechanisms. As a result, the enforcement of access controls into databases based on the content being accessed is difficult and very few vendors can actually do this at the moment (CA included).

While we're talking about scanning, CA also improved the way they scan for unstructured data. In previous versions, the scanning had to be performed from a central server. This is not ideal in many cases thanks to all the things that get in the way like firewall rules, security restrictions on machines, desktops not necessarily being available when required for scanning (either by being off the network or turned off) and so on. A more robust scanning strategy should support the ability to have the endpoints scan local data when required. It takes the load off the central server and allows for a more complete view of the environment from a data management standpoint. The new version of CA DLP added this capability. The negative however, is the performance hit taken by the endpoint while the scanning is being done (this is not a CA specific drawback - any endpoint scanner is going to impact performance).

The second point about the additional features around endpoint control (specifically regarding the mention of moving data to CDs, DVDs and controlling screen print events) really confused me. The examples given are supported by just about every single endpoint DLP vendor out there. I was shocked that Orchestria didn't have these capabilities. Alas, this was not the case. Gijo mentioned that they merely enhanced the capabilities around the CA DLP endpoint component and that these were some examples they picked out. The point CA were trying to make was around the fact that they still do the core DLP things expected of any DLP product worth implementing. Apparently after the previous release of DLP, many assumed they were no longer focusing on the core DLP capabilities and going down the "identity aware DLP" road. This is definitely not the case according to Gijo.

While the points mentioned in the press release are interesting in that they show CA are serious about core DLP capabilities, what impressed me most was the longer term vision CA has for the product. In fact, it is this longer term vision that had some accusing CA of neglecting their core DLP capabilities in the previous release.

CA are fortunate in that the natural evolution of products in the DLP space fit nicely with their need to work at integrating DLP with their portfolio of products. It makes product management decisions slightly easier for them instead of having to spend a lot of time trying to balance the need for additional features with being able to sell a cohesive suite of solutions (which is commonly the problem with acquisitions). In other words, adding integration points provides CA DLP with additional capabilities that make sense for most of the other products involved as well. For example:
  • The ability to add context to access control is a very powerful thing. Context is very much about information, with data at its core (although it's not everything, because data alone does not tell us what a user is actually doing). What I'm referring to is commonly labelled as content aware access management. A common use case here typically involves integration of access control decisions by a web access management component (Siteminder in CA's case) with data aware mechanisms provided by a data security solution (CA DLP in CA's case). The web access management product can either make decisions based on static tags on the information/resource being accessed or dynamic analysis made in real time by the data aware component (e.g. this data looks like a bunch of credit card numbers so we should not be giving the user access).
  • The analysis of data usage patterns across different environments allows for additional smarts when trying to manage risk, especially in cases where patterns are outside the norm of a user's peers. The trick here is being able to turn the data gathered into information to feed back to a GRC (Governance, Risk & Compliance) solution or SIEM (Security Information and Event Management) dashboard. Otherwise, you could just point any old reporting engine at the data and achieve the same result (which is far from what one would call proper integration).
  • Access and data governance are typically silos in organisations today. If you're able to tie the two together, the management overhead is reduced significantly. That's why it's a big deal if an organisation is able to get a single view of both from a management standpoint. This is not to say it cannot be done today. The key point I'm making here is that it's just really hard to do. If a vendor makes it that much easier to achieve, it saves time and money.
  • Improving the lifecycle activities around enterprise information and content management by using the data discovery and classification capabilities to provide additional context to the relevant processes.
I'll leave it as an exercise for the reader to figure out which CA product/s to slot into each example. The point is, they have something in their product stack to integrate with DLP in each example. What these illustrate however, is the direction CA are headed in with regards to the DLP strategy (even though some of it is a little high level).

Gijo was honest in acknowledging they don't have a lot of the things they want out of the box just yet. At this stage, many of the things I've mentioned (in terms of product strategy) will require a good amount of services work. I'm not going to criticise them for this as they only acquired Orchestria earlier this year and it's unrealistic to expect all the required integration to be built out so quickly, especially with a whole suite of products like CA's. What I do like a lot, is where they're going.

CA's strategy is good. They're on a journey and their DLP product is the jewel in their security suite from a competitive standpoint (against the other big IAM vendors). They also stack up well against their competitors in the data security space; in this case the advantage comes in the form of their IAM suite (and to a certain extent, their ever improving GRC prowess), which other data security vendors do not have. Those familiar with the security space might notice I haven't made any mention of the fact that RSA also have both IAM and DLP capabilities. Don't forget however, that it's a bit of a stretch to call RSA's IAM capabilities a suite (e.g. they don't do provisioning). They also have no real GRC capabilities to speak of (their GRC page is a bit of a joke).

As long as CA don't neglect the core data security capabilities in DLP along the way, they're going to do just fine.