Saturday, October 29, 2011

Is Logical/Physical Convergence the Next Big Thing in IAM?

Traditional identity and access management describes the processes, systems and policies that are used to identify individuals to information systems, control what privileges they have, and what they are doing with those privileges. By now, this is all well understood.

But why should IAM be restricted only to systems? For most large organizations, the need to control physical access to secure facilities and locations is every bit as relevant to a holistic security strategy as controlling access to applications and data. Corporations spend millions of dollars on physical security systems such as CCTV, electronic barriers and identification cards, designed to prevent unauthorized personnel from accessing restricted areas. Typically, these systems are centrally managed by a Corporate Security department, so that individuals are only granted the physical access required of their jobs, and any attempted security breaches can be immediately detected.

You see where I'm going with this, right? Conceptually, physical security represents the most basic IAM use case. One could almost think of buildings as applications and areas within those buildings as fine-grained entitlements, and the paradigm for physical security is identical to that of I.T. security. Once you make that connection, there is no reason why physical security could not be assigned using RBAC. Indeed, most commercial physical security systems already provide for role-based access.

Yet, for some inexplicable reason, physical and logical security are still generally viewed as two separate disciplines, which makes no sense to me. I suspect this is because in most organizations, I.T. Security and Corporate Security are distinct organizations, and there is often very little cross-pollenation between the two. IAM vendors, for their part, clearly haven't felt any pressing need to address this gap, and still focus almost exclusively on managing access to I.T. systems.

Some of the advantages to logical/physical convergence seem glaringly obvious:
  • Reduced TCO achieved by centralized management of all access policies, instead of having to maintain separate systems and processes for logical and physical access.
  • Greater compliance with regulatory mandates such as HSPD-12, GLBA, SOX and FIPS-201.
  • Streamlined offboarding processes, which is particularly important when dealing with a sensitive termination scenario.
  • Smartcards can serve a dual purpose of granting physical access to corporate facilities and providing a second factor of authentication to sensitive systems and data.
Of course, a converged IAM offering would need to be sold in a different manner from traditional IAM solutions. In many cases, IAM products are pitched at the Director/VP level of an I.T. organization, and sometimes at an even lower level than that, particularly when the solution is being acquired to address an urgent tactical need. But only a CIO, CEO, CTO or CCO usually has the appropriate authority over both I.T. and Corporate Security to appreciate the value of a converged offering, and has the political muscle to mandate its implementation. This is yet another reason, as I've argued before, why IAM needs to be positioned more as a corporate governance asset than a back-office I.T. function.

From a technical standpoint, supporting logical/physical convergence should not require a significant re-engineering effort for most IAM vendors. So I find myself asking, why hasn't this happened so far, and who will be the first to address the gap?

Friday, October 28, 2011

Is SPML Really Dead?

Over the past year or two, there has been an intense debate in the IAM community about the future viability of SPML. This began with a blog post by Burton Group's Mark Diodati in February 2010, in which he offered a bearish perspective on the poorly adopted provisioning standard.

I agree with many of the points Mark raised. Yes, SPML is too complex in its current incarnation. In trying to offer a broad range of functionality and account for every conceivable provisioning scenario, it often ends up accomplishing quite the opposite. Accordingly, SPML implementations tend to be extremely product specific, which defeats the entire purpose of a common standard in the first place. He is also correct that SPML implementations often cause performance issues due to the burden they place on connectors, although I would argue that this is less attributable to the standard itself than to how SPML has been implemented by provisioning vendors. And I completely agree with his assertion that enhancing the standard with a common inetorgperson-like user schema such as "SPMLPerson" would not only promote greater convergence with SAML, but would lead to increased adoption in the enterprise.

With that said, SPML works extremely well when implementations avoid connector-heavy and product-specific customizations (I'll come back to that shortly). In my opinion, the lack of SPML adoption isn't just because the standard itself is deficient, but because nobody has quite figured out the best way to implement SPML interfaces.

The most common use case I've encountered for SPML is when a customer wants to integrate their own proprietary UI with a commercial provisioning system. For instance, one of my customers has a sophisticated, AJAX-enabled access request portal built in .NET. It provides all the workflow functionality that one would expect to find in a commercial provisioning system, but was designed simply to manage requests and approvals rather than to automate provisioning. The customer had acquired Sun Identity Manager for provisioning automation, but didn't want to replace their elaborate, business friendly .NET interface with SIM's antiquated UI (and who can blame them?) In this particular instance, SPML was the obvious solution. The .NET application was modified to simply fire off SPML requests to SIM on the backend, thus automating provisioning actions while protecting end users from any visible impact.

In scenarios such as this, where a homegrown access request portal needs to integrate with a provisioning system, SPML works like a charm. But let's face it, this scenario doesn't require SPML specifically. In the use case I just described, Sun Identity Manager could have exposed any old SOAP or REST interface, and the integration would have been exactly the same, providing that the interface exposed the requisite methods.

The true value of a standard such as SPML is realized when providing interoperability with enterprise or cloud applications, and this is where SPML has unequivocally failed to deliver.

Which brings us back to the problem with SPML implementations. Because the standard is so broad and, unlike SAML, fails to define even a minimal schema for user profiles---much less offer a reference implementation---identity vendors have tended to implement variations of SPML that are product specific and are tightly coupled with the connector layer and underlying data stores. The result is that most SPML implementations are ugly and XML-heavy, which in turn can lead to performance issues when performing large numbers of provisioning transactions.

I have yet to see an SaaS application that supports inbound SPML messages in a way that could be called "standard". Likewise, I have yet to see a user provisioning solution that supports SPML in a non-proprietary fashion. As a result, provisioning solutions for cloud applications still depend heavily on individual connectors that all implement different product APIs, which is to say that SaaS applications are currently treated no differently from any other kind of application.

So the question remains, is SPML really dead? Well, in its current incarnation, it's hard to argue that it was ever really alive to begin with. SPML 2.0 is now five years old, and the OASIS Provisioning Services Technical Committee hasn't convened since 2008. But the critical need for interoperable provisioning is now greater than ever, particularly considering the explosive adoption of cloud applications in recent years. The fact that most identity management RFPs include a requirement for SPML confirms that standards-based integration remains a critical priority for organizations. Even as IAM maturation drives us away from traditional user-centric IAM towards more business-centric models, identity management projects still spend far too much time focusing on provisioning, and specifically on the nuances of proprietary connector architectures; this is another argument for a widely adopted provisioning standard that abstracts the complexity of underlying systems and data stores.

If SPML is to survive, then it will have to be reincarnated without any consideration to backwards compatibility. For that to happen, a major player in the identity space (I'm talking about the likes of Oracle, CA or IBM here) needs to step up, take leadership and create some momentum for a new standard.

In the meantime, there are several proposals out there to compensate for the lack of momentum surrounding SPML and the critical needs that it was intended to address. Jim McGovern, never a shrinking violet, has a wish list for what he would expect to see from SPML 3.0, if it ever happens. In 2009, the visionary Nishant Kaushik proposed using OAuth to support federated provisioning needs. The SimpleCloud (SCIM) initiative----which enjoys participation from various cloud providers such as Google, Salesforce as well as identity vendors such as UnboundID and SailPoint----appears to have embraced this concept and taken it to another level. For my money, SCIM is the most promising candidate for a new provisioning standard. Seemingly inspired by the LDAP standard, SCIM also leverages SAML, OAuth 2.0 bearer tokens, OpenID and JWT, while favoring a REST API over less efficient SOAP.

In summary, I tend to agree with Mark Diodati that SPML in its current form is obsolete, but the need for an interoperability standard is now greater than ever. Two lingering questions remain. Will SCIM be the standard we've been waiting for, and what is the value proposition that will compel vendors to embrace it?

Friday, October 7, 2011

A Brief Recap on OpenWorld 2011

Well, my intentions were good, at least. I had intended to blog throughout the week from Oracle OpenWorld, but ended up being so busy (this is the first year that we have had a booth at OOW), I never had time to post anything. The event itself was even bigger than last year, with more than 45,000 attendees and 4,500 Oracle partners.

Although I didn't get to attend any sessions due to other commitments, I'm planning to download and watch the sessions of interest later. In the meantime, here are a few things that we learned this week:
  • Oracle seems committed to nurturing and expanding a flourishing partner network, and appears to have no interest in providing services.
  • Larry Ellison's keynote received, let's just say, a lukewarm reception. The spat between Larry and Salesforce CEO Marc Benioff cast something of a dark shadow over proceedings, especially when the former referred to Salesforce as a "roach motel". Of course, all of this pointless bickering was put into stark perspective when it peaked on the same day that Steve Jobs passed away.
  • The Public Cloud is core to Oracle's vision of providing subscription-based access to business applications, including the Fusion middleware stack.
  • Oracle's first partner preview of Solaris 11 proves that my favorite *nix O/S is still alive and kicking, nearly two years after the Sun acquisition (yay!). Solaris 11 is fully virtualized, contains an impressive range of performance and functionality enhancements
  • The Oracle Big Data Appliance (BDA) is a BFD!
  • From an IAM perspective, not much to report, although admittedly I haven't watched the IAM sessions yet. There doesn't appear to have been much movement in this space since last year. Frankly, given Oracle's size and ambition, IAM appears to be losing some relevance for them. I hope I'm wrong about that.
There is no ambiguity about Larry's Oracle's technology vision. It can be summed up in three simple phrases: Big Data, Cloud and Parallel Processing.

Conference weeks always leave me feeling drained and needing a few days to recover. So I'm spending the weekend in Venice Beach, where I will NOT be thinking about Exadata or anything else to do with technology for the next couple of days.

Have a great weekend everyone!

Monday, October 3, 2011

New Qubera website is now online!!!

It took several months, but we finally got there just in time for OpenWorld. Check out Qubera's new website at

Day One at OpenWorld

Not much to report so far from OpenWorld. Unfortunately, I missed Larry Ellison's keynote last night as I was exhausted after fourteen hours of traveling, helping to set up the Qubera booth and trekking up a hill with my luggage to get to our rented house in the Castro (long story). From what I understand, the keynote was somewhat underwhelming in any case. But woke up this morning refreshed and raring to go. Planning to attend three sessions today, on Oracle Identity Administration, Trends in Identity Management and Identity Administration Management in the Cloud.

Will blog more later in the day.

Sunday, October 2, 2011

Visit us at Oracle OpenWorld

In a few hours, I'm heading out to San Francisco for the Oracle OpenWorld 2011 conference, arguably one of the biggest events in the I.T. calendar. Qubera Solutions will be at booth 2542 in Moscone South. All of us from the Qubera leadership team will be in town. I'm planning to be at the booth from 3-5PM on Monday, and 2-4PM on Wednesday. So if you're attending OpenWorld, please feel free to drop by and say hello. I'll be blogging from the event throughout the week.

Saturday, October 1, 2011

My Top 5 Likes/Dislikes about being an IAM Professional

Recently, a fellow I.T. industry friend of mine was asking me why I chose IAM as a specialty. It got me thinking about how I got into this field, and why I decided to make a career out of it. I'd be lying if I said it was a conscious decision, although at this point in my life, it's difficult to imagine another I.T. discipline that I would find quite as rewarding, or as challenging.

The "getting in" part is easy to explain. IAM found me back in the late 90s when I was working as an enterprise architect for Travelers Insurance. Having worked there for less than a month, I was given an assignment to create a custom Web SSO and directory framework for all of the legacy host applications that were being web-enabled at the time. Back then, there were no serious vendor offerings for Web SSO (at least none that weren't extortionately priced). Even if there were any vendor offerings worthy of consideration, we had a "build, don't buy" I.T. culture in those days. Besides, there were only about a dozen applications and a few thousand external users that would be using it, and nobody was using the term identity management yet.

Immediately, I was hooked. Within two years, what started out as a tactical framework evolved into an enterprise SSO, federated identity, user provisioning, role management and directory virtualization suite (all of which was developed in-house). A dozen applications turned into several hundred, and a few thousand users turned into a couple of million. Before I knew it, I was no longer an individual contributor but was leading a large and extremely talented group dedicated solely to identity and access management. I never anticipated my career panning out the way it did, but I'm pleased it happened, mainly because of all the wonderful experiences I've had working in IAM over the years. Not to mention that I'm still extremely proud of what we accomplished there (it wasn't until several years later that we finally began to start replacing our homegrown suite with vendor solutions).

Anyway, I've decided to put together a short list of the things I love and hate about IAM, and would be interested in hearing from fellow IAM professionals about their own lists...

  1. Working with customers. Although I was always enthusiastic about IAM, I only discovered how passionate I was about it when I moved over to the consulting side of the industry and began working with customers to solve their business problems. Having spent so many years as a customer myself, I can empathize with the internal challenges each of them face, having been there and done that. While at Qubera, I've had the privilege of working with some wonderful and highly talented professionals at customers all over the world. That's an experience I wouldn't swap for anything.
  2. Technological diversity. I can't think of another I.T. discipline that provides one with exposure to such a vast range of technologies and platforms. An IAM project can touch every element of an organization's technology infrastructure, so in order to be successful, you need to demonstrate extraordinary technical breadth. In any given week, for example, I might be working on projects that involve integrating an IAM system with everything from vanilla LDAP directories to healthcare systems to mainframe applications to homegrown web portals to custom client-server apps to mobile devices.
  3. Solving business problems. One of the pillars of our philosophy at Qubera is that IAM is a business enabler rather than just a technology. This is one of the first things we try to impress upon our customers. The sense of accomplishment one gains from solving a complex business problem for a customer and delivering quantifiable business value is an addictive feeling. IAM is one of the few I.T. disciplines where you have the opportunity to have a demonstrably positive impact on an organization's business culture.
  4. Constant change. Although IAM technologies and best practices have matured dramatically over the past few years, IAM itself is still very much in its infancy (or at least in its adolescence, which makes me wonder when it will begin asking for a car). Just staying abreast of all the constantly evolving IAM standards, tools and technologies out there can be a full-time job in itself. As somebody who thrives on change, I think that the day IAM stops evolving will be the day that I get tired with it. Fortunately, there is no prospect of that happening for the foreseeable future; in fact, I believe today is the best time ever to get involved in IAM.
  5. No two projects are the same. Every customer likes to think that their IAM challenges are unique. At a high level, I've never found this to be the case, as the general issues faced by most organizations tend to be extremely common. However, at a more granular level, the characteristics of every IAM project are very unique indeed. Sometimes this is because of the technical landscape or some unusual business processes, and sometimes it's purely because of the personalities involved in the project. Whenever I engage with a new customer, I know for certain that I am going to be faced with new challenges and have new experiences. That keeps things exciting and ensures I stay sharp.
  1. Politics. IAM projects are notorious for being politically contentious, and at some point in every IAM project, politics will rear its ugly head. This is sometimes because of data stewardship or process ownership issues, sometimes because of turf wars, sometimes because project participants fear change, and sometimes because of internal disagreements over strategy and direction. No matter how careful you are to avoid it, political contention on an IAM project is as inevitable as the sunrise.
  2. IAM is unglamorous. If you're looking to make a big name for yourself in I.T., then IAM is probably the wrong field for you. It is extremely unfashionable, and because ROIs often tend to be squishy, it isn't a top budget priority for most organizations. Generally, the only time anybody will care who you are is if you fail, and then everybody will know your name. To some extent, your proficiency in IAM can be measured by your level of anonymity. Fortunately for me, I never wanted to be famous, otherwise I would have chosen to become a movie star instead (/snark).
  3. Delivering bad news. Just like doctors have to learn how to deliver bad news to patients, IAM professionals are constantly having to deliver bad news to executives. Usually, this is because a customer's security exposures turn out to be far more significant and costly than they had believed. Sometimes, it's because their project objectives and timelines are horribly unrealistic. And occasionally, it's because they lack the internal staffing to support an enterprise IAM solution, never mind implement it to begin with. Worst of all is when a customer engages you for a clean-up project after another consulting company has messed up an IAM implementation, and you have to deliver the bad news that the implementation is so bad that you need to rip down everything that your predecessor has done and start again.
  4. Nobody takes IAM seriously until they suffer a serious breach. This one speaks for itself. We've all been there.
  5. All the misunderstandings surrounding what IAM "is" and "isn't". No, it isn't a "product" that simply needs to be installed. No, it isn't just about provisioning users, standing up an LDAP directory or synchronizing passwords. No, it isn't an operational back-office function. And.... well, I'm sure you get my meaning.
If you work in IAM, what are your top likes and gripes?

Decomposing Identity Management Approval Workflows

One of the hallmarks of a poorly conceived identity management solution is a large number of workflows, particularly approval workflows. In one extreme case with which I'm familiar, a large financial services company has a Sun Identity Manager implementation that contains hundreds of individual approval workflows—one workflow for every entitlement, application or role that a user can request. They have a team of IDM engineers whose function is to build new workflows every time a new application needs to be integrated with the identity system. In most cases, this task involves cutting and pasting the XML from other similar workflows and then hardwiring them with application-specific metadata that is provided by a business analyst. Needless to say, this company eventually reached a point where the identity management system began to experience crippling performance issues.

When designing self-service request/approval workflows, it is important not to think in terms of individual applications but in terms of assets and generic approval patterns. An asset can be anything that might be requested; an application, entitlement, role or even an out-of-band asset such as a laptop or cellphone. Each asset, in turn, is logically mapped to an approval pattern. Even in the largest, most complex organizations, the number of approval patterns tends to be extremely small; in most cases no more than five to ten. In fact, one tends to find that most approval patterns are variations of the following:

  • Manager-Only Approval
  • Asynchronous Multi-Level Approval (all approvers must complete in order)
  • Synchronous Partial Approval ('x' of 'y' approvers must complete)
  • Synchronous Full Approval (all approvers must complete in any order)
When one decomposes the major functionality of any approval workflow into its constituent parts, it becomes much easier to visualize:
Figure 1: Decomposition of an IDM Approval Workflow (Click for full-size version)
Let's examine the three major processes illustrated above.
Request Process: This handles an asset request from an end user or an approved designate. Optionally, this process may include generation of a request form, which collects additional, asset-specific information from the requester that is required to support the request (for example, a business case for requesting the asset). 
Approval Process: Once the request has been submitted, an approval workflow is invoked. An asset-to-pattern mapping determines the particular approval pattern to invoke. Optionally, approvers may be required to complete an approval form, in which they will enter additional information that is necessary to provision the asset. 
Execution Process: After all requisite approvals have been completed, the asset needs to be provisioned. This can be done automatically using a provisioning connector, or by generating a ticket to a human provisioner.
All three processes require metadata about the asset. Depending on the capabilities of the identity management system, metadata can be stored in the system itself, or even externalized in an RDBMS system or LDAP directory. Among other elements, metadata may include the fields to render on asset-specific request/approval forms, email templates, escalation settings, delinquency timeouts, instructions for the execution process on what to do when all approvals have been completed, and of course the type of approval workflow pattern to invoke.

This approach allows for the design of highly generic, dynamic and reusable workflow templates, which in turn facilitates the rapid integration of new assets for self-service access requests. The loosely-coupled nature of a pattern-based workflow architecture is also consistent with the governance-based implementation strategy I've described in previous posts, which is based on decoupling automated provisioning from account discovery and asset definition. Using metadata to drive execution of a provisioning action should in most cases allow administrators to "flick a switch" simply by updating the metadata itself once they are ready to turn on automated provisioning for a particular asset.

Of course, the obvious question is what does it take from a technical perspective to build an asset integration framework like this. Obviously, the specific implementation depends very much on your unique business requirements and the capabilities of the identity management system you are using, which is why I've attempted to keep this overview product-agnostic. With that said, this approach is certainly compatible with solutions such as OIM, SIM/Oracle Waveset, Novell IdM, SailPoint IdentityIQ and Quest One Identity Manager.

All of which also raises another advantage of pattern-based workflows. If at any point in the future, you plan to migrate to another identity solution, it is much easier to migrate a tiny handful of workflows than it is to migrate several thousand.

Do I really need the Amazon Kindle Fire?

Okay, I admit it. I'm as much of a geek as the next person, so when I see something cool and new like the Amazon Kindle Fire tablet that was unveiled earlier this week, my first instinct is to say "I want one!" There's no rationale behind it, no logic, just the primal instincts of a geek being drawn to a new bright, shiny object.

But now that the initial excitement has worn off, I have to say that I'm underwhelmed by the specs. No MicroSD slot, a paltry 8Gb RAM, no camera, no mic, no Bluetooth or HDMI support, and no 3G (at least not in the first release). An iPad killer this is not, although I was pleasantly surprised that it will include a micro-USB 2.0 port. Sure, the price point is compelling, and you can't really expect too much for $199, but that still begs the question, what is Amazon's target audience?

Perhaps the Fire is the tablet counterpart to the Google Chromebook. The hardware specs don't need to be impressive, because it is assumed that users will store all of their content in the cloud, reducing the need for local storage. In other words, the Fire is being aimed at a different audience than the iPad, which is designed to be as much of a productivity tool as a media client.

The range of apps available through the Amazon App Store is paltry in comparison to Apple, and Apple is likely to retain a huge advantage in this area for the foreseeable future. Interestingly, the Fire will not be able to access Android Market, although I suspect some enterprising hacker will figure out a jailbreak for that restriction within days of the Fire's release. As with Apple, the Amazon App Store is a walled garden, in which all apps undergo an approval process.

But I suspect that Apple isn't really Amazon's real target here. From a technical standpoint, the Fire doesn't appear to have been conceived as an iPad killer, and there are now so many devices competing in the high-end tablet space where the iPad reigns supreme that it doesn't make sense to add another one to the mix. Besides, Amazon clearly isn't concerned about making a profit on sales of Kindle Fire tablets. The device itself is a loss-leader for them, as the retail price is actually lower than the manufacturing cost, according to this analysis. The real long-term strategy behind the Kindle Fire is to draw new customers into Amazon's media ecosystem and make them dependent upon it. Anybody who purchases a Fire will receive a free one-month membership for Amazon Prime, which provides unlimited two-day shipping and, more importantly, access to a growing library of streaming movies and shows. As Business Week points out:
Amazon Prime may be the most ingenious and effective customer loyalty program in all of e-commerce, if not retail in general. It converts casual shoppers...into Amazon addicts. Analysts describe Prime as one of the main factors driving Amazon's stock price—up 296 percent in the last two years—and the main reason Amazon's sales grew 30 percent during the recession while other retailers flailed. At the same time, Prime has proven exceedingly difficult for rivals to copy: It allows Amazon to exploit its wide selection, low prices, network of third-party merchants, and finely tuned distribution system, while also keying off that faintly irrational human need to maximize the benefits of a club you have already paid to join.
If anybody should be nervous about the Kindle Fire, it should be Netflix and B&N, not Apple. The iPad dominates the higher end of the tablet market, and will likely continue to do so for the foreseeable future. The lower end of the market, however, is still up for grabs, and that is where the Kindle Fire is likely to dominate. Backed by Amazon's leviathan marketing machine, considerable resources and a vast media ecosystem, the Fire is likely to become yet another huge success for Amazon.

But none of this answers the question of whether or not I need one. I already own a first version iPad, which I use mainly for taking notes in meetings, watching movies on planes and as an RDP client for my home network. With the addition of the awesome Clamcase keyboard, my iPad doubles as an extremely effective netbook when I need it to. So when the iPad 2 was released last year, I decided to give it a miss, as it didn't offer any compelling new features that justified an upgrade. Instead, I figured I would wait another year to see what the iPad 3 had to offer, and that is still the plan.

With that said, there is still something.... I don't know.... compelling about the Kindle Fire. I'm a big fan of the 7" form factor for reading e-books, watching movies and browsing web content, and the addition of a MicroUSB port that supports the addition of an external hard drive is a killer feature for a movie buff like me. As a consultant, I spend a lot of time in hotels and on airplanes, where WiFi speeds are often insufficient for streaming media from the cloud, so the ability to carry around my media on a flash drive would be huge (the failure of the iPad2 to provide USB support was one of the major factors in my decision not to buy one). Even so, this one feature alone is not sufficient justification for replacing my iPad.

If I didn't already have an iPad, the Kindle Fire might be a no-brainer for me, but I certainly don't need two tablets. It would be like having both a notebook and a Chromebook, which I just can't see the need for. Still, the Kindle Fire will certainly be a huge success without people like me buying one. If you don't already have a tablet, you can't really go wrong with a $199 device that provides you with access to pretty much all the media content you would ever need.

UPDATE: My colleague Chris over at Technologese adds some great technical insights into Amazon's strategy, and as usual, is right on the money.