torsdag 20 februari 2014

Two-factor authentication for mobile with OATH

Nearly all major vendors of smartphones and tablet platforms, such as Apple, Google, Microsoft and Blackberry, support software tokens to provide an inexpensive yet secure way to perform two-factor authentication. The majority of these vendors leverage OATH HOTP for this service, and the fact that ForgeRock OpenAM supports OATH HOTP, is a good reason to drill into this a little deeper.

Two-factor (or multi-factor authentication) is nothing new and is commonly found in various applications where the basic authentication process of providing just a password, a pin code, or the swipe of a card isn’t enough. The purpose of multi-factor authentication is to lower the probability of the user trying to authenticate to provide false credentials as evidence of his identity. The rationale goes, the more factors, the higher the probability that the user is who he or she claims to be.

In the case of two-factor authentication, the two factors are typically something the user knows (such as the classic password, and some private identity details such as mother’s maiden name or first pet’s name) and something the user has (such as an access card or a cellular phone). A third factor might be something the user is, such as biometric data of the face or fingerprint, or voice characteristics.

Now, OATH (short for Open Authentication) is a collaborative effort of various members of the IT industry to provide a reference architecture for universal strong authentication across all users and devices over all networks. OATH is open and royalty free, for anyone to implement and use. HOTP is short for HMAC based One Time Password. The idea behind OATH is to reduce complexity and lower the cost of ownership by allowing customers to replace proprietary security systems, which often are complex and expensive to maintain.

ForgeRock OpenAM can be configured to support OATH-based HOTP for two-factor authentication. For the technically savvy, who like to try things out, there is an easy to follow guide to configure OpenAM in this way. On a side note, OpenAM can also support TOTP or Time based one time passwords. Both TOTP and HOTP are standards described in RFC 4226 and RFC 6238.

There are a number of use-cases where this added security comes in handy.

  • Self-service password resets
One of the most common problems users encounter is when they forget their passwords and are unable to login to work or to purchase goods and services. The traditional approach to address these issues is to leverage what is called challenge/response questions - a set of predefined or user-defined questions with answers to reset the password. In the light of what people tend to share on social media sites these types of questions typically introduce a weakness in the security perimeter. In fact, the U.S. Federal Financial Institutions Examination Council (FFIEC) and the U.S. Federal Deposit Insurance Corporation (FDIC) strongly cautions financial organizations against adopting authentication methods that use personal information for authentication purposes. With OATH tokens, an easy second factor can be introduced, thereby mitigating some of the associated risks.

  • Accessing cloud applications via single sign-on
    In the new and modern world without boundaries, moving freely between on-premise applications and off-premise cloud applications might cause your company security experts sleepless nights. Introducing a stepped up authentication with an OATH HOTP when accessing cloud apps could be one precaution to make the security experts sleep better.

  • Step up or risk-based authentication
Assume that, from within an application, you are accessing sensitive information or you are logging in using an unknown pattern (for example, from a different network or a new device). Providing a second authentication factor in this case again mitigates some of the risks and OATH is a cost-effective solution to this problem.



To conclude, we have discussed what OATH is, and some of its typical use-cases. OpenAM provides the necessary capabilities to implement a cost effective two-factor (or multi-factor) authentication process that increases user or consumer acceptance for stronger authentication. OATH allows your organization to be compliant and to follow the guidelines set out by FFIEC, FDIC and others, while reducing the risk and implications of identity theft. It does so by offering multiple factors of authentication, allowing users and consumers to more accurately prove that they are who they claim to be.





tisdag 18 februari 2014

OAuth 2.0 Explained and Whats the Business Value?


With the latest release of ForgeRock OpenAM in November 2013, there is a lot of talk about OAuth 2.0 and how OpenAM can act both as an OAuth 2.0 authorization server and as an OAuth 2.0 client for installations where the web resources are protected by OpenAM. As for this article, we’ll focus on OpenAM as an OAuth 2.0 authorization server, and the business value of this implementation.


Assume you rock up at the fancy restaurant with your $200k sports car and hand over the car key to the 18 year old valet tasked to drive it off to the parking lot a block away. Somehow you wish you could restrict the performance of your car, to drive below a certain speed, not allow access to the trunk and glove compartment, not to use the built in communication package to make outgoing calls, and only to travel a certain distance.


The solution to this is of course a valet key, designed specifically for this type of situation. Now, the same situation exists for the modern web. As a user, you might want to share certain resources and capabilities from one service with another - but not the full shebang, nor do you want to provide the credentials for your social media sites to third parties.

This is where OAuth 2.0 comes in. OAuth 2.0 is an open standard for access delegation and authorization without the need to explicitly share password information. OAuth 2.0 allows providers of services to share a “valet key” to a third party, that allows limited access to services, possibly with an attached time constraint. In other words, OAuth 2.0 provides the mechanism for clients to access resources on behalf of the user or owner of those resources. The end user can select or authorize third-party access to resources, without ever sharing any secret credentials such as the username and password.


In simpler terms, OAuth 2.0 empowers people to perform delegated authorization on their own.


The above concept is, of course, nothing new and is seeing more and more application in the modern web, where businesses wish to expose their APIs and share information with third party applications in a secure manner.


In the world of OAuth 2.0 there are four roles that are defined in the standard:


  • Resource Owner
    An entity capable of granting access to a protected resource. When the resource owner is a person, he is referred to as an end user.
  • Resource Server
    The server that hosts the protected resource, capable of accepting and responding to protected resource requests using access tokens. The resource can be anything, for example, photos, in the case of flickr.com.
  • Client
    An application that makes protected resource requests on behalf of the resource owner and with his authorization. The term "client" does not imply any particular implementation characteristics (for example, whether the application executes on a server, a desktop, or other device).
  • Authorization Server
The server that issues access tokens to the client, after successfully authenticating the resource owner and obtaining authorization.


In addition to these four roles, two different types of tokens are defined by the standard:


Access Token : Access tokens are credentials provided by the client to access protected resources.  An access token is a string that represents an authorization issued to the client. Tokens represent specific scopes and durations of access, granted by the resource owner, and enforced by the resource server and authorization server. The access token provides an abstraction layer, replacing different authorization constructs such as traditional credentials (username/password) with a single token that is understood by the resource server.


Refresh Token : Although not mandated by the specification, access tokens ideally have an expiration time that can last anywhere from a few minutes to several hours. Refresh tokens are credentials that are used to obtain access tokens. Refresh tokens are issued to the client by the authorization server and are used to obtain a new access token when the current access token becomes invalid or expires


OpenAM exposes a RESTful API that allows administrators to read, list, and delete OAuth 2.0 tokens. OAuth 2.0 clients can also manage their own tokens.


The Business Value of OAuth 2.0



Now that we have a rough idea of the purpose of OAuth 2.0, let’s look at how it fits the business puzzle and why it can be important to you and your organization.


More and more frequently, organizations are exposing their own APIs to attract third party developers who can take advantage of the organization’s services to build apps and widgets, with the ultimate motive of driving top line revenue. Companies like Amazon, Netflix, and even Sears do this to attract intermediaries who can connect buyers with themselves as sellers, and create convenient ways to transact.  


For this strategy to work, however, a certain level of trust must be established between the parties involved. Identity is an important piece of this puzzle and it is imperative that organizations secure the exposure of services via APIs. OAuth 2.0 is a component of any Identity Relationship Management (IRM) platform, and serves to address these security issues and provide a convenient way to deal with authorization.


From a developer perspective (described by Ryan Boyd at Google), before the dawn of OAuth 2.0 inside Google, developers spent 80% of their time dealing with authorization in new applications. The ability to deal with authorization easier and in a standardized way, cuts costs, saves money and brings applications to market quicker.
 


Further Reading

onsdag 12 februari 2014

Bridging Enterprise and Cloud in the shadow of Mr. Snowden

As the Software Industry and the way we procure enterprise software shifts from on-premise with proprietary licensing to off-premise with a subscription, there is an emerging need to control this hybrid environment being created. For small startups with less than a hundred people on the payroll, its natural to leverage cloud-services for file storage, customer relationship management, mail and calendar etc, but if you look at older and/or bigger companies, often there is a mix of on-premise and off-premise deployments. Despite what the industry hype says, not everybody is in The Cloud, but many are taking trembling steps to explore this model.


Now, the problem that emerges from a hybrid environment is of course the issue of compliance and the regulatory concerns associated with shuffling identity data to SaaS providers, as well as the actual “shuffling” itself. It needs to be secure, in sync with any potential authoritative sources, and the appropriate logs need to be kept.
Whenever an enterprise outsources sensitive business information, such as personal information about employees, contractors and partners to a SaaS vendor in the Cloud, there is a risk of data privacy issues and access control complexities. Not to mention the risk of foreign governments accessing the information for reasons other than national security. Especially in light of the recent interview with Mr. Snowden, the NSA whistleblower, where he states that such information would be collected in the name of national interest, regarding the topic of whether or not the NSA is spying on Siemens, Mercedes or other successful German companies.




For companies and organizations embracing Cloud Services, it is critical that they not only enforce and monitor controls, but also make sure that users are who they claim to be and ensure that the appropriate entitlements within the SaaS application are managed. Looking at this from the other side of the fence, the SaaS vendors want customers to quickly and securely be on-boarded and provide some of the capabilities the above concern raises. This is where the need for Identity Bridges arise or IDM systems that can manage both traditional enterprise applications and new SaaS applications.


Often in the hybrid environment, a local, on-premise authoritative source for employee data is present and managed by e.g. the Human Resources department. Alternatively, there might be a directory of some sort such as an LDAP-directory or Active Directory from Microsoft. In these cases its convenient if Cloud-Services are just seen as additions to their current infrastructure and do not require additional management, such as juggling with exported CSV files from an HR system that needs to be uploaded or worse, have to be manually onboarded/off-boarded to a Cloud Service.


However, Jonathan Lehr predicts that Identity and Entitlements will move beyond the active directory paradigm as we know it today. Especially with tools such as Workday already storing employee data in the Cloud.


ForgeRock OpenIDM provides an extensive integration layer that spans across both on-premise enterprise applications and systems for cloud based SaaS solutions, such as Salesforce and Google Apps. This allows an enterprise to control the provisioning flow and ensure that entitlements are set according to applicable policies. Identities can be kept in sync using OpenIDM’s discovery engine, reconciling and synchronizing down to the attribute level, yet at the same time record the provisioning activities for monitoring, regulatory and compliance reasons.


Now of course managing the boundaryless identities is one thing, but still there are fears, especially among Europeans, that data being stored in the Cloud could be vulnerable to foreign surveillance. Sensitive data flows quickly between servers, possibly in multiple countries, making it very complex to regulate and protect the data - but at least with the appropriate tools proper Identity Management can be done.