tisdag 22 maj 2018

A emailNotification workflow revisit in IDM 6.0

Workflow continues to be a topic that intrigues people i meet in the field deploying ForgeRock IDM.

This blog post will illustrate out to configure IDM 6.0 to enable the embedded workflow engine, discuss some tools that are available and also build and deploy a simple workflow that can be deployed that sends a simple email Notification.

First off, the workflow module is no longer enabled by default and requires you to configure IDM 6.0 to enable it. By default IDM 6.0 gets deployed using an embedded ForgeRock Directory Server and since the workflow engine can't persist workflow and business process state in the DS, it needs a separate RDBMS. For this purpose, the in-memory database H2 is still embedded and automatically gets utilized for this purpose.

IDM 6.0 does support using a external DS as repo, even in production, but if you are considering using the Workflow module in production, you should also ensure you have a supported RDBMS up and running to manage the persisting of workflows.

The Activiti implementation on IDM 6.0 is unfortunately based on an older version of Activiti and the documentation ForgeRock points out on Alfresco/Activiti's website reference the Activiti Desinger for Eclipise, which is no longer available. Community members have filed a request to get binaries back but so far only the source code is available and should you require this plugin, you need to build it from scratch.

The URL to access the Activiti Desinger is available on GitHub at https://github.com/Activiti/Activiti-Designer.

Despite the Activiti engine not being a par with the latest and greatest from Alfresco/Activiti it is still one of the most important and widely deployed components for ForgeRock IDM.

This Blog post deals with the topic of configuring IDM 6.0 to enable the Workflow engine and to build out a simple email Notification that connects to an SMTP and sends an email. A Launch form will be used where the business process invoking user is able to provide parameters to the workflow.

To follow this little exercise i'm assuming some prerequisites.

1.) ForgeRock IDM 6.0 is installed, up and running. 
Latest version should be available from http://backstage.forgerock.com

2.) You have a Fake SMTP server up and running on localhost. 
You just need a fake service that will accept emails. I would recommend getting FakeSMTP from http://nilhcem.com/FakeSMTP/download.html, install and start this.

3.) That a BPMN 2.0 Editor is installed and can be leveraged. 

Should you require a BPMN 2.0 editor i could recommend the Yaoqiang BPMN Editor that has some nice features and is easy to work with. Yaoqiang BPMN Editor can be found and downloaded for free from https://sourceforge.net/projects/bpmn/files/latest/download?source=typ_redirect

The steps we will do are:

1.) Enable and configure IDM 6.0 to enable the workflow engine.
2.) Create an emailNotification workflow
3.) Deploy and test the workflow in IDM 6.0


Enable and configure IDM 6.0 to enable the workflow engine. 
IDM 6.0 is up and running. Log in as the openidm-admin user and orient yourself via the Admin UI to System Preferences and enable workflow.




This creates two json config files that you can study.

1.  $OPENIDM/conf/workflow.json (This is the location where IDM picks up deployed workflows in .bar file format or .xml format)


{
    "useDataSource" : "default",
    "workflowDirectory" : "&{idm.instance.dir}/workflow"
}

2.  $OPENIDM/conf/datasource.jdbc-default.json (DB store for Workflow)


{
    "driverClass" : "org.h2.Driver",
    "jdbcUrl" : "jdbc:h2:file:&{idm.install.dir}/db/activiti/database;MVCC=FALSE;DB_CLOSE_DELAY=0",
    "databaseName" : "activiti",
    "username" : "sa",
    "password" : {
        "$crypto" : {
            "type" : "x-simple-encryption",
            "value" : {
                "cipher" : "AES/CBC/PKCS5Padding",
                "salt" : "XXXXXXXXXXXXXXXXX==",
                "data" : "XXXXXXXXXXXXXXXXX==",
                "iv" : "XXXXXXXXXXXXXXXXX==",
                "key" : "openidm-sym-default",
                "mac" : "XXXXXXXXXXXXXXXXX"
            }
        }
    },
    "connectionTimeout" : 30000,
    "connectionPool" : {
        "type" : "hikari",
        "minimumIdle" : 1,
        "maximumPoolSize" : 5
    }
}


Now create $OPENIDM/workflow directory that is being referred to in the $OPENIDM/conf/workflow.json.

The workflow engine is now enabled and the requirements there for you to be able to deploy workflows. 


Create an emailNotification workflow
Now lets create our simple emailNotification workflow. Purpose of this simple process is to provide a launchform that allows the invoking user to provide some parameters. In this simplistic illustration we will offer the user to provide a toEmail parameter. The rest of the data will be static and hard code. Should you want to you can at your own pace experiment with this sample process to expand on the launch form and the email task to include a more dynamic behavior. 




To create this workflow you need a Start Event, a Service Task and an End Event. If you are using the Yaoqiang BPMN Editor you can just drag these activities out and connect the transitions from Start to Service Task to End.

Rename the Service Task to Email Notification.

1.

2.

3.


Now you need to implement a Launch form. We do this in the Service Task that we have renamed Email Notification. Switch to "Source" mode and insert the necessary XML code to provide input via a form field. Our simple form will just provide a text field that takes input for the variable toEmail.

<startEvent id="startevent1" isInterrupting="true" name="Start" parallelMultiple="false">
      <extensionElements>
        <activiti:formProperty id="toEmail" name="To Email:" variable="toEmail" writable="true"/>
      </extensionElements>
      <outgoing>flow1</outgoing>
      <outputSet/>
    </startEvent>

Next up we want to set up the Service Task to send emails and define some the required parameters with some static values. 

<serviceTask activiti:type="mail" completionQuantity="1" id="mailtask1" implementation="##WebService" isForCompensation="false" name="Email Notification" startQuantity="1">
      <extensionElements>
        <activiti:field expression="${toEmail}" name="to"/>
        <activiti:field expression="no-reply@forgerock.com" name="from"/>
        <activiti:field name="text">
          <field>
            <activiti:string>
              <string><![CDATA[Here is a simple Email Notification from ForgeRock IDM.]]></string>
            </activiti:string>
          </field>
        </activiti:field>
        <activiti:field name="subject">
          <field>
            <activiti:string>
              <string><![CDATA[Simple Email Notification]]></string>
            </activiti:string>
          </field>
        </activiti:field>
      </extensionElements>
      <incoming>flow1</incoming>
      <outgoing>flow2</outgoing>

    </serviceTask>

Now we have fully implemented the necessary components for this exercise and we are ready to deploy and test the workflow. 

Save the file somewhere temporarily and then manually copy the file to $OPENIDM/workflow/.

Make sure the FakeSMTP is up and running. 

Login to the self-service interface of IDM 6.0 and you will discovered that the emailNotification workflow is available at the bottom of the dashboard. Expand details and take it for a spin!



If everything is correctly assembled, you should be getting an email to the specified email address you defined in the launch form when invoking the process. 




-=[ THE END ]=-






Appendix - The actual BPMN definition. 

<?xml version="1.0" encoding="UTF-8"?>
<definitions
 xmlns="http://www.omg.org/spec/BPMN/20100524/MODEL"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xmlns:activiti="http://activiti.org/bpmn"
 xmlns:bpmndi="http://www.omg.org/spec/BPMN/20100524/DI"
 xmlns:omgdc="http://www.omg.org/spec/DD/20100524/DC"
 xmlns:omgdi="http://www.omg.org/spec/DD/20100524/DI"
 typeLanguage="http://www.w3.org/2001/XMLSchema"
 expressionLanguage="http://www.w3.org/1999/XPath"
 targetNamespace="http://www.activiti.org/test">
 <process id="EmailNotification" name="emailNotification">
   <documentation>Simple Email Notification Task</documentation>
  
   <startEvent id="startevent1" name="Start">
   
   <extensionElements>
    <activiti:formProperty name="To Email:" id="toEmail" variable="toEmail" writable="true" />
  </extensionElements>
   
   </startEvent>
   <sequenceFlow id="flow1" name="" sourceRef="startevent1"
     targetRef="mailtask1"></sequenceFlow>
   <endEvent id="endevent1" name="End"></endEvent>
   <sequenceFlow id="flow2" name="" sourceRef="mailtask1"
     targetRef="endevent1"></sequenceFlow>
   <serviceTask id="mailtask1" name="Email Notification"
     activiti:type="mail">
     <extensionElements>
       <activiti:field name="to" expression="${toEmail}"></activiti:field>
       <activiti:field name="from" expression="no-reply@forgerock.com"></activiti:field>
        <activiti:field name="text">
          <activiti:string><![CDATA[Here is a simple Email Notification from ForgeRock IDM.]]></activiti:string>
        </activiti:field>
        <activiti:field name="subject">
          <activiti:string><![CDATA[Simple Email Notification]]></activiti:string>
        </activiti:field>
      </extensionElements>
   </serviceTask>
 </process>
 <bpmndi:BPMNDiagram id="BPMNDiagram_EmailNotification">
   <bpmndi:BPMNPlane bpmnElement="EmailNotification"
     id="BPMNPlane_EmailNotification">
     <bpmndi:BPMNShape bpmnElement="startevent1" id="BPMNShape_startevent1">
       <omgdc:Bounds height="35" width="35" x="170" y="250"></omgdc:Bounds>
     </bpmndi:BPMNShape>
     <bpmndi:BPMNShape bpmnElement="endevent1" id="BPMNShape_endevent1">
       <omgdc:Bounds height="35" width="35" x="410" y="250"></omgdc:Bounds>
     </bpmndi:BPMNShape>
     <bpmndi:BPMNShape bpmnElement="mailtask1" id="BPMNShape_mailtask1">
       <omgdc:Bounds height="55" width="105" x="250" y="240"></omgdc:Bounds>
     </bpmndi:BPMNShape>
     <bpmndi:BPMNEdge bpmnElement="flow1" id="BPMNEdge_flow1">
       <omgdi:waypoint x="205" y="267"></omgdi:waypoint>
       <omgdi:waypoint x="250" y="267"></omgdi:waypoint>
     </bpmndi:BPMNEdge>
     <bpmndi:BPMNEdge bpmnElement="flow2" id="BPMNEdge_flow2">
       <omgdi:waypoint x="355" y="267"></omgdi:waypoint>
       <omgdi:waypoint x="410" y="267"></omgdi:waypoint>
     </bpmndi:BPMNEdge>
   </bpmndi:BPMNPlane>
 </bpmndi:BPMNDiagram>

</definitions>

onsdag 2 maj 2018

Allowing GDPR change the world into a better place

ForgeRock is in the business of providing the necessary enablers to safeguard consumers personal identity information, yet allowing companies to customize and tailor the user experience and establish and maintain trusted digital relationships with technologies such as progressive profiling during self-service registration and thru out each login.

In the light of all the recent scandals, its it an absolute must for companies to implement state of the art software technology to ensure that the data trusted with companies is safe, secure and declared what the intent behind the collection of this data is all about.

How many of you read the user agreements that big companies forces you to accept for services that you utilises? This could be companies like Spotify, Apple or Facebook.

Often we are talking about 35-40 pages of lawyer-talk that is anything from simple to read and would require most people, young and old, to use magnifying glasses since the font size is so small that its impossible to make any sense of it. Many of these agreements often signs away liability and the right to the photos of your children, the latest barbecue with friends or collected metrics on how fast you ran 5k's over the weekend to the companies behind these services.

In many of the agreement you give away the right to file lawsuits against these companies or engage in any form of class act lawsuits should there be a breach of trust or misuse of the data you have entrusted these companies with.

Cambridge Analytica managed to access not only the once who signed up for the "This is your digital life" application but also their friends connected to them on Facebook. They managed to harvest personal information of possibly somewhere in the range of 50 million identities when the first reports came back in March, and use that to target political campaigns and adds in the United States and Indonesia. Who knows where else these type of activities have been going on with collected data from unknowing users?

As i was reading last night the number of users caught up in the Cambridge Analytica scandal that kicked off the latest round of scrutiny into the world of social networking - now stands at a staggering 87 million. As for the market cap of Facebook, it shed 22 percent of its stock and lost over $100 billion in market cap since the Cambridge Analytica revelations and with it also affected other prominent technology stocks along with it.

The other day i got an email from TaskRabbit, who apparently had a major data breach where personal identity information had been leaked. I must have signed up long tie ago out of curiosity and totally forgot about it - yet trusted them with pockets of my personal digital identity information, which is now in the hands of third parties.

Is private information safe with companies such as Facebook? What have we actually agreed to share and with whom and for what purpose?

Your personal identity information is your asset and companies profit on that. Everywhere you go, you leave an exhaust of personal information. Your smartphone tracks every step you go, every place you visit combine that with wearables such as Fitbit and smart watches where vital body metrics get captured and sent off.

I would argue that no consumer knows what their data is used for and with whom its shared and trust in these companies are dropping everyday.

Currently there is a stand-off happening, but fortunately things are about to change, atleast for european consumers with the enforcement of GDRP (the General Data Protection Regulation) on May 25th, 2018. Hopefully this will stir up this stand of and give control back to consumers.

GDRP is a great reminder to businesses that people lend their information and that organizations have a responsibility to look after it. It is not just about confidentiality, it is also about integrity, accuracy and availability, all in all being summarized to what should be and in my ears, is good business practice. You want to do your business with companies that can be trusted - not the sketchy ones that sells your data or shares it with third party or hides what they collect about you.

Companies have purposefully harvested information from their own staff and their customers and included in small print and lawyer talk, their right to reuse that information for purposes not originally intended - this is of course NOT ok!

With all these scandals and breaches of trust occurring back to back on a more or less daily basis, awareness is building among consumers. People are in fact concerned about how their data is being used, that their data is secure and that they have the ability to review what is known about them and that they can choose to opt out as shown by a recent survey published in the Economist.


  • 9 out of 10 are concerned about identity theft and fraud!
  • 89% are worried abou tthat their data is not kept securely by the providers.
  • 3 out of 4 are concerned that small invasions of privacy leads to loss of civil rights. 
  • 89% are worried their data is being shared with third parties that they have not consented to share with. 


Consumers demand disclosure from companies on how and why their data is being collected, what its being used for and for what purpose.

Now, even if companies must adapt new processes and implement better controls which can be costly, it is a great opportunity. An opportunity to long term establish healthy and sound, trusted digital relationships with their customers.


  • Clear, transparent and accessible information on how you process personal data will lead to public confidence in your organization. (People will tend to trust you more and also be more forgiving)  
  • Data volume reduction - as GDRP is being enforced, there will be a drive to reduce the data volumes which will turn into vastly reduced cost and operational inefficiencies associated with keeping masses of redundant and obsolete emails and files on corporate servers - in the cloud or on prem. 
  • Data quality - we all know data ages very quickly, even records that are months old can be completely out of date and storing and sifting through this mass of unstructured data mess wastes resource time and storage space. GDPR ensures information is only kept as long as it is valid and for the purpose it was gathered. 
  • Security - With data breaches hitting the headlines daily, GDRP will ensure you must adopt better policies with Data under management - benefiting both your reputation and your endu users' data.

    and finally....
  • Trust - As companies adopt better data policies to compy with GDRP, the overall trust level between companies dealing with each others' information will rise. 



So despite what we typically think about regulatory requrements, i personally believe that GDPR will bring something good to the table. Both for consumers who will enjoy a better protection knowing that its a legal requirement to manage their personal identity data safe, secure and with integrity but also from a corporate point of view who wants to act and be seen as serious companies that deal with their customers with out most respect. View GDPR as an opportunity to change the world into a better place!

måndag 11 september 2017

Delegated Administration and ForgeRock Identity Management

What is Delegated Administration?
Through delegation of administration, a directory or application services infrastructure, such as ForgeRock Identity Management, can be designed to span multiple and / or non-structured organizations that have unique management requirements. In particular, the delegation of administration in resources can help organizations meet specific requirements for structural and operational independence.


Why do people need Delegated Administration?
People generally look to delegate administration (portions or full administration) for 3 reasons:
  1. Organizational Structure: Understanding and control over resources are bound to a structure defined by an organization. They need the ability to participate in shared resources, while still maintaining independence in the decision-making process.
  2. Legal Requirements: Configuring and maintaining the ability to operate in a manner consistent with regulatory (or other legal) requirements that may restrict access or activity (government, defense or financial institutions, for example).
  3. Operational Requirements: Today's applications and services leverage structured constraints based on metadata (attributes) for configuration, availability or security. This is not uncommon in hosting or outward facing scenarios


Foundational components providing Delegated Administration
  1. Organizational Units: Provides a way to scope and group objects
    1. Managed Objects (including. roles, users, orgs, devices)
    2. Policies (password policies)
    3. Connectors
    4. Workflows (associate workflows to organizations)
  2. Authorization Layer: Granular entitlement on what can be done thru the RESTful layer on what objects and organizations
  3. Ability to group entitlements into assignable Administrative Roles


What do people do in OpenIDM today?

Today, there is no formal model for delegated administration; however, users can define their own model and methods to accomplish basic delegation tasks. This is often unsupported (custom code) and not repeatable - making any “extended” or “custom endpoint” solution untenable in the long run.


Ref: Blog post from Simon Moffat


A better solution

With the addition of Hub City Media’s IDA module, delegated administration capabilities are provided by ForgeRock Identity Management and previous versions of OpenIDM (4.x) in a seamless and intuitive way. It allows administrators to deploy in such a way that administrative tasks can be fully delegated based upon a set of predefined conditions and attribute values.


Installation of Hub City Media’s IDA is simple and straightforward. The IDM administrator runs an installation script on a single or clustered IDM server and the installer handles deploying the required endpoints and user interface.


Once installed, the IDA administrative role can be assigned to a user responsible for configuring the system. Configuration involves creating Delegated Administration Policies that control who the delegated administrators are, who they can administer, and what specific operations are allowed.

Delegated Administration Policies

The power of IDA is in the policy framework. Clients can implement multiple Delegated Administration Policies to satisfy various requirements from the same system. Delegated Administration Policies contain:
  1. A Source Rule: a boolean expression based on user and relationship attributes. Any user that matches the Source Rule is an administrator in this policy.
  2. A Target Rule: any user that matches the Target Rule can be administered by any user who matches the Source Rule
  3. A Permissions Schema: defines what operations (create, delete, enable, disable) and identity data the administrator can perform on the target users. This permission schema defines very fine-grained control over each field.


These policies are configured from the user interface and there is no coding required.


User Interface and APIs

Once the policy is defined, authenticated users can access the IDA user interface and administer users based on those policies. The delegated administrator can only modify users they are allowed to, given the active policy, and they can only edit attributes defined in the policy.


Like other components of the Forgerock Platform, HCM has exposed the functionality of IDA as a set of REST endpoints. If customers want to use their own UI, they can do so while getting the benefit of IDA’s policy enforcement.


Conclusion


ForgeRock Identity Management is the ideal component to implement and solve complex identity management problems whether external-facing or internal-facing problems due to its flexibility. With the addition of Hub City Medias IDA component on top of ForgeRock Identity Management, organizations can easily structure and administer a large amount of identities and better adapt to internal organizational structures and legal requirements. Reach out to ForgeRock or Hub City Media to discuss how ForgeRock Identity Management and HCM IDA can be part of your Identity Management infrastructure or to request a demonstration.

måndag 28 augusti 2017

DevOps for ForgeRock Identity Management - The Configuration Management Story

Enterprise software packages quickly become a struggle to install, configure and maintain. Complex products often required multiple components in different tiers, and initial configurations to deployed on often several instances of the same server. For a production environment, you are looking at high availability and hardening the products during installation. A job that quickly becomes tedious if not impossible to do. Once an enterprise software product is deployed and installed it is common practice to keep it static for months or even years without changing its configuration because of the complex process of testing and putting a new release into production


Caring and feeding multiple servers requires an automated way of doing this. The term is Configuration management and is a common aspect of DevOps methodologies.

As of ForgeRock Identity Management 5.0, pre-made containerized images are available along with a DevOps guide (For those customers and partners with access to backstage) to aid in deploying using DevOps strategies. The samples and guide utilize components such as Kubernetes, Docker and Amazon EC2 Container Service. This article, however, offers some alternative thoughts and software suggestions that might complement or offer an alternate strategy.

CFengine was one of the first configuration management systems that were deployed in anything approaching widespread use and was followed later by Puppet and Chef. A bit over two years ago, Salt Stack‘s “Salt” entered the market, and took a radically different approach to the problem of “configure all of my servers to do X.”


Many of these Configuration management solutions sits on top of SSH to perform a remote execution. SSH being the de facto standard for secure and encrypted network traffic but with the drawback of being rather computation expensive - other solutions have proprietary protocols with agents deployed or leverage HTTPS.

The goal is to get ForgeRock Identity Management to install in an automated fashion with the required components configured and ready.


Some of the reasons why you would want to have an automated installation are because it is great for quickly setting up environments for development, testing and trying out different aspects and versions of a product. Troubleshooting configurations, collaborative build with support for cloud deployments. A consistent approach that is quick to setup and quick to tear down.


Efforts put into setting up the automated installation will provide a lot of leverage down the road, reaching the goal of a fully hands-off deployment and installation.


Tools and frameworks



Chef

Chef is a company & configuration management tool written in Ruby and Erlang. It uses a pure-Ruby, domain-specific language (DSL) for writing system configuration "recipes". Chef is used to streamline the task of configuring and maintaining a company's servers, and can integrate with cloud-based platforms such as Rackspace, Internap, Amazon EC2, Google Cloud Platform, OpenStack, SoftLayer, and Microsoft Azure to automatically provision and configure new machines. Chef contains solutions for both small and large scale systems, with features and pricing for the respective ranges.


Chef uses a master-agent setup, and in addition to a master server, a Chef installation also requires a workstation to control the master. The agents can be installed from the workstation using the ‘knife’ tool that uses SSH for deployment to ease installation. Chef configs are packaged into JSON files called ‘recipes’, and the software can run in client-server (called Chef-server) or standalone mode (called ‘Chef-solo’).




Puppet

Puppet is an open-source configuration management utility. It runs on many Unix-like systems as well as on Microsoft Windows, and includes its own declarative language to describe system configuration.



Ansible/Vagrant

Ansible is a free software platform for configuring and managing computers. It combines multi-node software deployment, ad hoc task execution, and configuration management. It manages nodes over SSH or PowerShell and requires Python (2.4 or later) to be installed on them. Modules work over JSON and standard output and can be written in any programming language. The system uses YAML to express reusable descriptions of systems.


Vagrant is computer software that creates and configures virtual development environments. It can be seen as a higher-level wrapper around virtualization software such as VirtualBox, VMware, KVM and Linux Containers (LXC), and around configuration management software such as Ansible, Chef, Salt and Puppet.




CFEngine

CFEngine is an open-source configuration management system, written by Mark Burgess. Its primary function is to provide automated configuration and maintenance of large-scale computer systems, including the unified management of servers, desktops, consumer and industrial devices, embedded networked devices, mobile smartphones, and tablet computers. CFEngine is written in C and claims to be the fastest and leanest solution on the market for Configuration Management.


Salt

Salt leverages the ZeroMQ message bus, a lightweight library that serves as a concurrency framework. It establishes persistent TCP connections between the Salt master and the various clients, over which communication takes place. Messages are serialized using msgpack, (a more lightweight serialization protocol than JSON or Protocol Buffers), resulting in severe speed and bandwidth gains over traditional transport layers, resulting in in the ability to fit far more data quickly through a given pipe. This translates into a non-technical statement of, “Salt establishes a persistent data pipe between servers in your environment that’s extremely fast and low-bandwidth.”


Salt also has a Vigrant add-on allowing you to spawn up virtual machines similar as to the Ansible/Vigrant combination discussed above.


Software Version Control and Management

ForgeRock Identity Management supports to be started pointing to a particular folder for its configuration files. This means for example that the software can be installed for instance under /opt/openidm and configuration files can be stored under /etc/openidm. ForgeRock Identity Management can then be started with the argument -p <directory of conf files>. This separates the default files from the customer specific ones, provides easier upgrades and a better overview.


To keep track of configuration files during development, test and QA as well as the finished production artifacts a software versioning control system is highly recommended. Software version control is the practice of deploying consistent software versions. This practice improves the chance for validation and testing and limits the amount of software defects and interoperability issues. The Software Version Control system should be integrated with the configuration management system to pull the appropriate branch. This pattern makes deployment easier, faster, less complex and easier to support.


Configuration Upgrade Procedures

The Upgrade procedure for ForgeRock Identity Management helps to ensure that the process of lifting ForgeRock Identity Management from one version to a newer one occurs smoothly and with minimal downtown. With the help of a configuration management system such as those listed and discussed above, assists in testing out the procedure. Since ForgeRock Identity Management acts as a hub component in the infrastructure, testing out integrations is also important during these upgrades and would, therefore, require in-depth planning.


Once the ForgeRock Identity Management upgrade procedure is defined and validated, the upgrade procedure should be referenced in all change documentation appropriate to the particular upgrade. The ForgeRock Identity Management Release Notes always contains valuable clues to important changes and information about a particular release.


Some discovered challenges



  • EULA Acceptance (You need to accept license during installation)
  • Download production binaries from Backstage requires login.
  • OpenIDM servers are unable to start if the OpenIDM JDBC repository is unavailable. Therefore, it is imperative that the JDBC repository is up and running before you attempt to start OpenIDM in a DevOps deployment.
  • Clustered OpenIDM servers are not removed from the cluster node list when they are brought down or when they fail. In elastic deployments, with servers frequently added to and removed from clusters, the cluster node list can grow to be quite large.


Conclusions



When you have the infrastructure to proactively manage change using a configuration management system, the fast pace of development no longer leaves you behind. Being able to quickly setup and deploy a new ForgeRock Identity Management system allows you to scale quickly in a known and repeatable manner as well as allows you to test out new features and capabilities and upgrades. Leveraging a DevOps philosophy and a configuration management system, although heavy initially in setting up, is an opportunity to get ahead.


Although there is an abundance of tools available to support your configuration management needs, there are available sample cookbooks or recipes to get ForgeRock Identity Management deployed and configured on the forgerock.org website. Picking one of these as a starting point will save time and effort.


References