Finding suspect links and requirements to reconcile in CLM 6.0 configuration management enabled projects

The newly released CLM 6.0 has some great configuration management across the lifecycle capabilities.  As described in Enabling configuration management in CLM 6.0 applications there are some considerations to be made before enabling configuration management.  One of these is

Do you rely on suspect links for requirements in RM or for requirement reconciliation in QM?

The reason being is that in this release, automatic detection of suspect links (artifacts that need to be evaluated and possibly modified because a linked artifact has changed) is not working in configuration management enabled DOORS Next Generation (DNG) projects. Further, the requirements reconciliation process in Rational Quality Manager (RQM) to find any changed/deleted requirements impacting associated test cases is also not supported in configuration management enabled RQM projects.  Note that a new mechanism for determining link suspicion is intended for a future release, in the interim though, teams must use a workaround.  I’ll explore some of those in this blog.

Suspect Links Workaround

To determine what requirements have changed and which impacted test cases may need updating, we’ll need to look at a test coverage view of requirements filtered by a some modified date.  It’s likely that you’ll want to know which requirements have changed since the last release baseline.

Let’s assume we are working in the AMR Mobile US global configuration in DNG.  Open the Configuration Context Banner Button (CCBB) and navigate to the global configuration.

image

Show the baselines for this global configuration.

image

Observe the most recent baseline is AMR Mobile 3.0 US GA.  Drill down into it and see the baseline was created on May 27, 2015, 12:35:23 AM.

image

Note at this time, baselines for the each of the contributors to the global configuration must be created first before the baseline for the global configuration can be committed/finalized.  This means there likely be some time disparity between the global configuration baseline and the baselines for contributing local configurations (they could be earlier or later than the global configuration creation date).  For this reason, it’d be more accurate to use the local configuration baseline creation time for the filtering.  While looking at the global configuration baseline, select the row with the desired local configuration and observe its creation time on the right.

image

While viewing requirements of interest, likely on a per module by module basis, e.g. AMR System Requirements, open a Test Coverage view and add a Filter by Attribute for Modified on set to be after the baseline date.

image

Here we see there have been four requirements altered, two which have no linked test case, but may need one if they are new, and two that have linked test cases, but may need to be updated.

Now you could look at each individual requirement one by one to understand if they are new or modified.  In this example, opening requirement 1830 shows it has been created since the baseline date.

image

You could also add the Created On attribute to the view columns displayed.

image

This requirement doesn’t have a related test case so now you would evaluate whether one should be created.

Looking at requirement 517, you observe that it was created before the baseline and modified since.  There is a related test case but you need to understand what the change was to better evaluate if it necessitates a change in the test case.

Open the requirement history and get a sense of the changes.

image

Should the changes be such that a reevaluation of the test case is warranted, follow the Validated By links to navigate to the test case(s) and check if they need updating.

To track the potential impact of those substantive requirement changes, you could tag the requirements and create a traceability view to look only at those.

image

Alternatively, create a Tracked By link from the suspect requirement to a new RTC work item task that would be assigned to the test team to evaluate whether any linked test cases should be updated.

image

Now rather than going through each requirement individually, an alternative is to make use of the Compare Configuration capability to compare the current local DNG configuration/stream to the DNG baseline included in the AMR Mobile 3.0 US GA baseline.

image

image

image

The compare results will show additions, changes and deletions to project properties, folders and artifacts.  With this information, the analyst can make a reasonable determination of how substantive each change was.  Armed with this information, they would need to return to the Test Coverage view(s) and tag the appropriate requirements as suspect and/or create RTC work items to analyze the linked test cases.  Note that previously we were looking at requirements changes on a per module basis (if modules were being used) but the Compare Configuration will look at all changes to all artifacts across the stream, that is, consolidating all change and not giving a module perspective.

Now if you were paying attention to my scenario,  you’ll notice that the last screen shot above, showing the results of the compare, doesn’t line up with the Test Coverage view shown earlier as the system requirements that were shown to have changed since the baseline are not shown in the compare results.  No, this isn’t a bug in the software.  I was using a shared internal sandbox whose configuration was changed in between the time when I started writing this blog and the time I tried to capture the compare results.  Rather than trying recreate the scenario, I left things as they were as I think I still get the concept across (though the anal side of me really has a problem with it all not lining up!).

Requirements Reconciliation Workaround

Requirements reconciliation is a capability in RQM that looks at the requirements linked to a test case or the requirement collection linked to a test plan and determine if there are new, updated or deleted requirements that necessitate changes in the test case(s).  In CLM 6.0, requirements reconciliation is not supported in projects that have configuration management enabled.

While you can query for test cases changed since a certain baseline date, this doesn’t really help determine if there are test cases to be updated due to requirements changes.  It’s not possible from RQM to query on test cases and filter based on attributes of the linked requirements.

Thus, the reconciliation process would need to be driven by DNG such that the tester would need to use the same technique used by the analyst for the suspect links workaround.  That is, the tester would look at a test coverage view of requirements in DNG, filtered to show requirements updates since the baseline date and evaluate if a test case addition, deletion or update was warranted.  This process would be further helped if the analysts used tagging as previously described so that that tester wouldn’t need to sift through all the requirements to find only those with substantive changes.  Use of impact analysis tasks in RTC would as well.

Use of these test coverage views would only identify requirements added or changed since the baseline.  It would not list requirements removed.  So for a comprehensive view of requirements changes that need to be reconciled with test cases, RM stream needs to be compared against the baseline to see any requirements that have been deleted.

Conclusion

While it is unfortunate that this initial release of the global configuration management capability doesn’t include support for suspect links and requirements reconciliation, there are some manual workarounds available, while not ideal, can help mitigate the gap until such time such time a replacement is available.  For some customers, such a manual process may be untenable due to the volume of changes between baselines.  Rather than performing the analysis after the fact, perhaps being more proactive about flagging potential impacts from the beginning of and throughout the release is more appropriate.  As requirements get changed, assess then whether there are test case updates needed by tagging the requirement and/or creating an impact analysis task in RTC.  These can be reviewed/refined at different milestones throughout the release.  Again not ideal but does distribute the analysis burden.