Code Guard app

The Code Guard app reports on used standard vs. custom code, used vs. unused custom code, used and unused code by Application Area, and used custom code quality.

A user with LiveCompare Editor permissions must prepare this app by running the Create Object Links Cache workflow for the ‘Analysis System’ RFC Destination, and making sure that performance history data is available for the ‘Performance History System’ RFC Destination. See the Prerequisites section below for details.

DevOps categories

Development, Operations.

Parallel impact analysis

You can run the Code Guard app in in parallel with other impact analysis apps and workflows.

Prerequisites

Before running the Code Guard app for the first time, you will need to run the Create Object Links Cache workflow from the Prerequisites package to create an object links cache database for the system to be analyzed. Check the system’s RFC Destination in the LiveCompare hierarchy first, and select the OLC tab to verify whether an object links cache database has already been created. The Create Object Links Cache workflow may be run incrementally to update the object links cache database with any recent object dependency changes.

You will also need to make sure that performance history data is available on the RFC Destination selected for the ‘Performance History’ system. Select the RFC Destination in the LiveCompare hierarchy and click the PHD tab. Select the source for performance history data, set a collection schedule, click Save, and then click Update Cache. See the Retrieve performance history data help topic for details.

Run the app

To run the Code Guard app, select the app from the Apps screen and create an app variant. Set the Analysis System field to the RFC Destination for the system to be analyzed, and the Performance History field to the RFC Destination for the system from which performance history data has been obtained. Edit the AAQ Select List parameter if required to specify the quality scenarios, categories and rules to be used in the analysis.

Click Run. When the variant has completed, its results may be accessed from the App Cockpit screen.

App results

The Code Guard app generates a Dashboard report which includes the following charts:

  • The Used Standard vs Used Custom pie chart shows the number of used standard and custom objects in the performance history data retrieved from the Performance History system.

  • The Used vs Unused Custom Code pie chart shows the number of objects with used custom code, and the number of objects with unused custom code, according to the performance history data retrieved from the Performance History system.

  • The Used by Application Area pie chart shows the number of used objects grouped by Application Area. It shows the top three Application Areas and groups the others into the ‘Other’ category.

  • The Unused by Application Area pie chart shows the number of used objects grouped by Application Area. It shows the top three Application Areas and groups the others into the ‘Other’ category.

  • The ABAP Failures column chart summarizes the rules triggered in the ‘ABAP failures’ category.

  • The Object Quality Scorecard column chart lists each of the ABAP quality categories. The chart displays a High, Medium and Low score for each category, based on the following criteria:

Category Score Description Chart color
Complexity Low 0 to 10 executable pathways. Light blue.
Complexity Medium 11 to 15 executable pathways. Medium blue.
Complexity High 16 or more executable pathways. Dark blue.
Fan-In Low Fan-In value <=5. Light blue.
Fan-In Medium Fan-In value > 5. Medium blue.
Fan-Out Low Fan-Out value <=5. Light blue.
Fan-Out Medium Fan-Out value > 5. Medium blue.
Other Categories Low Passed all rules checked. Light blue.
Other Categories Medium Passed 80% to 99% of rules checked. Medium blue.
Other Categories High Passed less than 80% of rules checked. Dark blue.

The Dashboard report also includes links to the following reports:

Code Guard Details

This Excel report includes the following spreadsheets:

Dashboard

This spreadsheet includes the Dashboard reports described here.

Used

This spreadsheet provides details for the objects that were used on the Performance History System according to the available performance history data.

Unused

This spreadsheet provides details for the custom objects that were unused on the Performance History System according to the available performance history data.

Quality Scorecard

This spreadsheet provides a percentage quality score for each custom object in each of the analyzed categories. The STATUS column is set to OK if the object exists on Analysis System, to not found if the object couldn't be found on the specified SAP system, to error in ABAP if the object’s ABAP code contains one or more syntax errors, or to no code if the object’s ABAP code contains comments only. If an object from the is not active, its status value is set to no ABAP.

Rule Break Summary

This spreadsheet provides a summary of the quality rules that were triggered for each custom object.

Rule Break Detail

This spreadsheet provides details for each used custom object that triggered one or more quality rules. The results are filtered so that only rules in the Error class are shown. In this spreadsheet:

  • The LINE_NUMBER column stores ABAP source code line number in which the violation was detected. For Web DYNPRO (WDYN) objects, methods are stored in include files, and the line number refers to the position of the source code line within its method. For other object types, if the INCLUDE column for the violation contains a value, the line number refers to a line in the INCLUDE file.
  • The SEQ column stores the ABAP source code line number in which the violation was detected. For Web DYNPRO (WDYN) objects, this column refers to a line number in the object’s INCLUDE file. This column may be used when sorting the Rule Break Detail dataset to display the violations for each object ordered by line number.

Help

This spreadsheet provides help for each of the spreadsheet reports.

Analysis Input Data

This Excel report contains a copy of the input parameters used to produce the app’s Dashboard report. The value of each input parameter is stored in a separate worksheet, which is named after the parameter whose value it contains.

Related topics

Smart impact analysis runtime performance

Standard apps