C/C++

Contents

C/C++#

Lattix provides you a number of options for parsing source code. You can use the Clang parser which is incorporated into Lattix. You can also use the output of Understand for C++ and Klocwork.

  • Clang: The Clang parser is incorporated into Lattix. You don’t need any additional tools. It is a high quality parser that is a core part of Apple’s development environment and supports C/C++/Objective-C. For input user can specify the source code directories along with compiler options, a visual studio solution or project file, or a build specification file.

  • Axivion: Axivion offers a sophisticated range of tools for automated static code analysis in the form of the Axivion Bauhaus Suite. It supports your software system developers in ensuring high quality and long-term ease of maintenance of the code they create, also offering checking for MISRA C, MISRA C++, AUTOSAR C++14, CERT C, and CERT C++.

  • Klocwork: Klocwork does static analysis; it has an excellent parser and understands the build process. If you already have Klocwork installed or are considering its use because of its excellent static analysis for bug detection, it is great choice to use with Lattix. Together, Lattix and Klocwork are an excellent solution for bug detection and architecture analysis.

  • Understand for C++: Understand for C++ has a very forgiving parser and will do an excellent analysis even in cases where the program does not compile. Understand for C++ also provides an option for strict analysis that uses Clang. You must also specify any macros that are specified through the command line of the C++ compiler.

  • Parasoft: Parasoft offers C/C++test tools.

Additional Capabilities#

There are some useful scripts available that can be applied once a project has been created for additional visibility:

  • Combine Source and Header files: This script will go through your project and create a new subsystem for source and header files with the same name. This new subsystem is located in the same place where the subsystem for the source file is located. This capability is useful when you want to think of source files and header files as a single abstraction. For instance, you may have a source file called test.c and an associated header file called test.h. The script will group these files together into a single subsystem called test.

  • Generate Class View: This script will create a new project (using the LDI module) to create a pure class based view of the project. All atoms and dependencies which are not class based are eliminated. All you are left with are classes and dependencies. The directory structure is preserved allows you to extract that architecture from a pure object oriented perspective.

C++ (with Clang)#

This module for C/C++ is based on Clang (http://clang.llvm.org/). Clang is part of the LLVM framework and is designed to support the C family of languages including C, C++ and Objective C. Clang is part of Apple’s development platform for iPhone, Macs and other devices. Clang has proven itself for its complete support for the C language family.

Getting Started#

C/C++ analysis is sensitive to the compile and link options. The Clang module provides several different ways to make this process easier:

Creating a Project#

In order to create a project, the Clang module needs to know the compile options. The Clang module also needs to know the files that are linked together. Ultimately, the Clang module needs a specification of the build. You can explicitly create the buildspec file or use the tools provided to generate this file.

The Clang module can take input in three different ways:

Project Configurer#

The Project Configurer provides a file directory browser user interface. The user selects the files that constitute the project. The interface also allows the user to specify compiler options. The user is expected to specify any include files that are not part of the project. The Project Configurer provides a quick way for users to get an understanding of the architecture. It is not suitable for complex projects where there are many different compiler options.

Visual Studio Files#

Visual Studio solution and project files contain the compile and link information needed for processing the source code. Lattix can read these file to create a project.

Build Specification File#

The user provides the build specification in a file. This is an xml file that contains the required compile and link information. This is the preferred way for dealing with complex projects that do not use the Visual Studio. Lattix also provides tools for generating the build specification file from the output of make and cmake.

Project Configurer#

The clang module generates a project from C, C++ and/or Objective C source code. You can configure the clang module to parse source files or directories using a set of options or use a build specification file to describe parsing options.

To parse a directory:

  • Select New Project from the File menu

  • Set C/C++ (Clang) on the Module Type drop down

  • Select Project Configurer in the Input Sources section

  • Add Directory adds a directory tree of source files to the project

image0

  • Add File adds a single file to the project

  • Add Unit adds another Unit to your project (more on this below)

Unit#

Units allow you to indicate the “link” units for your project. This can be important if the source code in the project builds multiple DLL’s or shared libraries. A Unit is a scope that allows Architect to resolve references to multiply defined symbols by choosing the definition in the same Unit.

Options#

You can set compiler options on a Unit (the above picture shows one Unit called buildspec), a directory or a file. These options are switches passed to the clang compiler.

  • -I adds to the include path. Example: -IC:\MyProject\include

  • -D defines a macro. Examples: -DWIN32, -DOS=WIN32

Compiler options can either be merged with the options from the enclosing Unit or Directory or can be a stand alone set of options that does not include options from the outer scope.

  • Press OK to dismiss the Project Configurer

  • Press Create Project and the clang module will begin parsing your source code

Avoiding Common Pitfalls#

Most issues using the Project Configurer relate to providing incomplete compile information.

To see the diagnostics, run the Clang report: Reports –> Clang Reports –> Diagnostics by File.

Missing Header Files#

If the header files are missing, the files may not parse correctly or may not parse at all. This is often because you don’t have the system include file. Please make sure that you have the header files from the development environment installed. Use Options in the Project Configuer to specify additional include files.

Undefined Macros#

While this is generally not always a big problem, lack of macro definitions can make the code unparsable. Specify macros through Options in Project Configurer.

Multiply Defined Symbols#

You can get multiply defined symbols if you include two files which are linked into different binaries and have the same symbols. In this Lattix will mark the multiply defined symbol as an external. To avoid this, create your project with a unit for each binary.

Visual Studio#

The Lattix Clang module can create projects by reading .sln or .vcxproj files from Microsoft Visual Studio 2008-2012.

  • Select New from the File menu

  • Select C/C++ (Clang) Module Type

  • Set the input source to Visual Studio

  • Press Add and navigate to your .sln file.

  • Choose the build configuration to use for your project

  • Press OK and create the project

image1

The resulting Lattix project will have a Unit (i.e. datasource) for each studio project contained in the .sln file.

Build Specification#

Build spec files are XML files with the extension .ldclang. The xml file describes units, source files and compiler options.

Here are the tags supported in the response file

  • <buildspec dir=”dirname”> - outermost tag in the response file

    • The dir attribute allows you to specify the directory to use for relative path resolution.
      dirname
      

      is an absolute path or a path relative to the directory containing the build spec file. All relative file references in the buildspec are resolved relative to the path indicated by dirname. If dirname is not specified, relative paths are relative to the directory containing the build spec file.

  • <unit> - tag that describes a Unit. <unit> groups source files, much as a linker links compiled source files together. <unit> provides scoping that is used to resolve dependencies on multiply defined symbols. <unit> has one attribute, name, which typically corresponds to the name of a makefile.

  • <directory> - tag that describes a directory of source to be added to the project. The name attribute indicates the path to the directory. The recursive attribute (implied to be true) indicates that sources from subdirectories are to be added. The options attribute can be either mere or replace, defaulting to merge.

  • <exclude> - tag that is a child of <directory> whose attribute pattern indicates a pattern of file or directory to exclude from the build.

  • <file> - describes a source file. It has one attribute, name, that indicates the path of the source file to be compiled. The path can either be absolute, be relative to the name specified in the <unit> or <directory> or be relative to the path of the .ldclang file. <file> is child element <unit> and <directory>

  • <option> - indicates a switch passed to the compiler. It has one attribute, value, which indicates the switch. <option> can be used as a child of <buildspec>, <unit>, <directory>, or <file>.

Example#

<?xml version="1.0" encoding="ISO-8859-1"?>
<buildspec>
  <unit name="httpd-2.2.21" options="merge">
    <option value="-DWIN32"/>
    <directory name="C:\Lattix\Apache\httpd-2.2.21" recursive="true" options="merge"/>
  </unit>
</buildspec>

Options#

The Configuration Options allow you to control some behaviors of the clang module as well as pass arguments to the compiler.

  • Filter dependencies from header to source files - Dependencies from header to source files can creating spurious circular dependencies such as macro dependencies or even dependencies from inline functions. With this option off, the DSM will show the header file using the source file that included it.

  • Include Standard Clang Headers - Enable this option adds header files provided by the Clang compiler to the include path. These headers are found in the source/clang subdirectory of your Lattix installation.

  • Automatically detect include directories in source tree - Tells the compiler to augment the compiler’s include path by scanning for directories in the source tree that contain header files.

  • Compute Visible declaration information - Disabled by default, this option saves the declaration information in the model which is then used by reports and checkers to detect duplicated declarations.

  • Convert symbolic links to physical file names - Enabled by default, this option converts files with symbolic links into physical paths.

  • Include files matching these names - Only the files that match the name will be included. It allows you to build a subset of a unit or a project.

  • Include files only from these datasources - List the datasources that will be processed. Other datasources will be ignored. It allows you to build a subset of the entire project.

  • Skip atoms matching these patterns - You an specify a pattern using a perl style regular expression to prevent certain files from being processed.

  • Include Directories - list of directories added to the clang compiler’s include path.

  • Compiler Options - raw options passed to the clang parser, i.e. -DDEBUG defines the macro DEBUG

  • Remove these prefixes from atom names - Use this to remove a directory in the prefix. This allows projects compiled on different computers to be comparable. You can specify a “*” to designate an entire directory. For example: /home/name/build//project*.

  • Header File Extensions - list of file extensions considered to be header files.

  • Source File Extensions - list of file extensions considered to be source files.

  • Create dependencies to declarations if there’s only one - In the case where your source code does not contain implementations for all of the functions and data it uses (i.e. the implementation / definition is in a library) and there’s exactly 1 declaration for the item (i.e. in a shared header file), the clang module creates dependencies to that declaration. Typically dependencies target the definition.

  • Create atoms for declarations - Disabled by default, this property creates DSM elements for symbol declarations. By default, the Clang module only creates DSM entries for actual implementations and definitions.

  • Directory to use for temporary files - Directory which will be used to store temporary files generated as part of the analysis.

  • Directory to use to resolve relative files in buildspec - Similar to the “dir” attribute on the <buildspec> node in the buildspec. When Lattix detects a relative file in the buildspec, it attempts to resolve it.

    1. Relative to the “dir” attribute, if there is one

    2. Relative to this project setting

    3. Relative to the directory containing the buildspec.

  • Skip #define processing

  • Log File - name of file into which the clang module logs diagnostic information. This information can be very large and turning on this option can greatly reduce compiler performance.

  • Detailed Dependency Properties - Enabling this option adds additional properties to dependencies, primarily used for debugging purposes. This option is disabled, by default.

Atom and Dependency Kinds#

Atom Kinds#

The clang module creates atoms of these kinds:

  • Header File

  • Source File

  • Macro

  • Enum

  • Class - class, struct, template or union

  • Field - data member of a class

  • Method - class function or method

  • Global Variable

  • Static Variable - variable scoped to a file

  • Global Function

  • Static Function - function scoped to a file

  • Template Class

  • Template Function

Dependency Kinds#

The clang module creates these types of dependencies between atoms:

  • Override - a method overrides a method in a base class

  • Invoke - a method calls another

  • Inherit - a class has target class as a base class

  • Type Reference - general reference to a type

  • Macro - use of a macro

  • Address of Global - reference to the address of a global variable

  • Multiply Defined - symbol is multiply defined

  • Include - file include

  • Read Field - read access a data member of a class

  • Read Global - read access a global variable

  • Modify Field - modify a data member of a class

  • Modify Global - modify a global variable

  • Compiled With - Source files have a “Compiled With” dependency on the header files included in source’s compilation. Dependencies of this kind are only generated when the “Compute Visible Declaration Dependency Property” option is enabled. This dependency is filtered out by default.

Clang Reports#

There are a number of reports for the Clang Module. These reports include diagnostics (including errors) that were generated when the source code was compiled. It also includes a number of reports related to include file analysis. Please note that for some of these reports to work properly, the Compute Declaration Information property must be enabled at project creation. This property is needed because declarations are bypassed when dependencies are generated. As a result, the Externs report will generate an error if this property is not set. Furthermore, in the absence of this property, reports related to header files will not take declarations into account.

  1. Diagnostics by File This report produces two sub-reports:

    • Files with Errors

    • Files with Warnings

  2. Header Analysis This report produces 4 sub reports:

    • Headers not Included

    • Headers without dependencies

    • Headers included only for declarations

    • Headers included only to include other headers

  3. Externs Analysis This report produces the following two sub reports:

    • Non-local Declarations in Source Files

    • Dependencies using non-local Declarations

  4. Dependencies on Indirect Includes This report produces a 5 column report with the following fields: - Header File - Target/Provider atom in the header file (such as class, field, global data

    reference etc)

    • Source Location of the dependency in the source file for the target/provider atom

    • The type of dependency

    • The path(s) for the indirect dependency

  5. Undefined Macros Produces a list of macros that weren’t defined and where those macros were referenced.

  6. Missing Include Files Produces a list of include files that were missing, the name of the file that included them and a suggestion of the file that could be considered for inclusion (in case there was an error in the include path).

  7. Cloned Files Lists files that are duplicates. Sometimes, files are copied by build systems. Occasionally, files are duplicated by developers. This report gives you a list of all the files which are identical.

C++ (with Axivion)#

The Lattix Axivion for C++ module reads the .rfg files from the Axivion Bauhaus tool. Axivion can support C/C++/C# code and produces IR files. Axivion has a tool that can convert the IR files to a RFG file which Lattix can use to create the Lattix Model.

Getting Started#

Before processing your first Axivion file, you wlll need to configure Lattix. Select the View->Preferences menu
image2

In the Preferences Dialog, select Modules/C/C++ (Axivion) Defaults. Then you will need to fill in the “Axivion Home” and “Python 2.7 EXE” fields

image3

Create Project#

To create a project:

  • Select the Embedded or All Profiles.

  • Select the C/C++ (Axivion) project type

  • Navigate to the .rfg file and add it to the project.

image4

Atom and Dependency Kinds#

Atom Kinds#

This module creates atoms of these kinds:

  • Class

  • Member

  • Variable

  • Header

  • Method

  • Routine Template

  • Routine

  • Method Template

  • Source File

  • Class Template

  • Constant

  • Type

Dependency Kinds#

This module creates dependencies between atoms of the following kinds:

  • Invokes

    • Dispatching Call

    • Static Call

    • Invokes.Virtual

  • Type

    • Extend

    • Inherit

    • Instantiate

    • Implementation of

    • Override

    • Type Reference

  • Member

    • Member Address

    • Member Set

    • Member Use

  • Variable

    • Variable Address

    • Variable Deref

    • Variable Set

    • Variable Use

  • Include

Command Line Support#

ldcupdate is a command line utility that can be used to update Lattix Projects. This functionality can be used to automatically update the Lattix Project as part of a build procedure. This section provides an example of using ldcupdate with the Axivion module.

Example#

  • ldcupdate sample.ldz -module:axivion file.rfg

    Update project, sample.ldz, created from an axivion .rfg file

C++ (with Klocwork)#

The Klocwork module reads data generated by Klocwork’s kwbuildproject utility. Klocwork Version 10 and higher stores its data in the kwtables directory. Earlier versions of Klocwork stored the data in a MySql database. Lattix can read in data from both sources.

Model Generation from kwtables Directory (for Klocwork Version 10+)#

Introduction#

As of Klocwork version 10, Klocwork no longer stores its entity / relational data in MySql databases. The current release of Lattix Architect supports extracting Klocwork entity / relational from the kwtables directory generated by the kwbuildproject Klocwork Utility.

Run the Klockwork command:

kwinject --output specfile.out <build_command>
which creates the specfile.out file.
For instance, if you build with the make command, run:
kwinject ---output specfile.out make

Then you can run:

kwbuildproject.exe --url URL-to-klockwork-project --tables-directory output-kwtables-dir specfile.out
Which creates the output-kwtables-dir.
More information can be found on the Klocwork site here.

Creating a model#

image5

  • Select New Project from the File menu

  • Set Module Type to “C/C++ Klocwork”

  • Select your kwtables directory in the Input Sources

  • Press “Add Directory”

  • Press “Create Project”

Model Generation from MySql (for Klocwork Version 9 and earlier)#

Configuring Lattix to Work with Klocwork#

In order to work with Klocwork data, you will need to obtain and install a MySql JDBC driver.

  • Download the JDBC driver from http://dev.mysql.com/downloads/connector/j/5.1.htm. The driver can also be found in Server/class/mysql-connector.jar in your Klocwork server installation.

  • Install the JAR file (called mysql-connector-java-5.1.10-bin.jar as of this writing) containing the JDBC driver into the libplugins directory of your Lattix installation (typically c:\Program Files\Lattix9.0\lib\plugins in your Lattix installation)

Creating a Lattix Project with the Klocwork Module#

Creating a Lattix Project with the Klocwork module is a fairly straightforward process:

  • Start Lattix and select New Project from the File menu

  • Set the module type of project to C++ (Klocwork)

  • Press the Add button

Lattix displays a dialog box that looks something like this:

image6

  • Enter information about the server containing the Klocwork database. The username for a Klocwork database is usually kw - case sensitive. Typically there is no password.

  • Once the host, port, username and password is filled out, you can expand the Project drop down list box to display the list of Klocwork projects available in the database. Select the desired project.

  • Please note that Klocwork has changed the default port from 3313 to 3306. If you are unable to connect to the Klocwork MySql database from one port, please try using the other port to connect.

  • Once the project is selected, you can select the desired build. Selecting <Newest> or not selecting a build will use the latest build as input.

  • Press OK to add the connection to the project.

  • Press Create Project to create the project.

Atom and Dependency Kinds#

Atom Kinds#

This module creates atoms of these kinds:

  • Header File

  • Source File

  • Macro

  • Abstract Class

  • Class - class, struct

  • Class Field

  • Class Method

  • Constant

  • Global Function

  • Global Variable

  • Template Class

  • Template Class Method

  • Template Function

Dependency Kinds#

This module creates dependencies between atoms of the following kinds:

  • Data

    • Read Global - Read a global variable.

    • Read Field - Read a class member field.

    • References - Data reference to a non-scalar value (i.e. a struct).

    • Modify Global - Write to a global variable

    • Modify Field - Write to a class field.

  • Class

    • Reference - Reference to a class

    • Inherit - Reference from a derived class to a base class

    • Weak Type - This dependency kind is a type reference that does not require the full definition. For example struct MyStruct *mystructp generates a Week Type reference that mystructp uses struct MyStruct. Similarly, class MyClass; MyClass *myclassp; produces a Weak Type reference that myclassp uses MyClass. The Weak Type dependency kind is filtered by default.

  • Method

    • Override - Reference from a method in a derived class to the method it overrides in a base class

    • Invoke - Call a function

    • Invoke Declaration - Reference to the forward declaration used to compile a function call.

    • Call Function Pointer - Reference to a variable containing a function pointer used for an indirect function call

  • Declaration

  • Macro

    • Use Macro

    • Configuration Macro

Klocwork Module Project Options#

There are several project options that control the behavior of the Klocwork module. The options can be specified at project creation time by selecting the Options tab in the Create New Project dialog or by selecting Project Properties from the Project menu and clicking on Options under C++ (Klocwork). Here are the options for the Klocwork plug-in:

  • Member Level Processing - This tells the module whether it should create atoms for the contents of the files in your Klocwork project. By default, the Klocwork module aggregates file contents dependencies to the file.

  • Dependencies for Members - This tells the module whether dependencies should be aggregated to the file or should be attached to the contents (members) of the file.

  • Show New Members Atoms - When creating atoms for members, the default behavior is to aggregate the created member atoms to their containing file in the DSM. Select Show New Members Atoms causes Lattix to expand the members and make them visible in the DSM.

  • Enable Line Number Processing - In order to reduce memory requirements, the Klocwork plug-in does not preserve line number information for atom definitions and dependencies. Enabling this option preserves line number information. This line number information can be quite useful when used in conjunction with the View Source functionality.

  • Skip symbols matching these patterns (regexp) - This option contains a list of patterns that the Klocwork module uses to exclude atoms with particular names from the DSM. Dependencies on these external atoms appear as externals in the Usage panel. The default set of skipped symbols is configured to remove dependency information for Windows header files.

Command Line Support#

ldcupdate and ldcreport are command line utilities that can be used to update Lattix Projects and to generate generate reports from them. This functionality can be used to automatically update the Lattix Project as part a build procedure. This section provides some examples of using these tools with the Klocwork module.

Examples#

  • ldcupdate unittests.ldz -module:klc klc:/work/CppConstructs/kwtables

    Update project, unittests.ldz, created from the kwtables directory for a Klocwork 10.* project.

  • ldcupdate unittests.ldz -module:klc klocworkdatabase:host=localhost,port=3313,

    user=kw, project=unittest

    Create new Lattix project, unittests.ldz from the latest build of the unittest project in the Klocwork 9.* repository. The command line assumes that the MySQL for the repository is running on the local machine, at the typical port, with the user name kw.

  • ldcupdate unittests.ldz -report:html

    Update the project and generate an html report in unittests.html.

  • ldcreport unittests.ldz -report:html -cycles -violations

    Creates a cycles and violations report for unittests.ldz in html format.

C++ (with Understand)#

The Lattix Understand for C++ module reads the .udb files from Scientific Tools Understand for C++. Understand for C++ is an interactive development environment (IDE) designed to help maintain and understand large amounts of legacy or newly created C and C++ source code. It works well even if the source code does not compile or if the build environment is not setup.

You can obtain an evaluation version of Understand for C++ from https://licensing.scitools.com/download

Version Compatibility#

Understand API and licensing changed significantly in build b921.

Lattix Architect version 10.9+#

Lattix Architect 10.9 supports Understand 5.0 by default. To run Architect 10.9+ with older versions of Understand, you have to switch the DLL in the install directory.

Understand build 932+

Architect is setup to work with these builds by default. Note that you must set the environment variable STI_HOME to the directory where you installed Understand, typically c:\Program Files\Scitools. To avoid doing this every time, we suggest that you set this variable for the entire system so that it is set prior to running Lattix Architect.

Understand builds 919-

To use Architect with builds older than and including builds 919, follow these steps prior to starting Architect:

  • Rename the jniUnderstand.dll to jniUnderstand_old.dll. These files are located in the bin/win32 or bin/win64 directories of your Lattix installation.

  • Rename the file jniUnderstand_b919-.dll to jniUnderstand.dll.

  • Start Lattix Architect after renaming the files.

Lattix Architect version 10.8.1#

Architect version 10.8.1 now comes with 2 DLLs to support the old and the new scheme. By default, Architect 10.8.1 supports builds prior to build 921.

To use Architect with builds later than 921, follow these steps prior to starting Architect:

  • Rename the jniUnderstand.dll to jniUnderstand_old.dll

  • Rename the file jniUnderstand_b921+.dll to jniUnderstand.dll.

  • Start Lattix Architect after renaming the files.

If using Understand build 932 or later, the Understand library may be unable to find the QT Package. Here is how to fix:

  • Environment variable STI_HOME should be set to the directory where you installed Understand, typically c:\Program Files\Scitools) on Windows. Setting STI_HOME should allow Understand to find the QT Package library.

    Note: We suggest that you set this variable for the entire system so that it is set prior to running Lattix Architect. Otherwise, you will have to set the variable every time in a command window prior to running Architect from that window.

  • If Understand is still unable to find QT Package, set QT_QPA_PLATFORM_PLUGIN_PATH environment variable.

    On Windows 64bit:
    set QT_QPA_PLATFORM_PLUGIN_PATH=%STI_HOME%\bin\pc-win64\Plugins\platforms

    On Linux 64bit

    export QT_QPA_PLATFORM_PLUGIN_PATH=$STI_HOME/bin/linux64/Plugins/platforms

Notes on version compatibility#

  • The instructions above are for Windows. For Linux and Mac versions, you will have to change the instructions slightly to correspond to the platform.

  • Understand ‘udb’ format does change. This can lead to a version mismatch between the udb file and the Understand API library. For instance, an older Understand API library may not work with a newer version of ‘udb’. Lattix uses this API library which is contained within the Understand installation. This means that the ‘udb’ file must be written by the version of Understand that is installed on the system that Lattix runs on. If that is not the case, you must first load and then save the ‘udb’ using the version of Understand on your desktop. This is the most common issue that leads to questions about compatibility..

  • Understand makes new builds available regularly, typically every week. Lattix also makes new builds available regularly, typically once a month. It’s not possible for us to test every build of Lattix with every build of Understand (and there is certainly no way to test against future builds). When we say we have tested a certain version of Lattix against a certain version of Understand, we mean that we have tested at least one build of Lattix against one build of Understand for each of those versions.

Configure Architect to Work With Understand for C++#

The Understand module uses a library from Scientific Tools to read Understand database files (*.udb). Installing Understand 2.0 installs the library and alters the system PATH environment variable to include the Understand installation.

Windows: The Understand library is called udb_api.dll and is typically found in the C:\Program Files\SciTools\bin\pc-win32 directory. If, after installing Understand, Architect reports that the API library is missing, try logging out and logging back into your Windows session.

  1. Make sure the environment variable STI_HOME points to the installation directory for the Understand application. The Lattix scripts use this variable to find the udb_api.dll

  2. Please ensure that path contains the directory C:\Program Files\SciTools\bin\pc-win64 or C:\Program Files\SciTools\bin\pc-win32.

  3. Note that 64-bit Lattix will work with 64-bit Understand and the 32-bit Lattix will work with the 32-bit version Understand. You cannot use 64-bit version ofLattix with 32-bit version of Understand or vice versa.

Linux: The Understand library is called libudb_api.so and is found in the bin/pc-elf directory of your Understand installation. You can specify the location of your Understand installation by setting STI_HOME environment variable to point to the top of your Understand installation. Alternatively, the shell scripts start up Architect (i.e. lattixarchitect.sh) will attempt to locate your Understand installation. The scripts cache the results of this search, so the search is conducted only once.

Mac OSX: Lattix Architect looks for the Understand installation in /Applications/scitools. If you installed Understand in a non-standard place, simply create a symbol link from /Applications/scitools to your installation of Understand :

ln -s /your/understand/path /Applications/scitools

Please note older versions of Understand used a different file organization. As a result, with older versions of Understand (builds prior to Build 858), you may need to edit your Lattix installation configuration. Edit the file “/Applications/Lattix Architect 10.5.app/Contents/Info.plist”. Replace all occurrences of the string “/Applications/Understand.app/Contents/MacOS/C” and “/Applications/Understand.app/Contents/MacOS” with the string “/Applications/scitools/bin/macosx”.

Create a Project from a UDB File#

Once you have generated your UDB file, start LDM.

  • Select New Project from the File menu in LDM.

  • Select C/C++ (Understand) from the Module Type drop down

  • Navigate and select the UDB file in the file tree

  • Click on Add File (or double click) to add the UDB file to the project.

  • Click on Create Project

image7

New Project

The Understand module can read .udc files (from Understand 1.4) and .udb files (from Understand 2.0). The UDC Module also supports .ldi files, which allow you to augment the Understand data for elements in your LDM.

Atom Types#

The DSM is composed of atoms and dependencies. Here are the kinds of atoms generated by the UDC module:

  • Header File is typically a .H file. Header files usually contain macros, class definitions and function prototypes.

  • Source File is typically a .C, .CPP or .CXX file. Source files contain executable code.

  • Class is a C++ class, struct, union, enum or typedef.

  • Interface is a C++ class with only pure virtual methods.

  • Data is either a global variable, a class member or an enum value

  • Method is either a global function or a member function

  • Macro is a lexical macro (i.e. #define)

Dependency Kinds#

Dependency kinds can be used to filter the dependencies visible in the DSM. The UDC module supports these dependency kinds:

  • Include dependencies are generated when source file uses #include to add an include file to the compilation stream.

  • Data is a reference to a variable or member. It has three sub-kinds:

    • Global is a reference to a global variable

    • Member is a reference to a class or struct member.

    • Enum is a reference to an enum member

  • Class is a reference to a class, struct, union, enum or typedef. It has these sub-kinds:

    • Reference is the catch-all sub-kind.

    • Inherit indicates that the source is a sub-class of the target/provider.

    • Friend indicates that the source has a friend class of the target/provider.

    • Weak Type indicates a reference that does not require full knowledge of the target/provider class. For example:
      class Foo;
         ...
      void myFunc(Foo *x) {
         ...
      }
      

      The “Foo *” in the function signature of myFunc generates a dependency myFunc uses Foo, Weak Type dependency kind.

  • Method is a dependency on a method. In has these sub-kinds:

    • Override is a dependency between a method and the base class method it overrides.

    • Invoke is a dependency that indicates a method calls a method.

  • Declaration is a dependency between a declaration and an implementation / instantiation. Typically, this is some sort of forward declaration. In the example above, the “class Foo;” line would generate a declaration dependency between the file containing the “class Foo;” and the file that contains the definition of class Foo. The UDC module also generates Declaration dependencies between function forward declarations and the implementations.

  • Macro is a dependency on a lexical macro (#define). It has these sub-kinds:

    • Use Macro is the catch-all for macro references

    • Configuration Macro is a reference to a macro that is used to control the configuration of a header file. For example:

    myfile.cpp:
    
    #define WINDOWS_LEAN_AND_MEAN
    #include <windows.h>
       ...
    windows.h:
    #ifndef WINDOWS_LEAN_AND_MEAN
        : do something
    #endif
    

    Normally, this construct would show a dependency of windows.h using mpfile.cpp. As a result, this dependency actually detracts from architectural analysis. By default, configuration Macro dependency kinds are filtered.

  • in Macro Definition is a reference from one macro to another:

#define N1 10
#define N2 (2 * N1)
N2 uses N1, “in Macro Definition” dependency kind.

Options#

There are several configuration options for the Understand module. You can configure these options in the Project Properties dialog box accessible by selecting Project Properties from the Project menu.

Enable Line Number Processing adds source file line number information for dependencies and member objects. LDM uses this information to navigate to the definitions of DSM elements and source of a dependency in the source code using Understand.

Exclude non-include header file dependencies removes all dependencies from header files that are not caused by a #include preprocessor command.

Fold Unnamed Types into Typedefs: Constructs such as these create unnamed types:

typedef struct {
    int x;
} MYSTRUCT;

The Understand for C++ module contains an option for “folding” the unnamed structure into the typedef that follow. This option is enabled by default. References to the structure member x will refer to MYSTRUCT.x in the LDM.

Filter duplicate global elements removes global function and data entities whose names are not unique. This can sometimes be helpful in removing inaccurate dependencies caused by defining functions or global variables of the same name in multiple compilation contexts and then creating a single UDB file that spans both contexts.

Visibility Filter uses lexical visibility to filter erroneous references sometimes generated by Understand when the Understand project spans multiple compilation units. A target (also known as a provider) is lexically visible from a source (also known as a consumer) if the source and target are in the same file or the file containing the source #include’s a file the contains a definition for the target.

Use visibility to resolve references to multiply defined symbols - When an Understand project spans multiple compilation Units it sometimes generates erroneous references to multiply defined symbols. Using lexical visibility can sometimes remap these erroneous references.

Ignore include references to non-unique header files - When an Understand project spans multiple compilations Units, there can be multiple occurrences of a header file with the same name. Sometimes Understand generates an erroneous references to the wrong one of these files.

Scan namespace for duplicates - With this option enabled, the UDC module tracks duplicate symbol names and filters or corrects references generated to them.

Skip symbols matching these patterns filters out symbols whose names match a user specified list of regular expressions. The match is case sensitive.

Skip these files filters out files by base file name. For example, to remove all stdafx.h files from your model, add stdafx.h to the list.

Enable File Filtering must be enabled to filter the files specified in Skip these files.

Project Options and Large Understand Projects

You can use the project options to reduce the amount of memory / processing time that LDM requires to process a large Understand project. The default configuration of the options is designed to improve the fidelity of the LDM, particularly with respect to Understand projects that span multiple compilation units.

If you are running out of memory building an LDM from a UDC file, you might try:

  • disabling Visibility Filter and Use visibility…

  • disabling Scan namespace for duplicates

Mapping Dependencies to Code#

You can navigate to the dependency in Understand directly from the DSM or CAD.

After you’ve loaded your model into Architect, you will partition it to determine the architectural layering of your system. Consider Apache:

image8

This image shows a proposed architectural layering. The highlighted “12” is a set of dependencies that violate the layering. The usage pane looks like this:

image9

This image shows the 12 dependencies that violate the layering in the DSM. If you select http_core.c, the Uses pane shows that something in protocol.c is using a global variable defined in http_core.c:

image10

Now the question is, “what’s in protocol.c that uses a global in http_core.c”. To investigate, we’ll member level expand protocol.c and http_core.c.

  • Right mouse on http_core.c and select Expand Members

  • Right mouse on http_protocol.c and select Expand Members

The member information is displayed and we can ap_read_request, defined in protocol.c, uses ap_http_input_filter_handle, defined in http_core.c.

Understand Integration

We can research this further using Understand.

  • Load up the UDB file in Understand

  • Switch back to Artchitect

  • right mouse on the dependency (ap_http_input_filter_handle in the example) and select View Source from the menu.

If this is the first time that you have used the view source command, Architect will ask you to select the editor:

image11

Artchitect attempts to locate Understand. On Windows, if you’ve installed Understand in the default place, Artchitect will build a command line that facilitates integration with Understand.

  • Select understand-cpp and press OK

Understand will display the line in protocol.c that references ap_http_input_filter_handle.

  • Right mouse on ap_http_input_filter_handle in the Understand editor window and select View Information from the context menu.

Understand’s information panel diplays information about ap_http_input_filter_handle:

image12

This shows that ap_http_input_filter_handle is defined in http_core.c and has a Use reference in protocol.c.

Large Model Strategy

In the event that a model is too big to expand its members, you can still use the View Source to point Understand at the code where a reference occurs. If we had used View Source on http_etag.c, we could see the point in the code that caused that reference, as long as the Enable Line Number Processing option was enabled.

Integrating with LDC#

LDC is Lattix command line program which can be run as part of the build process for continuous monitoring of the architecture. LDC takes the UDB file generated by Understand as input. Understand also provides a command line tool for creating and updating the udc file. This tool is called und in Understand. You can instrument your build so that you first run und and then run LDC. und is documented in the Understand User Guide and Reference Manual. Help can also be obtained by running und with the -help switch. LDC is documented in the User Manual.

Here in an example sequence of commands for updating a project:

set path="C:\Program Files\Lattix5.0.6\bin";%PATH%
set path="C:\Program Files\SciTools\bin\pc-win32";%PATH%
und -db httpd-2210.udb analyze -all
ldcupdate httpd-2210.ldz -deltaTags
ldcreport  httpd-2210.ldz -impact -source:UpdateDelta -closure -report:html

The und line updates the Understand dependency database httpd-2210.udb by re-parsing the source files that were used to create the project.

The ldcupdate line updates the Lattix project by re-parsing the dependency information in httpd2210.udb. It also marks everything that is new or changed with the tag “UpdateDelta”.

The ldcreport then generates an impact report on all the elements that are tagged “UpdateDelta”.

Example - Apache Server#

This example will walk you through the process of creating a dependency model for two different versions of the apache server starting from the source code. This consists of three steps:

  1. Download the source code.

  2. Generate the UDB database using Understand

  3. Create a dependency model using Lattix Architect

Note that you can skip the first two steps if you are already familiar with the process of creating a UDB file. If that is the case then you can simply download the udc file that we have already created.

Download the source code for Apache Server 2.0.55#

Download source code for Apache Server 2.0.55 from the following url:

http://archive.apache.org/dist/httpd/

Unzip and extract the files. The files will be extracted into the directory httpd-2.0.55 which will be a sub-directory of the directory where you choose to extract the files. This is the source for Apache Server version 2.0.55 for windows. It is 230K lines of code distributed in 340 source files and 165 header files. It does not contain the standard include files which are part of the compiler.

Generate the UDB file from Understand#

Follow these steps to generate the UDB file for Apache Server 2.0.55:

  1. Run Understand

  2. Select File->New Project and specify the name httpd2055 in the New Project Dialog and click on Open.

  3. A Project Configuration Dialog will open. Specify the htpd-2.0.55 directory in the Directory field. Now click on Add.

  4. Click on Save to start up the parsing. Since the source code does not include standard header files, you will see the Missing Include File Dialog. Select stop warning about missing include files and click on Okay.

Load the UDB file into Lattix#

Follow these steps to generate the UDB file for Apache Server 2.0.55.

  1. Start Lattix Architect and bring up the New Project Dialog

  2. Select Module Type to Understand for C++

  3. In Input Sources, select the UDB file, httpd2055.udb, that you just created using Understand for C++ and click on Add.

  4. Now click on Create Project and a progress bar will appear indicating that the file is being loaded in. The file should load within 10-20 seconds.

Using Lattix Architect#

This section presents a cursory outline of the some the analysis capabilities. For a more detailed reading, we suggest that you read Lattix white papers and the tutorial.

image13
Initial DSM

Now partition the DSM and you will get a sense of the structure. You will also notice a few dependencies that appear to violate the architectural intent.

image14
Partitioned DSM

The re-ordered DSM moves srclib to the bottom. srclib contains the portable runtime and other utilities and provides services to the rest of the system.

You can click on an individual cell and see the dependencies associated with that cell in the usage pane. Notice that even though the Apache server is modularized to allow new modules to be added to it, the server is actually coupled to the httpd module.

Exploring Member Level#

The initial DSM has files as the leaf nodes. You can also explore a file down to its methods and data members. Simply select a partition, right click and choose “Mark Member Level Dependencies”. If you want to see member level dependencies for the entire project, simply select $root and expand to member level.

image15
Files are Expanded down to their Methods and Data members

Other Functionality#

Lattix provides a variety of useful functionality to manage the architecture. Some of these include:

Filtering Dependencies: Lattix Architect classifies each dependency based on their dependency kinds. You can select View->Filter Dependencies to bring the Filter Dependencies Dialog. You can then specify what kinds of dependencies you want to display. For instance, if you wanted to look at method call dependencies, you can do that easily. You can also filter for source/consumer and target/provider of dependencies. Thus, if you wanted to see the dependencies that include files have on include files you can do that easily as well.

Conceptual Architecture: You can create a conceptual architecture that displays the decomposition in a box-in-a-box diagram. The layout of the boxes is used to represent the layering and independent components.

Reporting: A variety of reports can be generated using Tools->Reports. Many output formats are supported including xml, html, xls, text and csv.

Conformance Checking: You can specify architectural rules and then verify them at build time using a command line utility (LDC). You can also use LDC to automate reporting and publishing of the results to an intra-net web site.

View Source: You can view source for any of the subsystems directly from Lattix Architect. On Windows, Lattix will invoke the notepad program to display the source. However, you can configure Lattix Architect to invoke Understand for C++ directly from the View Source menu.

C++ (with Parasoft)#

The Lattix Parasoft for C++ module takes the .ldi.xml files output from Parasoft C/C++test. Parasoft C/C++test has an available feature to output Lattix LDI files that can be read into Lattix for architectual processing. Parasoft produces a “deps” directory with all of the individual .ldi.xml files which can then be loaded into Lattix. If you are using the Lattix Architect GUI you can input the “deps” directory directly into Lattix.

Parasoft C/C++test Standard#

These commands will generate the *.ldi.xml files from parasoft.

cpptestcli -compiler gcc_9-64 -config "builtin://Export Code Dependency Data" -input cpptestscan.bdf -module .

Parasoft C/C++test Professional#

cpptestcli -data path_to_workspace -config "builtin://Export Code Dependency Data"  -localsettings local-settings  -appconsole stdout -bdf cpptestscan.bdf
* The path_to_workspace should point to an existing directory outside of the current directory. It is the eclipse workspace that will be used by Parasoft *
In Profressional, to control where the dependencies are in located, you can add a line to local-settings.
This specifies a link to another file named dependency.properties
cpptest.advanced.settings.file=dependency.properties

And the dependency.properties property file should specify the output directory

cpptest.analyzer.dependency.report.location=output_directory

To create a Lattix project from the the *.ldi.xml files, try this on the command line:

For Standard (the data created by Parasoft should be in the “reports” directory)

ldcupdate project_parasoft.ldz -module:parasoft deps:reports

For Professional

ldcupdate project_parasoft.ldz -module:parasoft deps:output_directory

The output_directory is the same directory specified in the dependency.properties file above