Skip to content

50.003 - Specification-based Integration Testing and System Testing

Learning Outcomes

  1. Identify the difference between decomposition-based integration testing and call graph-based integration testing.
  2. Conduct integration test using Jest.
  3. Derive system testing test cases based on the user case documentations.
  4. Develop testing strategies based different software development life cycle.

Levels of testing

Recall

Design abstraction level Testing level
Requirement Specifications System testing
Preliminary Design Integration testing
Detailed Design Unit testing
  • Unit testing - test individual smallest units of the system.
  • Integration testing - test related units/subcomponents of the system, which are related (according to the system design)
  • System testing - test the system as a whole, (often according to the user cases).

In this unit, we study integration testing.

Integration Test

There are two main approaches of performing integration tests

  • Decomposition-based testing
  • Call graph-based testing

Decomposition-based testing

In Decomposition-based integration testing, we follow the modular structure of the system design.

graph
  Program-->Function1
  Program-->Function2
  Program-->Function3
  Function1-->SubFunction1
  Function1-->SubFunction2
  Function2-->SubFunction3
  Function2-->SubFunction4
  Function3-->SubFunction5
We perform integration test by following the structure. There are two possible directions. 1. Top-down integration testing 1. Bottom-up integration testing

Top-Down Integration testing

In top-down decomposition-based integration testing, we mock up all the sub-components below the main program, and test the main program.

graph
  Program-->Function1*
  Program-->Function2*
  Program-->Function3*
  Function1*-->SubFunction1*
  Function1*-->SubFunction2*
  Function2*-->SubFunction3*
  Function2*-->SubFunction4*
  Function3*-->SubFunction5*
We put an asterix to denote that function is mocked. After the main program being tested against the mocked functions, we start to replace the mockded codes with the actual codes starting from the first left child until the bottom right child following the breadth first search order.

graph
  Program-->Function1
  Program-->Function2*
  Program-->Function3*
  Function1-->SubFunction1*
  Function1-->SubFunction2*
  Function2*-->SubFunction3*
  Function2*-->SubFunction4*
  Function3*-->SubFunction5*

The rationale of the top-down integration test is that when we encounter an error, the error must be caused by the integration of the newly unmocked code.

Bottom-up Integration testing

Bottom-up decomposition-based integration testing starts from the bottom left-most or the right most leaf function. We test the leaf functions by making use of the unit test codes (now we call it the driver code).

graph
  Program*-->Function1*
  Program*-->Function2*
  Program*-->Function3*
  Function1*-->SubFunction1
  Function1*-->SubFunction2
  Function2*-->SubFunction3
  Function2*-->SubFunction4
  Function3*-->SubFunction5
Note that in the above diagram, the components associated with an asterix are yet to be integrated in the integration test. We then move up the structure by integrating the parents of the leaf functions, (and using the unit test code). We repeat the process until we reach the top.

graph
  Program*-->Function1
  Program*-->Function2
  Program*-->Function3
  Function1-->SubFunction1
  Function1-->SubFunction2
  Function2-->SubFunction3
  Function2-->SubFunction4
  Function3-->SubFunction5

By doing so, we need not mock up the code as we can reuse (or modify) the unit-test code as drivers.

Limitation

The limitation of decomposition-based testing is that the structure is defined by the lexical structure of the source code (definition structure), which often does not reflect the execution and function call relation.

For example, recall in our Echo App (the restful API version), we have two major components in the app. By following the code structure we have.

graph
  app-->EchoRouter
  app-->MessageModel
  EchoRouter-->get.all
  EchoRouter-->post.submit
  MessageModel-->all
  MessageModel-->insertOne
  MessageModel-->insertMany

However, we find that we hardly have code from app to call MessageModel directly. In most of the situation, the call sequence is app-->EchoRouter-->MessageModel.

Call-graph based integration

To address the issue with Decomposition based integration, we define the integration structure by following the call graph. For instance, here is the call graph of our Echo App

graph
  app-->EchoRouter
  EchoRouter-->get.all
  EchoRouter-->post.submit
  get.all-->MessageModel.all
  post.submit-->MessageModel.all
  post.submit-->MessageModel.insertMany
  MessageModel.insertMany--> MessageModel.insertOne 
Note that insertMany() is never used.

Now we can apply the similar top-down or bottom-up integration test strategies.

Example

We reuse the my_mysql_app developed in the earlier units. We start by including jest and supertest in the project

npm i jest supertest

and modify package.json to change

  "scripts": {
    "start": "node ./bin/www"
  },
to
  "scripts": {
    "start": "node ./bin/www",
    "test": "jest" // added 
  },

Then we create a sub folder __test__ under the project root folder.

Now we should have a project folder structure as the following

.
├── __test__
├── app.js
├── bin
│   └── www
├── models
│   ├── db.js
│   └── message.js
├── package.json
├── public
│   ├── images
│   ├── javascripts
│   └── stylesheets
│       └── style.css
├── routes
│   ├── index.js
│   ├── echo.js
│   └── users.js
└── views
    ├── error.ejs
    └── index.ejs
Unit Testing Model message.all

First we define a unit test on on models/message.js's function all().

In the __test__ folder we add a test file message.test.js with the following content

const db = require('../../models/db.js');
const message = require('../../models/message.js');

async function setup() {
    try {
        await db.pool.query(`
            DELETE FROM message;`
        );
        await db.pool.query(`
            INSERT INTO message (msg, time) 
            VALUES ('msg a', '2009-01-01:00:00:00'),
                   ('msg b', '2009-01-02:00:00:00')
        `);
    } catch (error) {
        console.error("setup failed. " + error);
        throw error;
    }
}

async function teardown() {
    try {
        await db.pool.query(`
            DELETE FROM message;`
        );
        await db.cleanup();
    } catch (error) {
        console.error("teardown failed. " + error);
        throw error;
    }
}

describe("models.message.all() tests", () => {
    beforeAll(async () => {
        await setup();
    });
    test ("testing message.all()", () => {
        const expected = [ new message.Message('msg a', new Date('2009-01-01:00:00:00')), 
                           new message.Message('msg b', new Date('2009-01-02:00:00:00'))]
        const result_promise = message.all();
        result_promise.then((result) => {
            expect(result.sort()).toEqual(expected.sort());
        });
    });
    afterAll(async () => {
        await teardown();
    });
})

The setup and teardown define the setup and tear-down routine of this test suite. Note that in the actual project, you might consider backing up and restoring the actual table data in the setup and teardown functions.

In the test suite, we define only one test.

When we run

npm run test message.test.js

we see

> my-mysql-app@0.0.0 test
> jest --detectOpenHandles message.test.js

  console.log
    2

      at Object.log (models/message.js:36:17)

 PASS  __test__/models/message.test.js
  models.message.all() tests
     testing message.all() (3 ms)

Test Suites: 1 passed, 1 total
Tests:       1 passed, 1 total
Snapshots:   0 total
Time:        0.496 s, estimated 1 s
Ran all test suites matching /message.test.js/i.
Integration Test with Echo Router and Model message.all

Next we define a bottom-up integration testing by integrating the path from get.all to message.all().

graph
  app-->EchoRouter
  EchoRouter-->get.all
  get.all-->MessageModel.all

In the __test__ folder we define a new test file echo.test.js with the following content

const db = require('../../models/db.js');
const message = require('../../models/message.js');
const request = require('supertest')
const app = require('../../app');

async function setup() {
    try {
        // TODO backup the existing data to a temp table?
        await db.pool.query(`
            DELETE FROM message;`
        );
        await db.pool.query(`
            INSERT INTO message (msg, time) 
            VALUES ('msg a', '2009-01-01:00:00:00'),
                   ('msg b', '2009-01-02:00:00:00')
        `);
    } catch (error) {
        console.error("setup failed. " + error);
        throw error;
    }
}

async function teardown() {
    // TODO restore the table from the backup;
    try {
        await db.pool.query(`
            DELETE FROM message;`
        );
        await db.cleanup();
    } catch (error) {
        console.error("teardown failed. " + error);
        throw error;
    }
}

describe("routes.echo endpoint integration tests", () => {
    beforeAll(async () => {
        await setup();
    });
    test ("testing /echo/all", async () => {
        const res = await request(app).get('/echo/all');
        const expected = [ new message.Message('msg a', new Date('2009-01-01:00:00:00')), 
                           new message.Message('msg b', new Date('2009-01-02:00:00:00'))]
        expect(res.statusCode).toEqual(200);
        const json = JSON.parse(res.text);
        const received = [];
        for (let i in json) {
            received.push(new message.Message(json[i].msg, new Date(json[i].time)))
        }
        expect(received.sort()).toEqual(expected.sort());
    });
    afterAll(async () => {
        await teardown();
    });

})

The setup and teardown routines are similar to the unit test for message.all(). The only difference is that in the test case, we initiate the call from the app level which trigger the router handler with URL path /echo/all. We then extract the returned text returned from the handler, and parse it back to a json object. Finally we compared the received results (created from json) and the expected result.

When we run

npm run test echo.test.js

we see

> my-mysql-app@0.0.0 test
> jest --detectOpenHandles echo.test.js

  console.log
    2

      at Object.log (models/message.js:36:17)

GET /echo/all 200 28.616 ms - 101
 PASS  __test__/models/echo.test.js
  routes.echo endpoint integration tests
     testing /echo/all (66 ms)

Test Suites: 1 passed, 1 total
Tests:       1 passed, 1 total
Snapshots:   0 total
Time:        0.678 s, estimated 1 s
Ran all test suites matching /echo.test.js/i.

Cyclic call-graph

In case that the call-graph contains cycles, i.e. due to mutual recursion, we have to test the strong connected components as a unit

Pairwise testing

Besides top-down or bottom-up strategies, an alternative is to perform pair-wise testing. The idea is to test each edge of the call-graph. Similar to bottom-up strategy, we could convert unit tests for invidiual unit into test drivers for every pair, saving some effort in mock-up effort. One advantage of pairwise testing is to higher degree of fault isolation.

System Testing

System testing is often less formal compared to unit testing and integration test. Test cases of the system testing can be derived from

  • The use case documents and use case diagrams
  • The sequence diagrams
  • The state machine diagrams

For instance given the following use case document

Use case ID UC 1
Use case name Create New Message
Objective The user creates a new message which will be stored in the Echo app DB
Pre-conditions nil
Primary Actor User
Secondary Actor nil
Normal Flow 1. User navigates to https://localhost:3000/echo/
2. User enters a new text message and submits
3. Echo App receives the message and inserts it into the database
4. Echo App returns the list of all messages in the database and display in the UI
Alternative Flow
Post-conditions nil

We can define a system test case as follows

Test case ID TC 1
Test case name Create New Message
Objective The user creates a new message which will be stored in the Echo app DB
Pre-conditions nil
Event Sequence
Input User navigates to https://localhost:3000/echo/
Output The new message form is displayed
Input User enters a new text message and submits
Output The list of messages in the system is returned and rendered, which includes the newly submitted message.
Post-conditions nil

Life-cycle based testing

In this section, we discuss how to incoporate the testing activities along with the software development life-cycle.

Waterfall testing

In Walterfall software life-cycle, we could easily incoprate the testing activities as the last few phases.

graph
  A("Requirements Specification") --> B
  B("Analysis") -->  C
  C("Design") --> D
  D("Coding") --> E
  E("Unit Testing") --> F
  F("Integration Test") --> G("System Test")

As highlighted in the earlier lesson, Waterfall testing as part of the waterfall development life-cycle, suffers from the long feedback interval issues.

Iterative Life Cycle testing

In Iterative Software Dvelopment Life Cycle, we break and stage different parts/levels of the system components to be developed in different iterations.

graph
  Z("Previous Iteration") --> A
  A("Specification") --> B
  B("Analysis") -->  C
  C("Design") --> D
  D("Coding") --> E
  E("Unit Testing") --> F
  F("Integration Testing") --> G
  G("Regression Testing") --> H
  H("Progression Testing") --> I("Next Iteration")

In terms of testing, we follow a similar structure of waterfall testing for each iteration, except that towards the end, we conduct regression testing and progression testing instead of system test.

  • Regression testing - to re-test the test cases defined and passed in the previous iterations.
  • Progress testing - to pre-test the test cases defined in the upcoming iterations, some of them should fail.

Agile Testing

Recall in Agile development, the development plans are engineered to focus

  • Customer-driven
  • Bottom–up development
  • Flexibility with respect to changing requirements
  • Early delivery of fully functional components

Agile Development is often divided into sprints. In each sprints, development team liaise with the project users to identify the deliverables that should be delivered in the particular sprint. In the testing aspect, the testing must be aligned with the user story development for each sprint.

graph
  A("Customer Expectation")-->B
  B("Iteration Plan")-->C
  C("User Story")-->D
  D("Design")-->E
  E("Coding")-->F
  F("Unit Testing")-->G
  G("Integration Testing")-->H
  H("Regression Testing")-->C

Futher Reading

  1. https://lambtsa.medium.com/rest-api-with-express-router-jest-and-supertest-10832a23016f
  2. https://medium.com/geekculture/testing-express-js-with-jest-8c6855945f03