NTISthis.com

Evidence Guide: ICAT5079B - Perform integration test

Student: __________________________________________________

Signature: _________________________________________________

Tips for gathering evidence to demonstrate your skills

The important thing to remember when gathering evidence is that the more evidence the better - that is, the more evidence you gather to demonstrate your skills, the more confident an assessor can be that you have learned the skills not just at one point in time, but are continuing to apply and develop those skills (as opposed to just learning for the test!). Furthermore, one piece of evidence that you collect will not usualy demonstrate all the required criteria for a unit of competency, whereas multiple overlapping pieces of evidence will usually do the trick!

From the Wiki University

 

ICAT5079B - Perform integration test

What evidence can you provide to prove your understanding of each of the following citeria?

Prepare for test

  1. Prepare the test environment
  2. Prepare the test scripts (online test) or test run (batch test) for running
  3. Review expected results against test and acceptance criteria
  4. Confirm pre-existing modules and compile modification logs
  5. Perform static tests of each point of integration and verify correctness of arguments, positional parameters and return values in each integration suite
  6. Review results of earlier component testing and ensure critical issues are identified and taken into account
Prepare the test environment

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Prepare the test scripts (online test) or test run (batch test) for running

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Review expected results against test and acceptance criteria

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Confirm pre-existing modules and compile modification logs

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Perform static tests of each point of integration and verify correctness of arguments, positional parameters and return values in each integration suite

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Review results of earlier component testing and ensure critical issues are identified and taken into account

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Conduct test

  1. Select appropriate test tools
  2. Run test scripts and document the results against softwarelife cycle model
  3. Ensure that memory leakage, global name space pollution and static variables are specifically addressed for each integration unit in line with test and acceptance criteria
  4. Follow and adopt integration standards where appropriate in line with quality benchmarks
  5. Compare test results to requirements on completion of each integration component
Select appropriate test tools

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Run test scripts and document the results against softwarelife cycle model

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Ensure that memory leakage, global name space pollution and static variables are specifically addressed for each integration unit in line with test and acceptance criteria

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Follow and adopt integration standards where appropriate in line with quality benchmarks

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Compare test results to requirements on completion of each integration component

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Analyse and classify results

  1. Summarise and classify test results and highlight areas of concern
  2. Compare the test results against the requirements and design specification and prepare report
  3. Notify operations of completion of the testing
  4. Ensure attendees' details/comments are logged and signatures gained
  5. Schedule a feedback meeting to discuss report and possible next actions with stakeholders if necessary
  6. Ensure test reporting compliance with documentationandreporting standards
Summarise and classify test results and highlight areas of concern

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Compare the test results against the requirements and design specification and prepare report

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Notify operations of completion of the testing

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Ensure attendees' details/comments are logged and signatures gained

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Schedule a feedback meeting to discuss report and possible next actions with stakeholders if necessary

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Ensure test reporting compliance with documentationandreporting standards

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Assessed

Teacher: ___________________________________ Date: _________

Signature: ________________________________________________

Comments:

 

 

 

 

 

 

 

 

Instructions to Assessors

Evidence Guide

The evidence guide provides advice on assessment and must be read in conjunction with the performance criteria, required skills and knowledge, range statement and the Assessment Guidelines for the Training Package.

Overview of assessment

Critical aspects for assessment and evidence required to demonstrate competency in this unit

Evidence of the following is essential:

Assessment must confirm sufficient knowledge of the integration requirements for the units of the particular system.

Assessment must confirm the ability to determine whether the units of the system operate according to requirements specifications.

To demonstrate competency in this unit the person will require access to:

Acceptance criteria

Test plan

Integration standards

Requirements and design documents used in the analysis of the test

System/application suitable for testing

The person will need to ensure that:

Components have been compiled, linked, and loaded together

Components have successfully passed the integration tests at the interface level between each component

Context of and specific resources for assessment

It should be noted that the quality of code is not being assessed, but the competency of testing the components.

Integration testing involves formal testing of the combined parts of an application to determine if they function together correctly and is usually performed after unit and functional testing. This type of testing is especially relevant to client/server and distributed systems.

The breadth, depth and complexity covering planning and initiation of alternative approaches to skills or knowledge applications across a broad range of technical and/or management requirements, evaluation and coordination would be characteristic.

Assessment must ensure:

The demonstration of competency may also require self-directed application of knowledge and skills, with substantial depth in some areas where judgement is required in planning and selecting appropriate equipment, services and techniques for self and others.

Applications involve participation in development of strategic initiatives as well as personal responsibility and autonomy in performing complex technical operations or organising others. It may include participation in teams including teams concerned with planning and evaluation functions. Group or team coordination may also be involved.

Method of assessment

The purpose of this unit is to define the standard of performance to be achieved in the workplace. In undertaking training and assessment activities related to this unit, consideration should be given to the implementation of appropriate diversity and accessibility practices in order to accommodate people who may have special needs. Additional guidance on these and related matters is provided in ICA05 Section 1.

Competency in this unit should be assessed using summative assessment to ensure consistency of performance in a range of contexts. This unit can be assessed either in the workplace or in a simulated environment. However, simulated activities must closely reflect the workplace to enable full demonstration of competency.

Assessment will usually include observation of real or simulated work processes and procedures and/or performance in a project context as well as questioning on underpinning knowledge and skills. The questioning of team members, supervisors, subordinates, peers and clients where appropriate may provide valuable input to the assessment process. The interdependence of units for assessment purposes may vary with the particular project or scenario.

Guidance information for assessment

Holistic assessment with other units relevant to the industry sector, workplace and job role is recommended, for example:

ICAA5050B Develop detailed component specification from project specification

An individual demonstrating this competency would be able to:

Demonstrate understanding of a broad knowledge base incorporating theoretical concepts, with substantial depth in some areas

Analyse and plan approaches to technical problems or management requirements

Transfer and apply theoretical concepts and/or technical or creative skills to a range of situations

Evaluate information, using it to forecast for planning or research purposes

Take responsibility for own outputs in relation to broad quantity and quality parameters

Take some responsibility for the achievement of group outcomes

Maintain knowledge of industry products and services

Required Skills and Knowledge

Required skills

Problem solving skills for a defined range of unpredictable problems involving participation in the development of strategic initiatives (e.g. when static tests of each point of integration are performed and correctness of arguments, positional parameters and return values in each integration suite are verified)

Plain English literacy and communication skills in relation to analysis, evaluation and presentation of information (e.g. when attendees' details/comments are logged and signatures are gained)

Data analysis skills in relation to analysis, evaluation and presentation of information (e.g. when static tests of each point of integration are performed and correctness of arguments, positional parameters and return values in each integration suite are verified, and when each test script is run and results are documented, and when memory leakage, global name space pollution, static variables are specifically addressed for each integration unit)

Research skills for identifying, analysing and evaluating broad features of system testing and best practice in system testing; high-order problem solving skills (e.g. when results of earlier unit testing are reviewed and critical issues to take into account are identified)

Programming skills in programming language/s relevant to project (e.g. when static tests of each point of integration are performed and correctness of arguments, positional parameters and return values in each integration suite are verified, and when each test script is run and results are documented)

Required knowledge

Broad knowledge of at least two programming languages, with detailed knowledge of programming languages required by system

Detailed knowledge of system/application being tested

Broad knowledge of testing techniques, with detailed knowledge of features and processes in some areas

Broad knowledge of automated test tools, with detailed knowledge of features and processes in some areas

Detailed knowledge of underlying test data

Detailed knowledge of input/output requirements

Range Statement

The range statement relates to the unit of competency as a whole. It allows for different work environments and situations that may affect performance. Bold italicised wording, if used in the performance criteria, is detailed below. Essential operating conditions that may be present with training and assessment (depending on the work situation, needs of the candidate, accessibility of the item, and local industry and regional contexts) may also be included.

Test environment may include:

data

program libraries

network/communications and other equipment

operating system

other support software

Softwarelifecycle may include:

AS/NZS ISO/IEC 12207:1997 Information technology - Software life cycle processes

AS/NZS 15271:1999 Guide for AS/NZS ISO/IEC 12207 Information technology - software life cycle processes)

Test and acceptance processes may include:

AS 4006-1992 Software test documentation

AS/NZS 14143.1:1999 Information technology - software measurement - functional size measurement - definition of concepts

AS/NZS 15026:1999 Information technology - system and software integrity levels

AS 4006-1992 Software test documentation, IEEE Standard for software unit testing

International and Australian Standards are updated and changed on a regular basis. It is therefore important to check the Standards Australia website on a regular basis for new standards: http://www.standards.com.au

Quality benchmarks

There are several organisations that have developed standards for software review mainly: US Department of Defence (DoD) standards, IEEE, the Software Engineering Institute (SEI), and ISO standards.

Relevant quality standards include:

AS 4043-1992 Software configuration management

AS 4042-1992 Software configuration management plans

AS 3925.1-1994 Software quality assurance - plans

AS/NZS 4258:1994 Software user documentation process

AS/NZS ISO/IEC 12207:1997 Information technology - software life cycle processes

AS/NZS 14102:1998 Information technology - guideline for evaluation and selection of CASE tools

International and Australian Standards are updated and changed on a regular basis. It is therefore important to check the Standards Australia website on a regular basis for new standards: http://www.standards.com.au

Test and acceptance criteria

Dependent on the type of test (e.g. functional, efficiency, cohesion)

Documentation and reporting

Documentation for version control may follow ISO/IEC/AS standards. Audit trails, naming standards, version control, project management templates and report writing styles will vary according to organisational approach. Information gathering processes may have associated templates

Test tools may include:

Code/unit/class testing: AssertMate, BoundsChecker, C-Cover, CodeReview, CodeWizard, DeepCover, FailSafe, Hindsight, Insure++, JCAST, Logiscope, JavaPureCheck

Stress load testing: automated test facilities, e-Load, E-TEST Suite, e-MONITO, Astra SiteManager, Astra SiteTest, AutoTester Web, LoadRunner, JavaLoad

Applications testing: DataShark, Cyrano Suite, Datatect, preVue-C/S