NTISthis.com

Evidence Guide: ICAT5081B - Perform systems test

Student: __________________________________________________

Signature: _________________________________________________

Tips for gathering evidence to demonstrate your skills

The important thing to remember when gathering evidence is that the more evidence the better - that is, the more evidence you gather to demonstrate your skills, the more confident an assessor can be that you have learned the skills not just at one point in time, but are continuing to apply and develop those skills (as opposed to just learning for the test!). Furthermore, one piece of evidence that you collect will not usualy demonstrate all the required criteria for a unit of competency, whereas multiple overlapping pieces of evidence will usually do the trick!

From the Wiki University

 

ICAT5081B - Perform systems test

What evidence can you provide to prove your understanding of each of the following citeria?

Prepare for test

  1. Prepare the test environment
  2. Determine software life cycle
  3. Define test plan and appropriate test tools
  4. Recognise and separate the system into run-able modules mirroring live scenarios
  5. Gather and prepare logs and result sheets
  6. Notify operations of scheduled test to ensure preparedness and understanding of implications for operations
  7. Prepare test scripts (online test) or test run (batch test) for running
  8. Review expected results against acceptance criteria (walkthrough) and system requirements documentation
Prepare the test environment

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Determine software life cycle

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Define test plan and appropriate test tools

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Recognise and separate the system into run-able modules mirroring live scenarios

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Gather and prepare logs and result sheets

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Notify operations of scheduled test to ensure preparedness and understanding of implications for operations

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Prepare test scripts (online test) or test run (batch test) for running

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Review expected results against acceptance criteria (walkthrough) and system requirements documentation

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Conduct test

  1. Run test scripts and document results in line with test and acceptance processes
  2. Perform required quality benchmarks or comparisons in readiness for acceptance testing
  3. Adopt organisation/industry standards, where appropriate
  4. Compare actual results to expected results on completion of each system unit, and complete result sheets
Run test scripts and document results in line with test and acceptance processes

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Perform required quality benchmarks or comparisons in readiness for acceptance testing

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Adopt organisation/industry standards, where appropriate

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Compare actual results to expected results on completion of each system unit, and complete result sheets

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Analyse and classify results

  1. Summarise and classify results, highlighting critical or urgent areas of concern and prepare report
  2. Compare results against requirements
  3. Notify operations of test completion
  4. Log attendees' details/comments and gain required signatures
  5. Schedule feedback meeting to discuss report and possible next actions with stakeholders if necessary
  6. Ensure test reporting compliance with documentation and reporting standards
Summarise and classify results, highlighting critical or urgent areas of concern and prepare report

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Compare results against requirements

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Notify operations of test completion

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Log attendees' details/comments and gain required signatures

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Schedule feedback meeting to discuss report and possible next actions with stakeholders if necessary

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Ensure test reporting compliance with documentation and reporting standards

Completed
Date:

Teacher:
Evidence:

 

 

 

 

 

 

 

Assessed

Teacher: ___________________________________ Date: _________

Signature: ________________________________________________

Comments:

 

 

 

 

 

 

 

 

Instructions to Assessors

Evidence Guide

The evidence guide provides advice on assessment and must be read in conjunction with the performance criteria, required skills and knowledge, range statement and the Assessment Guidelines for the Training Package.

Overview of assessment

Critical aspects for assessment and evidence required to demonstrate competency in this unit

Evidence of the following is essential:

Assessment must confirm sufficient knowledge of components and run-able modules that make up a total system.

To demonstrate competency in this unit the person will require access to:

System test plan

Requirements and design documents

Test plan

Human resources assigned and in place

Test hardware and environments in place and free for system test use

System/application suitable for testing

Context of and specific resources for assessment

All testing should be carried out on the same platform as the completed system. Scheduled testing should be on the production platform. The production environment is required as part of test preparation.

The systems test is a fully functional exercising of the system to be implemented. As such, all resources necessary to execute the entire system will be required.

The purpose of system testing is to identify defects that will only surface when a complete system is assembled. That is, defects that cannot be attributed to individual components or the interaction between two components. System testing includes testing of performance, security, configuration sensitivity, start-up and recovery from failure modes and takes place prior to delivery.

The breadth, depth and complexity covering planning and initiation of alternative approaches to skills or knowledge applications across a broad range of technical and/or management requirements, evaluation and coordination would be characteristic.

Assessment must ensure:

The demonstration of competency may also require self-directed application of knowledge and skills, with substantial depth in some areas where judgement is required in planning and selecting appropriate equipment, services and techniques for self and others.

Applications involve participation in development of strategic initiatives as well as personal responsibility and autonomy in performing complex technical operations or organising others. It may include participation in teams including teams concerned with planning and evaluation functions. Group or team coordination may also be involved.

Method of assessment

The purpose of this unit is to define the standard of performance to be achieved in the workplace. In undertaking training and assessment activities related to this unit, consideration should be given to the implementation of appropriate diversity and accessibility practices in order to accommodate people who may have special needs. Additional guidance on these and related matters is provided in ICA05 Section 1.

Competency in this unit should be assessed using summative assessment to ensure consistency of performance in a range of contexts. This unit can be assessed either in the workplace or in a simulated environment. However, simulated activities must closely reflect the workplace to enable full demonstration of competency.

Assessment will usually include observation of real or simulated work processes and procedures and/or performance in a project context as well as questioning on underpinning knowledge and skills. The questioning of team members, supervisors, subordinates, peers and clients where appropriate may provide valuable input to the assessment process. The interdependence of units for assessment purposes may vary with the particular project or scenario.

Guidance information for assessment

Assessment must confirm the ability to test the operation and consistency of the total system according to the system requirements.

The person will have clearly identified the results of the systems tests. The system test should clearly confirm that:

Functionality, delivered by the development team, is as specified by the business in the business design specification document and the requirements documentation

Software is of high quality; the software will replace/support the intended business functions and achieves the standards required by the organisation for the development of new systems

Software delivered interfaces correctly with existing systems

If the system test does not confirm the above, then the person will need to document how the system has not met the test criteria.

Holistic assessment with other units relevant to the industry sector, workplace and job role is recommended, for example:

ICAA5056B Prepare disaster recovery and contingency plans

ICAT5084B Perform stress and load testing on integrated platform

An individual demonstrating this competency would be able to:

Demonstrate understanding of a broad knowledge base incorporating theoretical concepts, with substantial depth in some areas

Analyse and plan approaches to technical problems or management requirements

Transfer and apply theoretical concepts and/or technical or creative skills to a range of situations

Evaluate information, using it to forecast for planning or research purposes

Take responsibility for own outputs in relation to broad quantity and quality parameters

Take some responsibility for the achievement of group outcomes

Maintain knowledge of industry products and services

Required Skills and Knowledge

Required skills

Problem solving skills for a defined range of unpredictable problems involving participation in the development of strategic initiatives, for example when ability to recognise and separate the system into run-able modules mirroring live scenarios is demonstrated such as the case with end of day, interactive query scenarios of various loads)

Plain English literacy and communication skills in relation to analysis, evaluation and presentation of information (e.g. when attendees' details/comments are logged and signatures are gained)

Analysis/Programming skills in relation to testing the operation and consistency of the total system (e.g. when test scripts (online test) or test run (batch test) are prepared for running)

Questioning and active listening skills (e.g. when attendees' details/comments are logged and signatures are gained)

Required knowledge

Broad general knowledge of system requirements, with detailed knowledge of particular system requirements and features

Broad knowledge of automated test tools, with detailed knowledge of features and processes in some areas

Organisational rules for preparing test

Detailed knowledge of underlying test data

Detailed knowledge of input/output requirements

Range Statement

The range statement relates to the unit of competency as a whole. It allows for different work environments and situations that may affect performance. Bold italicised wording, if used in the performance criteria, is detailed below. Essential operating conditions that may be present with training and assessment (depending on the work situation, needs of the candidate, accessibility of the item, and local industry and regional contexts) may also be included.

Test environment

data

program libraries

network/communications and other equipment

operating system

other support software

Testtools may include:

Code/unit/class testing: AssertMate, BoundsChecker, C-Cover, CodeReview, CodeWizard, DeepCover, FailSafe, Hindsight, Insure++, JCAST, Logiscope, JavaPureCheck

Stress load testing: automated test facilities, e-Load, E-TEST Suite, e-MONITO, Astra SiteManager, Astra SiteTest, AutoTester Web, LoadRunner, JavaLoad

Applications testing: DataShark, Cyrano Suite, Datatect, preVue-C/S