Unit of Competency Mapping – Information for Teachers/Assessors – Information for Learners

ICAPRG529A Mapping and Delivery Guide
Apply testing techniques for software development

Version 1.0
Issue Date: April 2024


Qualification -
Unit of Competency ICAPRG529A - Apply testing techniques for software development
Description This unit describes the performance outcomes, skills and knowledge required to develop test strategies and implement tests to assure the reliability and quality of an application.
Employability Skills This unit contains employability skills.
Learning Outcomes and Application This unit is relevant to those responsible for test plan preparation, execution and maintenance; reporting of tests; and defect management in an application.The positions that undertake this role include quality assurance analysts, test analysts, testers, system testers, software testers, test leads and developers.
Duration and Setting X weeks, nominally xx hours, delivered in a classroom/online/blended learning setting.
Prerequisites/co-requisites Not applicable.
Competency Field
Development and validation strategy and guide for assessors and learners Student Learning Resources Handouts
Activities
Slides
PPT
Assessment 1 Assessment 2 Assessment 3 Assessment 4
Elements of Competency Performance Criteria              
Element: Plan and design test
  • Analyse and review software development specifications
  • Determine test context, scope, standard and methodology
  • Determine test types and tools
  • Determine test input data requirements
  • Design test plan and test cases using various test design techniques
       
Element: Prepare test environment
  • Analyse and review documents to prepare test environment
  • Determine test environment requirements
  • Build and set up test environment
       
Element: Implement and execute test
  • Build input data for testing
  • Create test suite or script from test cases
  • Execute test cases
  • Create test record to store test result
       
Element: Manage defect and testing process
  • Evaluate and report test results
  • Track defect and verify fixes
  • Maintain and archive testware
       


Evidence Required

List the assessment methods to be used and the context and resources required for assessment. Copy and paste the relevant sections from the evidence guide below and then re-write these in plain English.

The evidence guide provides advice on assessment and must be read in conjunction with the performance criteria, required skills and knowledge, range statement and the Assessment Guidelines for the Training Package.

Overview of assessment

Critical aspects for assessment and evidence required to demonstrate competency in this unit

Evidence of the ability to:

develop test-plan document and test cases to verify the completeness, reliability and performance of an application according to requirement specifications

analyse and prepare test environment, and execute test cases by using automated test tools

document and manage test result by performing application debugging process and re-testing application.

Context of and specific resources for assessment

Assessment must ensure access to:

test environment that closely resembles production environment

business, functional, system and user requirements

system or application suitable for testing

appropriate learning and assessment support when required

modified equipment for people with special needs.

Method of assessment

A range of assessment methods should be used to assess practical skills and knowledge. The following examples are appropriate for this unit:

review of test-plan document that follows a certain standard, such as AS/NZS15026:1999

evaluation of candidate’s ability to:

select and use features of automated testing tool to perform certain type of test (e.g. stress testing)

analyse and document test results

debug application.

Guidance information for assessment

Holistic assessment with other units relevant to the industry sector, workplace and job role is recommended, where appropriate.

Assessment processes and techniques must be culturally appropriate, and suitable to the communication skill level, language, literacy and numeracy capacity of the candidate and the work being performed.

Indigenous people and other people from a non-English speaking background may need additional support.

In cases where practical assessment is used it should be combined with targeted questioning to assess required knowledge.


Submission Requirements

List each assessment task's title, type (eg project, observation/demonstration, essay, assignment, checklist) and due date here

Assessment task 1: [title]      Due date:

(add new lines for each of the assessment tasks)


Assessment Tasks

Copy and paste from the following data to produce each assessment task. Write these in plain English and spell out how, when and where the task is to be carried out, under what conditions, and what resources are needed. Include guidelines about how well the candidate has to perform a task for it to be judged satisfactory.

Required skills

analytical skills to review and evaluate technical and business requirements

communication skills to:

liaise with programmers on fault debugging matters

liaise with project managers or leaders on report and result matters

seek requirements and information from business and technical experts

literacy skills to:

develop reports and documentation related to test result report

read and interpret software specifications developed by business and technical experts

problem-solving skills to apply basic debugging techniques in the context of software or application development

research skills to:

locate and interrogate complex and varied sources of information

source information from available sources

technical skills to:

operate software applications and navigate the internet

develop a small scale application

execute an application.

Required knowledge

characteristics of programming language

detailed knowledge of input and output requirements

software development life cycle (SDLC) methodologies

system layers, such as data network, hardware, operating system, database management systems, web servers, application servers and client deployment

processes and techniques related to small-size application development.

The range statement relates to the unit of competency as a whole. It allows for different work environments and situations that may affect performance. Bold italicised wording, if used in the performance criteria, is detailed below. Essential operating conditions that may be present with training and assessment (depending on the work situation, needs of the candidate, accessibility of the item, and local industry and regional contexts) may also be included.

Software development specifications may include:

budget requirements

customer requirements

functional design

internal design specifications

schedule requirements

user stories.

Standard may include:

ASNZS15026:1999

IEEE 829.

Methodology may include:

agile

extreme

rapid application development (RAD)

spiral

traditional plan driven development (TPDD)

waterfall.

Test types may include:

accessibility testing

load testing

performance testing

smoke testing

stress testing

usability testing

volume testing.

Tools may include:

automated test

configuration management

defect management

dynamic analysis

modelling

monitoring

requirement management

review

static analysis

test-data preparation

test-design

test-execution

test-management.

Test design techniques may include:

black-box

experience-based

specification-based

structure-based

white-box.

Documents may include:

configuration guides

installation guides

reference documents

user manuals.

Test environment requirements may include:

communications

configuration

hardware

software

versions.

Set up test environment may include:

obtain and install software releases

set up logging and archiving processes

set up or obtain test input data

set up test tracking processes.

Testware may include:

automation tools

defect repositories

script

test cases

test plan

test report

test result

testing framework.

Copy and paste from the following performance criteria to create an observation checklist for each task. When you have finished writing your assessment tool every one of these must have been addressed, preferably several times in a variety of contexts. To ensure this occurs download the assessment matrix for the unit; enter each assessment task as a column header and place check marks against each performance criteria that task addresses.

Observation Checklist

Tasks to be observed according to workplace/college/TAFE policy and procedures, relevant legislation and Codes of Practice Yes No Comments/feedback
Analyse and review software development specifications 
Determine test context, scope, standard and methodology 
Determine test types and tools 
Determine test input data requirements 
Design test plan and test cases using various test design techniques 
Analyse and review documents to prepare test environment 
Determine test environment requirements 
Build and set up test environment 
Build input data for testing 
Create test suite or script from test cases 
Execute test cases 
Create test record to store test result 
Evaluate and report test results 
Track defect and verify fixes 
Maintain and archive testware 

Forms

Assessment Cover Sheet

ICAPRG529A - Apply testing techniques for software development
Assessment task 1: [title]

Student name:

Student ID:

I declare that the assessment tasks submitted for this unit are my own work.

Student signature:

Result: Competent Not yet competent

Feedback to student

 

 

 

 

 

 

 

 

Assessor name:

Signature:

Date:


Assessment Record Sheet

ICAPRG529A - Apply testing techniques for software development

Student name:

Student ID:

Assessment task 1: [title] Result: Competent Not yet competent

(add lines for each task)

Feedback to student:

 

 

 

 

 

 

 

 

Overall assessment result: Competent Not yet competent

Assessor name:

Signature:

Date:

Student signature:

Date: