Assessor Resource

ICAGAM511A
Manage testing of games and interactive media

Assessment tool

Version 1.0
Issue Date: April 2024


This unit applies to IT personnel who take responsibility for managing testing games and interactive media.

The management of the testing of a product directly impacts on the quality and time lines of delivery to market of a product. Good test management can deliver a quality product. Badly designed products take longer to test but can still be of quality, when partnered with good test management. Well-designed software and good test management go hand in hand to produce quality, timely software products.

This unit describes the performance outcomes, skills and knowledge required to manage the testing of games and interactive media to enable timely product release.

You may want to include more information here about the target group and the purpose of the assessments (eg formative, summative, recognition)

Prerequisites

Not applicable.


Employability Skills

This unit contains employability skills.




Evidence Required

List the assessment methods to be used and the context and resources required for assessment. Copy and paste the relevant sections from the evidence guide below and then re-write these in plain English.

The evidence guide provides advice on assessment and must be read in conjunction with the performance criteria, required skills and knowledge, range statement and the Assessment Guidelines for the Training Package.

Overview of assessment

Critical aspects for assessment and evidence required to demonstrate competency in this unit

Evidence of the ability to:

define quality requirements and an associated test plan for software

install and configure testing support software

manage testing process

identify bugs or issues accurately and concisely

finalise software testing process to enable product release.

Context of and specific resources for assessment

Assessment must ensure access to:

system undergoing development with associated specifications

client requirements (verbal or written) for system quality requirements

testing support software

test environment

appropriate learning and assessment support when required

modified equipment for people with special needs.

Method of assessment

A range of assessment methods should be used to assess practical skills and knowledge. The following examples are appropriate for this unit:

direct observation of candidate identifying or finding bugs and issues

review of reports prepared by candidate showing plans and management of testing.

Guidance information for assessment

Holistic assessment with other units relevant to the industry sector, workplace and job role is recommended, where appropriate.

Assessment processes and techniques must be culturally appropriate, and suitable to the communication skill level, language, literacy and numeracy capacity of the candidate and the work being performed.

Indigenous people and other people from a non-English speaking background may need additional support.

In cases where practical assessment is used it should be combined with targeted questioning to assess required knowledge.


Submission Requirements

List each assessment task's title, type (eg project, observation/demonstration, essay, assingnment, checklist) and due date here

Assessment task 1: [title]      Due date:

(add new lines for each of the assessment tasks)


Assessment Tasks

Copy and paste from the following data to produce each assessment task. Write these in plain English and spell out how, when and where the task is to be carried out, under what conditions, and what resources are needed. Include guidelines about how well the candidate has to perform a task for it to be judged satisfactory.

Required skills

analytical skills to define test plan details and identify bugs or issues

communication skills to:

determine and define quality requirements statement

manage and coordinate testing processes and finalisation of testing

literacy skills to produce technical documentation and reports

planning skills to define test plan details

technical skills to:

manage test environment

develop test plan details

select, install and configure test plan support software.

Required knowledge

client requirements for platforms, hardware and software

client system requirements, both functional and non-functional

procedures for bug or issue management and identification

test reporting requirements

testing techniques, methods, test types, system dissection.

The range statement relates to the unit of competency as a whole. It allows for different work environments and situations that may affect performance. Bold italicised wording, if used in the performance criteria, is detailed below. Essential operating conditions that may be present with training and assessment (depending on the work situation, needs of the candidate, accessibility of the item, and local industry and regional contexts) may also be included.

Types of requirements may include:

functional, such as audio and visual

installation, such as new, upgrades, database and software

non-functional, such as performance, load, stress and cultural

platform or environment.

Outstanding bugs may include:

defined priorities

with or without defined priorities.

Limited release may include:

alpha and beta release options

environment-specific release

internal or in-house release options

external release options

platform-specific release.

Test cycles may include:

delivery of testable product to client

delivery of testable product to test team

in-development tests, such as source-code updates or retrievals from source control, initial implementation complete time, pre-implementation time.

Types of testing may include:

acceptance

data functionality

functionality

graphical user interface (GUI)

load

performance

regression

smoke

stress

unit.

Testing methods may include:

automated, such as coded testing or using automation-tool

manual, such as exploratory or testing of specific test cases

static and dynamic analysis.

Testing technique may include:

boundary value analysis

equivalence

way of designing test cases.

Test-support software may include:

bug tracking software

test automation software, such as:

WinRunner

LoadRunner

test case-management software, such as Test Director.

Implementation details and reporting details may include:

communication and interaction levels between departments for cycles, such as development may or may not be approachable during particular processes or cycles

details of all processes and procedures

required documentation repositories, templates, structures and locations.

Completeness techniques may include:

interdepartmental reviews

intradepartmental reviews

system dissection.

Copy and paste from the following performance criteria to create an observation checklist for each task. When you have finished writing your assessment tool every one of these must have been addressed, preferably several times in a variety of contexts. To ensure this occurs download the assessment matrix for the unit; enter each assessment task as a column header and place check marks against each performance criteria that task addresses.

Observation Checklist

Tasks to be observed according to workplace/college/TAFE policy and procedures, relevant legislation and Codes of Practice Yes No Comments/feedback
Review picture of product in market place to determine high-level requirements provided by product, for expected clients, considering all types of requirements 
Define a releasable product in terms of outstanding bugs, to enable a limited release and a complete product release 
Summarise findings into a product release enabling quality requirements statement 
Confirm client and development, agreement on product release enabling quality requirements statement 
Determine expected test cycles, during the software development life cycle, considering what development methodology is in use and what the quality requirements are for product release 
Determine what types of testing will be performed during the test cycles to enable efficiency of processes and confirmation of quality requirements statement 
Determine what testing methods will be used to implement testing types defined for defined test cycles 
Determine testing technique to be used to determine test cases and analyse results 
Perform test cycle until a combination is found that provides an acceptable balance of cost, quality and risk, for upper management and development to agree to 
Selection test-support software to enable efficiency in testing and testing management 
Define implementation details for agreed testing and team responsible for testing management 
Define reporting details for testing throughout product life cycle to enable ongoing management of testing process 
Confirm test plan completeness using available completeness techniques 
Confirm test plan with development and management 
Install and configure bug tracking process and define bug description fields to maximise efficiency and minimise possibilities of bouncing bugs 
Install and configure test case management software 
Install and configure test cycle management and reporting software 
Install and configure automated test tools 
Manage and report on development of test cases 
Manage and report on test cycle status 
Manage and report on outstanding bug status 
Manage and report on status of product testing related to product release enabling quality requirements statement 
Update test plan and schedule to deal with changing development conditions, and ensure management are informed 
Manage bugs to ensure efficient bug handling and resolution 
Manage test environment, including setup, receipt of test builds and clean-up 
Produce testing results for management review prior to release 
Manage test product freeze for final release and final test run 
Confirm product, release enabling, and quality requirements have been met 

Forms

Assessment Cover Sheet

ICAGAM511A - Manage testing of games and interactive media
Assessment task 1: [title]

Student name:

Student ID:

I declare that the assessment tasks submitted for this unit are my own work.

Student signature:

Result: Competent Not yet competent

Feedback to student

 

 

 

 

 

 

 

 

Assessor name:

Signature:

Date:


Assessment Record Sheet

ICAGAM511A - Manage testing of games and interactive media

Student name:

Student ID:

Assessment task 1: [title] Result: Competent Not yet competent

(add lines for each task)

Feedback to student:

 

 

 

 

 

 

 

 

Overall assessment result: Competent Not yet competent

Assessor name:

Signature:

Date:

Student signature:

Date: