Copy and paste from the following data to produce each assessment task. Write these in plain English and spell out how, when and where the task is to be carried out, under what conditions, and what resources are needed. Include guidelines about how well the candidate has to perform a task for it to be judged satisfactory.
Required skills
analytical skills to review and evaluate technical and business requirements
communication skills to:
liaise with programmers on fault debugging matters
liaise with project managers or leaders on report and result matters
seek requirements and information from business and technical experts
literacy skills to:
develop reports and documentation related to test result report
read and interpret software specifications developed by business and technical experts
problem-solving skills to apply basic debugging techniques in the context of software or application development
research skills to:
locate and interrogate complex and varied sources of information
source information from available sources
technical skills to:
operate software applications and navigate the internet
develop a small scale application
execute an application.
Required knowledge
characteristics of programming language
detailed knowledge of input and output requirements
software development life cycle (SDLC) methodologies
system layers, such as data network, hardware, operating system, database management systems, web servers, application servers and client deployment
processes and techniques related to small-size application development.
The range statement relates to the unit of competency as a whole. It allows for different work environments and situations that may affect performance. Bold italicised wording, if used in the performance criteria, is detailed below. Essential operating conditions that may be present with training and assessment (depending on the work situation, needs of the candidate, accessibility of the item, and local industry and regional contexts) may also be included.
Software development specifications may include: | budget requirements customer requirements functional design internal design specifications schedule requirements user stories. |
Standard may include: | ASNZS15026:1999 IEEE 829. |
Methodology may include: | agile extreme rapid application development (RAD) spiral traditional plan driven development (TPDD) waterfall. |
Test types may include: | accessibility testing load testing performance testing smoke testing stress testing usability testing volume testing. |
Tools may include: | automated test configuration management defect management dynamic analysis modelling monitoring requirement management review static analysis test-data preparation test-design test-execution test-management. |
Test design techniques may include: | black-box experience-based specification-based structure-based white-box. |
Documents may include: | configuration guides installation guides reference documents user manuals. |
Test environment requirements may include: | communications configuration hardware software versions. |
Set up test environment may include: | obtain and install software releases set up logging and archiving processes set up or obtain test input data set up test tracking processes. |
Testware may include: | automation tools defect repositories script test cases test plan test report test result testing framework. |
Copy and paste from the following performance criteria to create an observation checklist for each task. When you have finished writing your assessment tool every one of these must have been addressed, preferably several times in a variety of contexts. To ensure this occurs download the assessment matrix for the unit; enter each assessment task as a column header and place check marks against each performance criteria that task addresses.
Observation Checklist