Unit of Competency Mapping – Information for Teachers/Assessors – Information for Learners

AHCARB603 Mapping and Delivery Guide
Interpret diagnostic test results

Version 1.0
Issue Date: March 2024


Qualification -
Unit of Competency AHCARB603 - Interpret diagnostic test results
Description
Employability Skills
Learning Outcomes and Application This unit of competency describes the skills and knowledge required to interpret diagnostic test results and evaluate and critique the testing methods and results.This unit applies to individuals with broad theoretical and technical knowledge of a specific area or a broad field of work and learning and cognitive, technical and communication skills to demonstrate autonomy, judgement and defined responsibility in undertaking complex work within broad parameters to provide specialist advice and functions.The role involves the self-directed application of specialised knowledge in tree anatomy, physiology, pathology, tree dynamics and the edaphic environment with substantial depth in areas such as diagnostic tool application methods and analysis of diagnostic test results.No occupational licensing, legislative or certification requirements are known to apply to this unit at the time of publication.
Duration and Setting X weeks, nominally xx hours, delivered in a classroom/online/blended learning setting.

It is an industry requirement that competency in this unit requires the analysis and critique of a minimum of five different diagnostic tool test results.

Assessment must be demonstrated consistently over time in a suitable range of contexts and have a productivity-based outcome. No single assessment event or report is sufficient to achieve competency in this unit.

Assessment may be conducted in a simulated or real work environment, however determination of competency requires the application of work practices under work conditions.

The mandatory equipment and materials used to gather evidence for assessment include:

equipment:

computer

word processing software

statistical software

internet connection

materials:

critique of test results

portfolio of sample reports of diagnostic test results

Assessors must satisfy current standards for RTOs in the assessment of arboriculture units of competency.

Assessment must be conducted only by persons who have:

arboriculture vocational competencies at least to the level being assessed

current arboriculture industry skills directly relevant to the unit of competency being assessed

Prerequisites/co-requisites
Competency Field
Development and validation strategy and guide for assessors and learners Student Learning Resources Handouts
Activities
Slides
PPT
Assessment 1 Assessment 2 Assessment 3 Assessment 4
Elements of Competency Performance Criteria              
Element: Compile diagnostic tool knowledge requirements
  • Compile a portfolio of sample reports of diagnostic test results
  • Compile peer reviewed papers on the use of each diagnostic tool
  • Review the diagnostic tool specifications and user manuals
  • Research output ranges for each diagnostic tool
  • Identify key thresholds and benchmarks
  • Determine suitability of tool selection for purpose of test
       
Element: Analyse the data and testing methods
  • Identify and evaluate testing methods used
  • Assess relevance, benefits and limitations of methodology used
  • Determine assumptions used
  • Determine currency of the equipment, software or system used
  • Access raw data and testing evidence where available
  • Verify data relevance and accuracy
  • Review references cited in the report
  • Record non-conforming practices and treatments
  • Identify instances of incorrect application of diagnostic tools
  • Highlight unsupported statements and factual errors
  • Detail significant omissions, errors and ambiguities
  • Detail inconsistencies and errors of logic
  • Identify variances to specifications
  • Detail incorrect use of arboricultural terminology
       
Element: Evaluate and critique test results
  • Assess the suitability of the testing process as fit for purpose
  • Analyse the test results
  • Compare evaluation with original interpretation
  • Consider and account for anomalies
  • Determine the validity of outcomes of original report
  • Develop substantiated positions to inform critical analysis of test results
  • Determine further testing required to verify or falsify results
  • Document feedback on original results
       
Element: Prepare critique document the report
  • Compile the analysis
  • Review the completeness and accuracy of the analysis
  • Record analysis outcomes and rationale
  • Document the analytical processes
  • Provide alternative analysis and conclusion
  • Document critique in a report
  • Present report in agreed format and within agreed timelines
       


Evidence Required

List the assessment methods to be used and the context and resources required for assessment. Copy and paste the relevant sections from the evidence guide below and then re-write these in plain English.

Element

Performance criteria

Elements describe the essential outcomes.

Performance criteria describe the performance needed to demonstrate achievement of the element.

1. Compile diagnostic tool knowledge requirements

1.1 Compile a portfolio of sample reports of diagnostic test results

1.2 Compile peer reviewed papers on the use of each diagnostic tool

1.3 Review the diagnostic tool specifications and user manuals

1.4 Research output ranges for each diagnostic tool

1.5 Identify key thresholds and benchmarks

1.6 Determine suitability of tool selection for purpose of test

2. Analyse the data and testing methods

2.1 Identify and evaluate testing methods used

2.2 Assess relevance, benefits and limitations of methodology used

2.3 Determine assumptions used

2.4 Determine currency of the equipment, software or system used

2.5 Access raw data and testing evidence where available

2.6 Verify data relevance and accuracy

2.7 Review references cited in the report

2.8 Record non-conforming practices and treatments

2.9 Identify instances of incorrect application of diagnostic tools

2.10 Highlight unsupported statements and factual errors

2.11 Detail significant omissions, errors and ambiguities

2.12 Detail inconsistencies and errors of logic

2.13 Identify variances to specifications

2.14 Detail incorrect use of arboricultural terminology

3. Evaluate and critique test results

3.1 Assess the suitability of the testing process as fit for purpose

3.2 Analyse the test results

3.3 Compare evaluation with original interpretation

3.4 Consider and account for anomalies

3.5 Determine the validity of outcomes of original report

3.6 Develop substantiated positions to inform critical analysis of test results

3.7 Determine further testing required to verify or falsify results

3.8 Document feedback on original results

4. Prepare critique document the report

4.1 Compile the analysis

4.2 Review the completeness and accuracy of the analysis

4.3 Record analysis outcomes and rationale

4.4 Document the analytical processes

4.5 Provide alternative analysis and conclusion

4.6 Document critique in a report

4.7 Present report in agreed format and within agreed timelines

It is an industry requirement that competency in this unit requires the analysis and critique of a minimum of five different diagnostic tool test results. The candidate must be assessed on their ability to integrate and apply the performance requirements of this unit in a workplace setting. Performance must be demonstrated consistently over time and in a suitable range of contexts.

The candidate must provide evidence for and demonstrate:

compiling a portfolio of sample reports of diagnostic test results including: dynamic and static loading, drill resistance measurement device, sap flow measurements, electronic impedance, chlorophyll fluorescence, increment core, sonic tomography, radar imaging system, bulk density, laboratory, soil test and pH tests

compiling peer reviewed papers on the use of each diagnostic tool

reviewing the diagnostic tool specifications and user manuals

researching output ranges for each diagnostic tool

identifying key thresholds and benchmarks

determining suitability of tool selection for purpose of test

identifying and evaluating testing methods used

assessing relevance, benefits and limitations of methodology used

determine assumptions used

determining currency of the equipment, software or system used

accessing raw data and testing evidence where available

verifying data relevance and accuracy

reviewing references cited in the report

recording non-conforming practices and treatments

identifying instances of incorrect application of diagnostic tools

highlighting unsupported statements and factual errors

detailing significant omissions, errors and ambiguities

detailing inconsistencies and errors of logic

identifying variances to specifications

detailing incorrect use of arboricultural terminology

assessing the suitability of the testing process as fit for purpose

analysing the results

comparing evaluation with original interpretation

considering and account for anomalies

determining the validity of outcomes of original report

develop substantiated positions to inform critical analysis of test results

determining further testing required to verify or falsify results

documenting feedback on original results

compiling the analysis

reviewing the completeness and accuracy of the analysis

recording analysis outcomes and rationale

documenting the analytical processes

providing alternative analysis and conclusion

documenting critique in a report

presenting report in agreed format and within agreed timelines

use of industry standard terminology to describe diagnostic tools and tests.

The candidate must demonstrate knowledge of:

portfolio of sample reports of diagnostic test results

peer reviewed papers on the use of each diagnostic tool

diagnostic tool specifications

user manuals

output ranges

key thresholds

benchmarks

suitability of tool selection

purpose of test

testing methods

methodology

raw data

testing evidence

cited references

data relevance

data accuracy

non-conforming practices

non-conforming treatments

unsupported statements

factual errors

significant omissions, errors and ambiguities

variances to specifications

incorrect use of arboricultural terminology

suitability of the testing process

fit for purpose

analysis of results

anomalies

validity of outcomes

verification and falsification of results

alternative analysis

critiquing.


Submission Requirements

List each assessment task's title, type (eg project, observation/demonstration, essay, assignment, checklist) and due date here

Assessment task 1: [title]      Due date:

(add new lines for each of the assessment tasks)


Assessment Tasks

Copy and paste from the following data to produce each assessment task. Write these in plain English and spell out how, when and where the task is to be carried out, under what conditions, and what resources are needed. Include guidelines about how well the candidate has to perform a task for it to be judged satisfactory.

Element

Performance criteria

Elements describe the essential outcomes.

Performance criteria describe the performance needed to demonstrate achievement of the element.

1. Compile diagnostic tool knowledge requirements

1.1 Compile a portfolio of sample reports of diagnostic test results

1.2 Compile peer reviewed papers on the use of each diagnostic tool

1.3 Review the diagnostic tool specifications and user manuals

1.4 Research output ranges for each diagnostic tool

1.5 Identify key thresholds and benchmarks

1.6 Determine suitability of tool selection for purpose of test

2. Analyse the data and testing methods

2.1 Identify and evaluate testing methods used

2.2 Assess relevance, benefits and limitations of methodology used

2.3 Determine assumptions used

2.4 Determine currency of the equipment, software or system used

2.5 Access raw data and testing evidence where available

2.6 Verify data relevance and accuracy

2.7 Review references cited in the report

2.8 Record non-conforming practices and treatments

2.9 Identify instances of incorrect application of diagnostic tools

2.10 Highlight unsupported statements and factual errors

2.11 Detail significant omissions, errors and ambiguities

2.12 Detail inconsistencies and errors of logic

2.13 Identify variances to specifications

2.14 Detail incorrect use of arboricultural terminology

3. Evaluate and critique test results

3.1 Assess the suitability of the testing process as fit for purpose

3.2 Analyse the test results

3.3 Compare evaluation with original interpretation

3.4 Consider and account for anomalies

3.5 Determine the validity of outcomes of original report

3.6 Develop substantiated positions to inform critical analysis of test results

3.7 Determine further testing required to verify or falsify results

3.8 Document feedback on original results

4. Prepare critique document the report

4.1 Compile the analysis

4.2 Review the completeness and accuracy of the analysis

4.3 Record analysis outcomes and rationale

4.4 Document the analytical processes

4.5 Provide alternative analysis and conclusion

4.6 Document critique in a report

4.7 Present report in agreed format and within agreed timelines

Copy and paste from the following performance criteria to create an observation checklist for each task. When you have finished writing your assessment tool every one of these must have been addressed, preferably several times in a variety of contexts. To ensure this occurs download the assessment matrix for the unit; enter each assessment task as a column header and place check marks against each performance criteria that task addresses.

Observation Checklist

Tasks to be observed according to workplace/college/TAFE policy and procedures, relevant legislation and Codes of Practice Yes No Comments/feedback
Compile a portfolio of sample reports of diagnostic test results 
Compile peer reviewed papers on the use of each diagnostic tool 
Review the diagnostic tool specifications and user manuals 
Research output ranges for each diagnostic tool 
Identify key thresholds and benchmarks 
Determine suitability of tool selection for purpose of test 
Identify and evaluate testing methods used 
Assess relevance, benefits and limitations of methodology used 
Determine assumptions used 
Determine currency of the equipment, software or system used 
Access raw data and testing evidence where available 
Verify data relevance and accuracy 
Review references cited in the report 
Record non-conforming practices and treatments 
Identify instances of incorrect application of diagnostic tools 
Highlight unsupported statements and factual errors 
Detail significant omissions, errors and ambiguities 
Detail inconsistencies and errors of logic 
Identify variances to specifications 
Detail incorrect use of arboricultural terminology 
Assess the suitability of the testing process as fit for purpose 
Analyse the test results 
Compare evaluation with original interpretation 
Consider and account for anomalies 
Determine the validity of outcomes of original report 
Develop substantiated positions to inform critical analysis of test results 
Determine further testing required to verify or falsify results 
Document feedback on original results 
Compile the analysis 
Review the completeness and accuracy of the analysis 
Record analysis outcomes and rationale 
Document the analytical processes 
Provide alternative analysis and conclusion 
Document critique in a report 
Present report in agreed format and within agreed timelines 

Forms

Assessment Cover Sheet

AHCARB603 - Interpret diagnostic test results
Assessment task 1: [title]

Student name:

Student ID:

I declare that the assessment tasks submitted for this unit are my own work.

Student signature:

Result: Competent Not yet competent

Feedback to student

 

 

 

 

 

 

 

 

Assessor name:

Signature:

Date:


Assessment Record Sheet

AHCARB603 - Interpret diagnostic test results

Student name:

Student ID:

Assessment task 1: [title] Result: Competent Not yet competent

(add lines for each task)

Feedback to student:

 

 

 

 

 

 

 

 

Overall assessment result: Competent Not yet competent

Assessor name:

Signature:

Date:

Student signature:

Date: