Google Links

Follow the links below to find material targeted to the unit's elements, performance criteria, required skills and knowledge

Elements and Performance Criteria

  1. Compile diagnostic tool knowledge requirements
  2. Analyse the data and testing methods
  3. Evaluate and critique test results
  4. Prepare critique document the report

Range Statement


Performance Evidence

It is an industry requirement that competency in this unit requires the analysis and critique of a minimum of five different diagnostic tool test results. The candidate must be assessed on their ability to integrate and apply the performance requirements of this unit in a workplace setting. Performance must be demonstrated consistently over time and in a suitable range of contexts.

The candidate must provide evidence for and demonstrate:

compiling a portfolio of sample reports of diagnostic test results including: dynamic and static loading, drill resistance measurement device, sap flow measurements, electronic impedance, chlorophyll fluorescence, increment core, sonic tomography, radar imaging system, bulk density, laboratory, soil test and pH tests

compiling peer reviewed papers on the use of each diagnostic tool

reviewing the diagnostic tool specifications and user manuals

researching output ranges for each diagnostic tool

identifying key thresholds and benchmarks

determining suitability of tool selection for purpose of test

identifying and evaluating testing methods used

assessing relevance, benefits and limitations of methodology used

determine assumptions used

determining currency of the equipment, software or system used

accessing raw data and testing evidence where available

verifying data relevance and accuracy

reviewing references cited in the report

recording non-conforming practices and treatments

identifying instances of incorrect application of diagnostic tools

highlighting unsupported statements and factual errors

detailing significant omissions, errors and ambiguities

detailing inconsistencies and errors of logic

identifying variances to specifications

detailing incorrect use of arboricultural terminology

assessing the suitability of the testing process as fit for purpose

analysing the results

comparing evaluation with original interpretation

considering and account for anomalies

determining the validity of outcomes of original report

develop substantiated positions to inform critical analysis of test results

determining further testing required to verify or falsify results

documenting feedback on original results

compiling the analysis

reviewing the completeness and accuracy of the analysis

recording analysis outcomes and rationale

documenting the analytical processes

providing alternative analysis and conclusion

documenting critique in a report

presenting report in agreed format and within agreed timelines

use of industry standard terminology to describe diagnostic tools and tests.


Knowledge Evidence

The candidate must demonstrate knowledge of:

portfolio of sample reports of diagnostic test results

peer reviewed papers on the use of each diagnostic tool

diagnostic tool specifications

user manuals

output ranges

key thresholds

benchmarks

suitability of tool selection

purpose of test

testing methods

methodology

raw data

testing evidence

cited references

data relevance

data accuracy

non-conforming practices

non-conforming treatments

unsupported statements

factual errors

significant omissions, errors and ambiguities

variances to specifications

incorrect use of arboricultural terminology

suitability of the testing process

fit for purpose

analysis of results

anomalies

validity of outcomes

verification and falsification of results

alternative analysis

critiquing.