Application
This unit of competency describes the skills and knowledge required to interpret diagnostic test results and evaluate and critique the testing methods and results.
This unit applies to individuals with broad theoretical and technical knowledge of a specific area or a broad field of work and learning and cognitive, technical and communication skills to demonstrate autonomy, judgement and defined responsibility in undertaking complex work within broad parameters to provide specialist advice and functions.
The role involves the self-directed application of specialised knowledge in tree anatomy, physiology, pathology, tree dynamics and the edaphic environment with substantial depth in areas such as diagnostic tool application methods and analysis of diagnostic test results.
No occupational licensing, legislative or certification requirements are known to apply to this unit at the time of publication.
Elements and Performance Criteria
Element | Performance criteria |
Elements describe the essential outcomes. | Performance criteria describe the performance needed to demonstrate achievement of the element. |
1. Compile diagnostic tool knowledge requirements | 1.1 Compile a portfolio of sample reports of diagnostic test results 1.2 Compile peer reviewed papers on the use of each diagnostic tool 1.3 Review the diagnostic tool specifications and user manuals 1.4 Research output ranges for each diagnostic tool 1.5 Identify key thresholds and benchmarks 1.6 Determine suitability of tool selection for purpose of test |
2. Analyse the data and testing methods | 2.1 Identify and evaluate testing methods used 2.2 Assess relevance, benefits and limitations of methodology used 2.3 Determine assumptions used 2.4 Determine currency of the equipment, software or system used 2.5 Access raw data and testing evidence where available 2.6 Verify data relevance and accuracy 2.7 Review references cited in the report 2.8 Record non-conforming practices and treatments 2.9 Identify instances of incorrect application of diagnostic tools 2.10 Highlight unsupported statements and factual errors 2.11 Detail significant omissions, errors and ambiguities 2.12 Detail inconsistencies and errors of logic 2.13 Identify variances to specifications 2.14 Detail incorrect use of arboricultural terminology |
3. Evaluate and critique test results | 3.1 Assess the suitability of the testing process as fit for purpose 3.2 Analyse the test results 3.3 Compare evaluation with original interpretation 3.4 Consider and account for anomalies 3.5 Determine the validity of outcomes of original report 3.6 Develop substantiated positions to inform critical analysis of test results 3.7 Determine further testing required to verify or falsify results 3.8 Document feedback on original results |
4. Prepare critique document the report | 4.1 Compile the analysis 4.2 Review the completeness and accuracy of the analysis 4.3 Record analysis outcomes and rationale 4.4 Document the analytical processes 4.5 Provide alternative analysis and conclusion 4.6 Document critique in a report 4.7 Present report in agreed format and within agreed timelines |
Evidence of Performance
It is an industry requirement that competency in this unit requires the analysis and critique of a minimum of five different diagnostic tool test results. The candidate must be assessed on their ability to integrate and apply the performance requirements of this unit in a workplace setting. Performance must be demonstrated consistently over time and in a suitable range of contexts.
The candidate must provide evidence for and demonstrate:
compiling a portfolio of sample reports of diagnostic test results including: dynamic and static loading, drill resistance measurement device, sap flow measurements, electronic impedance, chlorophyll fluorescence, increment core, sonic tomography, radar imaging system, bulk density, laboratory, soil test and pH tests
compiling peer reviewed papers on the use of each diagnostic tool
reviewing the diagnostic tool specifications and user manuals
researching output ranges for each diagnostic tool
identifying key thresholds and benchmarks
determining suitability of tool selection for purpose of test
identifying and evaluating testing methods used
assessing relevance, benefits and limitations of methodology used
determine assumptions used
determining currency of the equipment, software or system used
accessing raw data and testing evidence where available
verifying data relevance and accuracy
reviewing references cited in the report
recording non-conforming practices and treatments
identifying instances of incorrect application of diagnostic tools
highlighting unsupported statements and factual errors
detailing significant omissions, errors and ambiguities
detailing inconsistencies and errors of logic
identifying variances to specifications
detailing incorrect use of arboricultural terminology
assessing the suitability of the testing process as fit for purpose
analysing the results
comparing evaluation with original interpretation
considering and account for anomalies
determining the validity of outcomes of original report
develop substantiated positions to inform critical analysis of test results
determining further testing required to verify or falsify results
documenting feedback on original results
compiling the analysis
reviewing the completeness and accuracy of the analysis
recording analysis outcomes and rationale
documenting the analytical processes
providing alternative analysis and conclusion
documenting critique in a report
presenting report in agreed format and within agreed timelines
use of industry standard terminology to describe diagnostic tools and tests.
Evidence of Knowledge
The candidate must demonstrate knowledge of:
portfolio of sample reports of diagnostic test results
peer reviewed papers on the use of each diagnostic tool
diagnostic tool specifications
user manuals
output ranges
key thresholds
benchmarks
suitability of tool selection
purpose of test
testing methods
methodology
raw data
testing evidence
cited references
data relevance
data accuracy
non-conforming practices
non-conforming treatments
unsupported statements
factual errors
significant omissions, errors and ambiguities
variances to specifications
incorrect use of arboricultural terminology
suitability of the testing process
fit for purpose
analysis of results
anomalies
validity of outcomes
verification and falsification of results
alternative analysis
critiquing.
Assessment Conditions
It is an industry requirement that competency in this unit requires the analysis and critique of a minimum of five different diagnostic tool test results.
Assessment must be demonstrated consistently over time in a suitable range of contexts and have a productivity-based outcome. No single assessment event or report is sufficient to achieve competency in this unit.
Assessment may be conducted in a simulated or real work environment, however determination of competency requires the application of work practices under work conditions.
The mandatory equipment and materials used to gather evidence for assessment include:
equipment:
computer
word processing software
statistical software
internet connection
materials:
critique of test results
portfolio of sample reports of diagnostic test results
Assessors must satisfy current standards for RTOs in the assessment of arboriculture units of competency.
Assessment must be conducted only by persons who have:
arboriculture vocational competencies at least to the level being assessed
current arboriculture industry skills directly relevant to the unit of competency being assessed
Foundation Skills
Foundation Skills essential to performance are explicit in the performance criteria of this unit of competency.
Range Statement
Sectors
Arboriculture (ARB)