15
th
International Congress on Archives
Hoy
www.wien2004.ica.org
17
Evaluation of professional development must be built into the design process, to measure quality and
effectiveness, not just numbers of students and funds allocated.
A report into recruitment and training in the Australian Public Service emphasised the lack of
evaluation in public sector training and education. One area identified in the report as lacking was
evaluation of the long term effects of programs, as most evaluations relied on the more immediate
participant response.
This concern is echoed by Kelly who recommends four levels of evaluation:
1.
participant reaction through the evaluation forms;
2.
learning through feedback to identify what participants thought they had learnt;
3.
behaviours and attitudes through longer term feedback on what participants did once back in
their own workplaces; and
4.
impact on the organisation through strategic level assessment of longer term benefits of
educational programs.
There are numerous other evaluation models which can be accessed. Qualitative evaluation models
suggest that we ask reflective questions that consider the wider context of the particular program. Such
models should gather information by asking questions about what is working and what is not, rather
than only what happened.
For professional development, evaluation could ask questions like:
·
Is the professional development program continuing to be relevant to the organisation and to
the profession?
·
Are employees or association members changing, and do they have different expectations?
·
What is happening in the profession or in other environments that is not being covered in the
program?
Conclusion
If competency standards are seen as inflexible, and unresponsive to different work environments,
constantly changing practices and new challenges, they will be rejected as irrelevant for professional
development. They will also be rejected in tertiary or vocational institutions. We must guard against
narrow, mechanistic competency standards being the only measure of learning, as these will not help
the archives and records profession adapt to new challenges and remain viable into the future.
Archival institutions working with competency standards for professional development must make sure
their objectives are clear and that any frameworks they develop do not stifle enthusiasm for learning or
offer chequered, unconnected pathways. Archival institutions want expertise, but they also need
thinkers who will use foresight to explore plausible futures or directions. If competency standards are
to be useful to the profession in the long term, they have to support critical thinking about concepts and
procedures and accept that these cannot be measured through demonstrated performance or quantitative
indicators alone.
82
Australian Public Service Commission (APSC) & Australian National Audit Office (ANAO), `Building
Capability: A Framework for Managing Learning and Development in the Australian Public Service', APSC
website:
http://www.apsc.gov.au/publications03/capability.htm
(accessed 11 June 2004), p. 24; Cox, p. 143.
83
OECD, Lifelong Learning Highlights, p. 10.
84
Senate Finance and Public Administration References Committee, `Recruitment and Training in the Australian
Public Service', 2003, Senate website (Australia):
http://www.aph.gov.au/senate/committee/fapa_ctte/aps_recruit_training/index.htm
(accessed 24 May 2004), p.
145.
85
D Kelly, `Planning and running events', in P Kahn & P Baume (eds), A Guide to Staff and Educational
Development, Kogan Page, London, 2003, p. 49.
86
APSC & ANAO; J Owen & P Rogers, Program Evaluation: Forms and Approaches, Allen & Unwin, St
Leonards, Sydney, 1999; D Baume, `Monitoring and evaluating staff and educational development', in P Kahn &
P Baume (eds), A Guide to Staff and Educational Development, Kogan Page, London, 2003, p. 93.