To paraphrase the architect and artilleryman Vitruvius, you can't manage a defence force that you can't measure.
I've spent some time asking people how good the Australian Defence Force is, and I'm not convinced anyone can tell me. Or more accurately, people can tell me how good or bad they think the ADF is, but seldom can they offer proof either way. True, it's very hard to measure the performance of a defence force, but we do a particularly bad job of it. Not understanding the ADF's baseline level of performance makes writing the 2013 Defence White Paper that much harder.
The paucity of defence data in Australia, relative to our democratic peers, undermines the creation of an effective Defence performance management system. Despite the information technology revolution, the level of detail in defence budgets and annual reports has decreased significantly in the past five years, as our friends at ASPI have identified repeatedly.
Even parliament struggles to get information on the performance of the ADF. The Joint Standing Committee on Foreign Affairs, Defence, and Trade is mandated to regularly review the Defence Annual Report. During last year's review, Defence failed to provide answers to all 30 questions on notice by the Committee's three-month deadline, and finally provided answers five months after the committee's hearings. The committee report concluded that, with the exception of the Chief of the Defence Force and Defence Secretary, 'defence officials seemed poorly briefed and ill-prepared for the hearings'.
Often it seems that Defence fails to provide performance information to parliament and the public not because it doesn't want to, but because it can't. In the past few weeks, a Lowy Institute intern has been trying to obtain details on the numbers of medals issued in the past decade by Defence Honours and Awards. It's one of the most accurate ways to determine how many people the ADF has deployed overseas. After two weeks, Defence concluded it didn't know the answer. The Defence organisation, with sole responsibility for processing Defence medals, can't readily determine how many it has handed out.
The Rizzo report into the sudden collapse of Navy's amphibious fleet last year identified that performance measurement and reporting in Navy was seriously flawed, with simplistic traffic light systems being used to measure complex systems and capabilities (and often, reporting itself was overly optimistic).
The Defence Annual Report uses the 'four tick' system to measure Defence performance across a range of outcomes. The amphibious fleet received two out of a possible four ticks in the 2009-10 report, and the same again in the 2010-11 report. In between, it had collapsed entirely. But by the Defence performance measurement system, two ticks means the outcome was 'substantially achieved. Targets were mostly met and any issues are being managed'. Because we have limited external scrutiny of Defence data, terms like 'mostly' and simplistic indicators have sufficed in place of proper performance measurement.
Also, sharing information is a new cultural concept in Defence (I should acknowledge here the excellent work of the Defence Freedom of Information team, who though very professional and helpful are an inchoate part of the organisation). Ask for information on something as simple and factual as the number of medals awarded, and you'll be met by caginess. Imagine how difficult it is to get data on more complex issues.
But I think those inside Defence have just as much difficulty acquiring organisational information as those outside, and for the same reasons. How else to explain the constant difficulty Defence has in responding to requests for information?
So what to do about it? The UK Ministry of Defence has an excellent model that Australia might do well to adopt. It's Defence Analytical Services and Advice agency, established in 1993, is a hybrid of economists, statisticians, and defence policy staff tasked with generating statistics and analysis on defence performance and activity. It provides publicly available defence data and cuts across a range of internal defence silos to pull together information useful for both internal decision making and external scrutiny. A similar agency in Australia could help provide better performance data to both baseline ADF capability as well as track the effectiveness of reform efforts.
Without the data, its difficult to know how Defence is faring. But my hunch is our Defence Force may not be as good as the country (or our allies, for that matter) thinks it is.
Photo courtesy of the Defence Department.