Site Information and assistance

In this presentation, Dr Sharron O’Neill brings together the contrasting worlds of corporate accounting and work health and safety. She urges accurate, timely and useful corporate reporting on work health and safety to be provided to a board or the stock exchange.

Who is this presentation for?

This presentation will be interesting to those interested in corporate social responsibility and good business governance.

About the presenter

Dr Sharron O'Neill is a research fellow at the International Governance and Performance Research Centre, Macquarie University.

Sharron’s research focuses on corporate governance and accountability, especially corporate social and non-financial performance with a specialist interest in work health and safety risk and performance measurement.

Sharron's collaborative research projects bring together the safety and accounting professions with industry, government and academia. She is a member of CPA Australia, the Safety Institute of Australia and the National Safety Council of Australia. Before academia, Sharron had an established career as a financial accountant.

Useful links

Demonstrating leadership and corporate
social responsibility in Annual Reports

Dr Sharron O'Neill 

IGAP Research Centre, Macquarie University

Dr Sharron O'Neill: 

Thank you for inviting me here to speak today about demonstrating leadership and corporate social responsibility in annual reports. A quick overview of the presentation today.  First, I'll speak a little bit about the history of annual report discloses on work health and safety and then give you some insights that we've found in our research at Macquarie University into current practices of reporting on work health and safety, some examples of what's been done well, some examples of things that haven't been done so well and that gives us some insights into how we can actually improve work health and safety reporting in annual reports.

So why the annual report? The annual report is an excellent vehicle for reporting on work health and safety. It's the main medium that we use to communicate with our external stakeholders and it's increasingly being seen as a vehicle for investors to find out about the way organisations manage risk, not just the financial information, but the non-financial information such as work health and safety. In the research that we have conducted, stakeholders see the annual report as a particularly credible source of information for reporting on work health and safety, more so than sustainability reports and more so than websites and other forms of corporate media.

There's actually quite a long history of reporting work health and safety information in annual reports. For example, I did spend some time down at a library in Canberra looking at the 1820s annual reports from the Australian Agricultural and Pastoral Company where they talked about people that had been off work due to sickness and injury and illness, and this is 1823 to 1825. Now even back then, although there was some reporting, it was very sporadic and there was the occasional mention of gases in mines and those sorts of things but it was really about trying to highlight where companies had lost money due to illness and injury or due to working conditions.

In the 1980s - '70s '80s and '90s, we started to see a real move towards corporate social responsibility, and with that, there was an explosion in the number of companies that were reporting on health and safety in annual reports. And there was increasing attention, but it's really become mainstream in the last 10 to 15 years, and it's not only the companies that are now reporting on health and safety, but the public has come to expect it. 

So, the issue is ‘the quality’. ‘The quality’ of that information in general is quite poor. In terms of the early research, when we first started looking - and this is particularly in the accounting discipline – when we first started looking at annual report information on work health and safety, in the 1970s and '80s there was a lot of focus on, "Is the information there?" So, "Do companies report on health and safety - yes or no?" We started counting pages and sentences and numbers of words because that gave an indication of how much focus or how much importance the companies placed on those particular disclosures and on that information.

Then attention started to turn towards the quality of information, so not just interested in was it there but what sort of information was being provided, and so organisations' reports were being assessed and analysed in terms of was the information just qualitative or quantitative, was it full of motherhood statements and claims about – you know – "We're best practice," or were they actually providing information? So they were looking at, "Is there positive news as well as negative news?", "Is there a balance?", "Is there quantitative information as well as qualitative?" 

More recently that investigation of quality and quantity has become more targeted and focused. The examination is really looking now at the quality of content in terms of is that information relevant to the users that it's intended for? Is it reliable? Is it verified? So no longer are those motherhood statements about, "We look after the work health and safety of our workers?" Enough. We now have to go beyond that and we're looking for evidence. 

One of the interesting issues around that is the issue of verifiability - independent assurance. I'll talk in a few moments about some of the studies we've been doing, but one of the interesting things that we noticed in some of the reports is we have seen reports where they say, "We're going to link executive bonuses to work health and safety performance and we're doing that because that information is able to be measured objectively, and it's easily verified." 

Yet in other reports we looked at, they had independent assurance of all of the social and environmental information, but health and safety was explicitly excluded from the scope of those audit reports, and so that's been a really interesting issue to look at in terms of why is that information being excluded. The current research that we're doing at the moment, we have two projects going looking at work health and safety evaluation and reporting sponsored by Safe Work Australia. Both are being conducted with the Safety Institute of Australia and we also have the Institute of Chartered Accountants and CPA Australia involved in those projects. And this work is informed by prior research and the prior research has looked two aspects – one, what do stakeholders want to see in reports, what's important to the different stakeholder groups, and secondly, what are companies actually providing? And the examples I'll present today come from two studies that you can see there – one is a study of mining and energy firm disclosures over a period of 10 years in annual and sustainability reports. The other study, is a study of work health and safety disclosures from a whole range of different industries and they were large companies in the ASX Top 50.

What we found looking at both of those content analysis studies is that in general there's increasing evidence of reporting on work health and safety in annual reports, although there's a wide variation in the quality of that data, and there is a substantial gap between what the stakeholders are expecting to see and what companies are actually reporting. And so these are the issues that we'd really like to talk about today.

In terms of demonstrating governance, upfront, this is the most important, the first thing we should be looking at. Does the company value work health and safety, and how does it express that? When we looked annual reports, most of the firms stated a commitment to work health and safety. They had a statement in there that talked about their vision, for example I've got an example there from an annual report - number one – “Nothing is more important to your Directors than ensuring the safety and security of our employees.” Now we saw a lot of statements like that. They became increasingly prevalent.  However, while they are important and they need to be there, they're also very easy to write. So, they're important but they are not sufficient. They need to be backed up by evidence and you can see an example we have there from another report which talks about ‘the committee reports to the board and it talks about the structures’, how that governance is enacted. ‘The auditors perform reviews, there's written reports’ and it starts to give you some evidence for how governance is actually practised within the organisation. So, adding that level of evidence is important.

Similarly, with reports on strategy some of the organisations were very vague. "We continue to reinforce our steadfast belief that we must never take the health safety of our people for granted and the pursuit of zero harm remains our overriding goal." Aside from the arguments about whether zero harm – we're not going to get into whether it's a goal, a vision or an objective, but the idea that we don't want to hurt people is obviously very important. We need to see that, we need to know the company believes it. However again, some evidence to justify, to explain that, to show how they go about meeting that objective is really important. 

And so some of the examples we had there, and you can see, talking about what's explained elsewhere in the annual report, how they actually demonstrate what they're doing about meeting that goal, continuous improvement projects, talking about some of their risks, talking about some of the management strategies - these are all very important things for investors and other stakeholders to see. So the key there is providing evidence. 

Now as we move from the early '90s to the mid '90s, we started to get out of the ‘motherhood statements’ and back it up with a little bit of qualitative evidence. More recently we've seen companies start to try and look at leading indicators and that's a really important way of providing evidence as to the effectiveness of the processes and projects and initiatives that have been put in place to manage health and safety. 

In terms of leading and lagging indicators we see a lot of reports that talk about injury rates but there's very few that focus on the front end there, the leading and lagging indicators of the critical processes used to manage health and safety. So for example, ‘consultation’. A leading indicator of consultation might be how many of staff we have consulted about health and safety issues. A lagging indicator or an effectiveness indicator would be saying something about how many suggestions offered by those staff have actually been adopted.

In the annual reports we tend to see a focus on auditing, so the number of audits conducted. We see a lot of evidence around training conducted. We see very little in terms of the effectiveness of audits or training and there's certainly a lot of room for improvement there, and we look forward to seeing that happen in the future. 

Unfortunately, where a lot of the reports do seem to be stuck, and this has continued over nearly 20 years now, we've seen an increasing number of firms reporting primarily on injury and illness rates. Certainly, not saying that's not important. We need to see that information. However, there's a lot of focus on fatality and lost time injury rates, and unfortunately that's not terribly helpful. We've seen a lot of new companies coming into the realm of reporting on work health and safety look around and say "What's everyone else reporting on?" - lost time injury, so they join that sort of flock of lost time injury advocates which has been a bit of a shame because it's not a useful measure for reporting in annual reports, and I'll just talk a little bit about why. 

Lost time injury rates don't measure safety and they certainly don't measure health and safety. Safety is about ‘the absence of risk of injury and illness’, so the injury rates will tell you if there has been an injury but they won't tell you anything about the risk that’s been there, particularly if injury rates are low. As well as that, lost time injury rates, they're only a subset of injuries and illnesses. They fail to capture most incidents, they don't capture illnesses very well at all and they fail to capture all the work-related injuries, so they're not a valid measure of frequency and they don't capture all of those injuries for a whole range of reasons, whether it's underreporting or whether it's that the injuries are captured in different categories such as medical treatment injury. They're also too aggregated to inform about the consequences of injury. 

A lot of people will say and we see it in annual reports all the time, "We measure lost time injury rates," they say, "because it tells us about our serious injuries," but that's not necessarily true because we could have a permanent hearing loss for example, which is a serious injury, and yet, if that person hasn't had to take an entire day off work due to that hearing loss, it won't be captured as a lost time injury. It will be captured as medical treatment which is perceived by many managers to be a more minor classification. Similarly with musculoskeletal injuries, sometimes you can have injuries where people will be affected for months, years, even whole-of-life, due to say, a back injury or a shoulder injury, and yet they may not be captured as a lost time injury. So they're viewed as a more minor category of injury. 

So in terms of those measures, lost time injuries, if they're not capturing illnesses, they're not capturing severity or frequency very well. We then thought we'd have a look and see, "Does that actually work on a larger scale?" "Can we show - demonstrate that that actually is the case, that they don't capture severity very well?" So in one study that we've completed fairly recently we took all of the data from New South Wales workers compensation over a period of 10 years and we graphed the injuries, we graphed fatalities and we graphed lost time injuries, and you can see there both of those are trending down. And so that demonstrates that New South Wales has been very successful in reducing fatalities and lost time injuries over that period, and that's what we want to see.

We then took exactly the same injuries and we reclassed them. We classified them as either permanently disabling or temporarily disabling, and what we found was the temporarily disabling injuries went down quite severely, but the permanent disabilities almost doubled in that period, and that data was completely hidden if we were just looking at lost time injury rates. 

And so this is the problem that we have in industry. Most managers get lost time injury and fatality data, and this is also the information that is most commonly produced in annual reports to investors and other stakeholders. So as a result, the users of those reports aren't able to see the information that could really tell them about the consequences of work health and safety outcomes. 

If we look at the traditional way that we would view categories of injury and illness, Heinrich's triangle, Bird's triangle, those kinds of traditional triangle models that tell us that we have first aid treatment which is the most minor, we've got medical treatment, restricted duties and then lost time injuries with fatalities at the top. What we see from that is that lost time injury rate aggregates a whole range of different categories of the injury and illness, and a range of different severities. By placing the triangle that way up, it looks as though the focus and it does direct the focus to the bottom of the triangle, the most frequent injury and illnesses. 

And so, in a sense we're trying to capture the low-hanging fruit. We're trying to prevent the short-term absences and the moderate absences. This is a very production or work focused perspective because those categories really reflect what the business does – Does it offer first aid? Does it offer medical treatment? Does it give the person time off? - and it fails to focus on the outcomes, the consequences, both socially, the consequences for the employees and to a large extent, the consequences financially for the organisation because if we look at the cost, the cost of fatalities and permanent disabilities are significantly higher than the costs of short-term absences, and you're talking about differences of over 80,000 for the disabilities down to less than 1,000 for the minor injuries. 

So, we've been looking at this alternate model and this is a model developed by Jeff Macdonald many years ago. It's been used around Australia in a number of organisations and it's really looking at reframing the way we view injury and illness. It turns the triangle upside down and it puts the focus on the injuries that are the highest consequence. Irrespective of how much time the person has had off work, it's about time to recovery, not time back to work. So class one are those most debilitating injuries and illnesses that cause a permanent disability or a fatality, or that have a long-term impact on the individual. Class two are the temporary impairments and class three are those that really don't have any sort of life altering difference, more of an inconvenience really. 

So, if we turn the triangle that way and we start looking at severity by saying, "Okay, we've got these total class one plus class two," essentially if you add them together it's going to be something very close to a TRIF – a total recordable injury rate because you've captured all of those injuries that are really everything over first aid, and we then can separate out class one, so we can identify which of those injuries that are going to have the most significant impact. It takes a human perspective. 

When we were looking at annual reporting studies, in the early '90s and mid '90s, we did see a number of organisations that were reporting class one. They didn't call it that, but they were reporting on not only fatalities, but permanently disabling injuries and also those that caused long-term impairment. However, we then saw this move towards lost time injuries and a lot of that permanent disability and serious injury reporting has disappeared. We saw some other issues in the reporting which are worth mentioning I think, in terms of injury and illness data quality, and that was around reliability and comparability issues. For example, different firms reported on different measures - some would report lost time injuries, some would report medical treatment, some would report total recordables, some would report an all injury rate - so you really couldn't compare where they stood, make any sort of meaningful comparisons between the organisations. But it actually went further than that. 

The measures were very unstable and by that I mean, an organisation would produce a report that said, "Our lost time injury rates are 0.7 and that's favourable compared to last year which was 0.9." Then the next year they'd say, "Our lost time injury rate is 0.6 which is better than last year which was 0.7," and then the next year you wouldn't see anything, and suddenly they'd produce something different. "This year we're talking about all injury," and after a few years they come back to reporting on lost time injury and show a trend graph and you realise that the year that they actually stopped reporting on it was a poor result. It had gone up. And so there was all this sort of manipulation in the way that the data was presented that they'd change from one to the other one when it didn't present them in a favourable light. So that was a bit of an issue.

In terms of understanding and interpreting the measures that was also a bit of an issue because organisations would use the same name for two different types of measures. So, they would say, "Our LTIFR is this," someone else, "LTIFR is that," but when you look at the definitions that they used, they're measuring different things. They're measuring it differently. Some would say, "We have to have our insurance company accept it before it counts as a lost time injury rate." Some would say, "It includes illnesses," others would say it wouldn't. Some would say, "It could be a period of time off work any time after the injury," others would say, "It has to be the very next day." So there's a whole lot of different definitions being applied at the same indicator.

And then we had the curious case of the same definition with different names which happened as well. Lost time incident versus lost time injury - are they the same thing? Is an incident also capturing illnesses or not? Sometimes injuries captured illnesses, sometimes they didn't, so it all got very messy when you're trying to understand what was actually being reported, and many organisations didn't even provide a definition which made it even more difficult to tell what they were measuring. So the use of a glossary in an annual report is absolutely critical to say what it is that you're actually trying to communicate to your readers.

Here's a couple of examples. This is one study that we had done. We looked at annual reports and these are all the different definitions just for lost time injury rates that were produced in those reports, and the interesting thing is some of the companies changed the definition they used. You'll see the one that has the most - 25 different companies reported lost time injury rates in those selected periods, but, if you add the numbers of reporters across, they won't add up to 25 which means that you actually had people using different definitions from the same company over the different years, which just goes to show how tricky it all gets.

Some of the other issues that we found - and this is just things to ensure that the quality of your data is being produced to a high standard - checking that the numbers add up. Sometimes we had different results reported in an annual report and a sustainability report for the same year. So they would issue the two reports at the same time and one would say, "Our lost time injury rate is…" – you know – "...0.4," and the other one would say, "It's 0.5," and it's for the same period. Now there may have been a reason for that. It may be that they're looking at slightly different time periods, but that sort of thing needs an explanation if you're going to do that. We even had examples where they had two different rates in the same report and one example with two different rates on the same page, and I think this is where we can really see examples of companies that cut-and-paste exactly the same information from year-to-year and just try and do an 'edit replace' on the numbers and it's a very good advertisement for actually thinking through your disclosures and not just doing that cut-and-paste because that really does show out with these sorts of things. 

One of the other problems that we saw - and again, it could be a genuine problem, it's not necessarily a manipulation, but it doesn't instil confidence in a reader - is when the reports would say, "This year our results are 0.7 compared to last year which was 0.9," and you would go and look at last year's and last year's was 0.6. There was one company in particular that we looked at that had that kind of thing happen every single year. The only year they didn't do that was the one and only year that the data was independently verified, and in that year it actually agreed with what was said the year before, which does two things really - it tells us about the value of independent verification and it also tells us that we need to be a little bit critical where the data that's coming out doesn't really make sense.

The other thing that we did notice was sometimes the narratives around the injuries and the illness data was a little bit interesting. An increase in injury rates in one report was described as "results plateaued". Another report said, "An increase in lost time injury rates was a slight improvement," which was a concern. So, I think if health and safety information is going in the reports, we do need to make sure that the health and safety people get to have a good look at it before it goes out because we can see those sorts of issues there could be picked up by people with a little bit more knowledge I think, about health and safety itself. 

In terms of the completeness, ‘consistency’ is also really important. Fatalities - for example, companies were reporting fatalities when there were none and then saying, "We had none this year compared to one last year," but last year there wasn't one reported. Contractors – and this is I think, an important issue that's come out with the new Work Health and Safety Act and the idea of a PCBU – in the previous reports particularly around the late '90s and the early 2000s, companies would report employee and contractor injury data separately, when the contractors were worse than them, and they would report them together when the employees (were) had poorer injury performance than the contractors. And I think the fact that now we have the PCBU concept, it's a way of demonstrating the need to view the contractors as part of the organisation and take that level of responsibility a bit higher. So I think that's a really useful thing, and I've already mentioned about permanent and temporary disability data sometimes being there and then not. 

A little bit of an example about graphs. Here's two graphs from two consecutive annual reports. Interesting to look at '03 and '04 in the two, the scales are aligned but you wouldn't think that '04 is actually the same performance in the two graphs because of the way they've been presented. Consistency in the way the data is presented would be good. Little numbers on top to tell us exactly what you mean would be even better, and it wasn't until I think, about 2008 that we got information about the actual numbers of those injury rates. 

When we asked stakeholders what sort of financial information they wanted to see on work health and safety, that produced a really interesting result. Stakeholders were very clear that they wanted to see information about the financial costs of work health and safety failure. They wanted to know about fines and penalties. They wanted to know about the legal costs. Some wanted to know about workers compensation, not quite so much, but they were very clear that they didn't want - most of them didn't want reporting on costs to do with prevention and there was a genuine fear I think, coming through in the research that if companies started reporting how much money they were spending on prevention two things would happen – one, there'd be a focus on, "Are we spending too much?" and secondly, that there would be a whole exercise of trying to tease out health and safety costs from other cost categories which would be really not very useful in terms of adding value to the organisation. And, so I think the focus there for costs is really around the failure costs and that's important for a number of reasons.

We saw with the BP Deepwater Horizon response from investors the year after the oil well explosion, the investors at the annual general meeting were saying, "Why aren't we getting the information we need on work health and safety?", "Why didn't we know the state that this organisation was in, in terms of its health and safety performance?" and so I think that's really important to keep in mind. In terms of lost productivity, lost time injury rate if anything, it's a productivity measure. It's certainly not a safety measure. It's not really a good injury measure. More than anything it's a productivity measure, but it's only a partial productivity measure because it tells you how many injuries were reported that actually cost lost productivity. It doesn't tell you how much was lost, so if you are going to report it as a productivity measure, it needs to be matched with a measure of lost work days. It's like producing financial statements. You have a balance sheet and an income statement telling you about two different perspectives on your financial performance. The same with lost time injury rates. You would need to know the number, the frequency and also the severity. 

Better still, OSHA in the United States has a measure called the DART rate which is Days Away of Restricted and Transfer, and so that gives a more complete, holistic measure of lost time and it avoids some of the issues that we see with manipulation of things being treated as a restricted day when it should be a lost day or those sorts of temptations. 

In terms of comparability, over time, as we looked at the annual reports', trends over time, one of the key conclusions that I think came out of it is there's a real desire among organisations to be comparable, to be able to benchmark to provide information that's consistent and I think that's genuine and I think it's a useful thing. However, what we've seen is the comparability and consistency has really come out in a reduction in standards to look at a lost time injury rate and that's where the comparability has ended up, and I think that's a shame. So, instead of reporting a range of things that are specific to different organisations, we now see a real focus on lost time injury and fatality measures. There is an increasing focus on, just in the last few years, on a total recordable measure instead of lost time injuries and I think that's a good move. It is a more aggregated measure, so it does need severity data to accompany it and that's really important that the two of those come together. Without that, what we end up with is measures that hide the human and social impact and don't tell us anything about the effectiveness of work health and safety systems. 

I think that last point is the real opportunity for the future. It's really about trying to develop ways of reporting that are able to communicate the work health and safety systems, the effectiveness of those systems, the actions that are being taken, there's very few firms that are doing that well. There are some, but there's very few that are doing that well at the moment. 

So, demonstrating leadership - coming back to our topic - demonstrating leadership and corporate social responsibility, first of all I think it's important to recognise who are the users of the report. We need to recognise that investors are important users. Employees - we often have students and people in our workshops that have told us over and again that when they're looking for new jobs, they're looking to move before they will move, they look at the annual report of the organisation that they're looking to work for, and they look at things like the work health and safety reporting, to see what the company is like as a potential employer. 

Customers and suppliers, particularly in terms of contracting, it's important for them to see work health and safety information. Companies also need to consider their own needs and priorities, consider the needs and priorities of those users of the reports and I think with – in terms of the investors, in the last three years in particular we've seen a significant change in the way investors are viewing health and safety information in annual reports. They're very keen to see that information. They understand that companies that manage health and safety well are more likely to manage risk well, business risk, and I think that's an important point to recognise.

In terms of demonstrating leadership through an annual report, it's not just about leading safety, it's about leading safely. So it's about showing that you can integrate work health and safety into the business operations, into decision making so that the aim of the activity, the leadership aim, is to really have safe, healthy and productive work. Corporate social responsibility - being open and honest about the impact and this is where the severity reporting comes in. Companies that are reluctant to talk about the severity of their injuries are really hiding the impact of their health and safety systems, and particularly the impact on the human side, the employees, the families and also the broader social impact through the externalities that they cause. 

So just in summary, work health and safety information in annual reports – if we're to demonstrate leadership and corporate social responsibility, this requires a number of things. It requires an articulation of the mission, what it is the company is really trying to achieve. The companies, the organisations, also need to recognise what their critical risks are and be prepared to say, "This is the critical risks that we face in our organisation." We know what sort of risks different industries face. It's not like it's a surprise to the users of the report. The fact that companies don't often want to talk about them is something that will then raise more questions than answers. 

Once they talk about the risks, it's really important that they outline how they're managing them. It doesn't have to go into a lot of detail, but if companies can say, "This is our approach to managing risk," giving the reader some assurance that they've thought about it, that they have a plan and a strategy, and then acknowledging the consequences, the injuries and illnesses, not only the frequency but also the severity, to some extent the cost – the failure costs where relevant, and then finally, where there have been a serious injury or illness, providing some information about, "What happened?", "What was the cause?", "What did you find out about why that injury or illness happened?", and then, "What's been done to prevent it happening again?" Again, it doesn't have to be long, it needs to be there.

Thank you very much.

Audience applause.

[End of Transcript]

This site is undergoing constant refinement. If you have noticed something that needs attention or have ideas for the site please let us know.

Last modified on Thursday 27 April 2017 [271|43061]