Workplace Safety in 2022: Is Manual Audit an efficient way of meeting your HSE targets?

5 mins read

Commencing in 1840 as a Royal Commission published report, HSE data processed manually on a single sheet of paper was an advancement like none other for its time. How relevant is it in 2022, is the story for today!
Call it the way the world works or simply the cost of human existence, occupational hazards and potentially fatal risks are a reality that 23% of the global workforce (employed by heavy industry) wake up to, every single day. This is despite some of the best regulations, safety procedures, and strictest laws in place.
Workplace Safety can never be an afterthought as organizational efficiency directly depends on the well-being of employees. If not for the sake of $1.25 trillion (global annual cost of safety incidents), then for the sheer joy of avoiding distress, it is imperative we take this time out to introspect.

Table of Contents

Decision-making – a complex nexus of data and variables

Human intelligence is most effectively utilized when tasked to exploit advantages and avoid risks by using experience, knowledge, and insight to anticipate dangers and opportunities. In doing so, it processes information from the past including statistical data, recommendations, observations, or tenets from earlier learnings.

Organizations have matured to this context across most functional verticals. In today’s era, theory without data is a capital mistake. Organizations across sectors depend heavily upon data for decision-making, response to change, and establishment of strategic goals.

The most critical piece of this process intuitively enough is efficient treatment and management of relevant data.
Answering questions about construction concept. Builder is talking on phone. Construction consultations. Man answers some construction questions. Builder on blue background. He holds documents in hands

What do we mean by Manual Data Processing?

The terms “automatic” and “manual” are relative to each other and subject to change with advancements in technology. A self-driving car is more “automatic” than a car with automatic transmission, which is more “automatic” as compared to one with manual transmission.   

Today, we shall be taking the liberty to refer to all modes of HSE data processing, that do not involve Artificial Intelligence, (advanced analytics) as “manual”. This is going to include pen-paper (checkbox-based) incident recording as well as those carried out using spreadsheets, database apps, and ERPs requiring manual logging of data.   

“Nearly half (48%) of manufacturing companies use spreadsheets or other manual data entry documents.”    

This work is usually performed by HSE officers trained to look for relevant insights from their administrative zones and transcribe them into the target digital medium. Not to mention the process must be repeated over a fatiguing cycle of data collection, data recording, and providing updates.  

Why manual data entry and processing can be inefficient and have dangerous consequences for HSE?  

Right from faulty collection to faulty collation and analysis, manual data processing has been the undoing of seemingly impeccable HSE strategies. An analysis of data consistency of checkbox-based hazard reports conducted by the Journal of Petroleum Technology revealed that,   

“Approximately 20% of employees had checked the ‘other’ category when reporting something they didn’t comprehend, adversely affecting the identification process.”    

An average human makes about 5 mistakes every hour. 3,000 or so people working in a high-risk industry make more than 100,000 errors every single day. The same translates over data processing carried out manually. While this depends a lot on the person working that data, the system itself comes to rely heavily on individual capabilities.    

In addition to fatally expensive catastrophes that may result, there are major hidden costs of error correction involved. Manual data collection and processing errors can affect the bottom line. It has been shown that incorrect data can cause organizations to overshoot their HSE budgets by over 30%.  

Research firm Gartner has found that the average cost of poor data quality amounts to anywhere between $9.7 million and $14.2 million annually. At the macro level, bad data is estimated to cost the US more than $3 trillion per year. 

What starts with 1, ends with 100

The fiscal cost of manual data processing is defined by the 1-10-100 data entry rule.  

“The rule states that verification of data accuracy at the point of entry costs $1, cleaning up of errors costs $10 in batch form, and uncorrected errors cost the organization $100 or more.”  

Manual data processing can work if you have the right partners and the right tools, but let’s face it, it is not the most productive use of your time. Especially when you know how fundamentally, and critically important accurate and timely data is to your HSE strategy when the moment arrives. 

Cometh the Hour …

Even the best of what you got is of little importance if it is not available when you need it the most. The same is true for HSE data.

Relying on manual data collection is labor-intensive, inefficient, and can lead to a lack of hard data-based, actionable insights for developing HSE strategies. But what is the solution?

The solution lies in using the data “smartly”. In a system reliable enough to eliminate the “luck” factor from HSE, for good. This system should be able to take over the repetitive tasks enabling the highly valued human intelligence to be tasked with identifying opportunities and preventing hazards. 

Join us next time as we discuss how automatic processing, powered by AI is the way forward for industries, at a time when there is significant pressure on existing resources and why it’s the hardest, but the best time to make a change.  

Share this blog post via