I was an early believer in data warehouse (DWH) and business intelligence (BI) technology, having taken a class with the creator of dimensional modelling, Mr. Ralph Kimball, in Brussels back in 1996. I immediately saw the potential, and convinced the management in the company where I was working to invest in building the first-ever data warehouse solution in 1997 for my client, a large car manufacturer.

My first experience with data analytics

However, I remember almost being thrown out of the EMEA Director’s boardroom when presenting the first results. We had lost millions of dollars for paying out warranty claims for cars that did not exist as the company hadn’t registered the VINs (vehicle identification numbers) correctly. Directors in the boardroom argued this was impossible, yet, they were unaware of the facts the data had shown. This data analytics exercise had exposed discrepancies and ultimately revealed that some fraudulent car dealers had found a way to circumnavigate the usual checks in order to claim back extra money.

Fortunately, the Board reconsidered the case and requested to develop the system further and put it to use. As a result, many subsequent reports proved that the business process could be improved. More importantly, I witnessed making business decisions based on analytical reports for the first time. This opened my eyes to the power of data analytics.

Early realisation of the potential of analytics

So the data had revealed facts and correlations that seemed impossible, and as a result the company directors went on to change many operational processes. In this case, data analytics had not only saved the car manufacturer a lot of money, but also enabled the company to discover technical issues much faster than ever before (days instead of months). In addition, the engineering team also wanted access to the data. A second-generation data warehouse was created, providing engineers with first-hand access to technical claims and also to the section with parts returned from dealers. They could then improve and accelerate the returned parts handling process, which resulted in fewer repairs of faulty parts.

Data analytics improved greatly since the late 1990s, mostly on the reporting side – well, data extraction, transformation, and loading cycle known as ETL improved as well, but I will focus on the reporting side for now. Many new reports could be created and the car manufacturer launched a new business department that used the data warehouse and reporting environment to stay informed and make key business decisions.

When I moved into the financial and banking industry, shortly after 2000, I saw a similar uptake and introduction of client profiling based on data warehouse systems that converted operational data into analytical reports and business up-selling opportunities. The data was combined with CRM systems and segmenting was born.

So, why is data analytics failing to reach its full analytical power?

Twenty years later, however, I wonder why are we at a standstill on this matter? Some people may not agree with this sentiment, as technology has certainly matured since those early days. But, many companies today are still only using basic features of data analytics, despite the fact that we now have unlimited data flows from different sources, and on top of that, we get customer feedback via social media. Furthermore, we also have big data enabling unparalleled new levels of real-time decision making, and we have HADOOP technology that allows massive amounts of data to be processed. However, companies seem stuck in the headlights of operational improvements. How come? Is it a lack of technology awareness, adoption, or investment? Is it because processes have not matured enough to allow data to drive decisions? Or is it more of a cultural issue – are people insufficiently driven to use data and make suggestions to their management?

I believe it is the latter, mainly a cultural issue. Too few companies have made a cultural change in their day-to-day processes to let data drive change. Very few people are actually busy analysing data; far too many are still using data in an operational way (for example, to report status- and evolution-like growth or decline in product sales).

Governments are missing out on huge opportunities

In my opinion, governments in particular are missing out on huge analytical opportunities. Since 2010 when I started to work on government accounts, I have seen big investments in data warehousing techniques and many reporting environments come to life. This is good progress, but, these systems are merely being used for operational reporting. These systems collect data from schools, companies and citizen interactions. They are used to produce reports like the number of school grants, number of unemployed, education levels, crime rates, etc. A massive amount of statistical data is out there and being published by the government as raw data or – at best – as proof that the government is doing its job of, for example, creating employment and ensuring welfare.

Since 2008, governments have come to realise that they are sitting on a pot of gold, and started to make this data available as “open government data”, which other entities can use and create applications from. Although this data is valued and used, the sad fact is that far too few analytical-style applications are coming out of the Open Data movement. Far too many applications simply report on a status or provide practical advice on the current status (such as opening hours) or statistical measurements (such as traffic, weather, and economic conditions).

Open Data offers great opportunities, so in my next blog I will be exploring how companies and governments can change the game using Open Data.