In the first of a two-part series about the future IT organisation, DXC Technology’s Sukhi Gill says IT must liberate and democratise data to support business at pace.

Chief information officers (CIOs) besieged by the pace of digital change will be pleased to hear that the IT organisation has a future — but only if it puts data at the centre. IT departments are custodians of legacy systems, trying to reduce expenses while maintaining uptime and levels of service — and being compliant at all times. This slow-but-steady approach is at odds with business IT that’s rolled out fast and often purchased on a credit card.

When fissures first appeared under the pressure of doing end-to-end digital business, Gartner invented the “bimodal” or two-speed approach to reconcile the dichotomy. But an agile approach, deployed by businesses to deliver new capabilities and attract new customers, is fundamentally at odds with the custodian stance, argues Sukhi Gill, DXC Technology’s vice president and chief technology officer for the United Kingdom, Ireland, Israel, the Middle East and Africa. “There’s a clash in the middle. The two must run together coherently in order to provide seamless, frictionless customer experience.”

Integrating islands of IT that operate at different speeds is laborious and frequently introduces errors. Plus, businesses that don’t fundamentally overhaul their heritage IT miss an important trick. “Above all, enterprises need to liberate and democratize data that’s buried in legacy systems,” says Gill. “It needs to be capitalised on by the new, API-fronted microservices that are being developed by the business.”

Making legacy IT fit-for-purpose and supportive of business IT calls for an air traffic control-type system that looks at all systems in the round as business cases develop. “We find IT really has to modernise how it delivers existing services to keep in step with the pace of business,” explains Gill. Crucially, the modernisation of IT moves it into DevOps territory.

DevOps requires IT custodians to update how they measure success and to deploy forward-looking key performance indicators, such as how quickly they can release code. Mean time to release has an impact on business as well as IT and is a good lever to bring the two IT domains together. Working toward continuous delivery pipelines, automating testing and release into production (without affecting live operations), and rollbacks that don’t damage the business — all of these elements ensure close cooperation and integration.

The task may look onerous and uphill, but Gill thinks incumbents have a strong incentive: “Most enterprises have a lot of [buried] data they can monetize, and that gives them an advantage”. But it’s hard work, he acknowledges, and IT chiefs need to understand data’s value in terms of its timeliness — is it valuable for 5 seconds or 5 minutes after it is generated? This will dictate when, where and for how long it is stored.

A company that manufactures racing cars knows that data is most valuable for them milliseconds after it is generated, as opposed to minutes or hours. At a later point in the data life cycle, it may be useful to marketers — but not to their car-driving customers. Tech brands players have such data evaluation down to an art form, whereas incumbents are used to reporting after the fact, rather than in real time to predict what is going to happen.

Putting data at the centre of the organisation demands a restructuring to make the link with business seamless, and this in turn calls for a configuration of IT roles and skills. Sukhi Gill will discuss how best to do this in Part 2 of this series.