With cybersecurity breaches and mass migration to cloud-platforms, IT departments might be more important than ever. But depending on company size and industry, IT departments could also be outsourced? Recent Controllers Council studies identify that corporate finance often supervises IT and corporate technology. Check out the following perspective.
As reported by Wall Street Journal on September 18th, 2022 by Joe Peppard.
Last November, I wrote an article with the headline, “It’s Time to Get Rid of the IT Department.” It generated considerable praise—and considerable criticism.
I won’t reiterate my entire thesis here, but my basic argument was this: In today’s digital-first world, organizations need to approach information technology differently. They need IT but not an IT department.
With organizations so dependent on tech, it might seem foolhardy not to have an IT department. But my research suggests that IT departments prevent organizations from achieving their digital-transformation ambitions. Instead of figuring out how to get the most value out of IT, companies manage IT. A better way—and one that a few pioneering companies have taken—is to embed IT knowledge and decision rights into every business unit. That makes it much more likely that a company’s technology capability will evolve along with the business objectives of these units, and the organization.
To some readers it was a brilliant call to arms; to others it was, as one reader wrote, “laughable.”
Many of the positive comments came from readers’ own experiences. “Our IT department has no clue what our company does,” one wrote. Said another: “IT departments are often the biggest impediment to getting problems solved. They are intransigent, shortsighted, and incredibly territorial.” One reader said of his previous company that the “common nickname for IT was the sales prevention department.”
The detractors were equally blunt. So rather than basking in the glow of those positive comments that supported my thesis, let’s look at some of those from the critics—and my response to their arguments.
IT does matter
There were a bunch of commentators who dismissed the piece completely, saying that I was arguing that companies should do away with IT. “I’ve been hearing about the death of IT since the early 2000s,” one reader wrote.
So let me be clear: What I am questioning is whether organizations need a separate organizational unit dedicated to the management of its technology, not that IT doesn’t matter. My basic premise is that IT very much matters! But merely purchasing and managing technology—which is the focus of many IT departments—doesn’t automatically confer any expected value. This sees IT as a supplier to the various departments, designated as a cost center, which means that the metrics by which the IT department is measured are often irrelevant to the success of the business units it is supposed to support.
What would you do without us?
Most detractors provided a laundry list of reasons as to why the IT department isn’t—and can’t—go away. The general tenor of these comments was that without an IT department, how would an organization handle a major security incident such as a virus outbreak, a distributed denial-of-service attack or even a ransomware attack? Others spoke of additional areas such as patching software, business continuity, data backup, disaster recovery, network design, voice systems, telecoms, business intelligence, database management, and assistance if you forget your password or need a new PC or desk-side support.
One reader posed the dilemma, “Just wait to see how the CEO reacts when that ‘specialist geek of the finance department’ can’t figure out how to bring up a database system to do payroll after a crash (i.e. the network, operating system, application, etc.). Then you will know what the important role of a dedicated IT department is.”
I completely agree that tech knowledge and know-how are crucial, and I never argued otherwise. I only questioned why an organization needs to house all this know-how in an organizational unit dedicated to this purpose. What matters is how an organization chooses to configure this knowledge.
Of course, with cloud computing, much of the heavy tech lifting that has traditionally been done by an in-house IT department can now be carried out by the likes of Amazon Web Services, Oracle and Microsoft. This means that having a lot of particular tech know-how in-house becomes obsolete. Any patching of software or backing up of data, for example, will no longer be done by an in-house tech team.
Without owning or operating any technology, one bank has built its digital infrastructure with an everything-as-a-service strategy. This required orchestrating an ecosystem of external service providers to deliver a secure, scalable, flexible, and resilient set of IT services at a fraction of the normal cost of ownership for in-house IT.
The centralization-decentralization pendulum swing
Many commentators rightly pointed to the fact that over the decades the pendulum has swung between centralization and decentralization of IT. As one reader wrote: “Like the old joke we’d tell about one of our competitors. Clients would hire them to centralize everything, then call them back five years later to decentralize it as the former didn’t work. Then rinse and repeat.”
The comments were alluding to the fact that if information-technology spending is seen as getting out of control or the tech landscape is becoming too complex, centralization is proposed as the solution. Alternatively, when companies’ primary goals are local responsiveness, closeness to the customer and speed of delivery, decentralization is often the solution. Readers dismissed my argument as just being the inevitable swing back away from centralization.
There’s no question that centralization and decentralization each have advantages and disadvantages. One commentator raised the conundrum: “My conclusion is that centralized IT doesn’t deliver business value at the pace the business requires, but it’s much more effective at creating a secure environment.”
Drawing on experience, another noted: “I’ve seen decentralized IT end up with terribly suboptimal solutions that were completely disconnected from the organizational needs, very insecure, very out of date, and generally had disappointing performance.”
The way forward, many argued, is a middle ground—a hybrid structure that offers the best of both. I agree. But while this is great in theory, nobody described what this would look like in practice. I believe that’s because it is incredibly difficult to achieve. With my research, I am seeking to develop archetypes that are appropriate for different organizations.
Guardrails will never work
Several comments picked up on my suggestion that the organization set up guardrails—everybody must use the same security protocols and software-programming language, for instance, and managers are then free within those frameworks when making decisions and taking action.
Many critics argued that it wouldn’t work. “The article mentions guardrails, such as everyone abiding by architectural standards, but from what I’ve seen in my 30 years’ experience, is that when you move to decentralization or a more agile mind-set, management quickly seizes the opportunity to ‘go faster’ and disregards the guardrails quickly. The right hand doesn’t know what the left hand is doing, and you end up with IT…in the middle trying to sort it out, everything goes wrong, the CIO decides to outsource to India, things get worse, they try to bring in-house, and repeat.”
The argument the critics made was that any guardrails that are established will be ignored. But I think that the willingness of employees to abide by guardrails is being underestimated. I do think that it will require some education, mainly because many employees just don’t understand why guardrails are so important. Can employees really be blamed if they don’t know what they don’t know?
Still, employees are already becoming more sensitive to data protection and cybersecurity. And they can quickly be brought up to speed on the importance of other areas such as data standards and architecture.
There is precedent that leads me to be optimistic. If we look at a resource like money, all organizations have guardrails regarding how it can be spent. In general, these are followed by employees. There will always be rogue employees, but the majority will be responsible.
IT is not an office printer
One of the biggest criticisms was that I wasn’t paying enough respect to the people in IT and what they do. As one reader put it: “You seek to treat IT like the office printer, and you show zero consideration at all to the engineers you say should be separated from their peers.”
Similarly, those in IT roles talked about the loss of collaborating with their fellow tech experts. “Having other developers with a different point of view or skills in an area where I am weak is extremely helpful,” one wrote.
I do get that all engineers like to be together, just as all accountants or marketing staff do. There are advantages to doing this in terms of maintaining in-depth discipline knowledge, as well as for career development. Still, while it works for many areas of a business, it doesn’t work for IT. As digital becomes an integral part of the fabric of the organization, that tech knowledge needs to be dispersed. This creates challenges for coordination, but those challenges shouldn’t be the tail wagging the dog.
Then there is the argument that I am treating IT like the office printer. Frankly, I think it’s the opposite.
Consider this comment from a reader: “You wouldn’t want airplanes and bridges built and operated by inexperienced people, why would you want your company’s infrastructure?” To me, this gets at the heart of the problem. I would argue that too many people interpret IT in precisely this way—as concerned with computers and other devices and technical equipment, such as servers, laptops, and routers. For these people, information technology is a physical artifact that is programmed to perform a particular function or set of functions. That is, IT is a noun, “a thing,” and the IT department is the organizational unit responsible for the management of all these IT things.
But this is a very narrow view of IT. If anything, this is the view that sees IT as a printer.
By contrast, I see IT as a verb. I envision IT as not a thing to be managed, but an active participant in whatever business unit it is embedded in—creating value and aligned with the needs of that business.
I don’t pretend that any of this is easy. I know that it will require education, a new mind-set and possibly resources. Some sacred cows will have to be sacrificed. And some of the arguments made by critics are challenges that companies will have to address. But in the end, I do feel it’s inevitable: To become a truly digital organization, companies will have to rethink the role of technology. And one of the requirements will be to rethink the IT department.
What are your thoughts on getting rid of the IT department?
Additional IT and Cybersecurity Resources
A Controller’s Guide to Cybersecurity
From Knowing Why to Knowing How: A Controller’s Look at Cybersecurity
Your ERP Is Essential to Your Digital Transformation Strategy