Once upon a time, there were application programmers. Every self-respecting IT organization staff had the sensible organization chart that contained systems programming, applications programming, maintenance programming and other miscellaneous divisions including education facilities support and documentation.

But today you wake up – just like Rip Van Winkle after a twenty year snooze to find that things have changed dramatically – there's no more application development organization!

Now does this mean that there are no more applications? Not at all! There are more applications than ever before. It's just that those applications are not being developed and maintained by the IT organization. Instead we find that:

  • Applications are being developed and maintained inside the end-user organization. The end user has his/her own PC, hardware, software and data and has long been estranged from the IT department.
  • Applications that were developed in the past are left in a heap called the "legacy systems environment." It is a fate worse than death to be banished to doing work here. If there was ever a dead-end career, it is in the maintenance of legacy systems.
  • Applications are not custom developed here, but in India. It is much more economic to ship problems overseas than it is to solve the problems here in the U.S.
  • Applications are developed by outside consulting firms.
  • Applications are being taken over by ERP software which has permeated applications. The real development and maintenance is taking place in Walldorf, Germany, and Pleasanton, California. If you want to see a real developer, you certainly can't find one in the halls of the IT organization.

Applications today are being developed and maintained a long way from Main Street, USA. The IT organization has lost its will and skills to conduct development. Today's IT staff has become an organization of caretakers and evaluators of technology. Development skills have long departed.
Why have development skills left the organization? There are a host of complex and interrelated reasons for the departure of development. Some of the reasons are:

  • Economic: It is much less expensive in the long haul to farm out development tasks than it is to staff and feed an in-house development and maintenance staff.
  • Technical skills: Real development talent is difficult to find. There never were enough skills to go around; and, as time passes, there is increasingly less talent to fill the needs of the organization.
  • Simplification: Technology has been simplified to the point that end users can do their own work, without the intercession of a middleman.
  • Architectures: Architectures have been created where all that is needed is data meeting end user. The need for the programmer has been abrogated.
  • Economy of Scale: There are tremendous economies of scale in allowing an outside developer to build systems for many customers, rather than have everyone develop their own unique applications.

These are some of the reasons why in-house development in the IT department is a myth in today's world, but there is another obscure and seemingly rational reason for the demise of the in-house development staff. That reason is the practice of adding more staff and development workers when a project starts to have problems.
To understand this phenomenon, consider the following. On the surface it seems perfectly normal and natural to add staff and resources to a project that is in trouble. A project is due to finish in October. The manager tells the head of IT that the deadline will not be met; therefore, management adds staff to the project in order to make the deadline. Isn't that the way other parts of the organization are run?

Also consider that generally managers are paid and otherwise rewarded based on the number of people they manage. If you manage a staff of five, you are a project leader; however, if you manage a staff of five hundred, you are a director – and directors make a lot more money than project managers.

Now consider two projects – A and B. Project A is run by manager A and project B is run by manager B. Both projects begin with five people on board. Project B is well run and finishes all deliverables on time. The end user is happy.

In the meantime, project A is running behind the time line. Five people are added to project A. Time passes and project A encounters further difficulties. Five more people are added to project A. More time passes, and project A continues to flounder. By this time, project B is already completed. Manager B and all the staff from project B are added to project A – working for the manager of project A.

A year later, project A finally staggers across the finish line. Manager A now has fifty people working for him. He receives a raise commensurate with the management of a staff of fifty people. Project manager B gets a merit raise of five percent.

Something is fundamentally wrong with this picture. The organization is rewarding incompetence and penalizing competence. The people who are making the biggest messes are getting the biggest rewards. The people who are competent are getting the biggest penalties. Furthermore, carried to an extreme, the people who make the biggest mistakes receive the biggest rewards, and the people who are the most competent endure the biggest penalties.

This scenario may seem far-fetched; however, it is anything but. This actually happened. Bigger and bigger development efforts became staffed with more and more incompetency. Finally, things ground to a dead stop.

Unfortunately, the need for new systems did not stop. That's where IT management started to see the end user go directly to the OLAP vendor or the ERP vendor. Soon, IT was on the outside looking in. The budget was stripped away from IT, and the era of competent external vendors was born.

There is another related issue, and that is the law of large projects:

If you spend enough money on a project, the project must be deeded a success because to say anything else would be political suicide, regardless of the reality of the development effort. IT existed with its head in a bag. This worked for a while, but at some point the end user became fed up with this charade. Enter the ERP vendor or the consulting firm and exit IT.

When reality rears its ugly head (and despite the best efforts of IT management, reality eventually wins), the IT department has been shown to be incapable of major new development.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access