Our progress as an ever advancing civilization is being held back by the way we approach the education of information technology. We have created a false dichotomy: we have those who come out of the education system understanding technology but not the way the real world works, and we have those who learn some aspect of the business world, but have no idea how technology is applied to their domain. It seems the more powerful the software developer, the less grounded they are in the real world, and the same is probably true for those who are strong in some vertical business functional area.
Over time, this one-sidedness is mitigated by experience and exposure, but it is not the same as having the fundamental understanding of what goes on over the fence. It is like having two separate brain hemispheres – one that is focused on how stuff can be built, and another that is focused on what needs to happen. The left brain (the software developers) know how to mix ingredients and build something, but it takes the right-brain to see how things need to be used.
The trouble is, without some means of conveying their expertise, a lot is lost in translation. Nontechs are unaware of what is possible, or have no idea whether something is technically risky or feasible. Technologists know lots of cool tech stuff but have no idea how some gem can be applied to the real world.
Technology is so pervasive, so fundamental to the way we now live that we need to rethink our education strategy or miss out on generations of possibilities. If you think we are doing just fine, why has it taken us 30+ years to apply social media principles to our computing, with publisher-subscriber models only beginning to permeate into our IT systems in natural human-facing ways? These new modes of operation are natural, what we have been doing previously is not natural, hence our historic fear of IT, our frustration with information overload, our expensive overruns and ridiculously high rates of project failures.
I was talking to some software department heads at one of Australia’s leading universities recently and I asked them when they thought we should begin teaching HTML and CSS to our students. Their response: grade two – that’s seven year olds. With this kind of fundamental understanding of the building blocks in web pages, these students will be much better prepared to build an understanding of what is possible.
On the same topic, why are we not teaching secondary students the fundamentals of object-oriented programming? I was rather shocked to learn that in the State of Victoria, Australia, there are only 14 secondary teachers who are qualified computer scientists.
Society will benefit greatly when the two hemispheres are able to communicate more effectively. Current workarounds like product managers and business analysts are a necessary glue, but how much more effective will be if there is a more fundamental understanding of what is going on in the other half of the brain. Imagine constructing buildings where the builder and architect have only a vague understanding of what the building’s purpose might be, or a prospective customer has no sense of the cost of adding a room after the walls have gone up.
I believe we need to start teaching the fundamentals of IT as part of our primary and secondary education, and carry that through to the all the university vertical domains so that computer technology is an intrinsic part of the education of every discipline. Likewise, we need to be introducing Applied Computer Science courses into the CompSci and InfoSys courses on offer so that graduates learn things like the application of Big Data, Publisher Subscriber models, marketing automation, the cost of downtime, basic risk etc, and are able to apply them to real world problems.
We need to be able to cultivate a society where both sides can make meaningful contributions to the other’s discipline by seeing through the other’s perspectives. Only then will be begin to recognise our full potential.