I work in tech, software development. I’d say that of the developers I have worked with, some have clear IT backgrounds, e.g. Computer Science degrees, others are entirely self-taught. As far as I can tell, as long as they have enough experience, it doesn’t seem to make a lot of difference. Controversial?
Being self-taught myself, I’m not actually sure what is taught in school and universities but, from my limited experience of working in a college and from knowing what my kids learn, it feels like it’s always well being the curve, at least 2 or 3 years behind what’s being used in the industry. This is both in terms of tooling and approach.
My kids have recently been learning to do various tasks in Excel. They’ll spend a week on something that I feel could be picked up in a couple of hours. For me, you’d get a lot more out of Excel by investing time in learning mathematical functions. And learning to use the import/export functionality. But that side of things is never taught.
Lots of courses include modules on the history of computing, the Difference Engine. Whilst interesting, does it really help? In fact, I’m not sure that any time spent on how things used to be done is really very useful. Keep looking forwards, not backwards.
I’m not sure what employers look for now or will look for in the future but I feel there could be more value in other subjects like maths than IT. IT is too broad, too “Jack of all trades, master of none”, and without any depth lacks real value. It’s easy to teach yourself whatever you need to learn in IT – there are so many good quality resources. I’d be inclined to learn other things and then demonstrate your IT skills through your own projects – websites, apps, APIs, etc. This will not only let you develop your other skills but give you experience of being self-sufficient in tech. It will also prove you have the motivation and discipline to teach yourself, which I think is essential.
Just my view. Please feel free to argue the case for the quals…