ATB1776, CC BY-SA 4.0, via Wikimedia Commons
AI is spread around like so much fertiliser across every media channel and yet nobody even asks what it is.
I’m not going to tell you either and if you’re wondering why, well here goes.
Whatever it meant when IBM began on Watson, or other laudable projects by other firms is irrelevant to the discussion. When you see the word AI it is simply meant to conjure up something incredibly smart and beyond your comprehension performed by these majestic things called computers. And that’s not an accident either. Just about every sort of system on the market today either claims it is, or contains AI. That’s what’s really going on.
The biggest and best computer on the planet understands only two things: Yes or No. That’s where the term binary came from (not the word).
When you ask Siri to find you the best train, or whatever she does, the system makes a series of yes or no decisions leading to the answer you hopefully want. All the questions and answers along the way were added by someone like me, a software engineer writing the logic, and a big database of data. The two elements are equally vital to the answer.
If the humans wrote a bad program the answer will most likely be a bad one and let me tell you they all begin bad, but some of them eventually mature via extensive testing followed by years of daily usage.
If the humans put in bad data, the answer will be bad. Now let’s not kid ourselves here, good data has a lot of errors, only very bad data has too many errors. Not only that, it may just be the wrong data, or incomplete, but the human who enters it can’t see this problem.
There’s no better methods than the ones we currently have for writing and testing the logic or collecting, testing and adding the data. It’s a very imperfect world and won’t be improving any time soon, because nobody even tries.
But here’s the bombshell; since I started in the industry 30 years ago, the skills of average industry operatives have gradually declined. The majority of projects today are run by Scrum Masters or Product owners with a 2 day training course behind them. On top of Agile, Continuous Integration and Continuous Deployment have taken over the software world. These methods were designed for small low-risk products that could be incremented one very small feature at a time and they were intended to be used by intelligent professionals who could keep an eye on the big picture while rolling out a new piece of a feature roughly twice a month. Of course a year later most of the month is spent on bugs in the previous releases and they eventually die in a pit of “technical debt”.
Nowadays just about everything is subject to this regime where architecture, strategy, planning and testing are cut short in the name of early delivery and systems are shoved out into the public domain in a haphazard manner and as often as not sold to someone who is tied in to a monthly subscription. Worse still his/her data and whole world is most likely to be actually or apparently tied into a cloud storage he/she cant get away form without severe pain.
That conversation could take a turn and lead to very complex stuff, so I won’t pursue it further right now.
What I can’t impress on you enough is this: never in the history of software has so much bad and even dangerous code been spewed out into the world.
If you are reading the stuff that is blogged on platforms like Linkedin and all the technical news sites across the web and allowing yourself to believe that we have reached a maturity whereby in the next ten years, most people’s jobs will be taken by machines, please do yourself a favour and just stop reading that garbage. It wont stop them trying by the way, but there’s no danger.
No, your job is not in danger of going to an AI bot.
If you’re worried about your job, learn software engineering, its not hard and it pays well. If you’d like to retain your current job, fret not.
At the risk of being dragged into definitions, let me just say that the stuff that reduces human effort and human risk and occasionally can replace a proportion of the people employed in a profession is not AI, but better described as either “Productivity software”, or “Expert systems”.
Productivity software is what has occupied a large proportion of my time over three decades. CRM fits that bill because it captures useful information about your customer relationships and makes it available just when you need it. It saves sales, marketing and customer service teams heaps of time and allows them to focus on more productive efforts. That’s the key. It doesn’t replace people, it makes them more effective. Nobody doesn’t want more sales, right?
Workforce planning was another, it reduced the effort by 25% at a national public service, but nobody was laid-off. I could go on.
Expert systems are less prevalent, but Covid has put some into the limelight. I believe that I could build an expert system with the aid of a panel of doctors that would do their job more efficiently with fewer errors and only call them when there is a doubt. Some systems already exist. That way doctors could be in a clinic or hospital delivering solutions.
Lawyers, the same story, we already see the embryo appearing on the internet. Many other professions could benefit and none of them would result in jobs lost.
What we can’t do with computers and won’t have the time to even try any time soon, is to drive a nail, paint a door, replace the washer in a tap, service a car, even an electric one, fry hamburgers, serve meals in a restaurant etc.
Not only is it way beyond our capabilities, the captains of industry who, let’s face it, are not that bright god bless them, still are smart enough to know that their customers, even if not direct, are also their employees. Let me put it another way, bots don’t buy hamburgers or drink beer and industry won’t be replacing living breathing shopping workers any time soon, so relax.
If you see a machine rambling down the road spewing poetry, you’ve probably had one too many, go to bed and all will be well tomorrow.