Why AI is the most significant thing to happen in technology in over 40 years

Dr. Joe Logan
3 min readMar 12, 2018

Terms like artificial intelligence, deep learning and neural networks are frequently spoke about as buzzwords, and lumped together with other emerging technologies such as virtual reality and the internet of things. However, I feel that AI is much more than that.

43 years ago, Bill Gates and Paul Allen released the first version of Microsoft BASIC for the Altair. Yes, there were breakthroughs decades prior to this, but in my opinion, Microsoft BASIC was the first true step in aligning human beings with technology. For those unaware, BASIC is a programming language which takes human readable instuctions, and interprets them as machine code. Of course, before BASIC, there were other languages which could (to some degree) interpret human readable instructions and compile them to machine code, such as Fortran, LISP and Cobol. However, Microsoft BASIC was the first step in bringing programming out of the laboratory and into the hands of real users. For this reason, I believe that Microsoft BASIC was as signficant as it was. Here is the classic snippet of BASIC code:

10 PRINT "HELLO WORLD"
20 GOTO 10

So simple right?

BASIC, together with it’s predecessors and numerous followers represent the programming paradigm. In order to enable a machine to perform a certain task, a human must tell the machine what to do through specific instruction. Unfortunately such instruction is not a simple endeavour. Explaining to a computer something in Javascript, is markedly different to doing the same in Python, Swift or Go. This tends to create siloed knowledge bases and a wholly difficult landscape for a non-technical individual to navigate. Moreover, it limits a person with a significant amount of domain knowledge (for example a lawyer, doctor or architect) from exploring technological solutions to various problems. I would bet all of my cryptocurrency on the fact that the world would be a very different place, if instructing a computer was as simple as instructing another person.

43 years have subsequently passed since Microsoft BASIC was released to home users, and we are now looking at an era where creating deep neural networks is just as accessible to regular people in their bedrooms as BASIC was. What exactly does this mean? Deep learning is data-driven, so the machine learns from the data rather than from specific instruction. This is huge. We are now looking at a possible shift away from the need to meticulously give instructions to a machine line-by-line, and instead focus on curating information and providing it to the machine to figure the rest out. Think computer vision, self-driving cars and language translation.

This effectively opens up computer science to people with domain knowledge and ideas without necessarily requiring technical skill. Theoretically, the dentist with thousands of images of periodontitis, can create a system to analyse photos of teeth from patients and figure out how urgently they need to see him. The architect with tons of data on soil conditions and building materials can mitigate risk in new projects and save millions.

As it stands, creating the actual architecture for a system to learn from data, still requires technical skill. However, I believe that this is not going to be the case for much longer. Google’s DeepMind team are working on a system called AutoML, which figures out the architecture from the data. Take computer vision as an example, AutoML sifted through the millions of images in the ImageNet repository, and created a best-fit architecture for solving this problem called NASNet. NASNet performed better than any human-coded neural network architecture that preceded it. This is a huge step.

The summary here is that grouping deep learning into the family of emerging tech such as VR and AR, is really not doing it justice. The potential that deep learning has to democratise the human-computer interaction is insurmountable. Just like BASIC brought the ability to offer up instructions to a machine to the home user, hardware breakthroughs and deep learning accessibility will diminish the reliance on tech skills to create groundbreaking solutions.

--

--

Dr. Joe Logan

Developer and AI enthisiast from Sydney. Founder of Alixir. Check me out @ https://jlgn.io