Image for post
Image for post

I recently wrote a story about how using Canonical’s multipass can provide a WSL-like isolated development environment on MacOS and Linux. I am a rather big fan of simple deployment and CI/CD, hence wanted to see if I could integrate Google Cloud Run deployment through Visual Studio Code on Multipass. Good news is that it is actually quite painless.

If you haven’t read my primer on getting set up with Multipass yet, head over to here and check it out. Also, this guide assumes you have already created a working Google Cloud Run deployment, if not, have a look at this guide by Google. …


Image for post
Image for post

I have been developing software for many years now, and have always defaulted to Apple hardware, based on their UNIX-style OS and the terminal applications / CLI. During the course of time, I have experimented with getting a productive development setup on a Windows machine, but usually this has been frustrating and I have almost immediately gone back to the Mac.

Recently, Microsoft have started to focus more heavily on software development, hence the introduction of class-leading tools such as Visual Studio Code (which is now my default code editor). More recently than that, they have been working with Canonical on developing the Windows Subsystem For Linux (WSL) which brings a true linux kernel into the Windows operating system. …


Global state in React has always been an area that has been a bit hit and miss. At the start there was the concept of passing props down the component hierarchy, and using callbacks as props to get data back up the chain. As you can imagine, this quickly got messy, and led to the development of third-party global state management tools such as Redux and MobX. It stayed this way for a number of years, until developers started to tire of the boilerplate needed for such state-management tools, and were looking for something native and easier to work with. …


Running docker containers on Azure doesn’t need to be complicated, and you don’t even need Docker installed locally! All you need is the Azure CLI toolkit, which you can get easily through a package manager, for example on MacOS you can run:

brew install azure-cli

Once you have the CLI installed, authenticate with the credentials you use to access Azure. The following command brings up a web browser so that you can login:

az login

Once you have authenticated the CLI with Azure, head over to the Azure web portal, and look in ‘All Services’ and favourite both ‘Container Registries’ and ‘Container Instances’. Then head on into Container Registries. Click the plus icon to create a new registry, and fill out the necessary details. Make sure you enable the ‘admin user’ or you will run into authentication issues later. …


Image for post
Image for post

I am a self-confessed Apple fanboy, and own both an iMac Pro and a 2018 MacBook Pro. I also own a couple of custom built Linux machines, solely (and reluctantly) for deep learning and artificial intelligence development. Essentially as the platform of choice for graphics processing in the Mac is AMD, whereas the deep learning community relies heavily on Nvidia’s CUDA library. This is a huge shame, especially for the owners of the iMac Pro, where a pretty damn quick Vega 56 or Vega 64 sits unused.

Sure there are solutions out there that enable OpenCL / AMD to work with TensorFlow, but getting that all set up, let alone working, is simply a pain in the ass. …


We use Google Cloud Platform within our platform, to deploy ML models to the cloud as containers, and serve them using Kubernetes Engine. For developers, the process of pushing code changes to GitHub / BitBucket and then having to run through docker build and gcloud processes to deploy the code is far from ideal. Thankfully, with Build Triggers, after the first deployment, you can automate the process of rebuilding the image in the cloud when changes are pushed to GitHub or BitBucket.

Image for post
Image for post

The process is reasonably straight-forward. Head into the Google Cloud Console and to Container Engine. Then select Build Triggers and Add Trigger. Choose your flavour of remote repository hub that you want to synchronise, for example, we choose BitBucket here. You will need to run through some authorisation steps to get the account linked. Select the repository that you want to link, and Google Cloud will begin mirroring it. Once this process finishes, you will be presented with some additional config. …


This post assumes that you already have a working docker image, that you can build and run on your local machine, and obviously a Google Cloud account with all of the necessary APIs activated, and the Google Cloud SDK installed on your machine. If all of that is in order, you are good to go!

Image for post
Image for post

The first thing I do is ensure that I have an up-to-date GitHub repo on my code, which I usually keep in the following format:

- .
- app
-- <APP CODE>
-- main.py
- Dockerfile

I set up a new repository in Github, and then make sure the repository is up to date using git add . and git commit -m "Some Message" before doing a final git push origin…


Anyone who has tried (and failed) deploying their Keras models will understand the frustration of getting your trained models out there on the cloud. It is all well and good having all of the code sitting their in a Jupyter Notebook, but eventually you will want to deploy and run these models at scale. Hopefully this guide will help!

The Problem

There really isn’t that much documentation out there on how to serve a Keras model publicly, and much of the complexity in terms of setting up tools such as flask, uwsgi and nginx further add to the complexity, especially if you come from the Go or Node world. DigitalOcean provide a reasonable guide for setting up this stack on their virtual machines, but unfortunately it breaks when you try and predict using a Keras model due to thread safety, and takes quite a lot of research and tweaking to get it to run on this stack. …


Most of the tutorials out there on setting up deep learning tools such as TensorFlow and Keras seem to be focused on Ubuntu. This is great and all, but what if you prefer a different distribution? I personally am a big Arch Linux fan, and moreso, a Manjaro fan. So here is an overview of how I set up the latest Nvidia driver, CUDA, CUDNN, Python, TensorFlow (GPU Version) and Keras on a fresh install of Manjaro linux.

Preamble, I installed from scratch using Manjaro Architect, and opted for the Budgie DE. I auto-installed the Nvidia driver during the installation, but you can always perform the mhwd installation of the Nvidia drivers by issuing the following command and following the…


Terms like artificial intelligence, deep learning and neural networks are frequently spoke about as buzzwords, and lumped together with other emerging technologies such as virtual reality and the internet of things. However, I feel that AI is much more than that.

43 years ago, Bill Gates and Paul Allen released the first version of Microsoft BASIC for the Altair. Yes, there were breakthroughs decades prior to this, but in my opinion, Microsoft BASIC was the first true step in aligning human beings with technology. For those unaware, BASIC is a programming language which takes human readable instuctions, and interprets them as machine code. Of course, before BASIC, there were other languages which could (to some degree) interpret human readable instructions and compile them to machine code, such as Fortran, LISP and Cobol. However, Microsoft BASIC was the first step in bringing programming out of the laboratory and into the hands of real users. For this reason, I believe that Microsoft BASIC was as signficant as it was. …

About

Dr. Joe Logan

Developer and AI enthisiast from Sydney. Founder of Alixir. Check me out @ https://jlgn.io

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store