When machine learning pipelines are well-formed Python packages, transfer learning is much easier!

This post is my second stab at convincing people that ML pipelines should be Python packages. A previous post argued (among other things) that Python packages make it easier to develop and understand an ML pipeline. Here I want to make the case that Python packages make it easier to develop and understand future ML pipelines. That is, Python packages dramatically simplify transfer learning because they’re composable. This may seem obvious if you’ve used something like Keras Applications, but are you actually writing Python packages when you build machine learning models…?

The test case for this argument is an ASCII letter classifier that starts from some MNIST feature weights. In this admittedly contrived example, starting from feature weights is important because I have many fewer labeled examples of ASCII letters (100 per class to be precise) than MNIST digits.

MORE

Say you have a dictionary that you want to both copy and update. In JavaScript, this is a common pattern that gets its own syntax, called the object spread operator:

const oldObject = { hello: 'world', foo: 'bar' }
const newObject = { ...oldObject, foo: 'baz' }

After running this snippet, newObject will be an updated copy of oldObject{ hello: 'world', foo: 'baz' }. Turns out, you can also do this in Python since 3.5:

old_dict = {'hello': 'world', 'foo': 'bar'}
new_dict = {**old_dict, 'foo': 'baz'}
MORE

When I look at a new open source deep learning project, I start with several questions.

  • What’s the structure of the model?
  • How’s the model trained?
  • How’s the training data formatted? How’s it preprocessed?
  • How can I do inference with a trained model?

For a machine learning pipeline to be useful to me in a real world scenario, I have to know the answer to all these questions.

But knowing the answer to these questions isn’t enough by itself. I also have to:

  • Modify the bits that are different about my problem.
  • Run the entire pipeline end-to-end.
  • Re-run the parts of the pipeline that are affected when the data, model or hyperparameters change.
  • Keep track of the data, model and hyperparameter settings that lead to the best performance.
  • Deploy the optimized models in a production environment.
MORE

Adding a dimension to a tensor can be important when you’re building deep learning models. In numpy, you can do this by inserting None into the axis you want to add.

import numpy as np

x1 = np.zeros((10, 10))
x2 = x1[None, :, :]
>>> print(x2.shape)
(1, 10, 10)
MORE

The latest version of Anaconda comes with Python 3.7. But sometimes you need to use an earlier release. For example, as of today (2019-02-28), TensorFlow does not yet work with the latest release of Python. The preferred way to use a previous version is to create a separate conda environment for each project.

To create a fresh conda environment called tensorflow with Python 3.6 and its own pip, run the following:

conda create --name tensorflow python=3.6 pip

From there you can activate the tensorflow environment and then pip or conda install whatever you need. For example:

conda activate tensorflow
conda install tensorflow
pip install ipython matplotlib

Then to return to the base environment, just run conda deactivate.

MORE

SNS is AWS’s pub-sub service. It’s useful for sending and receiving alerts for events you care about. It can also be used to send SMS messages. If you’ve setup your AWS command-line tool, you can do this in 3 lines of Python.

import boto3
sns = boto3.client('sns')
sns.publish(
   PhoneNumber='+15558675309',
   Message='hello world'
)
MORE

In this post we’ll classify an image with PyTorch. If you prefer to skip the prose, you can checkout the Jupyter notebook.

PyTorch is one of the newer members of the deep learning framework family. Two interesting features of PyTorch are pythonic tensor manipulation that’s similar to numpy and dynamic computational graphs, which handle recurrent neural networks in a more natural way than static computational graphs. A good description of the difference between dynamic and static graphs can be found here.

The most basic thing to do with a deep learning framework is to classify an image with a pre-trained model. This works out of the box with PyTorch.

MORE

The IPython shell is a fast way to evaluate small bits of code. It also functions as a mighty fine calculator. You can install it with pip, pip install ipython, and launch it with the ipython command. Besides the normal Python REPL stuff, here are a few cool things about the IPython shell.

%paste

Let’s say you’re reading a blog post on how to do something amazing in Python and you want to follow along in the shell. If you copy a code snippet to your clipboard, you can type %paste in the shell and it will execute each line one at a time. Go ahead and try it:

one_string = "this "
two_string = "is "
red_string = "pretty "
blue_string = "sweet"
>>> %paste
>>> print(one_string + two_string + red_string + blue_string)
this is pretty sweet
MORE

The Amazon Web Services (AWS) command line tool is a full-featured alternative to using the AWS console to perform actions in your account. Getting started is dead simple.

I assume you have an AWS account and access to your Access Key ID and Secret Access Key. If that’s not true, you can read our getting started guide.

  1. Install the tool.

    pip install awscli
  2. Add your credentials.

    aws configure

    Enter the AWS Access Key ID and AWS Secret Access Key when prompted. You’ll also need to set Default region name. I use us-east-1 for the region which is in Northern Virginia. This is a good region because all of the services are available there. It may be better to pick a different region if you live more than about 2,000 miles from Virginia.

MORE

Once you’ve installed Docker, there a few basic features to know. In this post you’ll learn about running containers. If you haven’t gotten started with Docker yet, checkout this quick start guide.

The basics

You can run Docker containers with a command that takes the following form:

docker run [options] <image name> [command]
MORE

Docker is a useful tool for creating small virtual machines called containers. Containers are instances of docker images, which are defined in a simple language. This language is usually written in a file named Dockerfile and it’s common practice to version control these files. When you run a container on your computer you get access to an entirely separate Linux environment. Better yet, you can run the same container on your laptop that runs in a battle-tested production environment, giving you the opportunity to develop and test in a realistic setting. This takes one major source of uncertainty out of the process of running your code on another machine.

MORE

The problem

Tuples are immutable sequences in Python. One common way to create a tuple is to put comma separated values in between parentheses.

some_tuple = (1, 2, 3)

There are a few circumstances where you might prefer an immutable sequence over a mutable one, like a list. One reason they can be useful is for holding the arguments to a function. Let’s say you want to write a threaded function that prints its argument every second for 10 seconds. You might do something like the following:

import time
import threading

def print_forever(some_value):
  for _ in range(10):
      print(some_value)
      time.sleep(1)

args_tuple = (1)
thread = threading.Thread(target=print_forever, args=args_tuple)

But this example is wrong in a subtle way and throws an exception.

>>> thread.start()

Exception in thread Thread-65:
Traceback (most recent call last):
  File "/Users/jbencook/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
    self.run()
  File "/Users/jbencook/anaconda3/lib/python3.7/threading.py", line 865, in run
    self._target(*self._args, **self._kwargs)
TypeError: print_forever() argument after * must be an iterable, not int
MORE