Docopt, setup.py and ConfigParser save time during development and I’ve been using them continuously since I used them for the first time. These tools are part of my toolbox now. See the article below and decide whether you’d like to add them to your own.

I like Python scripting for it allows me to create small automation routines pretty fast. Most of these programs have not more than a few hundred lines of code and their uses span from backup management to data processing and integrating APIs I use. Some of these programs I use on a daily basis, others were used once.

In the process of building these tools, I’ve developed some conventions and habits that speed up development and solve common task classes. Hence, this article.

I encourage you as well to write notes about your own conventions and tricks, hopefully it will save someone else’s time, even if only a few or a few dozen minutes.

It helps us grow. To give you an example since I’ve read Kamil Supera’s article on Three friends of the better code style – Python, I’ve been using it in my small projects too. The best way to learn is from your peers.

And now, let’s look at the three tools I have in mind for you.

Docopt

If there’s one thing I’d like you to leave with, it’s Docopt. I fell in love with it from the first sight. It’s just that good. I use it everywhere where I need to create a console utility – despite argparse being part of the standard library.

Why? Because for most programs you’ll ever write, Docopt suffices. And in a pretty elegant manner, all you need to do is to write down a declarative description of what your program does. Don’t worry you need to learn some new API, you already know this one:

At the top of the file, we’ve got the documentation in the form of a hand-written help message. Below, a one-line function call returns a dictionary object with all options and arguments passed to a program. As an added bonus, you’ll see the same help message under the -h or --help option (provided that you’ve written that down in your help message). It couldn’t get any easier than that.

This approach has a lot of benefits. For starters, it rewards me for writing a help message that I can come back to a year later and get an instant refresher of what the program does. It’s also a lot easier, shorter and more intuitive that its’ argparse equivalent. That’s what I call efficient.

So are there any drawbacks? Docopt is not great for commands like git that have their own subcommands. It’s possible to implement such a command but at some level of complexity it stops being manageable. In such a situation it’s better to use argparse.

pip install -e . and python3 -m venv env

Another trick I deploy is to always use the repository structure described below. It simplifies adding new tools and if I need to fork out a specific subset of tools as a new repository, it’s quick as well.

To provide an example, the first commit in my last repository looked like this:

.gitignore
setup.py
src/randomtools/__init__.py
src/randomtools/copiesfromcsv.py
  1. .gitignore is generated from gitignore.io
  2. setup.py is shown below.
  3. src/randomtools/copiesfromcsv.py is a specific tool.

setup.py

For a directory to be treated as a package, we need to provide a setup.py script. For my repositories it looks like this:

In the packages list, we need to add all submodules (e.g. if we have a subdirectory src/randomtools/config, it needs to be added here as randomtools.config).

In the console_scripts list, we can define all scripts that can be run from the console. The form path.to.module:function_to_run is used to define the entrypoint function. I suggest having only one entrypoint per file, which I typically call:

def main():
    # [...]

What to do then?

It’s easy.

For development, you can install the package with the following commands:

python3 -m venv env
. env/bin/activate
pip install -e .

From this point onwards, we have created a virtual environment in the env directory, which can then be activated in the terminal with the . env/bin/activate command. That way, two things have just become possible:

  1. We can run all console tools defined in setup.py.
  2. We can edit any file in src and we do not need to reinstall anything to see the changes we’ve made. The only exception is when we change setup.py, for example by adding a new script.

We can also install the package globally in our system by issuing the following command:

sudo pip3 install .

Voilà.

ConfigParser, passwords and configuration

Python has a built-in configparser library handling .ini files. In my scripts, I use it every time I need access to passwords and other configurable values, such as hostnames, API URLs, and so on.

I wrap the configparser.ConfigParser class in a very simple class of my own that reads the file, sets certain defined attributes and raises an exception if any required value is not defined.

It could be a function, which would make it even simpler.

Example:

Summary

The three conventions described above have saved a lot of my time when developing my own console utilities. They reduce time spent on adding common features to a program – something you should never spend too much time on. Here are the specific problems solved in this article:

  1. Program options and documentation;
  2. Installation and running in a development environment;
  3. Configuration stored in text files.

With such conventions, I spend proportionately more time on the task to be done, which means better utilization of my productive time.

I hope that this article will save some of your future development time, too. Or make you reflect upon your code and look for other examples of tasks that could be simplified by using common conventions as there’s no need to reinvent the wheel every time.

Check out the article Three friends of the better code style – Python, if you liked this one.

Software development can be either large or small in scope.

What are your needs?

Co-founder and CIO of Makimo, deeply fascinated with philosophy, humans, technology and the future.