Skip to content

bshe-px/travis-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Build Status codecov.io

Example Python and Travis

Learning how to properly structure a Python repository on GitHub, with unit tests, and Travis integration.

Repository structure

Keep it simple. Have a readme.md.

Python modules are just valid python files. When you "import" a module, the file is executed by the python iterpreter. Read the docs for more.

A Python package is just a directory structure containing modules. You add a __init__.py file to indicate that a directory is a package. By default, this does nothing, and so it is common to include further import commands.

  • For example, import points runs two commands:
    • import points.nn which means that points.nn is imported into your namespace
    • from points.point import Point which means that Point now becomes a type in your namespace
    • see example.py for usage. An interesting exercise is to change __init__.py to be empty, and see how then example.py needs to be modified.

To learn more, browse around github and see how they structure imports. Or look at packages in your local python install.

Tests

Here I use pytest. The docs are good. I haven't explored, much at all, the features pytest gives you. The Python standard library comes with the unittest module which is more object orientated. The mocking abilities in unittest.mock are very useful however.

I have put tests in the tests directory:

  • Read and re-read Conventions for Python test discovery
  • Decide that we should put an empty __init__.py file in tests. This will make pytest recurse back to the main directory before running tests. You want to be here to import the package.
  • An alternative is to put tests in the same directory as the module they test. Coming from a Java/Maven world, this seems cluttered to me.

You can run the tests by executing pytest or py.test in the root directory of the project.

See tests readme for more.

Travis CI integration

The Travis documentation is very clear.

If it all works, you should see your project on the travis page: here's mine. To finish, we should of course:

What's going on under the hood is that Travis will build your project on a set virtual machine image "in the cloud" (aka on one of their servers). For python, there is no compile stage, and so the "build" stage is actually to run all of the tests.

Code coverage

So now we have unit tests, and we're running them on every push thanks to Travis. How do we know that we're testing everything we should? For this, we need a code coverage tool, and the common one to use with GitHub is:

  • Codecov.io Visit this and sign-up with your GitHub account
  • Use this example: Codecov Python Example and follow the examples.
  • I ended up adjusting requirements.txt and .travis.yml.
  • See the source for this readme.md file for how to add the codecov badge.

Pip setup

We should list the packages we require, both to help users, and to allow Travis to correctly install the dependencies.

  • As an example, fork this repository, and change requirements.txt to an empty file
  • The build should then fail, as scipy will no longer be found.
  • Travis installs numpy automatically, along with pytest, but little else.

Lots more documentation here.

Documentation

Things were going so nicely... but making nice documentation seems hard.

Write docstrings

This at least is easy.

  • TODO: Lots more to say here

Sphinx

To produce, say, HTML documentation which includes discussion, and extracted docstrings, the current best practice seems to be to use Sphinx.

Clearly I need to spend some more time playing with Sphinx. To get it at least vaguely working, I needed to slightly adapt Sam Nicholls's comment. You need that when conf.py is run, from whatever directory it has ended up in, that the source code for our project can be imported. As laid out here, we have docs\sources\conf.py while the package points can be imported from the root. So relative to where conf.py is we need to step back two directories. To do this, I added/edited the following in conf.py:

import os
import sys
sys.path.insert(0, os.path.abspath(os.path.join('..','..')))

This makes the python interpreter search two directories up for imports. Also do follow Sam's suggestion as regards running sphinx-apidoc (the generated files are now in source control, but by default, after running sphinx-quickstart they are not).

The end result is not yet pretty, but at least works.

Publishing your package

We have no intention of doing this for our example, but the standard pip supported way is to visit pypi.python.org and follow the instructions.

Once you have a correctly written setup.py file, you can run python setup.py install which will automatically install the module points. You can now run python anywhere and import points will work, reading the installed version, instead of the development version. At least on my Windows install of Anaconda, you can see where the files are installed, and to "uninstall" simply delete them.

If you have uploaded your package to PyPI correctly, then you can run pip (un)install <package-name>.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages