Skip to main content
  1. Writing/

Easy Docs For Node And Python

·837 words

Let’s say you are shipping a library. And let’s say that you are developing it in JavaScript, TypeScript or Python. As a developer, you probably want to have it documented. But documentation is complicated, right? You have to have a tool, integrate a bunch of plugins into it, and then have a process to somehow transform those into the proper formats. A lot of hassle. But what if it wouldn’t be? Enter doctainers. The goal of this mini-project was to wrap up all the necessary tools to generate library documentation into portable Dockerfiles - so all you would need to do to get instantly better docs is run docker build.

The benefit of this is that you can easily package up everything and “ship it” to the cloud.

Getting Ready #

Before we start, let’s talk about what you will need installed on your machine. The list is below:

No, really - that is it. The only software you will need to have installed locally to be able to work with docutainers is, in fact, Docker. Once you have that installed on your computer, you are good to go.

The choice of Docker for this project was a really easy one to make - this allowed me to really abstract out all the other requirements and considerations for the target platform into a number of container definitions. Whether you are running Windows, Linux or macOS, you will get the exact same end-result, without having to deal with dependencies and mis-configuration issues.

Types of Containers #

Out of the box, I am currently providing three types of containers, that help you generate documentation for your library. They all come pre-pacakged with DocFX, the same tooling that powers docs.microsoft.com, and a set of extensions that convert platform-specific documentation into YAML, and subsequently build it as a static site and serve it from the container.

Vanilla JavaScript #

Download JavaSCript Dockerfile for DocFX.

This image is helpful if you are using standard *.js files inside your npm package. Probably should also mention that the image is tailored to pull content from npm - it won’t be that hard to modify it to read in source code (and I will even eventually get to building an image that does just that).

There are two arguments you need to pass to docker build in this case - LIBRARY, that is the npm package ID, and LIBRARY_FOLDER, that is the folder within the package, where the library is located. By default, LIBRARY_FOLDER is set to be src, so if that is the case for you, you don’t need to do anything extra.

The image is using the node2docfx plugin, developed by the team I work on. Through some special incantations, on the fly we create a configuration file that allows the user to specify what *.js files need to be documented, and we produce DocFX-compatible YAML once the processing is complete.

Make sure you ship JSDoc comments in your code to make it richer!

TypeScript #

Download TypeScript Dockerfile for DocFX.

This image is mostly the same as the one for the vanilla JavaScript, the difference being that I’ve integrated the type2docfx plugin for DocFX. This plugin converts the JSON output produced by TypeDoc into DocFX-ready YAML (you can see a pattern here - we like YAML).

And just like with the vanilla JavaScript image, you can pass a LIBRARY and LIBRARY_FOLDER arguments to customize the build job.

Python #

Download Python Dockerfile for DocFX.

Hands-down, my favorite image, because I 💚 Python. Under the hood, it still uses Sphinx, the belowed Python documentation tool. But it also uses sphinx-docfx-yaml, a plugin that was co-developed with the co-founder of Read the Docs, Eric Holscher. Unlike all other images, the Python Dockerfile makes the assumption that the library you need to document is stored in a Git repo somewhere. When building the image, you can pass LIBRARY - the URL to the Git repo, and LIBRARY_FOLDER - the folder within the repository where the source code is located.

Now, you might be asking - what if there are dependent packages that need to be installed? You can add those within the Dockerfile - there is a designated section just for that. You will know that packages are missing by the build output, once you run docker build.

Deploying to the Cloud #

Once you have the container, you can easily deploy it to a cloud service. Take Azure, as an example - you can push the image to Azure Container Registry, and subsequently spin up a container instance. Given that all the required steps are already documented and there is nothing specific for docutainers (deployment is typical - that is the beauty of the project), I will avoid getting too in-depth in this blog post. If you run into any issues - make sure to contact me through the comment section.

One thing you might notice is that the container today is bound to port 1900. Whenever you deploy, make sure that you have the desired port open, whether it’s the default one, or the one you modified.