Dockerizing Your OpenStudio Workflow
Use Docker Containers to Automate your Workflow and Provide Portability for OpenStudio
Docker + OpenStudio
Why should you use containers? There's certainly a learning curve when it comes to developing and modeling in containers, however once you get the general workflow down, Docker makes a lot things much, much easier.
First off, you will never have to mess around with environment issues again. Don't have the right version of OpenStudio? Don't have the right version of Ruby? Can't get OpenStudio-Standards installed? Did your environment break between Friday afternoon and Monday morning for no apparent reason? Whatever, you have Docker, the image is the same every single time. If something weird happens you can just rebuild the container and lose absolutely nothing.
Second, sharing models and applications is way easier with containers. If you email your co-worker an OpenStudio model that you've been working on locally, the weather files and all of the measures will surely have different paths that are gonna break the second she opens it in OpenStudio. When you run models in a container, it's local to container operating system so all you have to do is map the results back to your machine.
Thirdly, the Docker build process can be used to automate model development and measure analysis. You can create an entire pipeline in a Docker container to build your model from geometry and weather assets, use measures to configure the systems, and use a workflow file to analyze energy efficiency options.
As a bonus, containers make developing applications with OpenStudio or running models in the cloud far easier, which we'll get to in follow up articles.
This article outlines the basics of working with OpenStudio inside of a Docker container along with some examples to help you get started; if you aren't sure what Docker is or how containers work, head over to www.docker.com to get familiar with the technology and install docker and docker-compose. We won't go into the installation here; the team at Docker has already put together some great documentation for getting started:
OpenStudio in a Container
NREL publishes a basic image for working with OpenStudio; it comes with the OpenStudio command line tools and Ruby dependencies built in. If a vanilla OpenStudio install is all you need, a basic Dockerfile to get you going will look like:
FROM nrel/openstudio:latest RUN mkdir -p /code/model COPY ./model/* /code/model/ CMD ["openstudio run --workflow /code/model/workflow.json"]
This Dockerfile uses the NREL OpenStudio image as a base image ("latest" means that it will pull in the newest version, which is currently OS 3.0.0). It then creates a directory for placing your model assets, copies the files over, and runs an OS CLI command to run a workflow. One thing missing however is a mapped directory for storing results; in the above container, the model is ran and the container exists once its finished. If you want to run a pure Docker container, you'll need to use a
VOLUME statement to access the run results.
I personally don't like mounting volumes via the Dockerfile itself. A tool called Docker-Compose provides orchestration that makes this process easier and more flexible.
Docker-Compose lets you define a service for your Dockerized application. Its real purpose in life is to make developing applications easier by easily integrating all of your services and taking care of all the network stuff. If you are an energy modeler you might not care about that part but there are a few things Compose makes really easy for you. For one, its easier to use sensible naming when working with compose.
For instance, this command will run a basic OpenStudio Docker container and execute a model run:
docker run -v /Users/sam/model:/model -w /model nrel/openstudio openstudio run --workflow workflow.json
(this runs the base openstudio image, maps a local volume to the container, sets the working directory, and then runs the model via a workflow)
Versus doing it with Docker-Compose:
docker-compose run model
This command does the same exact thing but defines how the service is build and ran in a configuration file called
docker-compose.yml. To get this command to work, make sure both Docker and Docker-Compose are installed on your machine and create a compose file in the same directory as your Dockerfile and model assets; it should look like this:
version: "3" services: model: build: context: . dockerfile: Dockerfile working_dir: /code/model volumes: - ./model:/code/model command: openstudio run --workflow /code/model/workflow.json volumes: model:
Naming our service give us a convenient entry point (
run model) while build instructions and volume mounts let us easily tailor the modeling environment in a repeatable (and portable) fashion.
SuperStudio provides us a nice way to automate model builds and this extends to containers as well. Instead of launching an app to run models like we typically would, its important to define an OpenStudio workflow in order for the command line tools to run the model. The SuperStudio CLI can generate a basic workflow file for you if you aren't familiar with the format:
superstudio workflow --new --model /path/to/model.osm --weather /path/to/weather.epw
This is generate a new
workflow.json in the current directory.
Using a custom Dockerfile we can automate model development using a combination of superstudio and regular measures. The
sarocu/superstudio image is freely available on Docker Hub - it's based off of the latest OpenStudio image and installs OpenStudio-Standards and the SuperStudio CLI by default:
FROM sarocu/superstudio:latest run mkdir /code/model # Copy over geometry + weather files: COPY ./model/floorplan.json /code/model/ COPY ./model/USA_CO_Denver.Intl.AP.725650_TMY3.epw /code/model # Create a standards based model: # -t sets an OpenStudio-Standards building type # -z sets the climate zone # -s sets an OpenStudio-Standards standard template # -p sets the path to save the model to # --geometry --json /path/... tells superstudio to merge geometry from a FloorPlanJS file and sets the path to the file # -a true tells superstudio to use assumptions from OS-Standards WORKDIR /code/model RUN superstudio --create -t 'LargeOffice' -z 'ASHRAE 169-2006-5A' -s '90.1-2013' -p /code/model/denver-model --geometry --json /code/model/floorplan.json -a true # Create a workflow.json file: RUN superstudio workflow --new --model ./denver-model.osm --weather ./USA_CO_Denver.Intl.AP.725650_TMY3.epw
You can also use superstudio to scale your model and change your window-wall ratio by adding one line:
RUN superstudio --update --model /code/model/denver-model.osm --geometry --wwr 0.3 --floor_area 10000
This command uses the base model and scales it to 10,000 square feet while applying a 30% WWR.
Docker and Docker-Compose are great tools for automating your modeling process, and if you're developing applications that use OpenStudio and EnergyPlus under the hood, they simplify your development process and give you an easy way to get models running in the cloud. Once you're up and running on a basic Dockerized modeling environment, plug in the superstudio CLI to bootstrap models and automate the boring stuff.
If you have any questions or are interested in contributing to SuperStudio development, please reach out here or drop me an email at email@example.com