A Composable Dashboard for your Dev Environment with Jupyter

sharat87


2

Votes

Description:

Automating routine manual tasks is very important in a developer's workflow. It not only obviously saves time, especially if the task is to be done very frequently, but it's a habit that helps one grow as a developer. In this talk, I intend to show a different way of approaching the problem of automating such tasks by turning a Jupyter Notebook into a dashboard interface that knows how to get things done.

Solving problems of automation usually follows three steps. Firstly, identifying what needs to be automated, secondly, inspecting how it is currently being done manually and, finally, write a Python script to execute that process. This is a great way to do it because it has an aspect of self-evaluation, in that the problem being solved is very specific to the developer's needs and doesn't need to be very complex or configurable. This is okay, but, can we do better?

If we inspect our own typical workflow, particularly tasks besides coding, they usually fall under one of the following:

  1. A routine modification to a file on local disk. — standard library
  2. An HTTP request (GET or POST, localhost or elsewhere). — requests
  3. Starting a process, or running a shell command. — subprocess
  4. Run a command on a remote endpoint, connected with SSH. — paramiko
  5. Act on a database (SQL or otherwise, local or remote). — sqlalchemy or db-specific drivers

So, I started to wonder, if we can create an interface, a language with these above services making up the nomenclature, can we get such tasks done by just telling my computer what to do? Turns out it is.

The interface is a Jupyter notebook. Each of the services above, ends up being a Python module. I call this API, the base services layer. A second layer of modules uses these base services to define functions that do the tasks I need. Let's see an example of how this might work.

  • pull_all_repos() — Runs a git pull on all/some repos you know it's safe to do that on and gets them all to be up to date.
  • check_processes_on_servers() — Logs into servers with SSH and checks for the processes that you expect to be running and report status on what's missing.
  • docker_pull("staging") — Pulls docker images on you staging server(s).
  • delete_all_test_users_in_db() — Runs an SQL delete query on dev and staging databases to clear all users created for experimentation purposes and their associated data in the database.
  • check_errors_in_server_logs() — Logs in to servers with SSH and looks at the log files' last one hour content for any errors and reports them in the output.

All these examples assume a lot about how you work, where your code lies on your disk, how to connect to your servers etc., and that is okay. Sensitive information like that can easily be moved to a yaml file located elsewhere that the base services know how to read.

This then turns into a powerful paradigm where doing something routine translates into executing that relevant cell in my Jupyter Notebook. Additionally, now that there's a working base services layer, developer further automations is as easy as writing a function that leverages these services and talks to some/all of them to get the job done.

Prerequisites:

Fairly comfortable with Python syntax and standard library. Optionally, some experience with Jupyter.

Speaker Info:

Hello! I am Shrikant. I have been with software development directly or indirectly for close to eleven years, and have been writing Python on and off for close to nine years now. I have used Python to address problems of a wide range including automation, web development, DevOps, command line and GUI applications, server monitoring etc. Currently, together with the team at Appsmith, I'm helping build a developer tool to help build practical and useful apps quickly.

Section: Developer tools and automation
Type: Talks
Target Audience: Intermediate
Last Updated: