• Stars
    star
    205
  • Rank 184,310 (Top 4 %)
  • Language
  • Created almost 7 years ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Databricks Jsonnet Coding Style Guide

Databricks Jsonnet Guide

With over 1000 jsonnet files and templates, Databricks is to the best of our knowledge one of the larger users of Jsonnet. This guide draws from our experience coaching and working with engineers at Databricks.

Jsonnet is a language used most commonly to describe a finite number of complex, differentiated resources. For example, we may be describing services deployed within a Kubernetes cluster, differentiated by running in development versus production. As another example, we may be describing resources within a Cloud Provider, such as an Amazon RDS or Google Cloud SQL database, deployed across differnet regions.

Because we are most commonly describing a finite and somewhat fixed set of resources, it is useful to think of jsonnet as code which is executed at commit time (in the code repository sense) to materialize specific resources such as Kubernetes Deployment JSONs or AWS CloudFormation templates. In this way, the materialized templates are production code and source code diffing tools are unit tests, which means that we can be more certain in the correctness of jsonnet templates without writing specialized tests.

Jsonnet is a relatively constrained language, but we have found that sometimes the most obvious way to build and extend jsonnet also leads to significant headache down the line for people to understand and extend code. We have found that the following guidelines work well for us on projects with high velocity -- depending on the needs of your team, your mileage might vary.

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Table of Contents

  1. Document History

  2. Syntactic Style

  3. Defining and Using Abstractions

  4. Best Practices

Document History

  • 2017-06-17: Initial version.

Syntactic Style

Autoformatting

Use jsonnet fmt to format files. This will fix basic style errors.

Variable Declaration

  • Variables should be named in camelCase style, and should have self-evident names.
    local serverPort = 1000;
    local clientPort = 2000;
    
  • Prefer local to :: syntax for private/local variables. Unlike ::, variables defined with local cannot be overridden by children, nor accessed by other files.
    {
      // CORRECT
      local myVariable = 3,
      result: myVariable + 1,
    
      // INCORRECT
      myVariable:: 3,
      result: $.myVariable + 1,
    }
    

Line Length

  • Limit lines to 100 characters.
  • The only exceptions are import statements and URLs (although even for those, try to keep them under 100 chars).

Spacing and Indentation

  • Put one space before and after operators

    local c = a + b;
    
  • Put one space after commas.

    ["a", "b", "c"] // CORRECT
    
    ["a","b","c"] // INCORRECT
    
  • Put one space after colons.

    {
      // CORRECT
      foo:: "bar",
      baz: "taz",
      { hello: "world" },
    
      // INCORRECT
      foo :: "bar",
      baz:"taz",
      { hello : "world" }, 
    
  • Use 2-space indentation in general.

  • Only method or class parameter declarations use 4-space indentation, to visually differentiate parameters from method body.

    // CORRECT
    local multiply(
        number1,
        number2) = {
      result: number1 * number 2
    }
    
  • Do NOT use vertical alignment. They draw attention to the wrong parts of the code and make the aligned code harder to change in the future.

    // Don't align vertically
    local plus     = "+";
    local minus    = "-";
    local multiply = "*";
    
    // Do the following
    local plus = "+";
    local minus = "-";
    local multiply = "*";
    

Blank Lines (Vertical Whitespace)

  • A single blank line appears:
    • Within method bodies, as needed to create logical groupings of statements.
    • Optionally before the first member or after the last member of a class or method.
  • Use one or two blank line(s) to separate class definitions.
  • Excessive number of blank lines is discouraged.

Defining and Using Abstractions

Defining Classes

  • Rather than defining a concrete JSON file, it is often useful to define a template which takes some set of parameters before being materialized into JSON. We can liken named functions which take a set of parameters and result in a fixed scheme to "classes" in object-oriented languages, and so we will use that terminology.
  • When defining a class, use the following syntax:
    local newAnimal(name, age) = {
      name: name,
      age: age,
    };
    {
      newAnimal:: newAnimal,
    }
    
  • Returning a dictionary with a "newXXX" method (rather than just returning the constructor directly) allows exposing constants, static methods, or related class constructors from the same file. In other words, it allows extending this class in the future without refactoring all downstream consumers.
  • When defining a class with both required and optional parameters, put required parameters first. Optional parameters should have a default, or null if a sentinel value is needed.
    local newAnimal(name, age, isCat = true) = { ... }
    
  • Wrap parameter declarations by putting one per line with 2 extra spaces of indentation, to differentiate from the method body. Doing this is always acceptable, even if the definition would not wrap.
    local newAnimal(
        name,
        age,
        isCat = true) = { 
      name: name,
      ...
    }
    

Defining Methods

  • Method definitions follow the same syntactic style as class definitions.
  • Methods defined within a class should always be defined with ::, as they fail to render with :.
  • Methods which return single values (rather than a dictionary) should use parentheses () to enclose their bodies if they are multi-line, identically to how braces would be used.
    {
      multiply:: function(number1, number2): (
        number1 * number 2
      ),
    }
    

Using Classes

  • Import all dependencies at the top of the file and given them names related to the imported file itself. This makes it easy to see what other files you depend on as the file grows.
    // CORRECT
    local animalTemplate = import "animal.jsonnet.TEMPLATE";
    animalTemplate.newAnimal("Finnegan", 3)
    
    // AVOID
    (import "animal.jsonnet.TEMPLATE").newAnimal("Finnegan, 3)
    
  • Prefer using named parameters, one per line, when constructing classes or invoking methods, especially when they wrap beyond one line:
    // PREFERRED
    animalTemplate.newAnimal(
      name = "Finnegan",
      age = 3,
    )
    
    // ACCEPTABLE, since it does not wrap
    animalTemplate.newAnimal("Finnegan", 3)
    

File Structure

  • Jsonnet files which can be materialized with no further inputs should end with the ".jsonnet" suffix.
  • Jsonnet files which requires parameters to be materialized or which are libraries should end with the ".jsonnet.TEMPLATE" suffix.
  • Additional suffixes can be appended to file name to indicate the type of resource a Jsonnet file is describing, for clarification purpose:
    instance.cfn.jsonnet.TEMPLATE <-- A Cloudformation template
    manager-deploy.jjb.jsonnet <-- A Jenkins Pipeline Job template
    core-vnet.tf.jsonnet.TEMPLATE <-- A Terraform template
    
  • Structuring libraries and imports is not a solved problem, so use your best judgement. In general if you have one common template and many individual instantiations, a workable pattern is:
    central-database/database.jsonnet.TEMPLATE <-- common template
    central-database/dev/database.jsonnet <-- dev instantiation which imports common template
    central-database/prod/database.jsonnet <-- prod instantiation which also imports common template
    
  • Jsonnet files which are used to describe Kubernetes resources, e.g. deployment, service, should be put inside the package of the service or component itself, for better component isolation:
    myservice/deploy/myservice-dev.jsonnet <-- A Kubernetes deployment for 'dev' environment
    webapp/deploy/webapp-service.jsonnet.TEMPLATE <-- A Kubernetes service template
    

Documentation Style

  • Use // for comments.
  • Document parameters using a description, @param, and @returns similar to JavaDoc:
    // Multicellular, eukaryotic organism of the kingdom Animalia 
    // @param name Name by which this animal may be called.
    // @param age Number of years (rounded to nearest int) animal has been alive.
    local Animal(name, age) = { ... } 
    
  • Always put documentation at the top of each jsonnet file or template to indicate its purpose.

Best Practices

The Golden Pattern

In most cases, you can define a single class which outputs the entire template you're looking for, and have a single concrete jsonnet file per actual resource to create. For example, you might define a common database template and a single dev jsonnet and a single prod jsonnet which imports from the common template.

In this case, a pattern like the following is most preferred:

// Common Kubernetes Deployment template for a specific web application (webapp.jsonnet.TEMPLATE)
local newWebApp(customerName, releaseTag) = {
  ... Kubernetes Deployment definition ...
  serviceName: customerName + "-webapp",
  dockerImage: "webapp:" + releaseTag,
  ... etc ...
};

{
  newWebApp:: newWebApp,
}


// A dev webapp deployment (in dev/ericl-webapp.jsonnet)
local webAppTemplate = import "../webapp.jsonnet.TEMPLATE";
webAppTemplate.newWebApp(
  customerName = "dev-test-1",
  releaseTag = "bleeding-edge",
)


// A production webapp deployment (in prod/foocorp-webapp.jsonnet)
local webAppTemplate = import "../webapp.jsonnet.TEMPLATE";
webAppTemplate.newWebApp(
  customerName = "foocorp",
  releaseTag = "2.42-rc1",
)

For a more complete example, see the examples and blog post.

Use this pattern as far as it will get you. Avoid implementing further abstractions and avoid default parameters for as long as possible. Keeping the number of abstractions low usually makes templates easier to understand. Avoiding default parameters means you have to explicitly choose a value in every situation (reducing corretness bugs).

On the flip side, new additions may require significant mechanial work due to repition. Templates do not have to be DRY (don't repeat yourself) because they are fully materialized at commit time, so correctness issues of repetitiveness are reduced and readability is more important. Use your best judgement when deciding when to build out a new abstraction to avoid repetition.

Parameter Objects

In situations where sets of parameters are shared between multiple templates or objects, define parameter objects which extract out the common set.

local newAwsVpcParams(region, accountId, virtualNetworkId, encryptionKeyId) = {
  ...
};


local newUsersDatabase(awsVpcParams, instanceSize, highAvailability) = {
  ...
}

local newFrontendVirtualMachine(awsVpcParams, instanceName) = {
  ...
} 

Overriding and Inserting Fields

Jsonnet has excellent support for supplementing existing JSON structures with new fields, for example:

local protoCat = animalTemplate.newAnimal(name = "Finnegan", age = 3);
// ... many lines later ...
local myCat = protoCat + {
  catYears: 28,
}

This pattern is very convenient for implementing arbitrary transformers on data strutcures, but it should be used with caution because it makes it hard to reason about how code within the parent class is executing and what fields are provided on "myCat".

This becomes especially confusing if you have multiple layers of such additions, or helper functions with transform properties within the input. In the rare cases that parameters are determined at various points within the code, prefer to make helper functions with construct and mutate parameter lists, which are ultimately passed to an Animal object.

For example:

// Better, but still to be avoided
local protoCat = animalTemplate.newAnimalParams(name = "Finnegan", age = 3);
// ... many lines later ...
local myCatParams = protoCat + {
  catYears: 28,
};
local myCat = animalTemplate.newAnimal(myCatParams)

This pattern ensures that inputs and outputs are fully determined by the code within Animal, rather than split between Animal and callers.

More Repositories

1

learning-spark

Example code from Learning Spark book
Java
3,864
star
2

koalas

Koalas: pandas API on Apache Spark
Python
3,317
star
3

Spark-The-Definitive-Guide

Spark: The Definitive Guide's Code Repository
Scala
2,678
star
4

scala-style-guide

Databricks Scala Coding Style Guide
2,673
star
5

spark-deep-learning

Deep Learning Pipelines for Apache Spark
Python
1,984
star
6

click

The "Command Line Interactive Controller for Kubernetes"
Rust
1,416
star
7

LearningSparkV2

This is the github repo for Learning Spark: Lightning-Fast Data Analytics [2nd Edition]
Scala
1,077
star
8

spark-sklearn

(Deprecated) Scikit-learn integration package for Apache Spark
Python
1,076
star
9

spark-csv

CSV Data Source for Apache Spark 1.x
Scala
1,051
star
10

tensorframes

[DEPRECATED] Tensorflow wrapper for DataFrames on Apache Spark
Scala
751
star
11

devrel

This repository contains the notebooks and presentations we use for our Databricks Tech Talks
HTML
672
star
12

reference-apps

Spark reference applications
Scala
648
star
13

spark-redshift

Redshift data source for Apache Spark
Scala
598
star
14

spark-sql-perf

Scala
543
star
15

spark-avro

Avro Data Source for Apache Spark
Scala
538
star
16

spark-xml

XML data source for Spark SQL and DataFrames
Scala
481
star
17

spark-corenlp

Stanford CoreNLP wrapper for Apache Spark
Scala
424
star
18

spark-training

Apache Spark training material
Scala
396
star
19

databricks-cli

(Legacy) Command Line Interface for Databricks
Python
376
star
20

spark-perf

Performance tests for Apache Spark
Scala
372
star
21

terraform-provider-databricks

Databricks Terraform Provider
Go
333
star
22

spark-knowledgebase

Spark Knowledge Base
328
star
23

delta-live-tables-notebooks

Python
285
star
24

databricks-ml-examples

Python
284
star
25

sjsonnet

Scala
252
star
26

mlops-stacks

This repo provides a customizable stack for starting new ML projects on Databricks that follow production best-practices out of the box.
Python
243
star
27

databricks-sdk-py

Databricks SDK for Python (Beta)
Python
185
star
28

dbt-databricks

A dbt adapter for Databricks.
Python
179
star
29

containers

Sample base images for Databricks Container Services
Dockerfile
157
star
30

sbt-spark-package

Sbt plugin for Spark packages
Scala
150
star
31

databricks-sql-python

Databricks SQL Connector for Python
Python
125
star
32

benchmarks

A place in which we publish scripts for reproducible benchmarks.
Python
106
star
33

databricks-vscode

VS Code extension for Databricks
TypeScript
104
star
34

terraform-databricks-examples

Examples of using Terraform to deploy Databricks resources
HCL
103
star
35

notebook-best-practices

An example showing how to apply software engineering best practices to Databricks notebooks.
Python
102
star
36

spark-tfocs

A Spark port of TFOCS: Templates for First-Order Conic Solvers (cvxr.com/tfocs)
Scala
88
star
37

intellij-jsonnet

Intellij Jsonnet Plugin
Java
82
star
38

sbt-databricks

An sbt plugin for deploying code to Databricks Cloud
Scala
71
star
39

spark-integration-tests

Integration tests for Spark
Scala
68
star
40

terraform-databricks-lakehouse-blueprints

Set of Terraform automation templates and quickstart demos to jumpstart the design of a Lakehouse on Databricks. This project has incorporated best practices across the industries we work with to deliver composable modules to build a workspace to comply with the highest platform security and governance standards.
Python
61
star
41

spark-pr-dashboard

Dashboard to aid in Spark pull request reviews
JavaScript
54
star
42

run-notebook

TypeScript
44
star
43

simr

Spark In MapReduce (SIMR) - launching Spark applications on existing Hadoop MapReduce infrastructure
Java
44
star
44

ide-best-practices

Best practices for working with Databricks from an IDE
Python
40
star
45

devbox

Scala
37
star
46

unity-catalog-setup

Notebooks, terraform, tools to enable setting up Unity Catalog
37
star
47

diviner

Grouped time series forecasting engine
Python
33
star
48

cli

Databricks CLI
Go
32
star
49

security-bucket-brigade

JavaScript
30
star
50

databricks-sdk-go

Databricks SDK for Go
Go
29
star
51

pig-on-spark

proof-of-concept implementation of Pig-on-Spark integrated at the logical node level
Scala
28
star
52

databricks-sql-cli

CLI for querying Databricks SQL
Python
27
star
53

automl

Python
26
star
54

databricks-sql-go

Golang database/sql driver for Databricks SQL.
Go
24
star
55

tpch-dbgen

Patched version of dbgen
C
22
star
56

als-benchmark-scripts

Scripts to benchmark distributed Alternative Least Squares (ALS)
Scala
22
star
57

databricks-sql-nodejs

Databricks SQL Connector for Node.js
JavaScript
21
star
58

spark-package-cmd-tool

A command line tool for Spark packages
Python
18
star
59

python-interview

Databricks Python interview setup instructions
15
star
60

xgb-regressor

MLflow XGBoost Regressor
Python
15
star
61

databricks-accelerators

Accelerate the use of Databricks for customers [public repo]
Python
15
star
62

tableau-connector

Scala
12
star
63

files_in_repos

Python
12
star
64

upload-dbfs-temp

TypeScript
12
star
65

spark-sklearn-docs

HTML
11
star
66

genomics-pipelines

secondary analysis pipelines parallelized with apache spark
Scala
10
star
67

workflows-examples

10
star
68

databricks-sdk-java

Databricks SDK for Java
Java
10
star
69

sqltools-databricks-driver

SQLTools driver for Databricks SQL
TypeScript
9
star
70

xgboost-linux64

Databricks Private xgboost Linux64 fork
C++
8
star
71

tmm

Python
7
star
72

mlflow-example-sklearn-elasticnet-wine

Jupyter Notebook
7
star
73

databricks-ttyd

C
6
star
74

dais-cow-bff

Code for the "Bridging the Production Gap" DAIS 2023 talk
Jupyter Notebook
4
star
75

setup-cli

Sets up the Databricks CLI in your GitHub Actions workflow.
Shell
4
star
76

terraform-databricks-mlops-aws-project

This module creates and configures service principals with appropriate permissions and entitlements to run CI/CD for a project, and creates a workspace directory as a container for project-specific resources for the Databricks AWS staging and prod workspaces.
HCL
4
star
77

jenkins-job-builder

Fork of https://docs.openstack.org/infra/jenkins-job-builder/ to include unmerged patches
Python
4
star
78

terraform-databricks-mlops-azure-project-with-sp-creation

This module creates and configures service principals with appropriate permissions and entitlements to run CI/CD for a project, and creates a workspace directory as a container for project-specific resources for the Azure Databricks staging and prod workspaces. It also creates the relevant Azure Active Directory (AAD) applications for the service principals.
HCL
4
star
79

terraform-databricks-sra

The Security Reference Architecture (SRA) implements typical security features as Terraform Templates that are deployed by most high-security organizations, and enforces controls for the largest risks that customers ask about most often.
HCL
4
star
80

databricks-empty-ide-project

Empty IDE project used by the VSCode extension for Databricks
3
star
81

databricks-repos-proxy

Python
2
star
82

databricks-asset-bundles-dais2023

Python
2
star
83

pex

Fork of pantsbuild/pex with a few Databricks-specific changes
Python
2
star
84

SnpEff

Databricks snpeff fork
Java
2
star
85

databricks-dbutils-scala

The Scala SDK for Databricks.
Scala
2
star
86

notebook_gallery

Jupyter Notebook
2
star
87

terraform-databricks-mlops-aws-infrastructure

This module sets up multi-workspace model registry between a Databricks AWS development (dev) workspace, staging workspace, and production (prod) workspace, allowing READ access from dev/staging workspaces to staging & prod model registries.
HCL
2
star
88

homebrew-tap

Homebrew Tap for the Databricks CLI
Ruby
1
star
89

terraform-databricks-mlops-azure-infrastructure-with-sp-creation

This module sets up multi-workspace model registry between an Azure Databricks development (dev) workspace, staging workspace, and production (prod) workspace, allowing READ access from dev/staging workspaces to staging & prod model registries. It also creates the relevant Azure Active Directory (AAD) applications for the service principals.
HCL
1
star
90

mfg_dlt_workshop

DLT Manufacturing Workshop
Python
1
star
91

terraform-databricks-mlops-azure-project-with-sp-linking

This module creates and configures service principals with appropriate permissions and entitlements to run CI/CD for a project, and creates a workspace directory as a container for project-specific resources for the Azure Databricks staging and prod workspaces. It also links pre-existing Azure Active Directory (AAD) applications to the service principals.
HCL
1
star
92

terraform-databricks-mlops-azure-infrastructure-with-sp-linking

This module sets up multi-workspace model registry between an Azure Databricks development (dev) workspace, staging workspace, and production (prod) workspace, allowing READ access from dev/staging workspaces to staging & prod model registries. It also links pre-existing Azure Active Directory (AAD) applications to the service principals.
HCL
1
star