The Spark Notebook allows performing reproducible analysis with Scala, Apache Spark and the Big Data ecosystem.
Apache Spark is available out of the box, and is simply accessed by the variable
Multiple Spark Context Support
One of the top most useful feature brought by the Spark Notebook is its separation of the running notebooks. Each started notebook will spawn a new JVM with its own SparkSession instance. This allows a maximal flexibility for:
- dependencies without clashes
- access different clusters
- tune differently each notebook
- external scheduling (on the roadmap)
We achieve maximum flexibility with the availability of multiple
sparkContexts by enabling metadata driven configuration.
The Spark Notebook supports exclusively the Scala programming language, the Unpredicted Lingua Franca for Data Science and extensibly exploits the JVM ecosystem of libraries to drive an smooth evolution of data-driven software from exploration to production.
The Spark Notebook is available for *NIX and Windows systems in easy to use ZIP/TAR, Docker and DEB packages.
All components in the Spark Notebook are dynamic and reactive.
The Spark Notebook comes with dynamic charts and most (if not all) components can be listened for and can react to events. This is very helpful in many cases, for example:
- data entering the system live at runtime
- visually plots of events
- multiple interconnected visual components Dynamic and reactive components mean that you don't have write the html, js, server code just for basic use cases.
Go to Quick Start for our 5-minutes guide to get up and running with the Spark Notebook.
C'mon on to Gitter to discuss things, to get some help, or to start contributing!
- Explore the Spark Notebook
- HTML Widgets
- Visualization Widgets
- Notebook Browser
- Running on Clusters and Clouds
- Advanced Topics
- Using Releases
- Building from Sources
- Creating Specific Distributions
- Creating your own custom visualizations
- User Authentication
- Advanced: How to Develop/improve
Spark Notebook gives us a clean, useful way to mix code and prose when we demo and explain our tech to customers. The Spark ecosystem needed this.
It allows our analysts and developers (15+ users) to run ad-hoc queries, to perform complex data analysis and data visualisations, prototype machine learning pipelines. In addition, we use it to power our BI dashboards.
|Kensu||website||Lifting Data Science to the Enterprise level|
|Agile Lab||website||The only Italian Spark Certified systems integrator|
|CloudPhysics||website||Data-Driven Inisghts for Smarter IT|
|Aliyun||product||Spark runtime environment on ECS and management tool of Spark Cluster running on Aliyun ECS|
|EMBL European Bioinformatics Institute||website||EMBL-EBI provides freely available data from life science experiments, performs basic research in computational biology and offers an extensive user training programme, supporting researchers in academia and industry.|
|Metail||website||The best body shape and garment fit company in the world. To create and empower everyone’s online body identity.|
|kt NexR||website||the kt NexR is one of the leading BigData company in the Korea from 2007.|
|Skymind||website||At Skymind, we’re tackling some of the most advanced problems in data analysis and machine intelligence. We offer start-of-the-art, flexible, scalable deep learning for industry.|
|Amino||website||A new way to get the facts about your health care choices.|
|Vinted||website||Online marketplace and a social network focused on young women’s lifestyle.|
|Vingle||website||Vingle is the community where you can meet someone like you.|
|47 Degrees||website||47 Degrees is a global consulting firm and certified Typesafe & Databricks Partner specializing in Scala & Spark.|
|Barclays||website||Barclays is a British multinational banking and financial services company headquartered in London.|
|Swisscom||website||Swisscom is the leading mobile service provider in Switzerland.|
|Knoldus||website||Knoldus is a global consulting firm and certified "Select" Lightbend & Databricks Partner specializing in Scala & Spark ecosystem.|