getOrCreate ( conf = conf ) #Start the spark context # Monitor should spawn under the cell with 4 jobs sc. You can use it as follows: from pyspark import SparkContext # start the spark context using the SparkConf the extension inserted sc = SparkContext. With the extension installed, a SparkConf object called conf will be usable from your notebooks. # run jupyter lab IPYTHONDIR =.ipython jupyter lab -watch ipython/profile_default/ipython_config.py Ipython profile create -ipython-dir =.ipythonĮcho "c.('sparkmonitor.kernelextension')" >.
![jupyterlab docker image jupyterlab docker image](https://miro.medium.com/max/2000/1*gar_8pG4PrU5nt4LDRr0Jw.png)
Jupyterlab docker image install#
Setting up the extension pip install jupyterlab-sparkmonitor # install the extension # set up ipython profile and add our kernel extension to it docker run -it -p 8888:8888 itsjafer/sparkmonitor This docker image has pyspark and several other related packages installed alongside the sparkmonitor extension. Quick Start To do a quick test of the extension Support for multiple SparkSessions (default port is 4040).For a detailed list of features see the use case notebooks.A notebook server extension that proxies the Spark UI and displays it in an iframe popup for more details.A graph showing number of active tasks & executor cores vs time.A timeline which shows jobs, stages, and tasks.
![jupyterlab docker image jupyterlab docker image](https://img1.daumcdn.net/thumb/R800x0/?scode=mtistory2&fname=https:%2F%2Fblog.kakaocdn.net%2Fdn%2FXKCyT%2FbtrKsFpA8d7%2FX3FJsSR5JdozE3YHCMp0a1%2Fimg.png)
![jupyterlab docker image jupyterlab docker image](https://miro.medium.com/max/1384/1*jvl6_eU5ajTVO8Rff3qEoQ.png)
This project was originally written by krishnan-r as a Google Summer of Code project for Jupyter Notebook. Spark Monitor - An extension for Jupyter Lab