Skip to content

A Python library for sending notifications on the current status of a Spark Job.

License

Notifications You must be signed in to change notification settings

BrightEmah123/spark-pager

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

spark-pager

265, You got Mail

A 'tongue-in-cheek' pager of some sorts that notifies users via email alerts on the status of Spark Jobs during/after execution https://pypi.org/project/spark-pager/

Prerequisites

  • An E-mail Address and Password

Installation

Install spark-pager with pip

  pip install spark-pager

Usage

  • Import and instantiate package with e-mail credentials and spark context
from spark_pager import Pager
from pyspark import SparkContext

sc = SparkContext()

pager = Pager(spark_context=sc, 
              username = <username>, 
              password = <password>,
              host = <host>,
              port = <port>)

## monitor the spark-context when spark-jobs are initiated 
## and notify users on its status.

pager.listener()

💥 Note: the default host is smtp.gmail.com and the default port is 587; feel free to revert to the host and port of your choosing

  • To Stop the pager; run this .::
# To stop the pager
pager.stop()

# To stop the spark-context
sc.stop()

Example-Code

## Import Packages
from spark_pager import Pager
from pyspark import SparkContext
from pyspark.sql import SparkSession

## Set Spark Configuration
sc = SparkContext()
spark = SparkSession.builder \
                    .enableHiveSupport() \
                    .getOrCreate() 

spark.sparkContext.setLogLevel("ERROR")
spark.conf.set("spark.sql.repl.eagerEval.enabled", True)

## Intialize Pager 
pager = Pager(spark_context=sc, 
              username = user@gmail.com, 
              password = password)

## Set Listener
pager.listener()

df = spark.createDataFrame([("john-doe", 34), 
                            ("jane-doe", 22)], 
                            ["name", "age"])

# Stop Pager
pager.stop()

# Stop Spark-Context
sc.stop()          

💥 Note: Job Status could either be Running, Failed or Succeeded

Now if everything goes well; you should receive a mail notification that looks kind-of like this .::

alt text

Appendix

This project is open to contributions and additions from the community

Authors

About

A Python library for sending notifications on the current status of a Spark Job.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages