apache spark - What is the Pythonic way to share SparkContext/Session between classes? -


say have 2 classes , each of them uses spark. init sparksession in __init__ method of 1 of classes. want write new class make spark calls. what's pythonic way this?

you pass spark context __init__ method, like:

class mysparkcallingclass:     def __init__(self, sc):         self.sc = sc 

Comments