Multiple Spark versions / groups in Zeppelin

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Multiple Spark versions / groups in Zeppelin

Fabian Böhnlein
Hi all,

we're looking to support multiple Spark versions in the same Zeppelin instances. Can this work with multiple Spark groups or in another way?

We already use multiple Interpreters (via "Create"in the Interpreter UI) to configure different Spark environments (all using group "spark").

How can I copy the spark group and adjust its SPARK_HOME? I could not find interpreter/spark/interpreter-setting.json which might configure this.

Thanks,
Fabian
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Multiple Spark versions / groups in Zeppelin

Jeff Zhang

you can define SPARK_HOME in the interpreter setting page for different spark version.

image.png


Fabian Böhnlein <[hidden email]>于2017年5月8日周一 上午5:12写道:
Hi all,

we're looking to support multiple Spark versions in the same Zeppelin instances. Can this work with multiple Spark groups or in another way?

We already use multiple Interpreters (via "Create"in the Interpreter UI) to configure different Spark environments (all using group "spark").

How can I copy the spark group and adjust its SPARK_HOME? I could not find interpreter/spark/interpreter-setting.json which might configure this.

Thanks,
Fabian
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Multiple Spark versions / groups in Zeppelin

Fabian Böhnlein
Indeed, that's it, thanks!

On Mon, 8 May 2017 at 17:23 Jeff Zhang <[hidden email]> wrote:

you can define SPARK_HOME in the interpreter setting page for different spark version.

image.png


Fabian Böhnlein <[hidden email]>于2017年5月8日周一 上午5:12写道:
Hi all,

we're looking to support multiple Spark versions in the same Zeppelin instances. Can this work with multiple Spark groups or in another way?

We already use multiple Interpreters (via "Create"in the Interpreter UI) to configure different Spark environments (all using group "spark").

How can I copy the spark group and adjust its SPARK_HOME? I could not find interpreter/spark/interpreter-setting.json which might configure this.

Thanks,
Fabian
Loading...