Custom spark for zeppelin and interpreter-list

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Custom spark for zeppelin and interpreter-list

Serega Sheypak
Hi, I have few concerns I can't resolve right now. I definitely can go though the source code and find the solution, but I would like to understand the idea behind.
I'm building Zeppelin from sources using 0.8.0-SNAPSHOT. I do build it with custom cloudera CDH spark 2.0-something.
I can't understand if built and started zeppelin uses my custom zeppelin-spark interpreter or not? 

interpreter-list has maven coordinates. What is it for? Will zeppelin try to grab interpreters from remote maven repo?

interpreter-list doesn't have spark in it. How does zeppelin figure out what and how spark interpreter use?
Reply | Threaded
Open this post in threaded view
|

Re: Custom spark for zeppelin and interpreter-list

moon
Administrator
Hi,

'conf/interpreter-list' is just catalogue file that `/bin/install-interpreter.sh' uses.
The information is not being used any other place.

'/bin/install-interpreter.sh' use 'conf/interpreter-list' to 1) print list of interpreter that Zeppelin community provides 2) convert short name to group:artifact:version, so user doesn't have to provide -t option.

Spark interpreter is being included in both zeppelin-bin-all and zeppelin-bin-netinst package, that's why `conf/interpreter-list' doesn't have it.

So, if you're trying to install your custom interpreter using 'bin/install-interpreter.sh', you can still do without modifying 'conf/interpreter-list' by providing '-t' option. If you're installing your custom interpreter without using 'bin/install-interpreter.sh', then 'conf/interpreter-list' is not related at all.

Hope this helps.

Best,
moon

On Sat, Apr 22, 2017 at 1:04 PM Serega Sheypak <[hidden email]> wrote:
Hi, I have few concerns I can't resolve right now. I definitely can go though the source code and find the solution, but I would like to understand the idea behind.
I'm building Zeppelin from sources using 0.8.0-SNAPSHOT. I do build it with custom cloudera CDH spark 2.0-something.
I can't understand if built and started zeppelin uses my custom zeppelin-spark interpreter or not? 

interpreter-list has maven coordinates. What is it for? Will zeppelin try to grab interpreters from remote maven repo?

interpreter-list doesn't have spark in it. How does zeppelin figure out what and how spark interpreter use?
Reply | Threaded
Open this post in threaded view
|

Re: Custom spark for zeppelin and interpreter-list

Serega Sheypak
I'm building Zeppelin from sources. I suppose it means that "default spark interpreter" which has my custom spark deps is included into built zeppelin dist.  It solves my problem! Thanks for explanation. 

2017-04-23 5:08 GMT+02:00 moon soo Lee <[hidden email]>:
Hi,

'conf/interpreter-list' is just catalogue file that `/bin/install-interpreter.sh' uses.
The information is not being used any other place.

'/bin/install-interpreter.sh' use 'conf/interpreter-list' to 1) print list of interpreter that Zeppelin community provides 2) convert short name to group:artifact:version, so user doesn't have to provide -t option.

Spark interpreter is being included in both zeppelin-bin-all and zeppelin-bin-netinst package, that's why `conf/interpreter-list' doesn't have it.

So, if you're trying to install your custom interpreter using 'bin/install-interpreter.sh', you can still do without modifying 'conf/interpreter-list' by providing '-t' option. If you're installing your custom interpreter without using 'bin/install-interpreter.sh', then 'conf/interpreter-list' is not related at all.

Hope this helps.

Best,
moon


On Sat, Apr 22, 2017 at 1:04 PM Serega Sheypak <[hidden email]> wrote:
Hi, I have few concerns I can't resolve right now. I definitely can go though the source code and find the solution, but I would like to understand the idea behind.
I'm building Zeppelin from sources using 0.8.0-SNAPSHOT. I do build it with custom cloudera CDH spark 2.0-something.
I can't understand if built and started zeppelin uses my custom zeppelin-spark interpreter or not? 

interpreter-list has maven coordinates. What is it for? Will zeppelin try to grab interpreters from remote maven repo?

interpreter-list doesn't have spark in it. How does zeppelin figure out what and how spark interpreter use?