Livy interpreter - external libraries and changing queue name at runtime

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Livy interpreter - external libraries and changing queue name at runtime

Anandha L Ranganathan
We are using Livy interpreter from Zeppelin to connect to Spark.

In this,  we want to give the users an option to download the external libraries.
By default we have added some basic libraries in interpreter setting.

In spark interpreter, an users can download the external libraries they want using this command.
%spark.dep
z.reset()
z.addRepo("Spark Packages Repo").url("http://dl.bintray.com/spark-packages/maven")
z.load("com.databricks:spark-csv_2.11:1.2.0")


How can we import the external libraries using livy ?


Another question, is there a way to change the yarn queue name at runtime? Some users want to use different queue rather than default queue assigned in the interpreter.  If that feature is not available, then what is the best approach to implement this ?

Thanks
Anand

 
Reply | Threaded
Open this post in threaded view
|

Re: Livy interpreter - external libraries and changing queue name at runtime

Jeff Zhang

You can do it via livy interpreter setting.

Here's 2 configuration which can help you add external jars and external packages

livy.spark.jars
livy.spark.jars.packages   

And this is the configuration for queue name
livy.spark.yarn.queue

Anandha L Ranganathan <[hidden email]>于2017年11月22日周三 上午9:13写道:
We are using Livy interpreter from Zeppelin to connect to Spark.

In this,  we want to give the users an option to download the external libraries.
By default we have added some basic libraries in interpreter setting.

In spark interpreter, an users can download the external libraries they want using this command.
%spark.dep
z.reset()
z.addRepo("Spark Packages Repo").url("http://dl.bintray.com/spark-packages/maven")
z.load("com.databricks:spark-csv_2.11:1.2.0")


How can we import the external libraries using livy ?


Another question, is there a way to change the yarn queue name at runtime? Some users want to use different queue rather than default queue assigned in the interpreter.  If that feature is not available, then what is the best approach to implement this ?

Thanks
Anand

 
Reply | Threaded
Open this post in threaded view
|

Re: Livy interpreter - external libraries and changing queue name at runtime

Anandha L Ranganathan
Thanks Jeff.

Is that something I can use it in the notebook or in the interpreter? If it is in the notebook can you provide me with syntax ? I tried in the notebook and it is throwing an error.




On Tue, Nov 21, 2017 at 5:28 PM, Jeff Zhang <[hidden email]> wrote:

You can do it via livy interpreter setting.

Here's 2 configuration which can help you add external jars and external packages

livy.spark.jars
livy.spark.jars.packages   

And this is the configuration for queue name
livy.spark.yarn.queue

Anandha L Ranganathan <[hidden email]>于2017年11月22日周三 上午9:13写道:
We are using Livy interpreter from Zeppelin to connect to Spark.

In this,  we want to give the users an option to download the external libraries.
By default we have added some basic libraries in interpreter setting.

In spark interpreter, an users can download the external libraries they want using this command.
%spark.dep
z.reset()
z.addRepo("Spark Packages Repo").url("http://dl.bintray.com/spark-packages/maven")
z.load("com.databricks:spark-csv_2.11:1.2.0")


How can we import the external libraries using livy ?


Another question, is there a way to change the yarn queue name at runtime? Some users want to use different queue rather than default queue assigned in the interpreter.  If that feature is not available, then what is the best approach to implement this ?

Thanks
Anand

 

Reply | Threaded
Open this post in threaded view
|

Re: Livy interpreter - external libraries and changing queue name at runtime

Jeff Zhang

livy doesn't support adding dependency via in note like %spark.dep, you have to do it in interpreter setting. 


Anandha L Ranganathan <[hidden email]>于2017年11月23日周四 上午4:37写道:
Thanks Jeff.

Is that something I can use it in the notebook or in the interpreter? If it is in the notebook can you provide me with syntax ? I tried in the notebook and it is throwing an error.




On Tue, Nov 21, 2017 at 5:28 PM, Jeff Zhang <[hidden email]> wrote:

You can do it via livy interpreter setting.

Here's 2 configuration which can help you add external jars and external packages

livy.spark.jars
livy.spark.jars.packages   

And this is the configuration for queue name
livy.spark.yarn.queue

Anandha L Ranganathan <[hidden email]>于2017年11月22日周三 上午9:13写道:
We are using Livy interpreter from Zeppelin to connect to Spark.

In this,  we want to give the users an option to download the external libraries.
By default we have added some basic libraries in interpreter setting.

In spark interpreter, an users can download the external libraries they want using this command.
%spark.dep
z.reset()
z.addRepo("Spark Packages Repo").url("http://dl.bintray.com/spark-packages/maven")
z.load("com.databricks:spark-csv_2.11:1.2.0")


How can we import the external libraries using livy ?


Another question, is there a way to change the yarn queue name at runtime? Some users want to use different queue rather than default queue assigned in the interpreter.  If that feature is not available, then what is the best approach to implement this ?

Thanks
Anand

 

Reply | Threaded
Open this post in threaded view
|

Re: Livy interpreter - external libraries and changing queue name at runtime

Anandha L Ranganathan
Thanks Jeff.

We will add dependencies though livy.spark.jars.packages


Thanks
Anand



On Wed, Nov 22, 2017 at 4:29 PM, Jeff Zhang <[hidden email]> wrote:

livy doesn't support adding dependency via in note like %spark.dep, you have to do it in interpreter setting. 


Anandha L Ranganathan <[hidden email]>于2017年11月23日周四 上午4:37写道:
Thanks Jeff.

Is that something I can use it in the notebook or in the interpreter? If it is in the notebook can you provide me with syntax ? I tried in the notebook and it is throwing an error.




On Tue, Nov 21, 2017 at 5:28 PM, Jeff Zhang <[hidden email]> wrote:

You can do it via livy interpreter setting.

Here's 2 configuration which can help you add external jars and external packages

livy.spark.jars
livy.spark.jars.packages   

And this is the configuration for queue name
livy.spark.yarn.queue

Anandha L Ranganathan <[hidden email]>于2017年11月22日周三 上午9:13写道:
We are using Livy interpreter from Zeppelin to connect to Spark.

In this,  we want to give the users an option to download the external libraries.
By default we have added some basic libraries in interpreter setting.

In spark interpreter, an users can download the external libraries they want using this command.
%spark.dep
z.reset()
z.addRepo("Spark Packages Repo").url("http://dl.bintray.com/spark-packages/maven")
z.load("com.databricks:spark-csv_2.11:1.2.0")


How can we import the external libraries using livy ?


Another question, is there a way to change the yarn queue name at runtime? Some users want to use different queue rather than default queue assigned in the interpreter.  If that feature is not available, then what is the best approach to implement this ?

Thanks
Anand