how to configure livy.sparkr to run job across cluster nodes

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

how to configure livy.sparkr to run job across cluster nodes

Minquan Xu

Hi All,

 

My livy sparkr appears to run jobs in local mode, i.g, the jobs are not in yarn resource manager UI, while spark.pyspark and Jupyer Toree SperkR jobs run across cluster nodes.

 

Platform: hortonworks hdp 2.6, zeppelin version 0.70, os: linux 2.6.

 

In the zeppelin interpreter, livy.spark.master is set as yarn-client.

Any suggestion is appreciated.

 

 

%livy.sparkr

livy.spark.master

yarn-client

 

%spark.pyspark

Yarn ResourceManager UI

 

 

Jupyter toree SparkR

 

Thanks,

 

Minquan

 


This email and any attachments thereto may contain private, confidential, and privileged material for the sole use of the intended recipient. Any review, copying, or distribution of this email (or any attachments thereto) by others is strictly prohibited. If you are not the intended recipient, please contact the sender immediately and permanently delete the original and any copies of this email and any attachments thereto.


Reply | Threaded
Open this post in threaded view
|

FW: how to configure livy.sparkr to run job across cluster nodes

Minquan Xu

 

 

From: Minquan Xu
Sent: Monday, June 05, 2017 12:34 PM
To: '[hidden email]' <[hidden email]>
Subject: how to configure livy.sparkr to run job across cluster nodes

 

Hi All,

 

My livy sparkr appears to run jobs in local mode, i.g, the jobs are not in yarn resource manager UI, while spark.pyspark and Jupyer Toree SperkR jobs run across cluster nodes.

 

Platform: hortonworks hdp 2.6, zeppelin version 0.70, os: linux 2.6.

 

In the zeppelin interpreter, livy.spark.master is set as yarn-client.

Any suggestion is appreciated.

 

 

%livy.sparkr

livy.spark.master

yarn-client

 

%spark.pyspark

Yarn ResourceManager UI

 

 

Jupyter toree SparkR

 

Thanks,

 

Minquan

 


This email and any attachments thereto may contain private, confidential, and privileged material for the sole use of the intended recipient. Any review, copying, or distribution of this email (or any attachments thereto) by others is strictly prohibited. If you are not the intended recipient, please contact the sender immediately and permanently delete the original and any copies of this email and any attachments thereto.


Reply | Threaded
Open this post in threaded view
|

Re: how to configure livy.sparkr to run job across cluster nodes

Jianfeng (Jeff) Zhang
In reply to this post by Minquan Xu

Please check the conf of your livy server.

livy.spark.master should be yarn-cluster

It doesn’t take effect to set it in zeppelin interpreter

Best Regard,
Jeff Zhang


From: Minquan Xu <[hidden email]>
Reply-To: "[hidden email]" <[hidden email]>
Date: Tuesday, June 6, 2017 at 1:36 AM
To: "[hidden email]" <[hidden email]>
Subject: FW: how to configure livy.sparkr to run job across cluster nodes

 

 

From: Minquan Xu
Sent: Monday, June 05, 2017 12:34 PM
To: [hidden email]' <[hidden email]>
Subject: how to configure livy.sparkr to run job across cluster nodes

 

Hi All,

 

My livy sparkr appears to run jobs in local mode, i.g, the jobs are not in yarn resource manager UI, while spark.pyspark and Jupyer Toree SperkR jobs run across cluster nodes.

 

Platform: hortonworks hdp 2.6, zeppelin version 0.70, os: linux 2.6.

 

In the zeppelin interpreter, livy.spark.master is set as yarn-client.

Any suggestion is appreciated.

 

 

%livy.sparkr

livy.spark.master

yarn-client

 

%spark.pyspark

Yarn ResourceManager UI

 

 

Jupyter toree SparkR

 

Thanks,

 

Minquan

 


This email and any attachments thereto may contain private, confidential, and privileged material for the sole use of the intended recipient. Any review, copying, or distribution of this email (or any attachments thereto) by others is strictly prohibited. If you are not the intended recipient, please contact the sender immediately and permanently delete the original and any copies of this email and any attachments thereto.


Reply | Threaded
Open this post in threaded view
|

RE: how to configure livy.sparkr to run job across cluster nodes

Minquan Xu

 

Hi Jeff,

 

Thank you for reply.

 

Spark->Advanced livy-conf

livy.spark.master was already set as yarn-cluster while I executed my note book.

 

Thanks,

Minquan

From: Jianfeng (Jeff) Zhang [mailto:[hidden email]]
Sent: Monday, June 05, 2017 7:26 PM
To: [hidden email]
Subject: Re: how to configure livy.sparkr to run job across cluster nodes

 

 

Please check the conf of your livy server.

 

livy.spark.master should be yarn-cluster

 

It doesn’t take effect to set it in zeppelin interpreter

 

Best Regard,

Jeff Zhang

 

 

From: Minquan Xu <[hidden email]>
Reply-To: "[hidden email]" <[hidden email]>
Date: Tuesday, June 6, 2017 at 1:36 AM
To: "[hidden email]" <[hidden email]>
Subject: FW: how to configure livy.sparkr to run job across cluster nodes

 

 

 

From: Minquan Xu
Sent: Monday, June 05, 2017 12:34 PM
To: [hidden email]' <[hidden email]>
Subject: how to configure livy.sparkr to run job across cluster nodes

 

Hi All,

 

My livy sparkr appears to run jobs in local mode, i.g, the jobs are not in yarn resource manager UI, while spark.pyspark and Jupyer Toree SperkR jobs run across cluster nodes.

 

Platform: hortonworks hdp 2.6, zeppelin version 0.70, os: linux 2.6.

 

In the zeppelin interpreter, livy.spark.master is set as yarn-client.

Any suggestion is appreciated.

 

 

%livy.sparkr

livy.spark.master

yarn-client

 

%spark.pyspark

Yarn ResourceManager UI

 

 

Jupyter toree SparkR

 

Thanks,

 

Minquan

 


This email and any attachments thereto may contain private, confidential, and privileged material for the sole use of the intended recipient. Any review, copying, or distribution of this email (or any attachments thereto) by others is strictly prohibited. If you are not the intended recipient, please contact the sender immediately and permanently delete the original and any copies of this email and any attachments thereto.


Reply | Threaded
Open this post in threaded view
|

Re: how to configure livy.sparkr to run job across cluster nodes

Jeff Zhang
In reply to this post by Jianfeng (Jeff) Zhang

Then please check the livy server log to see whether there' any log about yarn app submission, it should run in yarn-cluster mode instead of local mode

Minquan Xu <[hidden email]>于2017年6月6日周二 下午10:55写道:

 

Hi Jeff,

 

Thank you for reply.

 

Spark->Advanced livy-conf

livy.spark.master was already set as yarn-cluster while I executed my note book.

 

Thanks,

Minquan

From: Jianfeng (Jeff) Zhang [mailto:[hidden email]]
Sent: Monday, June 05, 2017 7:26 PM
To: [hidden email]
Subject: Re: how to configure livy.sparkr to run job across cluster nodes

 

 

Please check the conf of your livy server.

 

livy.spark.master should be yarn-cluster

 

It doesn’t take effect to set it in zeppelin interpreter

 

Best Regard,

Jeff Zhang

 

 

From: Minquan Xu <[hidden email]>
Reply-To: "[hidden email]" <[hidden email]>
Date: Tuesday, June 6, 2017 at 1:36 AM
To: "[hidden email]" <[hidden email]>
Subject: FW: how to configure livy.sparkr to run job across cluster nodes

 

 

 

From: Minquan Xu
Sent: Monday, June 05, 2017 12:34 PM
To: [hidden email]' <[hidden email]>
Subject: how to configure livy.sparkr to run job across cluster nodes

 

Hi All,

 

My livy sparkr appears to run jobs in local mode, i.g, the jobs are not in yarn resource manager UI, while spark.pyspark and Jupyer Toree SperkR jobs run across cluster nodes.

 

Platform: hortonworks hdp 2.6, zeppelin version 0.70, os: linux 2.6.

 

In the zeppelin interpreter, livy.spark.master is set as yarn-client.

Any suggestion is appreciated.

 

 

%livy.sparkr

livy.spark.master

yarn-client

 

%spark.pyspark

Yarn ResourceManager UI

 

 

Jupyter toree SparkR

 

Thanks,

 

Minquan

 


This email and any attachments thereto may contain private, confidential, and privileged material for the sole use of the intended recipient. Any review, copying, or distribution of this email (or any attachments thereto) by others is strictly prohibited. If you are not the intended recipient, please contact the sender immediately and permanently delete the original and any copies of this email and any attachments thereto.



image002.png (12K) Download Attachment
image003.png (12K) Download Attachment
image002.png (12K) Download Attachment