Re: Spark Context time out on Yarn cluster

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Spark Context time out on Yarn cluster

moon
Administrator
Hi,

Thanks for sharing the problem.
Could you create an issue in the JIRA? so we can keep track of this problem.

Thanks,
moon

On Thu, Jun 18, 2015 at 4:45 PM Sambit Tripathy (RBEI/EDS1) <[hidden email]> wrote:
Hi,
 
Recently dynamic allocation feature of YARN has been enabled on our cluster due to increase in workload. At the same time I upgraded Zeppelin to work with Spark 1.3.1.
 
Now the spark context that is created in the notebook is short lived. Every time I run some command it throws me an error saying, spark context has been stopped.
 
Do I have to provide some configurations in zeppelin-env.sh or interpreter settings to work with YARN dynamic allocation?
 
 
 
Regards,
Sambit.
 
Loading...