unread block data

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

unread block data

Kang Minwoo
Helle, Users

I think it is not a zeppelin issue.
But I ask for your help.

I am using zeppelin version 0.7.1, Spark 2.0.2, Hadoop 2.7.1

When I start zeppelin notebook it includes spark code, I got an error.
The error log is below.

---

Caused by: java.lang.IllegalStateException: unread block data
  at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2431)
  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1383)
  at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
  at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
  at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
  at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
  at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
  at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:298)
  ... 3 more

---

Does anyone of you have the same problem with me?

I checked master and slave JDK version, spark version...

Best regards,
Minwoo Kang
Reply | Threaded
Open this post in threaded view
|

Re: unread block data

Jeff Zhang

Could you try to run your code in spark shell first ?


Kang Minwoo <[hidden email]>于2017年9月25日周一 下午7:04写道:
Helle, Users

I think it is not a zeppelin issue.
But I ask for your help.

I am using zeppelin version 0.7.1, Spark 2.0.2, Hadoop 2.7.1

When I start zeppelin notebook it includes spark code, I got an error.
The error log is below.

---

Caused by: java.lang.IllegalStateException: unread block data
  at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2431)
  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1383)
  at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
  at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
  at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
  at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
  at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
  at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:298)
  ... 3 more

---

Does anyone of you have the same problem with me?

I checked master and slave JDK version, spark version...

Best regards,
Minwoo Kang
Reply | Threaded
Open this post in threaded view
|

RE: unread block data

Kang Minwoo
Thanks! I'll try.

Best regards,
Minwoo Kang

________________________________________
보낸 사람: Jeff Zhang <[hidden email]>
보낸 날짜: 2017년 9월 25일 월요일 오후 8:17:29
받는 사람: [hidden email]
제목: Re: unread block data

Could you try to run your code in spark shell first ?


Kang Minwoo <[hidden email]<mailto:[hidden email]>>于2017年9月25日周一 下午7:04写道:
Helle, Users

I think it is not a zeppelin issue.
But I ask for your help.

I am using zeppelin version 0.7.1, Spark 2.0.2, Hadoop 2.7.1

When I start zeppelin notebook it includes spark code, I got an error.
The error log is below.

---

Caused by: java.lang.IllegalStateException: unread block data
  at java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2431)
  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1383)
  at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
  at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
  at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
  at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
  at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
  at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
  at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:298)
  ... 3 more

---

Does anyone of you have the same problem with me?

I checked master and slave JDK version, spark version...

Best regards,
Minwoo Kang