Comments (7)
Hi @larssk , a version that supports Spark 3.3 was only released yesterday (as a fix for this ticket). Please make sure you use the right version of sparksql-scalapb according to the list here: https://scalapb.github.io/docs/sparksql/#setting-up-your-project
from scalapb.
Thanks for the update ! Looks like this triggers a new issue: Task not serializable exception
Caused by: java.io.NotSerializableException: scalapb.descriptors.FieldDescriptor
Similar to issue 278 but I couldn't figure out the fix for it.
Tested with
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "1.1.0")
addSbtPlugin("com.thesamet" % "sbt-protoc" % "1.0.6")
libraryDependencies += "com.thesamet.scalapb" %% "compilerplugin" % "0.11.13"
scalaVersion := "2.12.13"
val sparkVersion = "3.3.0"
libraryDependencies += "com.thesamet.scalapb" %% "sparksql33-scalapb0_11" % "1.0.2"
Serialization stack:
- object not serializable (class: scalapb.descriptors.FieldDescriptor, value: )
- element of array (index: 4)
- array (class [Ljava.lang.Object;, size 10)
- element of array (index: 1)
- array (class [Ljava.lang.Object;, size 3)
- field (class: java.lang.invoke.SerializedLambda, name: capturedArgs, type: class [Ljava.lang.Object;)
- object (class java.lang.invoke.SerializedLambda, SerializedLambda[capturingClass=class org.apache.spark.sql.execution.WholeStageCodegenExec, functionalInterfaceMethod=scala/Function2.apply:(Ljava/lang/Object;Ljava/lang/Object;)Ljava/lang/Object;, implementation=invokeStatic org/apache/spark/sql/execution/WholeStageCodegenExec.$anonfun$doExecute$4$adapted:(Lorg/apache/spark/sql/catalyst/expressions/codegen/CodeAndComment;[Ljava/lang/Object;Lorg/apache/spark/sql/execution/metric/SQLMetric;Ljava/lang/Object;Lscala/collection/Iterator;)Lscala/collection/Iterator;, instantiatedMethodType=(Ljava/lang/Object;Lscala/collection/Iterator;)Lscala/collection/Iterator;, numCaptured=3])
- writeReplace data (class: java.lang.invoke.SerializedLambda)
from scalapb.
Please provide a minimal example to reproduce the issue, ideally this can be done in a fork of https://github.com/thesamet/sparksql-scalapb-test or a failing test under the sparksql-scalapb/src/test/
directory in this repo.
from scalapb.
Producing and consuming within this project works as expected:
thesamet/sparksql-scalapb-test@master...larssk:sparksql-scalapb-test-spark33:master
But the problem persists when reading from an existing topic. Checking..
from scalapb.
Works on Spark 3.3.2. Fails on Spark 3.3.0, with java.io.NotSerializableException.
Command to execute:
spark-3.3.0-bin-hadoop3/bin/spark-submit --jars . --class myexample.RunDemo target/scala-2.12/sparksql-scalapb-test-spark33-assembly-1.0.0.jar
from scalapb.
I've updated the description in order to reproduce the issue:
https://github.com/larssk/sparksql-scalapb-test-spark33/blob/master/README.md
I get the same error locally as on dataproc (which is unfortunately limited to Spark 3.3.0).
from scalapb.
Hi @larssk , thanks for creating an example - I was able to reproduce. As far as I can tell this is a Spark bug that has been fixed. We'd have to wait for dataproc to support latest version.
from scalapb.
Related Issues (20)
- JSON serialization of Any message fails when the message has a special JSON representation HOT 2
- scalapb grpc runtime & InProcessTransport HOT 4
- Unrecognized enum serialized as int HOT 2
- Cannot name a 'oneof' field 'option' if an optional field exists HOT 1
- --jvm_0_out: protoc-gen-jvm_0: Plugin failed with status code 1. WIth Java version 11, change to 17 is fixed HOT 21
- Type Mismatch Error with `asRecognized` in versions 0.11.14 and 0.11.15 HOT 3
- sealed_oneof_companion_extends doesn't work for optional sealed oneof
- Sealed oneof "extend" for Empty case HOT 1
- java.lang.IllegalAccessError: tried to access method 'com.google.protobuf.LazyStringArrayList' HOT 10
- google/protobuf/empty.proto: File not found. HOT 1
- Define trait for GeneratedEnumCompanion’s fromJavaValue/toJavaValue HOT 1
- Support Scala 3.4 type wildcards HOT 2
- `_typemapper` are defined as package private, causing issues when deriving schemas HOT 4
- Protobuf with a field named using generates invalid Scala 3 code HOT 1
- ArrayIndexOutOfBoundsException on getFieldByNumber HOT 5
- UTF-8 Strings are unparseable? HOT 2
- JsonFormat.toJsonString omits authorisation string in the output JSON HOT 2
- Dependency on `scalapb.options.ScalapbProto` in generated code. HOT 3
- New release HOT 1
- Add an option to generate oneof fields as a sealed abstract class
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from scalapb.