Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] NullPointer encountered when using paimon-spark #5117

Open
1 of 2 tasks
liyubin117 opened this issue Feb 20, 2025 · 1 comment
Open
1 of 2 tasks

[Bug] NullPointer encountered when using paimon-spark #5117

liyubin117 opened this issue Feb 20, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@liyubin117
Copy link
Contributor

liyubin117 commented Feb 20, 2025

Search before asking

  • I searched in the issues and found nothing similar.

Paimon version

paimon-spark-3.3-0.9
spark: 3.3.2

Compute Engine

java.lang.NullPointerException
        at org.apache.paimon.spark.SparkInternalRow.fromPaimon(SparkInternalRow.java:274)
        at org.apache.paimon.spark.SparkInternalRow.getUTF8String(SparkInternalRow.java:157)
        at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
        at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
        at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:760)
        at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
        at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:168)
        at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
        at org.apache.spark.scheduler.Task.run(Task.scala:136)
        at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)

Minimal reproduce step

todo

What doesn't meet your expectations?

NullPointer

Anything else?

No response

Are you willing to submit a PR?

  • I'm willing to submit a PR!
@liyubin117 liyubin117 added the bug Something isn't working label Feb 20, 2025
@yangjf2019
Copy link
Contributor

Hi @liyubin117 can you left more information for this problem,such as codes,logs,environment and so on. So that we reproduce it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants