cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Disable Databricks-generated error messages

alsetr
New Contributor

Since Databricks Runtime 12.2 Databricks started to wrap spark exceptions in their own exceptions.

https://learn.microsoft.com/en-us/azure/databricks/error-messages/

While for some users it might be handy, for our team it is not convinient, as we cannot see original exception, check whats going on in source code etc. When I put these stacktrace to IntelliJ, I cannot find such lines of code.

For example Databricsk say  QueryExecutionErrors.scala:3372, but this file in Spark source code has only 2700 LoC and EXECUTOR_BROADCAST_JOIN_OOM cannot be found in Spark source code.

 

Could you please advise how to disable Databricsk error wrapping and get raw Spark error?

1 REPLY 1

-werners-
Esteemed Contributor III

Databricks does not use vanilla spark.
They added optimizations like the AQE, unity catalog etc.
So looking for the error in the spark source code will not always work (in a lot of cases it will)

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now
OSZAR »