Since Databricks Runtime 12.2 Databricks started to wrap spark exceptions in their own exceptions.
https://learn.microsoft.com/en-us/azure/databricks/error-messages/
While for some users it might be handy, for our team it is not convinient, as we cannot see original exception, check whats going on in source code etc. When I put these stacktrace to IntelliJ, I cannot find such lines of code.
For example Databricsk say QueryExecutionErrors.scala:3372, but this file in Spark source code has only 2700 LoC and EXECUTOR_BROADCAST_JOIN_OOM cannot be found in Spark source code.
Could you please advise how to disable Databricsk error wrapping and get raw Spark error?