Databricks Spark Knowledge Base

Missing Dependencies in Jar Files

By default, maven does not include dependency jars when it builds a target. When running a Spark job, if the Spark worker machines don't contain the dependency jars - there will be an error that a class cannot be found.

The easiest way to work around this is to create a shaded or uber jar to package the dependencies in the jar as well.

It is possible to opt out certain dependencies from being included in the uber jar by marking them as <scope>provided</scope>. Spark dependencies should be marked as provided since they are already on the Spark cluster. You may also exclude other jars that you have installed on your worker machines.

Here is an example Maven pom.xml file that creates an uber jar with all the code in that project and includes the common-cli dependency, but not any of the Spark libraries.:

    <name>Databricks Spark Logs Analyzer</name>
            <id>Akka repository</id>
        <dependency> <!-- Spark -->
        <dependency> <!-- Spark SQL -->
        <dependency> <!-- Spark Streaming -->
        <dependency> <!-- Command Line Parsing -->