The famous driver not found issue I was request to draw a branch of data out of big data platform to MySQL recently. After some carefully consideration, I decide to code the logic in spark with the DataFrameWriter.write.jdbc() interface. Everything goes well until the integration test, in which I have to install my spark app for a real run. java.sql.SQLException: No suitable driver found for jdbc:mysql://bla:bla/bla Background1: I do follow the spark jdbc guide Actually, I did the simulation by follow the DATABRICKS: Connecting to SQL Databases using JDBC on my own zeppline notebook. And everything went well. Background2: I do follow the MySQL jdbc guide I did follow the official MySQL Connector/j: 6.1 Connecting to MySQL Using the JDBC DriverManager Interface coding sample. // The newInstance() call is a work around for some // broken Java implementations Class.forName("com.mysql.jdbc.Driver").newInstance(); Background3: I do attach the jar dependency I did a