阅读量:0
在Spark中连接MySQL数据库有两种方式:
- 使用JDBC连接:
import org.apache.spark.sql.SparkSession val spark = SparkSession.builder() .appName("MySQLExample") .getOrCreate() val url = "jdbc:mysql://hostname:port/databaseName" val table = "tableName" val properties = new Properties() properties.put("user", "username") properties.put("password", "password") val df = spark.read.jdbc(url, table, properties) df.show()
- 使用MySQL Connector for Apache Spark:
首先需要在spark-submit
命令中添加MySQL Connector的jar包路径:
spark-submit --jars /path/to/mysql-connector-java.jar --class your_class your_jar.jar
然后在代码中使用MySQL Connector连接MySQL数据库:
import org.apache.spark.sql.SparkSession val spark = SparkSession.builder() .appName("MySQLExample") .getOrCreate() val url = "jdbc:mysql://hostname:port/databaseName" val table = "tableName" val properties = new Properties() properties.put("user", "username") properties.put("password", "password") val df = spark.read.format("jdbc") .option("url", url) .option("dbtable", table) .option("user", properties.getProperty("user")) .option("password", properties.getProperty("password")) .load() df.show()
以上是两种连接MySQL数据库的方式,可以根据需要选择适合自己的方法。