阅读量:1
在Spark中,可以使用DataFrameWriter
的jdbc
方法来删除JDBC中的数据。具体的方法如下所示:
import org.apache.spark.sql._ val spark = SparkSession.builder() .appName("Delete JDBC data") .config("spark.master", "local") .getOrCreate() val jdbcUrl = "jdbc:mysql://localhost:3306/mydatabase" val jdbcUsername = "username" val jdbcPassword = "password" val table = "my_table" val condition = "id > 100" val deleteQuery = s"DELETE FROM $table WHERE $condition" val connectionProperties = new java.util.Properties() connectionProperties.put("user", jdbcUsername) connectionProperties.put("password", jdbcPassword) val df = spark.read.jdbc(jdbcUrl, table, connectionProperties) df.write.mode(SaveMode.Append).jdbc(jdbcUrl, table, connectionProperties)
上述代码中,deleteQuery
是要执行的DELETE语句,connectionProperties
包含了JDBC连接所需的用户和密码。然后,可以使用DataFrameWriter
的jdbc
方法将DELETE语句传递给JDBC以删除数据。