RDD Programming Guide - Spark 3.3.1 Documentation
spark.apache.org › docs › latestScala Java Python Spark 3.3.0 is built and distributed to work with Scala 2.12 by default. (Spark can be built to work with other versions of Scala, too.) To write applications in Scala, you will need to use a compatible Scala version (e.g. 2.12.X). To write a Spark application, you need to add a Maven dependency on Spark.