PySpark DataFrame withColumn multiple when conditions
stackoverflow.com › questions › 61926454Jul 2, 2021 · 3 How can i achieve below with multiple when conditions. from pyspark.sql import functions as F df = spark.createDataFrame ( [ (5000, 'US'), (2500, 'IN'), (4500, 'AU'), (4500, 'NZ')], ["Sales", "Region"]) df.withColumn ('Commision', F.when (F.col ('Region')=='US',F.col ('Sales')*0.05).\ F.when (F.col ('Region')=='IN',F.col ('Sales')*0.04).\