Skip to content

Spark – Divide int with column?

I’m trying to divide a constant with a column. I know I can do

df.col("col1").divide(90)

but how can I do (90).divide(df.col("col1")) (obviously this is incorrect). Thank you!

Answer

Use o.a.s.sql.functions.lit:

lit(90).divide(df.col("col1"))

or o.a.s.sql.functions.expr:

expr("90 / col1")