Skip to content
Advertisement

Spark – Divide int with column?

I’m trying to divide a constant with a column. I know I can do

df.col("col1").divide(90)

but how can I do (90).divide(df.col("col1")) (obviously this is incorrect). Thank you!

Advertisement

Answer

Use o.a.s.sql.functions.lit:

lit(90).divide(df.col("col1"))

or o.a.s.sql.functions.expr:

expr("90 / col1")
User contributions licensed under: CC BY-SA
5 People found this is helpful
Advertisement