Skip to content
Advertisement

Compare schema of dataframe with schema of other dataframe

I have schema from two dataset read from hdfs path and it is defined below:

val df = spark.read.parquet("/path")

df.printSchema()

root
 |-- name: string (nullable = true)
 |-- id: integer (nullable = true)
 |-- dept: integer (nullable = true)

Advertisement

Answer

Since your schema file seems like a CSV :

// Read and convert into a MAP  
val csvSchemaDf = spark.read.csv("/testschemafile")
val schemaMap = csvSchema.rdd.map(x => (x(0).toString.trim,x(1).toString.trim)).collectAsMap

var isSchemaMatching = true

//Iterate through the schema fields of your df and compare 
for( field <- df.schema.toList ){
  if( !(schemaMap.contains(field.name) && 
        field.dataType.toString.equals(schemaMap.get(field.name).get))){
      //Mismatch 
      isSchemaMatching = false;
  }
}

use isSchemaMatching for further logic

User contributions licensed under: CC BY-SA
1 People found this is helpful
Advertisement