PySpark: How to judge column type of dataframe

TL;DR Use external data types (plain Python types) to test values, internal data types (DataType subclasses) to test schema.


First and foremost - You should never use

type(123) == int

Correct way to check types in Python, which handles inheritance, is

isinstance(123, int)

Having this done, lets talk about

Basically I want to know the way to directly get the object of the class like IntegerType, StringType from the dataframe and then judge it.

This is not how it works. DataTypes describe schema (internal representation) not values. External types, is a plain Python object, so if internal type is IntegerType, then external types is int and so on, according to the rules defined in the Spark SQL Programming guide.

The only place where IntegerType (or other DataTypes) instance exist is your schema:

from pyspark.sql.types import *

df = spark.createDataFrame([(1, "foo")])

isinstance(df.schema["_1"].dataType, LongType)
# True
isinstance(df.schema["_2"].dataType, StringType)
# True

_1, _2 = df.first()

isinstance(_1, int)
# True
isinstance(_2, str)
# True