get datatype of column using pyspark

Your question is broad, thus my answer will also be broad.

To get the data types of your DataFrame columns, you can use dtypes i.e :

>>> df.dtypes
[('age', 'int'), ('name', 'string')]

This means your column age is of type int and name is of type string.


For anyone else who came here looking for an answer to the exact question in the post title (i.e. the data type of a single column, not multiple columns), I have been unable to find a simple way to do so.

Luckily it's trivial to get the type using dtypes:

def get_dtype(df,colname):
    return [dtype for name, dtype in df.dtypes if name == colname][0]

get_dtype(my_df,'column_name')

(note that this will only return the first column's type if there are multiple columns with the same name)