how to check if a string column in pyspark dataframe is all numeric

I agree to @steven answer but there is a slight modification since I want the whole table to be filtered out. PFB

df2.filter(F.col("id").cast("int").isNotNull()).show()

Also there is no need to create a new column called Values


Alternative solution similar to above is -

display(df2.filter(f"CAST({'id'} as INT) IS NOT NULL")

A simple cast would do the job :

from pyspark.sql import functions as F

my_df.select(
  "ID",
  F.col("ID").cast("int").isNotNull().alias("Value ")
).show()

+-----+------+
|   ID|Value |
+-----+------+
|25q36| false|
|75647|  true|
|13864|  true|
|8758K| false|
|07645|  true|
+-----+------+