ValueError: Cannot convert column into bool

It is complaining because you give your calc_dif function the whole column objects, not the actual data of the respective rows. You need to use a udf to wrap your calc_dif function :

from pyspark.sql.types import IntegerType
from pyspark.sql.functions import udf

l = [(2, 1), (1,1)]
df = spark.createDataFrame(l)

def calc_dif(x,y):
    # using the udf the calc_dif is called for every row in the dataframe
    # x and y are the values of the two columns 
    if (x>y) and (x==1):
        return x-y

udf_calc = udf(calc_dif, IntegerType())

dfNew = df.withColumn("calc", udf_calc("_1", "_2"))
dfNew.show()

# since x < y calc_dif returns None
+---+---+----+
| _1| _2|calc|
+---+---+----+
|  2|  1|null|
|  1|  1|null|
+---+---+----+

Either use udf:

from pyspark.sql.functions import udf

@udf("integer")
def calc_dif(x,y):
    if (x>y) and (x==1):
        return x-y

or case when (recommended)

from pyspark.sql.functions import when

def calc_dif(x,y):
    when(( x > y) & (x == 1), x - y)

The first one computes on Python objects, the second one on Spark Columns