What is the difference between a layer with a linear activation and a layer with no activation at all?

If you don't assign in Dense layer it is linear activation. This is from keras documentation.

activation: Activation function to use (see activations). If you don't specify anything, no activation is applied (ie. "linear" activation: a(x) = x)

You can only add Activation if you want to use other than 'linear'.

model.add(Dense(1500))
model.add(Activation('relu'))
model.add(Dense(1500))

You are right, there is no difference between your snippets: Both use linear activation.

The activation function determines if it is non-linear (e.g. sigmoid is a non-linear activation function):

model.add(Dense(1500))
model.add(Dense(1500, activation='sigmoid'))

7 Common Nonlinear Activation Functions and How to Choose an Activation Function