Does the fact that there are two different mathematical definitions of entropy imply there are two different kinds of entropy?

One important point is that statistical entropy is define as a function of the total energy of the system $$ S(E) = k_B \ln \Omega(E).$$ Now assume that your system that start with total energy $E$ is bring to energy $E'=E+\delta Q$ by heat exchange. The heat exchanged here is $\delta Q$ and you have for infinitesimal change $$ S(E')-S(E) = \mathrm{d} S = \frac{d S}{dE}\delta Q$$ The temperature is actually define in statistical mechanics as $\frac{1}{T} = \frac{d S}{dE}$ and you retrieve your classical formula from statistical mechanics

$$\mathrm{d} S = \frac{\delta Q}T $$

Hence both formula are indeed connected.

A point about the difference between thermodynamics and statistical mechanics.

Thermodynamics is about what can be say on the system on an exterior basis, that means postulate of thermodynamics assume the existence of some functions (internal energy, entropy,...) and say that those function are enough to describe the exchange of the system with the exterior. But thermodynamics never provide a way to compute those function and associated quantities (such that heat capacity).

Statistical mechanics however is concerned by the computation of such quantities from first principle (you start from the hamiltonian of the system).

So we do not have a priori incompatibilities between definition of entropy in thermodynamics and statistical mechanics as thermodynamics never explain how to compute entropy without having to measure things. (If you measure heat capacity you should be able to retrieve the entropy but you will have to measure something)


How I consider the entropy definitions are connected:

Classical thermodynamics: Entropy is measure of the amount of energy which is unavailable to do work.

Statistical mechanics (Boltzmann entropy): Entropy a measure of the amount of information which is unavailable about the many-particle system (i.e. entropy is a measure of potential information, and Boltzmann = Shannon entropy when microstates are equiprobable)

So - if this is the same entropy - a measure of unavailable energy or information

then energy must be proportional to information, right?

Sure it is: Landauer's principle, the mathematical connection.