How could we see microwave radiation with our eyes?

Microwave radiation is the range of EM wavelengths between $1$ mm and $1$ meter

There are several reasons that the human eye cannot detect microwave radiation

  • The cornea, aqueous humor, and crystalline lens collectively act as a bandpass filter, only passing wavelengths between 400 nm and 1400 nm. In other words, the material is opaque to microwave radiation. For this reason you will not see microwave radiation. The above transmission data is from the SPIE Visual and Opthalmic Optics Field Guide.

  • The retina itself is what detects electromagnetic radiation. There are two types of detectors in your retina: cones detect color (red, green, and blue) and rods detect low light signals in black and white only. In a very dim room or outside on a dark night you will only see in black and white because your cones, which provide color detection, are inactive. Only your rods will detect light. Neither your cones nor your rods are sensitive in the microwave spectrum and in fact their sensitivity tapers off around 700nm (for comparision, microwave radiation picks up at 1mm, i.e. 1,000,000 nm). The sensitivity of your rods and cones are depicted here (Each color represents its respective cone and the black dashed line is the sensitivity of your rods)

You mentioned an equation that discusses the size of the eye. With the current materials in your eye (cornea, aqueous humor, and crystalline lens), as well as the response of the rods and cones, you will not be able to see microwave radiation regardless of how large your eye is.

However, an equation that might be what you're thinking of is the following:

$$ \sin\theta = \frac{1.22 \lambda }{D}$$

In this equation, $\theta$ is the diffraction limited (best) angular resolution that you are able to resolve. In other words, if $\theta$ is smaller, you have better resolution.

On the right hand side of this equation is $\lambda$, the wavelength of light in question, and $D$, which is the diameter of the entrance pupil of your optical system. In the case of the human eye, $\lambda \approx 500nm$ and $D_\text{pupil}\approx 2.5mm$.

This computes to a diffraction limited angular resolution of

$$ \theta = \sin^{-1} \left( \frac{1.22\lambda}{D} \right) = \sin^{-1} \left( \frac{1.22 \times 500\times 10^{-9}\text{ meters}}{2.5\times 10^{-3} \text{ meters}} \right) \approx 0.014 ^o$$

For this diffraction limited angular resolution, if you were to increase lambda by a factor of $2000$, putting you at the edge of the microwave radiation regime, you would have to increase the pupil size, $D$ by a factor of $2000$ to maintain the same angular resolution: your pupil would have to be 5 meters in diameter and your eye would be even bigger if you wanted to maintain the same angular resolution.

If you did not increase the pupil size, but did increase your wavelength to $1 mm$, your diffraction limited resolution would be about $30 ^o$. You would have difficulty distinguishing two people at opposite sides of a medium sized room at this wavelength.