How can I use robots.txt to disallow subdomain only?

You can serve a different robots.txt file based on the subdomain through which the site has been accessed. One way of doing this on Apache is by internally rewriting the URL using mod_rewrite in .htaccess. Something like:

RewriteEngine On
RewriteCond %{HTTP_HOST} !^(www\.)?example\.com$ [NC]
RewriteRule ^robots\.txt$ robots-disallow.txt [L]

The above states that for all requests to robots.txt where the host is anything other than www.example.com or example.com, then internally rewrite the request to robots-disallow.txt. And robots-disallow.txt will then contain the Disallow: / directive.

If you have other directives in your .htaccess file then this directive will need to be nearer the top, before any routing directives.