Why AT command set?

brhans is correct - Legacy.

In the 1980s, Hayes began making the "Smartmodem 1200". It was obsolete almost immediately and Hayes rushed out the Smartmodem 2400. In that rush, there was no time for design alterations between the modem designs. As a result, Hayes were the first to make two different speed modems that accepted the same programming commands! Any software that could get a Smartmodem 1200 to dial a telephone number could also dial a Smartmodem 2400.

At the time, every new modem required months for an updated driver to be written. When the Smartmodem 2400 came on the market, there was already a working driver for the Smartmodem 1200 so no months of waiting. Suddenly other manufacturers realised the advantage of new modems having the same command set as older modems. Within six months, vendors were offering "Hayes compatible" modems as the only choice. Which got them sued by Hayes. So everyone started calling their modems "AT Command Set compatible", but continued to use the Hayes command set.

By the mid 80s no consumer modems were made that could not use the AT command set. As a result every modem like comms system uses AT commands. There are other advantages too - as the command set is ASCII, anyone can manually type AT commands into a terminal window to control a modem. Because my own modem had a dicey RJ11 connection, I used to start every session in Procomm Plus with:

AT
OK
ATH1
[dial tone]
ATDT [phone number]

Just to make sure I got the dial tone. If I didn't, I'd go around and wiggle the wires a bit!


You're talking only about the downsides of the command set. Consider the upsides:

  1. By using the AT command set, your communication device can immediately be put on any IP network via the OS's PPP implementation. The alternative is that in addition to designing a custom protocol interface, you have to write your own network device driver for every OS you want to support before that OS can use your device to join the Internet.

  2. Any competent engineer is going to know this protocol already. Take it from one whose day job requires him to understand and implement dozens of nonstandard serial protocols: one well-engineered common protocol is better.

  3. While it is true that the AT protocol is fairly complex and takes more memory to implement than a task-specific purpose-built protocol, it is also the case that someone who chooses to implement this protocol gets to avoid spending a bunch of time reinventing a perfectly good wheel. He's got decades of design expertise to draw from. He knows it will work before he commits development time to it. Good protocol design is surprisingly difficult.

    (One of these days, I'm going to publish my magnum opus, "Your Protocol Sucks," in the hopes of preventing the perpetration of more terrible half-considered one-off protocols.)


I'll expand on the other side of the question ... why not just add another signalling line to the interface?

That can only be asked by someone who didn't live through all the permutations of signalling lines on a genuine 25-pin RS232 interface. In addition to TXD, RXD and Gnd, there were several other pairs of signals already, RTS/CTS (Ready to Send, Clear To Send) DSR/DTR (Data Set Ready, Data Terminal Ready) and a hardware Hangup pin. And others. And no clear universal agreement between manufacturers exactly what did what function - why did you need two sets of hardware handshaking signals in the first place? And software XON/XOFF protocol on top of that) (And why did Diablo printers insist - uniquely as far as I know - on handshaking on pin 11?)

Some equipment required a full interface. Some were happy with TXD/RXD/Gnd. Some could be fooled into working by shorting pins 4 and 6 (thus looping back their own RTS to CTS). And some that should have been DCE were DTE or vice-versa and would only talk to anything else via a "null modem" cable with each pair of connections swapped.

Then to simplify all this, the IBM PC introduced a new 9-pin interface for RS232. Meaning all your existing collection of cables were obsolete and you had to start again...

All of which made life difficult even without considering that both ends may have been set to different baud rates...

This supported an entire industry built up around RS232 breakout boxes, cables and test/debugging tools.

Adding another signal, in this context, probably wasn't going to fly...