Why didn't GNU Info succeed man?

To answer your question with at least a hint of factual background I propose to start by looking at the timeline of creation of man, info and other documentation systems.

The first man page was written in 1971 using troff (nroff was not around yet) in a time when working on a CRT based terminal was not common and printing of manual pages the norm. The man pages use a simple linear structure. The man pages normally give a quick overview of a command, including its commandline option/switches.

The info command actually processes the output from Texinfo typesetting syntax. This had its initial release in February 1986, a time when working on a text based CRT was the norm for Unix users, but graphical workstations still exclusive. The .info output from Texinfo provides basic navigation of text documents. And from the outset has a different goal of providing complete documentation (for the GNU Project). Things like the use of the command and the commandline switches are only a small part of what an Texinfo file for a program contains.

Although there is overlap the (Tex)info system was designed to complement the man pages, and not to replace them.

HTML and web browsers came into existence in the early 90s and relatively quickly replaced text based information systems based on WAIS and gopher. Web browsers utilised the by then available graphical systems, which allows for more information (like underlined text for a hyperlink) then text-only systems allow. As the functionality info provides can be emulated in HTML and a web browser (possible after conversion), the browser based system allow for greater ease of navigation (or at least less experience/learning).

HTML was expanded and could do more things than Texinfo can. So for new projects (other than GNU software) a whole range of documentation systems has evolved (and is still evolving), most of them generating HTML pages. A recent trend for these is to make their input (i.e. what the human documenter has to provide) human readable, whereas Texinfo (and troff) is more geared to efficient processing by the programs that transform them.¹

info was not intended to be a replacement for the man pages, but they might have replaced them if the GNU software had included a info2man like program to generate the man pages from a (subset of a larger) Texinfo file.

Combine that with the fact that fully utilising the facilities that a system like Texinfo, (La(TeX, troff, HTML (+CSS) and reStructured Text provide takes time to learn, and that some of those are arguably more easy to learn and/or are more powerful, there is little chance of market dominance of (Tex)info.

¹ E.g reStructured Text, which can also be used to write man pages


GNU info was preceded and influenced by XINFO in TOPS-20.

XINFO was preceded and influenced by INFO in MIT ITS.

Back in the day, discs were tiny, terminals were slow, many terminals were still paper, and so-called 'glass TTYs' did not offer things like cursor addressing. Man pages were supposed to be short, reminders not real documentation. They were small enough that your sysadmin probably didn't remove them from the machine to save space, but might well just keep the compressed output around. They could be displayed on the crudest of terminals or typeset nicely. You could write new man pages using only software that came with your Unix distribution, and read them quickly without having to navigate blindly through a twisty maze of nodes and edges, not entirely unlike playing rogue or zork.

Eventually programs like tkman made it possible to get some of the benefits of info or HTML while retaining the man format, and even when some vendors like Sun switched over to providing documentation in SGML or XML (docbook), it was still processed by converting it to man format, because you needed man tools for things not provided by the vendor.

That thing about sysadmins removing or not installing documentation files? To this day in Linux, 'apt install foobar' too often requires a separate 'apt install foobar-doc'.