microcontroller / cpu design book?

I would suggest Building a Modern Computer from First Principles. As the title suggests, it goes through the process of building a computer from the fundamental logic gates on up. Check out the reviews on amazon here.

The author's website actually has quite a lot of info on it, so you can decide for yourself if it looks like it would meet your needs.


I happen to be working on something like this but am not ready to release any of it yet. (at least not publicly)

I think you need to edit/refine your question though. Taking a general stab at the question. I would start with something simpler than both the ARM and x86, and would avoid x86 all together.

Having an assembler book implies you are a programmer or have some programming skills. I dont know if your question is more from the programming side or digital design side.

If you are coming from the programmer side, I would start by learning assembler for the msp430, and arm or thumb (dont mess with x86 just yet, it is not a good architecture), maybe PIC (older pic, not the dspic nor the pic32). Then write a disassembler and/or an instruction set simulator. if it is just a dissassembler choose a non-variable word length instruction set like the ARM or thumb (yes if you think about it you can handle that branch instruction) or the PIC. Eventually write an instruction set simulator, at least enough of one to implement a handful of instructions and a branch, make a loop see it execute. Then go study up on state machines and re-write the simulator using state machines. I have an instruction set simulator, not necessarily, completely, state machine based, but might give you the feel for what I am talking about. search for thumbulator at github. My username is dwelch67 there.

if you are approaching this from an electrical engineer perspective, you might want to look at opencores. The readability and other features of the hdl may be difficult but it is out there to be consumed. verilator and icarus verilog are good verilog simulators. I like verilator more partly because it is so rigid with the verilog parsing and second it is super easy to cross the C/C++ and HDL boundary for writing software to talk to the hardware being simulated, and other reusable type things. There is a tiny one called mcpu or something like that that the whole thing fits on about a page of paper if printed. You really have to wrap your head around how to write a program with only one or two real operations (I think it only has a NOR). If you are an electrical engineer and need a crash course in digital, the transistors are always saturated, basically they are used as electrically controlled switches. Take a pair of them with some wiring and other components and you can make AND, and OR gates, add an inverting transistor and make a NOT gate allowing NAND and NOR. You will want to learn about state machines if you dont already know.

In general you need to understand basic logic AND, OR, NOT, etc. Learn about state machines which work just fine either with software or hardware, the concept is the same. Reset and clocking are actually part of the state machine learning, getting the state machine going and thinking in terms of each pass through the logic/code is a clock cycle in the cpu and you state variables are used on each pass to indicate what to do during that clock cycle, what is the current state, and based on the current state and inputs what is the next state going to be, etc.


In the textbook domain, Computer Organization and Design is one of the classics. I'm not sure how well it works for self-study, though.