#124 in Computers & technology books
Use arrows to jump to the previous/next product

Reddit mentions of Computer Organization and Design: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design)

Sentiment score: 16
Reddit mentions: 21

We found 21 Reddit mentions of Computer Organization and Design: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design). Here are the top ones.

Computer Organization and Design: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design)
Buying options
View on Amazon.com
or
Specs:
Height9 Inches
Length7.5 Inches
Number of items1
Weight3.45905289078 Pounds
Width1.5 Inches

idea-bulb Interested in what Redditors like? Check out our Shuffle feature

Shuffle: random products popular on Reddit

Found 21 comments on Computer Organization and Design: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design):

u/raz_c · 30 pointsr/programming

First of all you don't need to write a billion lines of code to create an OS. The first version of Minix was 6000 lines long. The creator of Minix, Andrew Tanenbaum, has a great book called Modern Operating Systems, in which he explains the inner workings of several famous operating systems.

Considering the emphasis on "from scratch", you should also have a very good understanding of the underlying hardware. A pretty good starter book for that is Computer Organization and Design by David A. Patterson and John L. Hennessy. I suggest reading this one first.

​

I hope this can get you started.

u/Fruitcakey · 10 pointsr/learnprogramming

Well, I'm entering my final year of my degree. I've did it the hard way with lots of fanny-arsing and making life difficult for myself. I strongly recommend you don't do it my way.
In my experience, in your first year you won't get exposed to a great deal of code, nothing a clever university student can't cope with.

If I was to re-do my degree, I would get a a grasp on the more theoretical side early on. Over the next 4 years you'll be doing plenty of programming, what programming you can cram into the summer, ultimately won't count for much.

Early on you will be exposed to logic gates. AND, NOR, XOR, NAND etc. Think of these as the smallest possible, circuit level, building blocks that computers run on. You can construct all types of logic gates using the universal gates NOR and NAND. If you can teach yourself that (which I don't think would be too difficult) then you'll be ahead in at least one of your classes.

At some point you will have to learn a functional programming language, and if you're looking at it for the first time, it's a complete mind-fuck.
http://learnyouahaskell.com/
That's an excellent resource for learning haskell, arguably the most popular functional language. If you manage to work through some of that, you'll be miles ahead of your class mates still struggling with the concept.

You will likely have some sort of data structures and algorithms class, so if I were you, I'd familiarise myself with the main ones.
Learn the difference between an array and a linked list, how quick sort and merge sort work. Trees and Binary Trees, breadth first search vs. depth first search. Amongst others. Don't exhaust yourself over it, but at least get a flavour for it.

Maybe in first year, maybe in second. You'll start learning about Instruction set architectures, cache, operating systems and some assembly. If you're keen check out this:
http://www.amazon.co.uk/Computer-Organization-Design-Interface-Architecture/dp/0123747503
I genuinely can't recommend it enough.

Learning about the Internet Protocol Suite is probably a good idea. I found it really interesting and not too complex.


These are just my suggestions. In my opinion, they are manageable, can be self-taught, and will provide you with a good head start, cover a few bases. Sure, at some point you will need to delve into number theory and graph theory, and proof-by-inductions, but don't jump into the deep end too soon. You'll end up overwhelmed with big gaps in your knowledge. Hope it helps.

u/cesclaveria · 6 pointsr/learnprogramming

One book that I really liked, that helps you understand a lot of "under the hood" stuff on the computer (from the hardware level and up) if Computer Organization and Design: The Hardware / Software Interface its quite approachable and I feel it helped me to learn the "MIPS 3000" assembly language. I used a previous version than the one I linked (I think I used the 2nd version), I see the new one added sections regarding multicore and multiprocessor architectures.

The book focuses a lot on MIPS but also has exercises/examples for ARM and X86/IA32.

u/old_TA · 6 pointsr/berkeley

Former 61C ugrad TA here. 61C is broken into 6 main ideas, which you can find on the last slide of the first lecture: http://www-inst.eecs.berkeley.edu/~cs61c/sp13/lec/01/2013Sp-CS61C-L01-dg-intro.pdf

From personal experience, 61C seems to be more difficult for most people than 61A or 61B. On the other hand, if you've been struggling with 61A or 61B, then 61C provides a much more level playing field - the material is new for pretty much everyone, so someone who's been programming since the beginning of high school doesn't have as much of an advantage as they do in the earlier classes.

Also I realize that the advice I'm about to give is devalued since I'm a former staff member, but if you want any type of A, READ THE BOOK CAREFULLY (the book I'm referencing is http://www.amazon.com/Computer-Organization-Design-Fourth-Edition/dp/0123747503/ref=dp_ob_title_bk). There are tons of subtleties in the material that we simply don't have enough time to cover in lecture/discussion/lab but are essential to doing well on projects/exams. The book is meaty, but probably the best book in the world for this material.

Feel free to respond to this if you have more questions.

u/speaktothehand · 5 pointsr/learnprogramming

A book called Software Engineering by Ian Sommerville.
It explains key concepts like scheduling, inter process communication, synchronization, etc.
It was part of the books recomended for my embedded systems course.

Ian Sommerville's website:
http://www.software-engin.com/

You should also use a book called Computer Organization and Desing by David Patterson and John Hennessy, that explains computer architecture very well. They use the MIPS architecture as an example, but all the concepts are easily applied to other architectures.
http://www.amazon.co.uk/Computer-Organization-Design-Interface-Architecture/dp/0123747503

u/zxcdw · 4 pointsr/hardware
  • Learn how to program, in any language just to get the hang of the way to think algorithmically
  • Learn how your OS executes your programs on your CPU
  • Learn how to program in assembly language

    That's just the beginning, but even from that you can infer so much when it comes to matters related to hardware. The low-level details of how AMD or Intel implement their FPU or ALU, or the communication protocol of their memory controllers etc. are utterly irrelevant for anything unless it's really something super specific. Studying such matters leads you nowhere, but actually having some understanding how how a computer operates in general, and actual experience of making the computer operate by programming it is a huge deal.

    Obviously this doesn't help you reason about individual products based on some specific microarchitecture compared to another, but it creates the foundation for you to actually dig deeper into the subject.

    Really, there's so much to it. There are many subjects in computer science, electrical engineering and even software development which come at play. It's not about individual facts and rote learning, but about being able to generalize and synthetize ideas from your knowledge and enough critical thinking skills to recognize bullshit.

    But if you just need one piece of advice, here it is: read Computer Organization and Design, Fourth Edition by Patterson & Hennessy. Or really, any other book regarding the subject. ...and learn to program, for real.

    Also Wikipedia has some good articles on many subjects, and it is a great source for sources of information. Also some clever use of Google-fu can help to get some good course materials in your hands, or research papers, or just about anything. For example using site:*.edu is your friend.
u/the_omega99 · 3 pointsr/learnprogramming

It's somewhat old and MIPS architecture (very applicable, although you probably wouldn't directly work with any MIPS hardware), but I thought Computer Organization and Design was pretty good. MIPS is a very good ISA for learning purposes, due to its simplicity. And there's a number of simulators available for trying to program stuff.

The book will not only build up assembly language, but the very design of the processor and then some. And of course, that does include digital logic. For example, there's a mention on the design of an ALU, as well as optimizations such as carry look-ahead. So you'd see things like one bit adders as a set of basic logic gates (ANDs, ORs, etc), while an ALU would be built with multiple one bit adders, etc. At the CPU level, things will usually be a little higher level, since we end up with multiple ALUs and numerous multi-plexers (not to mention complicated subsystems like RAM and the registers). Overall, it's pretty good at managing the abstractions and specifics.

u/zhay · 3 pointsr/AskComputerScience

I'm pretty sure this book has some diagrams.

Here are some lecture notes that might help.

u/ACoderGirl · 3 pointsr/IAmA

No idea about online resources, but this is the text my class on it used (for MIPS32 assembly -- should be pretty transferable to other RISC languages like ARM). The focus really is on the architecture of the CPU and how that relates to assembly. Which I feel is the important thing to learn, though. I feel like the text is enough. The lectures of that class didn't really do anything beyond what the text covered, anyway.

For exercises with assembly specifically, all the standard beginner programming problems can be used (like, basics of looping and conditionals). Really anything from some first year textbook should be an interesting challenge in assembly simply because it's sooo much simpler. It's not like there's very much to learn because assembly is pretty minimal with what it can do. Everything complex is just a shit ton of code (also why few would build anything large in just assembly). You could have your compiler compile, say, a C program to assembly (gcc's -S flag) to try and practice understanding reverse engineering. It'll let you see what assembly your compiler is generator to try and understand it (it'll be more complex and optimized than human written assembly). Or could even grab a disassembler and disassemble some existing binary and try and understand some small portion of its assembly to see what it's doing.

u/binary_is_better · 3 pointsr/AskEngineers

> let's say I want to do 5*5. How does it take 5, and multiply the same thing in binary

How this is done varies from processor to processor. MIPS are usually the easiest processor to understand. When I learned this stuff in school we started with MIPS. I work on ARM processors mostly now (smartphones), but at a high enough level that I don't worry about the type of details that you're asking now.

000000 00100 00101 00010 00000 100000

In MIPS, that binary sequence means add two numbers together. So if the CPU saw that binary sequence it would first look at the first six digits. This is called the op code. My memory to what these do exactly is fuzzy so I'll leave it to someone else to answer. The next five digits tell the CPU to grab the binary digits that it is storing in register 4, the next five digits tell the CPU to grab the binary digits it is storing in register 5. The next 5 digits tell the CPU that when it is done working with the numbers it should store the results in register 2. The next 5 digits are ignored for this example. The last 6 digits tell the CPU that it should add these numbers together.

If you previously stored the numbers 3 and 17 in registers 4 and 5, register 2 should now hold 20. (It's a different MIPS instruction to store a number, and yet another instruction to retrieve the number.)

I should note that most computer scientist never work at this low level of detail. If we want to add two numbers together and store the result we just type "a = b + c;". That would take the number stored in location b, add it with the number stored in location c, and then store it in location a. We wouldn't care if a, b, or c are registers or in cache or in ram. Those details are handled by the computer (well, compiler) not us.

As it how the processor adds the numbers together, ask a hardware guy. I don't really remember, and to be honest I never really understood it well either.

If you want to delve deeper into this subject, this is a good book, but be warned it assumes you already have a decent grasp of computer science.

As for the second part of your questions it has to do with the number of cores and what they specialize in. CPU's generally have just a few cores. Maybe 1 to 8. They are also general purpose so they can do a lot of things and are very powerful. This monster video card from AMD has 2048 stream processing units on it. None of those processing units are very powerful, and they can really only do a few tasks (which just so happen to be the ones that graphics need). But it can do 2048 of them at a time verses 1 to 8 on a CPU. That's the difference between a CPU and a GPU.

Take the Mythbusters example. Their "GPU" can only paint the Mona Lisa, nothing else. But it can paint it very fast. The "CPU" could be programmed to paint anything. It just takes a lot longer to paint it. Actually, that's a bad example. A CPU will beat a GPU at almost everything. GPU's can only do a few tasks, but the tasks they can do they are much better at than the CPU.

u/nosrednaekim · 3 pointsr/AskEngineers

I studied computer Engineering with a focus on Computer Architecture

A good book for the more advanced topics is Computer Organization and Design

It jumps in at a fairly deep level, so you'd better already have a working knowledge of microprocessor architecture and assembly language, state machines, etc

u/Blue_Kill · 2 pointsr/learnprogramming

^ This is probably the book you want. It's available in Kindle too. If you want a more thorough overview then Computer Organization and Design is the one a lot of universities use for their architecture course. But definitely don't get this book in the Kindle version because it's too big and the layout is not right for an ebook reader.

u/gotomstergo · 2 pointsr/compsci

Remember that CS is much about the physical computer than anything else. Computers are made possible by multiple layers of abstraction. They begin with semiconductors, boolean logic, machine language, assembly language, compiler/linker, leading up to high level languages like python and C. Computers organized memory hierarchy to arbitrate between the access time and availability of different types of memory (cache, DRAM, hard drive) .
In addition, the current trend seems to be much focused on the multi-core, parallel system, since engineers can't get enough performance improvement just by implementing pipelines or faster clock cycles.

So that's that. If you enjoy this realm of CS (it's more of computer engineering to be precise), you should read about these books. Nevertheless, this knowledge will "expand", as you put it, your understanding of computing system in general.

http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686/ref=pd_sim_b_27

http://www.amazon.com/Computer-Organization-Design-Revised-Edition/dp/0123747503/ref=sr_1_1?ie=UTF8&qid=1344489171&sr=8-1&keywords=hennessy+computer+organization

u/aherpiesderpies · 2 pointsr/compsci

I'm more vocational than academic - with the experience you have you can probably jump straight into work if that's what you want to do. I planned to work for a while and then go onto a masters but a few years later it became clear that employers do not look past your vocational experience once you have a couple of years. Part of the reason I wanted to go back and do a masters after working was that I found during my undergrad that we were taught a lot of concepts but that there was nowhere to tie them without the real world exeprience - this leads me to downplay the value of academic qualifications beyond somebody demonstrating they can get shit done.

That said, you almost certainly can, looks like GU would take you . If you just want to get a better understanding of software development you'd be better joining in some open source projects, if you want to get a better fundamental understanding of computers then get this book. My copy is > 5 years old and computers still work p.much the same way so don't bother splashing out :)

I do apologise for answering a different question from the one you asked but from your question it looks like you are self motivated and do a lot of learning on your own, if this is true it's likely you can achieve more outwith an academic context than in it, and save a pile of cash along the way.

All the best :)

u/pybaws · 2 pointsr/cmu

https://www.amazon.co.uk/Computer-Organization-Design-Interface-Architecture/dp/0123747503 Is that the one you were referring to when you said "undergrad version"?

u/gimpwiz · 2 pointsr/osdev

https://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123747503

I thought this book was quite good. It also answers everything you've asked in this thread.

u/AlphaMotel · 1 pointr/compsci

Mathwise you could start with some basic number theory
I found Rosen's Discrete Mathematics textbook to be really helpful.


You could also start with boolean algebra (AND OR NOT XOR ) bit shifting and so on since it will be absolutely useful later on.


For computer hadware and assembly language, I used this book Art of Assembly Language by Randall Hyde and Computer Organization and Design by Patterson and Hennessy.

For cryptography you might learn all about prime numbers , algorithms to find really large prime numbers, random number generator algorithms and why some are more random (cryptographically strong) than others.

Then using that you can apply that towards public / private key encryption, one way hash functions, and the main hash algorithms used by the public.
(MD5, RSA, SHA512) and how they compare against each other.
And how one way hash function are used to verify data integrity.
I found Gary Kessler's site to be really helpful


For password security then you can understand how you can use a one way hash function with a salt and a nonce to make a reasonably secure password storage system. You could learn how one could safely store password hashes in a database like mySQL (www.mysql.com)


And once you understand one way hash functions and public and private keys, then you would already 90% on the way to understand how the bitcoin protocol works and how CPU's mine bitcoins and how the public ledger blockchains works.

For other languages, another language you could easily learn is Java using Processing. I really do enjoy using it and it was easy and fun to learn, and I use it a lot for rapid prototyping.

u/samsmith453 · 1 pointr/computerscience

What interests you about CS? What would you like to build / know / work on?

I would always recommend starting at the very bottom when it comes to learning computer science. Build a knowledge of computing based on what is really happening under the hood, in the hardware. This is the approach I took and it gave me a great foundation, and accelerated my career!

This book is great: https://www.amazon.co.uk/Computer-Organization-Design-Interface-Architecture/dp/0123747503

I have just started a youtube series on understanding hardware aimed at beginners which you might find helpful:

https://www.youtube.com/playlist?list=PLH4a1-PgdkBTKkSSNx63uVkQG1Qs6GmYv

u/shinyhare · 1 pointr/ECE

After checking out some popular books besides the ones I learned from, for digital logic I found Schaum's Outline of Digital Principles is surprisingly good, and concise. You could definitely get by with that, googling anything that doesn't 'click' right away.

There are many books that go beyond basic digital logic to build things like microprocessors and embedded systems so it's hard to give a solid recommendation (and in retrospect all the ones I've read were way to verbose, imo). The one I'm most familiar with is this one. It's cool since it explains how programming languages are translated down the hardware level, and different processor architectures.

In any case, doing projects as you go along is probably going to be more important, and will teach you more than the reading itself.

u/PM_ME_UR_OBSIDIAN · 1 pointr/compsci

The "last interface" between hardware and software is the CPU. Think of it as made from a few universal shift registers connected to various other components (adders, etc.) Instructions will assert/deassert the control lines of the registers and components, and cause their contents to change accordingly. It's very interesting. Look at this book for more.