#90 in Computers & technology books
Use arrows to jump to the previous/next product

Reddit mentions of Computer Organization and Design MIPS Edition: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design)

Sentiment score: 20
Reddit mentions: 41

We found 41 Reddit mentions of Computer Organization and Design MIPS Edition: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design). Here are the top ones.

Computer Organization and Design MIPS Edition: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design)
Buying options
View on Amazon.com
or
    Features:
  • Morgan Kaufmann Publishers
Specs:
Height9.1 Inches
Length7.4 Inches
Number of items1
Weight3.0423792156 Pounds
Width1.3 Inches

idea-bulb Interested in what Redditors like? Check out our Shuffle feature

Shuffle: random products popular on Reddit

Found 41 comments on Computer Organization and Design MIPS Edition: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design):

u/hell_onn_wheel · 13 pointsr/Python

Good on you for looking to grow yourself as a professional! The best folks I've worked with are still working on professional development, even 10-20 years in to their profession.

Programming languages can be thought of as tools. Python, say, is a screwdriver. You can learn everything there is about screwdrivers, but this only gets you so far.

To build something you need a good blueprint. For this you can study objected oriented design (OOD) and programming (OOP). Once you have the basics, take a look at design patterns like the Gang of Four. This book is a good resource to learn about much of the above

What parts do you specify for your blueprint? How do they go together? Study up on abstract data types (ADTs) and algorithms that manipulate those data types. This is the definitive book on algorithms, it does take some work to get through it, but it is worth the work. (Side note, this is the book Google expects you to master before interviewing)

How do you run your code? You may want to study general operating system concepts if you want to know how your code interacts with the system on which it is running. Want to go even deeper with code performance? Take a look at computer architecture Another topic that should be covered is computer networking, as many applications these days don't work without a network.

What are some good practices to follow while writing your code? Two books that are widely recommended are Code Complete and Pragmatic Programmer. Though they cover a very wide range (everything from organizational hacks to unit testing to user design) of topics, it wouldn't hurt to check out Code Complete at the least, as it gives great tips on organizing functions and classes, modules and programs.

All these techniques and technologies are just bits and pieces you put together with your programming language. You'll likely need to learn about other tools, other languages, debuggers and linters and optimizers, the list is endless. What helps light the path ahead is finding a mentor, someone that is well steeped in the craft, and is willing to show you how they work. This is best done in person, watching someone design and code. Also spend some time reading the code of others (GitHub is a great place for this) and interacting with them on public mailing lists and IRC channels. I hang out on Hacker News to hear about the latest tools and technologies (many posts to /r/programming come from Hacker News). See if there are any local programming clubs or talks that you can join, it'd be a great forum to find yourself a mentor.

Lots of stuff here, happy to answer questions, but hope it's enough to get you started. Oh, yeah, the books, they're expensive but hopefully you can get your boss to buy them for you. It's in his/her best interest, as well as yours!

u/khedoros · 11 pointsr/EmuDev

I don't know about "beginner", but I was introduced to a lot of the key ideas when I took my Computer Architecture course in college, using a book like this.

Emulator 101 should be a good guide getting started, and other posts like Imran Nazar's about emulating the Game Boy in Javascript would be useful.

Chip-8 is a simple starting point (just a few operations, very simple architecture, only expects to run at about 500-1000 Hz, there are timers but not really interrupts, etc). Makes sense that it's simple and slow; it's actually a VM that ran on some microcomputers in the latter half of the 70s.

Space Invaders (the arcade game) has a more complex CPU, straightforward graphics and audio, and predictable timing on the interrupts.

Game Boy is a cleaner design than the NES, and the CPU can very nearly be adapted from the Space Invaders one. It introduces interrupts, interrupt priorities, memory bank-switching, more complex graphics and audio.

NES is similar to the Game Boy in some ways, but I feel like the quirkiness is even closer to the surface. Fewer interrupts, split memory map between CPU and PPU (the graphics chip), and a horrendous number of bank-switchers used in different games.

A lot of Sega's hardware, the SNES, or even something more obscure might make sense at this point.

My own path, so far, has been NES, started Game Boy (took a small break to build Chip-8), then finished Game Boy, added Color. Took a bit more time, then jumped into Game Boy Advance, which is my current thing (and being fair, I've taken a lot of breaks...I think I was seriously looking into GBA over a year ago).

u/srnull · 7 pointsr/hardware

> Textbooks aren't much of a thing because so much information is available online and technology changes so fast.

That's

  • really
  • not
  • true

    and I'm not just pointing out that those books exist. They're really good resources!
u/Y0tsuya · 7 pointsr/hardware

Low level stuff is pretty academic. You need textbooks just to get started.

These were my college textbooks (I have earlier editions):

Computer Architecture: A Quantative Approach

Computer Organization and Design: The Hardware/Software Interface

The material is pretty dry but if you can slog through it you will gain good insight into how and why chips and systems are designed the way they are.

Below this level is logic gate design where if you don't have a background in semiconductor physics you'd never get through it.

u/Salyangoz · 7 pointsr/Turkey

ilk olarak kolay gelsin.

Stanford ve MIT'in online course'lari var itunes university'de ordan bakmaya basla istersen.

Internetten egitimini sevdigin okullarin syllabus'unu alip ordaki kitaplardan calismaya baslayabilirsin zaman kaybetmemek icin. Istersen sana sirali kendi transcriptimdeki dersleri PM olarak atabilirim.

Okulundan aldigin .edu emailini cogu programi bedava kullanmak icin kullanabilirsin. (github inanilmaz bi pack veriyor, %100 suistimal etmeni tavsiye ederim)

Kitap oku ve bol bol kod yaz. Boktan olsa, bozuk olsa bile yaz. Kagit kalemle de yazmaya cekinme (is basvurularinda seni beyaz tahtaya cikaracaklar malesef debugger/syntax checker olmicak)

aklima gelen standart kitaplardan en onde su geliyor:

  • Introduction to algorithms : facebook ve google direk bu kitap icinden soru soruyor ise alimlarda. Cogu ilk basta cok zor gelebilir, korkuya gerek yok, 2 sayfayi 3 gunde falan yapiyorsan cok iyi.


  • Bilgisayar temel bilgileri icin de Computer Organization

    Eger lise bilgilerinden korkuyorsan cok inanilmaz bi matematik yok (sektorune gore degisebilir tabi). Lineer Cebir (image processing/game-development vs.) ve Olasilik (AI, Machine learning, data analysis vs.) bilgilerini tazele. Eger machine learning falan yapmak istiyorsan ilerde olasilik bilginin guclu olmasi gerek.

    Cok net bi cizgi izlemene gerek yok. Gerektikce ogrenme politikasi benden yana cikti su ana kadar ama bu tartisilir.

    baska bisi olursa cevap vermeye calisirim.
u/Gibborim · 6 pointsr/EngineeringStudents

It seems like you are looking for a textbook on computer architecture?

If so, this book would be pretty standard.

u/joatmon-snoo · 6 pointsr/explainlikeimfive

Disclaimer: I don't know the EE stuff very well, but I do know enough to explain everything that comes after.

Here are two explanations of how you build logic gates from transistors: a simple one and courtesy of the EE StackExchange, a more technical one. (The value of an input and output is taken relative to V-/GND.)

Before you can build a CPU with logic gates, there are two concepts you need: (1) Boolean algebra and (2) memory cells.

----

If you look up Boolean algebra, you're going to get a lot of results that only math majors really understand (e.g. the Wikipedia page). To simplify it all, Boolean algebra is essentially the field of study that asks "if I only have two values to work with, TRUE and FALSE, what kind of math can I do?" Notice that TRUE and FALSE map neatly to 1 and 0 (hello, binary math!) as well as HIGH and LOW (V+ and 0V).

This means that you can make all sorts of circuits, like binary adders, multipliers, dividers, and so on. (Subtraction involves some extra logical tricks.)

At this point, what you essentially have is the ability to create any function.

----

Now what we need is some way to remember data: that's where memory cells come into play. (This is basically your RAM.)

The primitive form that gets taught in introductory EE courses is the flip-flop circuit: a circuit with two stable states. The stable part here is important: it means that if such a circuit enters this state, it will not leave this state until an input changes. (Similarly, if such a circuit enters an unstable state, generally, it will eventually transition into a stable state.) There are a lot more ways to construct memory cells, of course, but flip-flops are a simple way to see how you can store and manipulate data in a circuit.

With memory cells and Boolean algebra, you can now build state machines. Again, if you google this you're going to end up finding a lot of academic technical nitty-gritty, but at its most basic, a state machine has a finite number of states (A, B, C, ...) and each state corresponds to some function of its inputs.

----

The canonical example is a vending machine (keep in mind, all the electromechanical stuff is abstracted away here - we're only thinking about the control logic).

Let's start with a really simple vending machine. It only accepts $1 coins, it only dispenses one type of soda, and all sodas are $1 each. It's not our job to worry about restocking or counterfeit money or whatnot: our job is just the dispensing logic circuit. We know we're going to have one input and one output: an input for "is there a dollar coin being inserted" and an output for "dispense one can of soda". And if we think about it, the circuit should only have two states: dispensing a soda and not dispensing a soda.

That's pretty simple, then: we use one memory cell, to distinguish between the dispensing and not-dispensing state. The output will always reflect our internal state (i.e. output goes HIGH when dispensing, LOW when not dispensing); and if our input goes HIGH when we're not dispensing, we transition to dispensing, and no matter what our input is when we're dispensing, we transition to not dispensing.

Now we can start adding some complexity to our vending machine: let's accept pennies, nickels, dimes, and quarters too. How about dollar bills? To deal with this, clearly our state machine is going to need some kind of internal counter for how much money has been inserted. We're also going to need logic to compare how much money has been inserted to how much soda costs right now ($1), and also logic to dispense change.

But not everyone's a fan of Generic Soda™ so we're going to need some variety. Now we need a way for people to choose a soda. And since some people are snobs and want pricey stuff - they're willing to pay $2 for their canned beverage of choice (gasp! shock! horror!) - we need to add logic to handle different prices.

----

CPUs are built up much in the same way like the hypothetical vending machine above. A program is supplied as input, in the form of a list of instructions, and the CPU is basically a really big state machine that goes through the program line-by-line.

Explaining the details of how a basic CPU is designed is a full undergraduate course (Computer Organization/Architecture, usually), and seeing as how I've already outlined its prerequisite (Digital Logic) above, I'm going to stop here. The text I learned from was Patterson and Hennessy's Computer Organization and Design (you can find free PDFs of older versions floating around if you just google it).

----

Aside: if you have Steam and are interested in assembly-level programming, I've heard great things about Shenzhen I/O.

u/Gaff_Tape · 6 pointsr/ECE

Not sure about EE-related topics, but for CE you're almost guaranteed to use these textbooks:

u/Echohawkdown · 6 pointsr/TechnologyProTips

In the interim, I suggest the following books:

  • Digital Design and Computer Architecture, by Harris & Harris - covers the circuitry & hardware logic used in computers. Should also cover how data is handled on a hardware level - memory's a bit rusty on this one, and I can't find my copy of it right now. Recommend that you read this one first.

  • Computer Organization and Design, by Patterson & Hennessy - covers the conversion of system code into assembly language, which itself turns into machine language (in other words, covers the conversion of programs from operating system code into hardware, "bare metal" code). Knowledge of digital circuitry is not required before reading, but strongly recommended.

  • Operating System Concepts, by Silberschatz, Galvin & Gagne - covers all the basic Operating System concepts that each OS today has to consider and implement. While there are Linux-based ones, there are so many different Linux "flavors" that, IMO, a book that covers a specific Linux base (called a Linux kernel) exclusively would be incomplete and fail to address all the key aspects you'll find in modern OSes. Knowledge of coding is required for this one, and therefore should be read last.

     

    As for the coding books, I suggest you pick one up on Python or Java - I'm personally biased towards Python over Java, since I think Python's syntax and code style looks nicer, whereas Java makes you say pretty much everything you're doing. Both programming languages have been out for a long time and see widespread usage, so there's plenty of resources out there for you to get started with. Personally, I'd suggest going with this book for Java and this book for Python, but if you go to Coursera or Codecademy, you might be able to get better, more interactive learning experiences with coding.

    Or you can just skip reading all of the books I recommended in favor of MIT's OpenCourseWare. Your choice.
u/ayequeue · 6 pointsr/learnprogramming

If you're trying to learn any assembly language (not specifically x86 based) I know there are several books out there for MIPS. I've used [Computer Organization and Design] (http://www.amazon.com/Computer-Organization-Design-Fifth-Edition/dp/0124077269/ref=sr_1_1?ie=UTF8&qid=1396238245&sr=8-1&keywords=computer+organization+and+design) by Patterson and can say I found it very helpful. On top of that, [MARS] (http://courses.missouristate.edu/kenvollmar/mars/), a combination IDE/emulator can be used with it (and is open source/free).

u/MushinZero · 4 pointsr/ComputerEngineering

Do you understand a current ISA? This will become clearer once you do.

MIPS or RISC-V are the recommended ones to start with.

https://smile.amazon.com/dp/0124077269/ref=cm_sw_em_r_mt_dp_U_kfG4Db0D3JR91

https://smile.amazon.com/dp/0128122757/ref=cm_sw_em_r_mt_dp_U_hfG4DbCTTF7H3

Also, it is going to be much faster to implement a processor in an HDL than in Minecraft.

u/KibblesNKirbs · 4 pointsr/hardware

i used this book for my first comparch course, written by the guys who pioneered RISC and MIPS

goes over just about everything for the basics of how computer processors work, you can find a pdf online pretty easily

u/nimblerabit · 3 pointsr/compsci

I learned mostly through reading textbooks in University, but not many of the books we were assigned stood out as being particularly great. Here's a few that I did enjoy:

u/morto00x · 3 pointsr/embedded

Are you familiar with logic design (multiplexers, decoders, registers, logic gates, etc)? Computer Organization and Design covers a lot of it and is relatively easy to read. But having some background in digital logic will help a lot.

u/QuoteMe-Bot · 3 pointsr/ComputerEngineering

> We use vivado in school and they teach verilog. My impression is that VHDL is more of an industry standard, but I'm still a student so don't quote me on that. The way my university introduced digital logic was by having us start at logic gate level then use those modules to make state machines and last semester we made a MIPS processor.

>
Vivado (web pack should be free)
https://www.xilinx.com/products/design-tools/vivado.html

> Here is the book we used for the processor
https://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269

~ /u/laneLazerBeamz

u/Opheltes · 3 pointsr/learnprogramming

Patterson and Hennessy's textbooks, Computer Architecture and Computer Organization and Design are pretty much the standard textbook used in every computer architecture class, everywhere.

u/Quinnjaminn · 3 pointsr/cscareerquestions

Copy pasting my response to a similar question:

Edited to have more resources and be easier to read.

It's hard to draw the line between "essential" and "recommended." That depends a lot on what you want to do. So, I will present a rough outline of core topics covered in the 4 year CS program at my university (UC Berkeley). This is not a strict order of topics, but prerequisites occur before topics that depend on them.

Intro CS

Topics include Environments/Scoping, abstraction, recursion, Object oriented vs functional programming models, strings, dictionaries, Interpreters. Taught in Python.

The class is based on the classic MIT text, "Structure and Interpretation of Computer Programs." Of course, that book is from 1984 and uses Scheme, which many people don't want to learn due to its rarity in industry. We shifted recently to reading materials based on SICP, but presented in python. I believe this is the reading used now. This course is almost entirely posted online. The course page is visible to public, and has the readings, discussion slides / questions and solutions, project specs, review slides, etc. You can find it here.

Data Structures and basic algorithms

DS: Arrays, Linked Lists, Trees (Binary search, B, Spaly, Red-Black), Hash Tables, Stacks/Queues, Heaps, Graphs. Algorithms: Search (Breadth first vs depth first), Sorting (Bubble, radix, bucket, merge, quick, selection, insert, etc), Dijkstra's and Kruskal's, Big-O analysis.

This class uses two books: "Head First Java" and "Data Structures and Algorithms in Java" (any edition except 2). The class doesn't presupposed knowledge in any language, so the first portion is covering Object Oriented principles and Java from a java book (doesn't really matter which), then moving to the core topics of data structures and algorithms. The course page has some absolutely fantastic notes -- I skim through these before every interview to review. You can also check out the projects and homeworks if you want to follow along. The course page is available here (note that it gets updated with new semesters, and links will be removed -- download them soon if you want to use them).

Machine Structures (Intro Architecture)

Warehouse scale computing (Hadoop Map-Reduce). C language, basics of assemblers/compilers/linkers, bit manipulation, number representation. Assembly Language (MIPS). CPU Structure, pipelining, threading, virtual memory paging systems. Caching / memory hierarchy. Optimization / Performance analysis, parallelism (Open MP), SIMD (SSE Intrinsics).

This class uses two books: "The C Programming Language" and "Computer Organization and Design". This class is taught primarily in C, so the first few weeks are spent as a crash course in C, along with a discussion/project using Map-Reduce. From there in jumps into Computer Organization and Design. I personally loved the projects I did in this class. As with above, the lecture slides, discussion notes, homeworks, labs, solutions, and projects are all available on an archived course page.

Discrete Math / Probability Theory

Logic, Proofs, Induction, Modular Arithmetic (RSA / Euclid's Algorithm). Polynomials over finite fields. Probability (expectation / variance) and it's applicability to hashing. Distributions, Probabilistic Inference. Graph Theory. Countability.

Time to step away from coding! This is a math class, plain and simple. As for book, well, we really didn't have one. The class is based on a series of "Notes" developed for the class. When taken as a whole, these notes serve as the official textbook. The notes, homeworks, etc are here.

Efficient Algorithms and Intractable Problems

Designing and analyzing algorithms. Lower bounds. Divide and Conquer problems. Search problems. Graph problems. Greedy algorithms. Linear and Dynamic programming. NP-Completeness. Parallel algorithms.

The Efficient Algorithms class stopped posting all of the resources online, but an archived version from 2009 has homeworks, reading lists, and solutions. This is the book used.

Operating Systems and System Programming

Concurrency and Synchronization. Memory and Caching. Scheduling and Queuing theory. Filesystems and databases. Security. Networking.

The Operating Systems class uses this book, and all of the lectures and materials are archived here (Spring 2013).

Math

Those are the core classes, not including about 4 (minimum) required technical upper division electives to graduate with a B.A. in CS. The math required is:

  • Calculus 1 and 2 (Calc AB/BC, most people test out, though I didn't)

  • Multivariable calculus (not strictly necessary, just recommended)

  • Linear Algebra and Differential Equations.

    Those are the core classes you can expect any graduate from my university to have taken, plus 4 CS electives related to their interests. If you could tell me more about your goals, I might be able to refine it more.
u/schreiberbj · 3 pointsr/compsci

This question goes beyond the scope of a reddit post. Read a book like Code by Charles Petzold, or a textbook like Computer Organization and Design or Introduction to Computing Systems.

In the meantime you can look at things like datapaths which are controlled by microcode.

This question is usually answered over the course of a semester long class called "Computer Architecture" or "Computing Systems" or something like that, so don't expect to understand everything right away.

u/[deleted] · 2 pointsr/learnprogramming

Welp - here's how they used to do it...
http://www.brackeen.com/vga/basics.html

At a basic level computer graphics are output as a system of matrices correlating to pixel RGB values and state, coefficents are passed in as mapping transformations, which apply changes to the relative values. The fundamental level of graphics forms a complicated system of linear equations, and it's very dense maths. Some really smart people spent their life's work figuring out how this works, if you're still interested, nobody here is going to give you a satisfying answer, as you'll have some linear maths college courses to catch up on before you can even talk about it in the context of graphics.

These calculations are now abetted with software interfaced with video processing hardware at a very basic level on the actual chip. The reason you aren't finding a straight answer is we just don't do graphics programming in assembly anymore. For that matter, very few people outside of proprietary microcontrollers program in assembly. Even the assembly your high level code compiles to is accessing graphical libraries on the chip before it's compiled to machine language. The fact is that this stuff is handled at a very high level of abstraction, fuck, most of the process from assembly to machine code is incredibly abstract, even at an otherwise basic level, and you're not going to get a hard and fast answer.


If you're interested in this stuff I would highly recommend this book :
http://www.amazon.com/gp/product/0124077269?keywords=computer%20organization%20and%20design&qid=1457111982&ref_=sr_1_1&sr=8-1


And of course, if you're in college you should take a course called Computer Architecture, I think you'll like it a lot.


u/Nullsrc · 2 pointsr/unexpectedfactorial

Find it here on Amazon. It's actually a pretty good textbook and worth reading even if you're mostly a software developer.

u/0x5345414E · 2 pointsr/webdev

You shouldn't worry so much about different programming languages. They all more or less work the same. I would recommend you learn how a computer works at a low level and work your way up. You could start here, then move on to this. This kind of knowledge makes a lot of things clearer.

u/ziptofaf · 2 pointsr/learnprogramming

>is book could have been useful also for C++ real-time programmers of course because i would include also HW information used in that field.. probably I'm asking too much now..

It wouldn't be. You misunderstand how that field of programming works. Differences can be HUGE and you would end up using at least something like https://www.amazon.com/Computer-Organization-Design-MIPS-Architecture/dp/0124077269.

Why? Because hardware used there can be fundamentally different than your typical computer. How much? Well... some CPUs don't support recursion. No, really. Do more than 2-3 recursive calls and CPU rans out of memory. You also end up using FPGAs and ASICs. To explain all that is way more than a book.

You seem to want a hypotethical book on "current PC hardware and it's performance". Which frankly is not in a form of a book but comes from visiting places like Guru3d and anandtech. Actual low level differences that WILL matter for a programmer are hidden in CPU specs sheets and to read that you need resources that target computer architectures and your problem domain specifically. Well, that and practice really - someone working in game engine development is likely to know graphics pipeline like the back of their hand and can easily talk about performance on several GPUs and pinpoint what makes one better than the other. But that came from experimenting and plenty of articles, not a singular resource. Too many different requirements really to create a single resource stating that "X is good for Y branch of programming but bad for Z".

I mean, even within desktop CPUs themselves. Would you rather have a 14 Core i9 9980XE or a 32 core Threadripper 2990WX? Answer is - it depends. One has far superior single threaded performance due to higher clock, the latter will eat it alive in heavily multithreaded and independent processes (2990WX has 32 cores but only 16 are connected to the rest of your computer, this can cause very visible delays so there are multithreaded scenarios when it will underperform). And in some cases you will find out that an 8-core 9900k is #1 in the world. It ALL depends on a specific application and it's profile.

u/ctcampbell · 2 pointsr/netsec

Or "Computer Organization and Design, Fifth Edition: The Hardware/Software Interface"

http://www.amazon.com/Computer-Organization-Design-Fifth-Edition/dp/0124077269

u/lordyod · 1 pointr/UCSC

That was a typo it's CE12. The past three quarters it has focused on digital logic structures, binary/hex math, basics of building a processor, and the MIPS assembly language. If you want to get a head start on the book pick up Computer Organization and Design.

CS101 will depend on the instructor. If you are assigned to Tantalo's class then you will be doing a mix of programming assignments and proof stuff. I'm not super familiar with the details but luckily, his materials are all posted on his course websites, just google UCSC CMPS 101 and find it. If on the other hand you are assigned to Sesh's class then (at least based on this last quarter) you won't be doing coding, you'll be doing very thorough proofs about algorithms. Both of these classes use CLRS which, if you're serious about CS, you'll probably want to have as a desk reference regardless.

u/Flynzo · 1 pointr/RPI

https://www.amazon.com/Computer-Organization-Design-MIPS-Fifth/dp/0124077269

I believe this is the one. I had just rented it for the semester from Amazon, but they also have it in the bookstore. I don't doubt there's PDF's you could find of it online too.

u/stillinmotionmusic · 1 pointr/utdallas

The Assignments changed in difficulty a lot, some things were very nit picky things from the textbook, others were writing general Assembly programs.

the topics in general were what was difficult, writing Assembly itself wasn't that hard, but understanding how everything fits together with the diagrams was the difficult part.

we used Computer Organization and Design as our textbook, however the book is not designed very well and contains errors, it has a lot of information, but making sense of the information is what was difficult for me.

That's why I am trying to learn Digital Logic because I feel being grounded in that would make reading the diagrams and everything easier. Appendix B in that textbook, tries to cover digital logic, but it isn't explained very well and assumes you already know it when going through the actual material in the book.

u/LogBaseE · 1 pointr/ECE

It's verilog based but I like ciletti, mano, and patterson:

https://www.amazon.com/Advanced-Digital-Design-Verilog-HDL/dp/0136019285

https://www.amazon.com/Digital-Design-Introduction-Verilog-HDL/dp/0132774208

https://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269

https://www.amazon.com/dp/0124077269/ref=pd_luc_rh_ebxgy_01_01_t_img_lh?_encoding=UTF8&psc=1

I just went through a project course and here were some good project ideas:
Conway's game of life with VGA/LEDPanel
Single Cycle CPU
2D convolution with Systolic Arrays (really cool)


u/interiorcrocodile666 · 1 pointr/learnprogramming

Sounds like you need to read a book on Computer Organization and Design. This book will teach you how to build a computer from basic logic gates. I can't recommend it highly enough.

u/empleadoEstatalBot · 1 pointr/argentina
	


	


	


> # Teach Yourself Computer Science
>
>
>
> If you’re a self-taught engineer or bootcamp grad, you owe it to yourself to learn computer science. Thankfully, you can give yourself a world-class CS education without investing years and a small fortune in a degree program 💸.
>
> There are plenty of resources out there, but some are better than others. You don’t need yet another “200+ Free Online Courses” listicle. You need answers to these questions:
>
> - Which subjects should you learn, and why?
> - What is the best book or video lecture series for each subject?
>
> This guide is our attempt to definitively answer these questions.
>
> ## TL;DR:
>
> Study all nine subjects below, in roughly the presented order, using either the suggested textbook or video lecture series, but ideally both. Aim for 100-200 hours of study of each topic, then revist favorites throughout your career 🚀.
>
>
>
>
>
> Subject Why study? Best book Best videos Programming Don’t be the person who “never quite understood” something like recursion. Structure and Interpretation of Computer Programs Brian Harvey’s Berkeley CS 61A Computer Architecture If you don’t have a solid mental model of how a computer actually works, all of your higher-level abstractions will be brittle. Computer Organization and Design Berkeley CS 61C Algorithms and Data Structures If you don’t know how to use ubiquitous data structures like stacks, queues, trees, and graphs, you won’t be able to solve hard problems. The Algorithm Design Manual Steven Skiena’s lectures Math for CS CS is basically a runaway branch of applied math, so learning math will give you a competitive advantage. Mathematics for Computer Science Tom Leighton’s MIT 6.042J Operating Systems Most of the code you write is run by an operating system, so you should know how those interact. Operating Systems: Three Easy Pieces Berkeley CS 162 Computer Networking The Internet turned out to be a big deal: understand how it works to unlock its full potential. Computer Networking: A Top-Down Approach Stanford CS 144 Databases Data is at the heart of most significant programs, but few understand how database systems actually work. Readings in Database Systems Joe Hellerstein’s Berkeley CS 186 Languages and Compilers If you understand how languages and compilers actually work, you’ll write better code and learn new languages more easily. Compilers: Principles, Techniques and Tools Alex Aiken’s course on Lagunita Distributed Systems These days, most systems are distributed systems. Distributed Systems, 3rd Edition by Maarten van Steen 🤷‍
>
> ## Why learn computer science?
>
> There are 2 types of software engineer: those who understand computer science well enough to do challenging, innovative work, and those who just get by because they’re familiar with a few high level tools.
>
> Both call themselves software engineers, and both tend to earn similar salaries in their early careers. But Type 1 engineers grow in to more fullfilling and well-remunerated work over time, whether that’s valuable commercial work or breakthrough open-source projects, technical leadership or high-quality individual contributions.
>
>
>
> Type 1 engineers find ways to learn computer science in depth, whether through conventional means or by relentlessly learning throughout their careers. Type 2 engineers typically stay at the surface, learning specific tools and technologies rather than their underlying foundations, only picking up new skills when the winds of technical fashion change.
>
> Currently, the number of people entering the industry is rapidly increasing, while the number of CS grads is essentially static. This oversupply of Type 2 engineers is starting to reduce their employment opportunities and keep them out of the industry’s more fulfilling work. Whether you’re striving to become a Type 1 engineer or simply looking for more job security, learning computer science is the only reliable path.
>
>
>
>
>
> ## Subject guides
>
> ### Programming
>
> Most undergraduate CS programs start with an “introduction” to computer programming. The best versions of these courses cater not just to novices, but also to those who missed beneficial concepts and programming models while first learning to code.
>
> Our standard recommendation for this content is the classic Structure and Interpretation of Computer Programs, which is available online for free both as a book, and as a set of MIT video lectures. While those lectures are great, our video suggestion is actually Brian Harvey’s SICP lectures (for the 61A course at Berkeley) instead. These are more refined and better targeted at new students than are the MIT lectures.
>
> We recommend working through at least the first three chapters of SICP and doing the exercises. For additional practice, work through a set of small programming problems like those on exercism.
>
> For those who find SICP too challenging, we recommend How to Design Programs. For those who find it too easy, we recommend Concepts, Techniques, and Models of Computer Programming.
>
>
>
> [Structure and Interpretation of Computer Programs](https://teachyourselfcs.com//sicp.jpg)
>
>
>
> ### Computer Architecture
>
> Computer Architecture—sometimes called “computer systems” or “computer organization”—is an important first look at computing below the surface of software. In our experience, it’s the most neglected area among self-taught software engineers.
>
> The Elements of Computing Systems, also known as “Nand2Tetris” is an ambitious book attempting to give you a cohesive understanding of how everything in a computer works. Each chapter involves building a small piece of the overall system, from writing elementary logic gates in HDL, through a CPU and assembler, all the way to an application the size of a Tetris game.
>
> We recommend reading through the first six chapters of the book and completing the associated projects. This will develop your understanding of the relationship between the architecture of the machine and the software that runs on it.
>
> The first half of the book (and all of its projects), are available for free from the Nand2Tetris website. It’s also available as a Coursera course with accompanying videos.
>
> In seeking simplicity and cohesiveness, Nand2Tetris trades off depth. In particular, two very important concepts in modern computer architectures are pipelining and memory hierarchy, but both are mostly absent from the text.
>
> Once you feel comfortable with the content of Nand2Tetris, our next suggestion is Patterson and Hennesy’s Computer Organization and Design, an excellent and now classic text. Not every section in the book is essential; we suggest following Berkeley’s CS61C course “Great Ideas in Computer Architecture” for specific readings. The lecture notes and labs are available online, and past lectures are on the Internet Archive.
>
>
>
>
>
> ### Algorithms and Data Structures
>
> We agree with decades of common wisdom that familiarity with common algorithms and data structures is one of the most empowering aspects of a computer science education. This is also a great place to train one’s general problem-solving abilities, which will pay off in every other area of study.
>
> There are hundreds of books available, but our favorite is The Algorithm Design Manual by Steven Skiena. He clearly loves this stuff and can’t wait to help you understand it. This is a refreshing change, in our opinion, from the more commonly recommended Cormen, Leiserson, Rivest & Stein, or Sedgewick books. These last two texts tend to be too proof-heavy for those learning the material primarily to help them solve problems.
>

> (continues in next comment)

u/dnabre · 1 pointr/asm

Apparently the side bar needs additions, there is nothing there for MIPS.

edit Asked a colleague about online courses, Programmed Introduction to MIPS Assembly Language was recommended. Look well paced, and even has little quizzes you take to test your understanding as you go.

Books though, I can help with:

For high level theory and general architecture, the goto book is Computer Organization and Design by Patterson & Hennessy. It uses MIPS for the examples and the circuit diagrams throughout. I think its in it 5th edition. There are a few chapters at the end about where computer architecture is going and such (unrelated to MIPS) that changes between editions. University libraries will definitely have this, possibly even public one. This text is the standard for college-level computer science courses on computer architecture, and has been for something in the ballpark of 20 years.

For practical coding, I'd recommend (See MIPS Run by Dominic Sweetman](https://smile.amazon.com/Morgan-Kaufmann-Computer-Architecture-Design/dp/0120884216). It's in its 2nd edition, which I haven't read, so I don't know if it offers anything more than the first. The [first edition]
(https://smile.amazon.com/Morgan-Kaufmann-Computer-Architecture-Design/dp/1558604103) can be had used for next to nothing. It's especially good if you're writing real MIPS assembly on Linux as opposed to writing it on a simulator.

u/laneLazerBeamz · 1 pointr/ComputerEngineering

We use vivado in school and they teach verilog. My impression is that VHDL is more of an industry standard, but I'm still a student so don't quote me on that. The way my university introduced digital logic was by having us start at logic gate level then use those modules to make state machines and last semester we made a MIPS processor.


Vivado (web pack should be free)
https://www.xilinx.com/products/design-tools/vivado.html

Here is the book we used for the processor
https://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269

u/zweischeisse · 1 pointr/jhu

These are what I had. Different professors might use different books, obviously.

u/ravenorl · 1 pointr/ucf

I had to look it up. It's the intro to computer architecture class.

Here's the textbook on Amazon without an affiliate link -- https://www.amazon.com/Computer-Organization-Design-MIPS-Architecture/dp/0124077269

Not because I think you're going to buy it on Amazon, but because you can "Look Inside" from that link.

It looks like they're still teaching from an old edition of that textbook. I would have converted to the RISC-V edition by now.

Sad.

That's a good textbook, though. The authors are legends (literally, look them up) in computer architecture. I actually prefer their other textbook ( -- A Quantitative Approach), but it's not suited for an intro class.

Have fun.

u/throwdemawaaay · 1 pointr/AskComputerScience

https://www.amazon.com/Computer-Organization-Design-MIPS-Architecture/dp/0124077269

After that:

https://www.amazon.com/Computer-Architecture-Quantitative-Approach-Kaufmann/dp/0128119055

These authors are the foremost authorities in the field. The second book is *the* textbook for computer architecture. These are the people that invented RISC.

u/AsteriskTheServer · 1 pointr/learnprogramming

IMO the best book that gives an general overview of computer architecture is the book Computer Organization and Design by David A. Patterson and John L. Hennessy. That being said this is a difficult book. However, it goes over how memory hierarchy works to virtual memory to even showing you the data path of how instructions are executed. Is this is going to tell everything you need to do pen testing and so forth?. Not a chance.

u/hell_0n_wheel · 1 pointr/Python

All I could recommend you are the texts I picked up in college. I haven't looked for any other resources. That being said:

http://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269

This text is really only useful after learning some discrete math, but is THE book to learn algorithms:

http://www.amazon.com/Introduction-Algorithms-3rd-MIT-Press/dp/0262033844

This one wasn't given to me in college, but at my first job. Really opened my eyes to OOD & software architecture:

http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612

u/MrGreggle · 1 pointr/AskMen

This book will explain how you go from a transistor to all of the basic components of a CPU: https://www.amazon.com/Digital-Design-RTL-VHDL-Verilog/dp/0470531088

This book will explain all kinds of advanced processor design and largely explain how the vast majority of modern processors work: https://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269/ref=sr_1_1?s=books&ie=UTF8&qid=1484689485&sr=1-1&keywords=computer+organization+and+design

There really aren't any prerequisites, you can just read those 2 books.

u/ChineseCracker · -1 pointsr/AndroidMasterRace

honest question: are you trolling?

I have no idea, how you could've read the text, that you've just quoted, and still think that flash storage somehow works "like paper"


let me break it down for you, so you can understand........ELI5:

flash storage are maaaaany different small disks
small disks can write veeeery fast
but small disks also get broken, after writing a looooot of times
hurts
cheap disks get broken faaaaast
expensive disks get broken sloooow
but, there is good news:
if one disk dies, others are still there to work
yaaay
but they are not as fast as before :'-(



And now ELI17 (which is probably your actual age):

You've just quoted, that there are different types of flash storages, that have different PE cycles (according to the manufacturer)....so you understand that there is a quality difference in flash storage

I think your problem is, that you think 100000 PE cycles lasts for 1000 years or something.....

First of all: the flash storage that you buy from the store (in any shape or form) isn't the same as device manufacturers use as components for their devices. Especially when we're talking about embedded systems (you think they put a Samsung SSD inside their smartphones? They don't!) manufacturers want to save money, and therefore usually go for the rock bottom cheapest option they can get. (just like they use the shittiest reject fruit to put into jelly)

And secondly: A PE cycle doesn't mean that something gets deleted off of the disk, whenever you uninstall an app or delete a music file. An operating system that works entirely on a flash storage is constantly writing and deleting files! Every time you leave an app in Android, (and the onStop() method of the activity is called), the app gets cached on the storage to free up ram (DRAM)

therefore you have hundreds or even thousands of delete operations every day, especially with TRIM enabled.


-----

Don't be a retard. Either fully read the links I've provided, or just stop arguing about something you know absolutely nothing about. Don't just Ctrl+F through an article to find signs for things that vaguely sound like they'd make your argument. And stop claiming that flash storage is like paper. I'm usually a big fan of analogies, but I have no idea where you pulled this one out - especially because it doesn't make any sense

if you're actually interested in this subject matter.....I purchased this book, when I was studying computer science:

http://www.amazon.com/Computer-Organization-Design-Fifth-Edition/dp/0124077269/ref=dp_ob_title_bk (this is the newest edition, I have an older one)

it's a pretty good point of reference for understanding how computer hardware works. it is very dry though