#20 in Computers & technology books
Use arrows to jump to the previous/next product
Reddit mentions of The Elements of Computing Systems: Building a Modern Computer from First Principles
Sentiment score: 51
Reddit mentions: 93
We found 93 Reddit mentions of The Elements of Computing Systems: Building a Modern Computer from First Principles. Here are the top ones.
Buying options
View on Amazon.comor
- MIT Press MA
Features:
Specs:
Color | Navy |
Height | 9 Inches |
Length | 8 Inches |
Number of items | 1 |
Release date | January 2008 |
Weight | 1.32056894938 Pounds |
Width | 0.6875 Inches |
Here's my list of the classics:
General Computing
Computer Science
Software Development
Case Studies
Employment
Language-Specific
C
Python
C#
C++
Java
Linux Shell Scripts
Web Development
Ruby and Rails
Assembly
We did something similar as well. The labs were tons of fun. I remember having to run a couple dozen lines of code through the CPU cache on a test once, including some sneakery of using code as data at one point. I do appreciate having done it, but I'm not sure how much practical lasting value that really contributed.
That said, for those who are interested in this there is The Elements of Computing Systems: Building a Modern Computer from First Principles, more commonly known as "NAND to Tetris".
Petzold's Code is excellent as well.
Edit: Actually, while I've suggested those two let me throw Computer Systems: A Programmer's Perspective into the mix. It's a book we used across two courses and I really enjoyed it. We used the 2nd edition (and I have no issue recommending people get a cheaper, used copy of that), but there is a 3rd edition now. Being a proper text book it's stupidly priced (you can get Knuth's 4 book box set for $30 more), but it's a good book.
Anyone have suggestions similar to that Computer Systems's text? I've always wanted to revisit/re-read it, but could always used a different perspective.
Sorry to see this getting downvoted. Read the about page to get an idea of why /u/r00nk made the page.
I have to agree with one of the other comments that it is way too terse at the moment. I remember when we learnt about e.g. d-latches in school and it was a lot more magical and hard to wrap your head around at first then the page gives credit for. That and, or, and xor gates can be built up from just nand gates (the only logic gate properly explained) is also glossed over. Either go over it, or don't show the interiors of the other logic gates.
The interactive stuff is really neat. Good work on that.
Edit: If anyone reading wants to learn this stuff in more detail, two good books are
Edit, 9 hours later: Just so people don't think I'm bitching that this post is only 84% upvoted, it was struggling at below zero points and 42% upvoted when I first commented.
Along those same lines I can recommend: The Elements of Computing Systems. With the book as a guide you build a whole computer, starting with NOR to build the basic logic gates, and finishing by writing a simple OS.
I just finished reading Code: The Hidden Language of Computer Hardware and Software and will state unequivocally that this book is the most satisfying read I've experienced. It starts with flashlights blinking through windows, moves to Morse code, introduces electrical relays and demonstrates how they can be connected to form logic gates, then uses those gates to construct an ALU/counter/RAM and multiplexors. It goes on to describe the development of an assembly language and the utilization of input and output devices.
This book can be described as knowledge hose flooding the gaps in my understanding of computer hardware/software at an extremely enjoyable pace. It may help satisfy your interest in the concepts and technology that led to modern computers. Check out the reviews for more info.
If you haven't already studied logic gates in depth in your formal education, I would suggest using a logic simulator to actually build the combinational logic structures. I now feel very comfortable with logic gates and have a strong understanding of their application in computing from my time spent building the described logic.
I went through the book very slowly, rereading chapters and sections until I felt confident that I understood the content. I can not recommend this book enough.
After reading CODE, I have been working through The Elements of Computing Systems: Building a Modern Computer from First Principles. If you are looking to gain a better understanding of the functions of hardware components, this is the book to read. This book's companion site http://www.nand2tetris.org has the first chapters free along with the entire open source software suite that is used in the book's projects. You will build, in the hardware design language starting with Nand gates, each logic gate and every part of a computing system up to a modern high level language with which you can program custom software of your own design to compile in a compiler you designed into an assembly language you specified which is turned into binary that runs in a processor you built from Nand gates and flip flops. This book was very challenging before reading CODE, now I feel like I'm simply applying everything I learned in code with even more detail. For somebody that hasn't attended college for computing yet, this has been a life changing experience.
http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319
http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686
Thanks ;). Not so skilled on that and my advice might be misleading; though I got a background in cs:This would be my suggestion for someone beginning.
You can also search for those books pdf by using google hacks eg
filetype:pdf "title of the book here"
orintitle:index.of "title of the book here"
You might want to start by looking here : http://www.nand2tetris.org/
It's an "online class" based on this book http://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686
They start with simple logical units (In the first chapter you'll build NOT, AND, OR, XOR from NAND) and end with Tetris.
To get from transitors to NAND check this out http://www.cs.bu.edu/~best/courses/modules/Transistors2Gates/
Without the affiliate link.
This book:
Elements of Computing Systems - https://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/ref=sr_1_1?ie=UTF8&qid=1518118693&sr=8-1&keywords=elements+of+computing+systems
Will do what you are looking to do.
Riding on your top post coattails...
The Elements of Computing Systems and Code by Charles Petzold are exactly what you want.
Code goes through number systems, basic information theory, circuits (from gates on up), memory, machine code and programming languages, all with accessible diagrams and explanations.
TECS has you build an actual working computer from the ground up.
The other guys are right that they are multiplexers.
Check out The Elements of Computing Systems if you really want to learn from the ground up. I've just started reading it and after the second chapter you have to figure out how to make an ALU using logic gates made in the previous chapters from a NAND gate including 16 bit multiplexers.
Amazon Link
Book Website
If you're interested in reading up on this, I highly recommend https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319
Or for a more hands-on approach: https://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/
The link to the book is;
http://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/ref=sr_1_1?ie=UTF8&s=books&qid=1217450690&sr=8-1
or uk amazon
http://www.amazon.co.uk/Elements-Computing-Systems-Building-Principles/dp/0262640686/ref=sr_1_1?ie=UTF8&s=books&qid=1217450662&sr=8-1
the title is Elements of Computing Systems: Building a Modern Computer from First Principles.
It looks like a really interesting book. Shame I have banned myself from buying books until I get through some more of the ones sitting on my shelf.
This is awesome! I've been slowly getting more and more interested in hardware, and this is something I would absolutely love to do. I just don't know where to start.
I've been reading a couple of books about learning lower level stuff, and planned on working my way up.
I'd really like to get out of webdev and into low-level programming, or even hardware design and implementation. There's sooooo goddamn much to learn, that I doubt I'll be ready without getting a BS in Comp. Engineering, and maybe a master's as well.
(I'm absolutely a beginner, and if anyone is interested in the books I've been reading, these are they:
The Elemennts of Computing Systems, aka Nand2Tetris. The book's blurb (from Amazon):
>In the early days of computer science, the interactions of hardware, software, compilers, and operating system were simple enough to allow students to see an overall picture of how computers worked. With the increasing complexity of computer technology and the resulting specialization of knowledge, such clarity is often lost. Unlike other texts that cover only one aspect of the field, The Elements of Computing Systems gives students an integrated and rigorous picture of applied computer science, as its comes to play in the construction of a simple yet powerful computer system.
>
>Indeed, the best way to understand how computers work is to build one from scratch, and this textbook leads students through twelve chapters and projects that gradually build a basic hardware platform and a modern software hierarchy from the ground up. In the process, the students gain hands-on knowledge of hardware architecture, operating systems, programming languages, compilers, data structures, algorithms, and software engineering. Using this constructive approach, the book exposes a significant body of computer science knowledge and demonstrates how theoretical and applied techniques taught in other courses fit into the overall picture.
>
>Designed to support one- or two-semester courses, the book is based on an abstraction-implementation paradigm; each chapter presents a key hardware or software abstraction, a proposed implementation that makes it concrete, and an actual project. The emerging computer system can be built by following the chapters, although this is only one option, since the projects are self-contained and can be done or skipped in any order. All the computer science knowledge necessary for completing the projects is embedded in the book, the only pre-requisite being a programming experience.
>
>The book's web site provides all tools and materials necessary to build all the hardware and software systems described in the text, including two hundred test programs for the twelve projects. The projects and systems can be modified to meet various teaching needs, and all the supplied software is open-source.
Check out TECS, its the definite course on that. You start with logic gates and build your way up to an ALU, then a CPU, an assambler, a compiler, an OS and a game :)
I have no idea, but apparently the interviewer did. If I am not mistaken, The Elements of Computing Systems: Building a Modern Computer from First Principles had a good explanation of things like that.
You could go through the projects in:
The Elements of Computing Systems
http://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/ref=sr_1_1?ie=UTF8&qid=1449165576&sr=8-1&keywords=elements+of+computing+systems
I think I learned more about Computer Science from these projects than many of the books I've read.
The projects are made for self study and include:
Designing the hardware for a basic computer
Building an assembler for your hardware
Building a compiler for an object oriented language
Building a virtual machine to run programs
That edition is just out of print. Here's a link to a much cheaper copy.
http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686/ref=tmm_pap_title_0
I have an internship lined up, but I'm really excited to be learning outside of that too. You should check out The Elements of Computing Systems by Nisan and Schocken. I'm going to be working through this book throughout spring semester and the summer. I think it will provide a foundation for every low-level part of CS and help fill in some gaps that I'm missing.
If you're excited about web dev, you could make a website with Ruby on Rails, Django, Flask, Node, Meteor, etc. There's always another good web framework that you could learn.
If you're into system programming, programming languages, or compilers, there are tons of great tutorials and guides online. I'm currently working through Learn C: Build Your Own Lisp. I'm really looking forward to doing Implementing a Language with LLVM. If you didn't already know, LLVM was started here!
If you haven't finished core math yet, there's Linear Algebra on Khanacademy. I think Salman Khan is one of the best teachers I've had. The videos are very concise and very clear. There's also a great series on ML on YouTube. It explains the theoretical underpinnings of the algorithms, but doesn't really show how to use them. If you want to use them, your best bet is the Python library scikit-learn.
For reverse engineering, here's a fantastic challenge site, and here's a good book that you can view online.
There's so much to do, and not enough time to do it! If you constantly work on a few things, little by little, it will all start to accumulate. Good luck and have fun this summer!
I would personally recommend this book. (There’s also an associated coursework available at EdX I believe? You can also check out the book’s site at www.nand2tetris.org)
If you're trying to get into the emulator development scene, my recommendation would be to start writing an emulator for a simple system (such as the Game Boy or NES). I, too, wanted to read computer architecture books before I started my own emulator, and it turned out to be a huge waste of time. I already knew 80%+ of the material in the early chapters, but I insisted on slogging through anyway. But when I'm doing boring stuff like that, I make terrible progress on anything.
Maybe you and I learn differently, but to learn efficiently I need some kind of direction to keep me motivated and moving along. Sifting though pages and pages of dense textbook material without an end goal or a particular concept I'm trying to understand, and instead hoping that I'll just absorb general information, doesn't work out. Much of what you read won't stick with you, and you'll space out a lot (or at least I did).
Once I started actually working on my Game Boy emulator, my productivity shot through the roof. I was making so much more progress in learning and programming when I had actual, tangible goals to achieve. So, if you have the programming ability, start doing that sooner rather than later and you'll learn what you need to know as you move along. Start by emulating the memory, then the cpu, then the rest of the stuff. If you're actually interested, I can provide you with some links to help you get started.
----------------------------------------------
Also if you've already taken an intro to digital design course, then you won't learn anything from https://www.amazon.com/dp/0262640686/ref=cm_sw_su_dp.
That's kind of besides the point of this post, but is nonetheless awesome, and you posting that does let me use it as a springboard to say these two things:
>Jonathan Blow, the creator of Braid is making a compiler
For the interested, the link to watch that is here.
>I know nothing about making compilers, but it was awesome to watch
Compilers *are* cool, and if you want to make a simple one for fun, try it as the last part of this project. As for needing to know anything before you do it, I went through that book at the same time I was taking my first college CS course, and I managed alright. By the end of it you have a complete platform, from a machine you make from scratch (NAND) up to a compiler for an object-oriented language that runs on the machine you designed yourself.
The first thing you have to realize is that computers are the most complex system ever produced by humans. Not only the multiple levels of hardware, but also the large complex software systems required to even just start it up in windows.
The core design idea is that every layer of hardware and software on a computer is a form of translation, changing one formal language into another.
At the very bottom are the laws of physics, which allows us to arrange semi-conductors and other materials to build a small device (a transistor) that encodes negated AND logic on wires that carry electric impulses. Making large circuits of exclusively NAND gates allows you to create any complex computer logic you want. In theory you can use whatever you want that encodes such logic, you can make a computer that works with billiard balls. Transistors are just the current most economic way of building one.
The layer above just the bare circuits is the design of your circuits. You cannot just wire them arbitrarily, there are physical and economic limits. The RAM modules stuck in your motherboard are a design tradeoff. DRAM allows engineers to use less transistors for every encoded bit, but access is slower. The multiple levels of 'caches' on your CPU are also RAM, but encoding every bit with 8 or more transistors for absolute speed. Early on it was thus seen as economically interesting to design different modules separately and have them communicate in some way.
> So, I want to know all about the CPU, RAM
A certain design became popular in the 1950's: the Von Neumann architecture. This is a simple design for a computer with roughly three modules: control, arithmetic and memory. A computer program is basically a language, with its own formal words and verbs. All those words and verbs are somewhere in the memory module of the computer. A specific memory location will always hold the next instruction or verb that the computer has to process. The control module thus continuously retrieves the next instruction, that instruction will ... well instruct the control module what it needs to do. Typically an instruction will make the control module put some words in memory and kick the arithmetic module into action that will transform those words into something else.
Over the years with miniaturization, more and more discrete modules were added to the control chip to make it go faster. We call this now typically the CPU, but it's really a lot of multiple integrated modules that are really good at doing one thing.
> how the computer (software?) communicates with those components.
How do the multiple components communicate with each other? By sending electric signal over wires. How does it know to send what signals? Software controls what signals are sent over what wires. When a company makes a computer component, it will include drivers, which are basically pieces of software that control how it communicates with the CPU and the rest of your computer. This is true for the GPU, your hard-drive, audio chip and every component on your motherboard. A lot of modules communicate through a known protocol, such as the BIOS on your motherboard that starts up your computer. Others are company secrets or just too complex to make standard.
> and GPU, how they work; can use the GPU? Are the streaming units kind of like CPU cores?
The GPU is still essentially a computer, but had a completely different architecture or design to start out with. First up, everything a GPU does can be done by a CPU. For many years, it did. The only reason companies like 3dfx started selling discrete GPU modules is speed. By taking advantage of the specific way graphics are computed, GPU companies were able to design chips that could complete graphics computations a lot faster than the CPU. At first GPUs were basically tasks that were done in software baked into circuits. You push it data and poke some information that triggers certain graphics operations, but you had no arbitrary programs made from instructions like you have on the CPU. NVIDIA introduced programmable shaders on the Geforce 3 in 2001, which were basically little programs that could only use certain limited instructions. The restrictions on those programs grew less and less, and companies started adding more and more of those little processors that could execute those programs. These days there is only a limited amount of fixed-hardware functionality left on GPUs. They have basically become like a massively parallel CPU, only the design tradeoffs are completely different.
CPU vs GPU. In its more basic form, CPUs are made for fast reaction and fast sequential running. GPUs are slow lumbering beasts, but can do many task at once, as long as those tasks don't need to cooperate too much. Using an analogy, when you need to move a thousand boxes, a CPU is a nimble sprinter that will run really fast back and forth for each box, the GPU is a big fat professional mover that will load everything on a lorry and move only once.
Obviously, I'm lying here. Everything is grossly simplified because reality is so much more complex. The basics can be really simple: A CPU talks to a CPU using wires. Similarly, your GPU or CPU talks to your screen using electric wires. That is probably not the insight you were looking for though. But it is on very basic principles that a computer is built up, through layers of abstraction in both hardware and software. Some layers are simply unknown to use, we don't really know how NVIDIA makes your CPU talk to your GPU. You sometimes don't want to know, as long as there is a common language or interface that shields you from the gritty details.
If you want to know essentially how computers work, look through the recommendations in this thread. I would just want to add to that list Elements of Computing Systems for hardware and Structure and Interpretation of Computer Programs for software.
Check out this book: https://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686
It is heavy reading, but the reading is short and you'll spend most of your time implementing the projects, which are non-trivial but very fun and rewarding IMO. For lighter reading, check out CODE by Charles Petzold. It reads like a novel and requires no CS background but will leave you with a better understanding of the hardware/software stack than most CS grads have.
May be you should read The Elements of Computing Systems: Building a Modern Computer from First Principles. With this book you will learn how to build a computer from scratch. Watch the conference of prof Shimon Schocken at Google here.
My second choice is, of course, Structure and Interpretation of Computer Programs. Videos here
Não sei se seria complicado demais, mas tem o livro The Elements of Computing Systems: Building a Modern Computer from First Principles (por R$ 99 na amazon).
Ele mostra como construir um "computador", incluindo SO, bootloader, etc, que roda num processador que você também constrói virtualmente. E justamente p/ construir esse processador você parte das portas lógicas mais básicas e vai incrementando.
Não é tão "logic for dummies" mas me parece bem recompensador. (Está na minha wishlist ainda)
Data Structures & Algorithms is usually the second course after Programming 101. Here is a progression (with the books I'd use) I would recommend to get started:
Edit: If you're feeling adventurous then after those you should look at
Obviously the biggest gripe with Magento is obviously, Speed.
That said I recall Alan Storm mentioning performance was not a target during initial development, flexibility and developer friendliness would be a key to market penetration as it is obviously what helped make it as popular as it is.
Fast forward and now we all are having to deal with this initial uncaring for performance. We install layers upon layers of caching and indexing to squeeze as much out of it as we can.
Personally I find Magento overly architected. It feels like Spaghetti code at times, except its just a big massive plate of lasagna now. Everything has to go through hundreds of layers to build out a simple request.
With that said.
Wrong:
Right:
What I'm NOT looking forward to in Magento2:
After using Magento since version .6b using (X-Cart years before) I've slowly come to the realization that the majority of implementations seem to come from the Java world.
http://i.qkme.me/3u7vuq.jpg
I don't know if its the fact that instead of teaching Basica or even C isn't part of the curriculum at universities anymore and they just drop you into Java but I wish people would take more time to figure out the basic principles FIRST before saying Java, PHP, C, etc is better. They all have their +/-'s. But understanding the principles layers generally helps you understand how your PHP code is working inside the machine. I don't think this is taught anymore or no one cares. I'd suggest http://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686 to pickup for a good read for those that don't.
With that said anyone who thinks they can rebuild a better Magento from scratch, I will salute you, me I realize such a task would require years to achieve, by then the train may of already left the station. I know Varien who was a consulting company before Magento, realized the many pitfalls of OSCommerce for their clients. Hence why we have Magento today.
Disclaimer: These are just my opinions, which are a lot like butts we all have one and they all stink. ;)
The speaker wrote this as a book, which was actually published in 2005.
Bought my own copy some time ago, definitely recommended!
Don't start with the dragon book as someone else suggested - it's too much for someone new. I'd suggest reading The Elements of Computing Systems (which you can also read online). It starts from the grond up, but in the end has you creating a "high level" compiled language.
Start here:
http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686
This is an incredibly broad and complex set of questions.
Here is a good video describing why there are so many programming languages.
Wikipedia has multiple pages comparing programming languages. Here is the beginning.
Students will spend years in school learning about the different layers of abstraction in programming and how code gets turned into something the computer will understand. This website along with the companion book is an excellent overview of the subject.
If you have more specific questions after perusing through the resources, I can answer them. The links the other poster and I have posted will give you a high-level overview of what you asked, but if you want all the details, you'd be halfway to a computer science bachelors degree.
Further reading/research: (Not all of which I've gotten to read yet. Some of which may be quite tangentially relevant to the discussion at hand along with the books and sites I mentioned above. Consider this more a list of books pertaining to the history of technology, machining, metrology, some general science and good engineering texts.)
Dan Gelbart's Youtube Channel
Engineerguy's Youtube Channel
Nick Mueller's Youtube Channel
mrpete222/tubalcain's youtube channel
Tom Lipton (oxtools) Youtube Channel
Suburban Tool's Youtube Channel
NYCNC's Youtube Channel
Computer History Museum's Youtube Channel
History of Machine Tools, 1700-1910 by Steeds
Studies in the History of Machine Tools by Woodbury
A History of Machine Tools by Bradley
Tools for the Job: A History of Machine Tools to 1950 by The Science Museum
A History of Engineering Metrology by Hume
Tools and Machines by Barnard
The Testing of Machine Tools by Burley
Modern machine shop tools, their construction, operation and manipulation, including both hand and machine tools: a book of practical instruction by Humphrey & Dervoort
Machine-Shop Tools and Methods by Leonard
A Measure of All Things: The Story of Man and Measurement by Whitelaw
Handbook of Optical Metrology: Principles and Applications by Yoshizawa
Angle of Attack: Harrison Storms and the Race to the Moon by Gray
Machine Shop Training Course Vol 1 & 2 by Jones
A Century of Electrical Engineering and Computer Science at MIT, 1882-1982
Numerical Control: Making a New Technology by Reintjes
History of Strength of Materials by Timoshenko
Rust: The Longest War by Waldman
The Companion Reference Book on Dial and Test Indicators: Based on our popular website www.longislandindicator.com by Meyer
Optical Shop Testing by Malacara
Lost Moon: The Preilous Voyage of Apollo 13 by Lovell and Kruger
Kelly: More Than My Share of It All by Johnson & Smith
Skunk Works: A Personal Memoir of My Years at Lockheed by Rich & Janos
Unwritten Laws of Engineering by King
Advanced Machine Work by Smith
Accurate Tool Work by Goodrich
Optical Tooling, for Precise Manufacture and Alignment by Kissam
The Martian: A Novel by Weir
Roark's Formulas for Stress and Strain by Young Budynas & Sadegh
Materials Selection in Mechanical Design by Ashby
Slide Rule: The Autobiography of an Engineer by Shute
Cosmos by Sagan
Nuts, Bolts, Fasteners and Plumbing Handbook by Smith Carol Smith wrote a number of other great books such as Engineer to Win.
Tool & Cutter Sharpening by Hall
Handbook of Machine Tool Analysis by Marinescu, Ispas & Boboc
The Intel Trinity by Malone
Manufacturing Processes for Design Professionals by Thompson
A Handbook on Tool Room Grinding
Tolerance Design: A Handbook for Developing Optimal Specifications by Creveling
Inspection and Gaging by Kennedy
Precision Engineering by Evans
Procedures in Experimental Physics by Strong
Dick's Encyclopedia of Practical Receipts and Processes or How They Did it in the 1870's by Dick
Flextures: Elements of Elastic Mechanisms by Smith
Precision Engineering by Venkatesh & Izman
Metal Cutting Theory and Practice by Stephenson & Agapiou
American Lathe Builders, 1810-1910 by Cope As mentioned in the above post, Kennth Cope did a series of books on early machine tool builders. This is one of them.
Shop Theory by Henry Ford Trade Shop
Learning the lost Art of Hand Scraping: From Eight Classic Machine Shop Textbooks A small collection of articles combined in one small book. Lindsay Publications was a smallish company that would collect, reprint or combine public domain source material related to machining and sell them at reasonable prices. They retired a few years ago and sold what rights and materials they had to another company.
How Round Is Your Circle?: Where Engineering and Mathematics Meet by Bryant & Sangwin
Machining & CNC Technology by Fitzpatrick
CNC Programming Handbook by Smid
Machine Shop Practice Vol 1 & 2 by Moltrecht
The Elements of Computing Systems: Building a Modern Computer from First Principles A fantastic book with tons of free online material, labs, and courses built around it. This book could take a 6th grader interested in learning, and teach them the fundamentals from scratch to design a basic computer processor and programming a simple OS etc.
Bosch Automotive Handbook by Bosch
Trajectory Planning for Automatic Machines and Robots by Biagiotti & Melchiorri
The Finite Element Method: Its Basis and Fundamentals by Zhu, Zienkiewicz and Taylor
Practical Treatise on Milling and Milling Machines by Brown & Sharpe
Grinding Technology by Krar & Oswold
Principles of Precision Engineering by Nakazawa & Takeguchi
Foundations of Ultra-Precision Mechanism Design by Smith
I.C.S. Reference Library, Volume 50: Working Chilled Iron, Planer Work, Shaper and Slotter Work, Drilling and Boring, Milling-Machine Work, Gear Calculations, Gear Cutting
I. C. S. Reference Library, Volume 51: Grinding, Bench, Vise, and Floor Work, Erecting, Shop Hints, Toolmaking, Gauges and Gauge Making, Dies and Die Making, Jigs and Jig Making
and many more ICS books on various engineering, technical and non-technical topics.
American Machinists' Handbook and Dictionary of Shop Terms: A Reference Book of Machine-Shop and Drawing-Room Data, Methods and Definitions, Seventh Edition by Colvin & Stanley
Modern Metal Cutting: A Practical Handbook by Sandvik
Mechanical Behavior of Materials by Dowling
Engineering Design by Dieter and Schmidt
[Creative Design of Products and Systems by Saeed]()
English and American Tool Builders by Roe
Machine Design by Norton
Control Systems by Nise
That doesn't include some random books I've found when traveling and visiting used book stores. :)
Take a look at these:
Intro to Logic Design
and
*
Elements of Computing Systems
Elements of Computing Systems is a really good book to learn from the ground up.
I have yet to read it myself, but i hear good things about The Elements of Computing Systems: Building a Modern Computer from First Principles.
This video is related. If you have an hour to spare I'd really recommend it.
Edit:
As a point of interest the minecraft computer is based on the "Hack machine" described in the book.
I was discussing with a friend about computers as I'm a Computer Engineer and learned all about the design and ground up construction from the principals of electronics up through logic gates operating systems and programming as a part of my college curriculum. He said he read through this book and it was very insigtful and was able to have a good proper discussion about it. I'm not sure if linking is allowed but here it is on Amazon.
http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686
It's The Elements of Computing System Principles by Noam Nisan.
From there if you're further interested you'll know which topics might be more interesting to you and you'll be able to better investigate it. It's really a lot of material, and after 4 years of school, I know quite a bit about it, but people tend to specialize due to the sheer amount of material there is to learn about each specific topic. Lifetimes have been devoted to singular aspects of computer design so don't feel bad if some of it is overwhelming.
You know how you wish you could relive the experience of your first time watching your favorite movie or listening to a mind-blowing song -- that's how I feel about reading Code and The Elements Computing Systems. I'm a self-taught programmer with impostor syndrome who's spent a career playing catch-up to formerly trained comp sci-ers. When I first read these books, they utterly blew my mind.
Compete this book (don't just read it, do everything).
The Elements of Computing Systems: Building a Modern Computer from First Principles
I was interested in the same topic, I purchased this book:
http://www.amazon.com/gp/product/0262640686
I haven't gone all the way through it yet, but it helps you learn where assembly code came from, and low loevel interactivty between hardware and software.
Elements of Computer Systems: Building a Modern Computer From First Principles is a great introduction to a wide range of computer science topics. It covers principles from the bottom up, and is a good place to start for things like computer architecture and programming language design. It doesn't cover things like networking or algorithms/data structures, but if you're interested in systems, this is a great book.
You should be able to find the book on Amazon.
Here Its’s not called nand2tetris.
Edit: I was a little unclear. The book is not called nand2tetris, but the associated courses are/were. You can go to nand2tetris.org to find the author’s website and links to their coursera courses. You really don’t need the courses. The books is great.
This is an excellent book, I enjoyed reading it. I would also recommend the Elements of Computing Systems.
As other's have said, K&R is a great introduction to C, but not a great introduction to programming/computer science. I think more people should try to C as their first language as it gives the student a better idea of what the computer is actually doing than high-level languages. I wish I had a modern book I could refer you to for learning C as a first language, but I am out of the loop, however, I have heard great things about Harvard's free online course: Introduction to Computer Science which uses C (and some other languages).
As far as learning how to be a better programmer, I think one of the key things is to 1) strive to understand what is happening under the hood. 2) Break large problems into smaller ones 3) Logically order the operations needed to complete the tasks that solve the problem, 4) Learn multiple programming languages.
Some tips for becoming a better programmer
Strive to understand what the computer is doing when you execute your program
Understanding what the compiler/interpreter is doing with your source code, how the the processor executes the binary and how information is stored/accessed in memory will help you write more efficient code. The C language is great for learning these things once you start to wrap your mind around it. I would also recommend learning computer organization and hardware. One book I found that really helped me learn what a computer does is The Elements of Computing Systems: Building a Modern Computer from First Principles. I would recommend a casual reading of it, don't get too hung up if you don't quite 'get' it. Read it and see what sinks in. After you get better at C and maybe learn another language, come back to this book and read it again with closer scrutiny.
Break large problems into smaller ones. Logically order the operations needed to complete the tasks that solve the BIG problem
Before I write a single line of code I will spend days, weeks or even months just planning the program out. Flow charts and lists, pseudo code are your friend. Identify your large problem, think about all the different steps needed to get there, write them all down. Determine how to what you need to do to complete each step. Determine if you are doing the same task multiple times (I keep writing data to this log file, I keep checking to see if this array is full, etc.), if so, then you need a function for that. Write this all down as a human readable list of steps. Once you think you have solved the big problem, start coding the small stuff. Write small programs that complete each step you identified and test each little program. Once you've written all those little programs, put the pieces together. Check out How to Think Like A Programmer. It's an excellent book in this area.
Learn multiple programming languages
Again, stick with C until some things are really clicking for you. Eventually though you need to learn another language or two before the "thinking like a programmer" will really sink in. The more languages you learn, the easier it is to learn even more languages. You will begin to see the patterns in languages. You will notice the different approaches that different programming paradigms take. There is a reason that nearly every book, course or tutorial on learning a language follow very similar trajectory: What datatypes exist in this language? How to declare a variable of a particular type. How to output text to the screen, how to cast a variable to a different type, how arrays work in this language, how IF/Then/Else works, How loops work, etc. These are things (nearly) every language has and they are the first steps to learning how to work with that language.
Hope some of this helps!
Remember that CS is much about the physical computer than anything else. Computers are made possible by multiple layers of abstraction. They begin with semiconductors, boolean logic, machine language, assembly language, compiler/linker, leading up to high level languages like python and C. Computers organized memory hierarchy to arbitrate between the access time and availability of different types of memory (cache, DRAM, hard drive) .
In addition, the current trend seems to be much focused on the multi-core, parallel system, since engineers can't get enough performance improvement just by implementing pipelines or faster clock cycles.
So that's that. If you enjoy this realm of CS (it's more of computer engineering to be precise), you should read about these books. Nevertheless, this knowledge will "expand", as you put it, your understanding of computing system in general.
http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686/ref=pd_sim_b_27
http://www.amazon.com/Computer-Organization-Design-Revised-Edition/dp/0123747503/ref=sr_1_1?ie=UTF8&qid=1344489171&sr=8-1&keywords=hennessy+computer+organization
Elements of Computing Systems. Any lower level than that and you're moving closer to analog circuit territory.
LondonPilot has an excellent explanation.
If you want everything explained in a book, look at Code by Charles Petzold. It's basically an ELI5 Computers through all the layers of abstraction, literally starting with binary.
If you want a hands on approach, check out The Elements of Computing Systems. It will take you through building the gates that LondonPilot is talking about to building Memory, ALUs, and even a CPU. It continues on to assembly language programming, writing a compiler, writing a high level language, etc.
These books combined is really a large chunk of what you should know with a BS in Computer science.
https://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/
Build a simple system from scratch :)
You might want to look at The Elements of Computing Systems, also known as "nand2tetris", It will start you out with nothing more than a
NAND
logic gate and you'll build the whole stack, from ALU to CPU to compiler to simple OS. Quite fun and kept reasonably simple.I'm a self taught programmer, so I don't know what CS degrees entail, but I highly recommend the book Code and also another one called The Elements of Computing Systems.
The former pretty much teaches you how a computer physically works and the latter teaches you how to build a processor and then write an OS for it. After reading those two books you pretty much know how computers work at every level of abstraction. I think that's the way programming should be taught.
You might want to check out something like this book to make it all come together.
Io non sono molto un tipo da libri, ma tra i miei preferiti avevo salvato :
you could start here. or here.
Another good place to start: http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686
It is not just a text, it's a course that walks through building a virtual CPU using Hardware Description Language and a Hardware Emulator, and then coding Tetris on top of it.
It starts with a single NAND gate and go on from there.
Actually to give you some better advice try this:
https://www.amazon.co.uk/Elements-Computing-Systems-Building-Principles/dp/0262640686
It's available free online, but choose amazon as I don't know the link sites. I bought a hard copy.
This is a university course written by a Harvard professor. You start with basic binary logic and end up writing a basic operating system. It will teach you how a computer works from the ground up, as you basically build your own virtual CPU in software and write a compiler for it.
Yes, its really hardcore but you don't need to do the whole thing in a few weeks. Stretch it out over 5 years while you learn on the job.
The point is, it teaches you HOW a computer works. With that understanding, everything else becomes a lot easier. It grounds the concepts into you that are fundamental to everything, from hardware to software to networking. All of it makes much more sense when you have the foundation to understand it.
Others I'd recommend:
Any of Peter Norton's books on the IBM PC from the 80's. You'd think they would be irrelevant by now, but we still use IRQ's on modern hardware, even if its far abstracted by the OS now.
I'm a systems architect now, after 19 years in the industry, but pulling the books from my shelf that got me here:
Peter Norton: Inside the IBM PC and PS/2
Peter Norton: Programmers guide to the IBM PC
Brian Kernighan/Rob Pike: The Unix programming envrionment
Will Asop: Unauthorised Access
Street/Nabours/Basking: Dissecting the hack
Gerald Weinberg: The psychology of computer programming.
Oh and anything written by Kevin Mitnick
You want to read Charles Petzold's book Code: The hidden language of Computer Hardware and Software. This is another good one.
Got THIS, it´s amazing.
It starts out with the fundamental principles of logic gates and transistors, moving through memory adresses and basic assembly, through to making your own compiler and a basic OS. Each chapter has a nice little simulator program you can download and use together with a number of exercises, to help you consolidate what you´ve learnt.
And no, I didn´t write it! Or sell it for a living. Nor am I in any way affiliated with anybody having to do anything with the production or sale of this book. I just read it. It´s great.
Read this book. The first half, in which it shows you how to design and instantiate (in an included simulator) flip-flops, registers, an ALU and RAM, are extremely enlightening.
The second half is about software, so I didn't bother to continue. But the first half is well worth the price of admission. Oh, related to /u/panda_burgers' comment below, this is the book for the NAND2Tetris course. But their site is throwing malware warnings at the moment.
The two starting books that gave me a great deal of understanding on systems (which I think is one of the toughest things to grasp and CLRS and the Art of Programming have already been mentioned):
[Computer Systems: A Programmer's Perspective] (http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040/ref=sr_1_2?ie=UTF8&qid=1407529949&sr=8-2&keywords=systems+computer)
This along with its labs served as a crash course in how the system works, particularly a lot about assembly and low-level networking.
The Elements of Computing Systems: Building a Modern Computer from First Principles
I've mostly only done the low-level stuff but it is the most fun way I have found to learn starting all the way at gate architecture. It pairs well if you have read Petzold's Code. A great introduction to the way computers work from the ground up.
Senior Level Software Engineer Reading List
Read This First
Fundamentals
Development Theory
Philosophy of Programming
Mentality
Software Engineering Skill Sets
Design
History
Specialist Skills
DevOps Reading List
Maybe too light for anyone with a CS/CE degree, but I'm presently enjoying The Elements of Computing Systems.
I consider myself C competent but I've never done any assembly programming. I ordered these two books today to supplement any internet resources I might come across:
Code: The Hidden Language of Computer Hardware and Software
The Elements of Computing Systems: Building a Modern Computer from First Principles
One of those (or maybe both) were mentioned in some forum (or maybe on Reddit) in reference to preparing to learn the language.
I am reading a book about this right now. If you really want to learn more, I highly recommend it! One of the most enlightening books I've ever read.
http://www.amazon.com/gp/product/0262640686/ref=oh_details_o03_s00_i00?ie=UTF8&psc=1
I really liked this:
https://www.amazon.co.uk/Elements-Computing-Systems-Building-Principles/dp/0262640686/ref=sr_1_1?ie=UTF8&qid=1526995864&sr=8-1&keywords=elements+of+computer+systems&dpID=51h%252BQaiDsvL&preST=_SX218_BO1,204,203,200_QL40_&dpSrc=srch
I found it helped me put a lot of the theory I learned in my CompSci degree together, and helped me make more sense of it all.
It uses AND and OR logic.
I suggest you to read this book and you'll be able to find a lot about internals of CPU
If you have the time/resources, take a 2 year diploma program specializing in programming. It's what I did (BA in History - staring at the same future of awful administrative type jobs). The program I took through a local tech college focussed on how to code, the basics of the math behind it, and some extra courses for basic management, accounting and communication (i.e. tech writing), and project management - all key skills in software development. All in all, it was a great program and got me way further then I could have done on my own.
What it didn't do is give me a solid foundation for the underlying concepts in Computer Science - basically how exactly a CPU works, how a compiler works, some of the more abstracted mathematics behind computational logic. While you don't necessarily need this knowledge to have a career coding, I feel it holds me back - so I'm trying to bolt it on through self learning and am finding it a huge challenge.
But for basic computer science, if you're serious, check out this book: The Elements of Computing Systems - http://www.amazon.ca/Elements-Computing-Systems-Building-Principles/dp/0262640686/
The Learn X the Hard Way are great primers on coding - I use them to teach myself languages I don't use in work (Python, C).
Other books to read include Code Complete 2, Pragmatic Programming and Clean Code - while these texts are largely for people already familiar with software development, I do recommend them to all beginner (and experienced) coders as they contain extremely handily insights into the complicated beast that is software development, as a craft.
Finally, the most important thing you can do if you want to learn coding is to CODE. Practice is the best way to learn. Wanna figure out how something works? Don't read about it, do it. Make hundreds of throw away programs tinkering with concepts. Coding is a craft - only way you're going to get better is by actually doing it.
seconded, book and coursework
If you're looking to get further into the subject, you might like The Elements of Computing Systems by Noam Nisan and Shimon Schocken. I have not read it myself, but it is meant to be very good. The companion website is: http://www.nand2tetris.org/.
Check out: http://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/ref=sr_1_fkmr0_1?ie=UTF8&qid=1292467785&sr=1-1-fkmr0
Anyone interested in a very basic book that guides you through building a computer should check out Nisan and Schocken's Elements of Computing Systems
I picked this book up on a recommendation from someone on youtube who had built an ALU in Minecraft and was delighted to find just how easy it was to get a good understanding of the basics of computer architecture.
First note that Career/Job/Market is quite different from Knowledge/Intellectual satisfaction. So you have to keep "earning money" separate from "gaining knowledge" but do both parallely. If you are one of the lucky few who has both aligned in a particular job, you have got it made. Mostly that is never the case and hence you have to work on your Motivation/Enthusiasm and keep hammering away at the difficult subjects. There are no shortcuts :-)
I prefer Books to the Internet for study since they are more coherent and less distracting, allowing you to focus better on a subject. Unless newer editions are reqd. buy used/older editions to save money and build a large library. So here is a selection from my library (in no particular order);
I had never heard of But How Do It Know?, thank you for bringing it to my attention. From a related link, another title The Elements of Computing Systems: Building a Modern Computer from First Principles got good reviews as well.
https://www.nand2tetris.org/course
https://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/ref=ed_oe_p
It's a bit too late to reply here, but you should checkout the book NAND2Tetris, The Elements of Computing Systems[1]. It's a really good book to help you understand the basics of computers in a simple manner.
[1] http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686/ref=sr_1_1?ie=UTF8&qid=1413824881&sr=8-1&keywords=nand2tetris
So I looked at the course you mentioned, and I'm not seeing where it falls short for your goals.
I haven't read this, but it looks like an excellent architecture book, comes with software to implement all projects on. It also says it includes hardware projects. You'll just need to buy a breadboard and a bunch of logic gates, probably.
"This book guides the reader on a journey from the basics of boolean logic and elementary gates through CPU design, assembly, virtual machines, high level languages, compilers and operating systems. How can such a task be accomplished in one 300-page volume? Simple - you do most of the work yourself. The relatively short chapters introduce each concept and suggest an approach to implementation. The reader is then given a project to complete and test. Intimidated by assembly language? You probably won't be after you've written a symbolic assembler. Confused by compilers? Imagine how you'll feel when you realize you've created one for a simple (but completely usable) high-level language." link to the Amazon review
"The Elements of Computing Systems" by Noam Nisan and Shimon Schocken
http://www1.idc.ac.il/tecs/
http://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686
If we're recommending books to help grasp the concept, I'd like to recommend this one that's meant for use with a free online course called Nand2Tetris. It walks you through the building of your own 16-bit computer from the ground-up, teaching you many of the ins and outs of what makes a computer tick. /u/reeecheee suggested Code: The Hidden Language of Computer Hardware and Software by Charles Petzold, which is also an excellent read, but if you're more of a hands-on learner, try this:
The Elements of Computing Systems: Building a Modern Computer from First Principles, by Noam Nisan
http://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/
I can say I actually understood how CPUs work after reading this book.
I only heard of the project you mentioned from your post, so I can't comment on the quality or coverage, but you said you are looking for more resources, so here's my suggestion.
There is a book called "The Elements of Computing Systems" that I worked with a couple of years ago. They start at an even lower level than CS from the Bottom up, it seems, but don't nearly cover all the OS mechanisms. I'd say both projects complement each other well.
Basically, they introduce a very easy to learn hardware description language and then have you first build important parts of the hardware. You start out by building simple logic gates, then an ALU, RAM and so on. Then you start to actually program this machine. This all happens inside a neat little interpreter/emulator.
The two 2-star-reviews on amazon are not wrong, though: this is only an introduction and if you want to go really deep, you need additional literature, but TEoCS covers a lot of ground in a practical way and is a really good introduction to the topic.
The Elements of Computing Systems: Building a Modern Computer from First Principles
If you don't do anything else, buy this book and go through the course exercises. You will do it all from first principles. Build virtual hardware. Implement an OO language. Write an OS.
http://www.hackszine.com/blog/archive/2008/03/from_nand_to_tetris_in_12_step.html
I'm kind of in the same position as you, OP. Thinking of getting CLRS, New Turing Omnibus, The Elements of Computing Systems, and Algorithmics!
So excited.
For an n-bit adder, just chain together a few full adders by linking the carry output (Cout) into the carry in of the next adder like this
To make the adder subtract, use the two's complement method by inverting inputs A and B and holding the initial carry-in input high :)
For more advanced "computers", I prompt you to buy The Elements of Modern Computer Systems - Its a fantastic book that will take you through everything from basic combinational and sequential logic (like adders) all the way to a full blown computer, complete with compiler, operating system and programming language virtual machine knowns as the "Hack platform".
You can even read some of it online here!
Good luck!
Like /r/falafel_eater says, check into nand2tetris. Here is the companion book: The Elements of Computing Systems: Building a Modern Computer from First Principles.
This is really good stuff. The book uses Java, but you can use another language (I used python). It starts by implementing a nand gate, and using that to implement and, or, xor gates. These gates are used to implement simple chips, then simple cpus. Further abstractions include a simple machine language, assembler, and finally a compiler for a high level language and a simple graphics library that is used to implement a Tetris game. Hence, nand2tetris.
This stuff is awesome. You will have a great time!
Further things you can look at (sorry I am to lazy to provide links) are 'Bebop to the Boolean Boogie' and 'How Computers do Math', both by Clive Maxfield, and 'Code' by Charles Petzold.
Here's a couple books I have recommended in the past. Code by Charles Petzold is a nice intro to what your computer is doing deep down. It may be too basic for you but it's actually a fun read. The second book is The Elements of Computing Systems: Building a Modern Computer from First Principles. I've had this book for a while and just haven't taken the time to read and work my way through all of it. It's basically a course, it walks you through basically building an emulator for a fictional computer, the HACK computer. I found out about that book because there was a guy who read it and then made a working computer in minecraft. Crazy stuff.
http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686
This text book goes through building a computer starting at logic gates and going all the way to building a CPU and writing a compiler. It might take a while to get through, but after you do you will have a really good understanding of how computers work.
Get a spare and start playing with it. Don't ever play around with your main machine.
If you really want to know how computers work, I recommend two books: Code by Charles Petzold, and The Elements of Computing Systems by Nisan and Schocken. Read them in that order.
As for the rest, you're just going to have put the time in. I got a Commodore 64 when I was eight years old. I got a modem for my twelfth birthday. I got my first PC at sixteen. I've put in at least an hour a day on a computer ever since, and I'm 39 and don't have a computer-related job.
> This book must exist.
Think about how big a book this would be for a bit. The closest you'll get is this book but of course that's no where close to 'everything'.
Code is also a good book. If you want textbooks:
This is a good one for computer architecture. And there's a companion course/website at www.nand2tetris.org. https://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/ref=ed_oe_p
I like this one as far as operating systems go: https://www.amazon.com/gp/aw/d/1118063333/ref=dp_ob_neva_mobile
For learning programming, I would check out courses at www.udemy.com. But be mindful of ratings & reviews because the quality of courses can vary pretty drastically. But I've had good experiences there. www.udacity.com also has great courses. They offer paid nanodegrees but you can take every individual course free of charge. www.teamtreehouse.com is another good website
If you're interested in networking, this is a good book for starters: https://www.amazon.com/gp/aw/d/0768685761/ref=dp_ob_neva_mobile
Any A+/Network+ certification books or courses would also be a great way to learn networking and computer hardware
Those are pretty big topics in tech & computer science. There's a ton of stuff to learn. I've been studying this stuff for probably 2-3 years and sometimes I feel like I've barely scratched the surface. Let me know if that helps & if there are other topics you'd want book recommendations on! :)
It sounds like you can't admit what you don't know during your problem-solving process. If you see sample code with strange syntax, you can always break it down by running parts of it in an interactive interpreter. (Python is recommended for this.)
Also, perhaps you want to start from the ground up.
http://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686/
There's a video on Google Video with the authors describing the intent of the book.
Perhaps your course was bad, rather than microprocessors? Purchase and read The Elements of Computer Systems: Building a Modern Computer from First Principles. It's only $25, and if you can't afford that most of the materials for the book are available for free online.
After Gödel, Escher, Bach: An Eternal Golden Braid, it's one of my favorite books ever. I already knew a lot about programming, but this helped me understand how circuits, timing, and compilers worked in a very nice way.
My favorite CS book. Reading this made it all click into place for me. Here is a link to the original book on Amazon: https://www.amazon.com/dp/0262640686/ref=cm_sw_r_fm_apa_aEzPAb1SG38F7.
Computer Engineering, all else is trash. You should probably do something in your free time first though, you'll learn a lot more if you've given yourself a head start. I also suggest this book to get a general theory of how computers work. Once you understand how an assembler works, you cam begin to dive into how software is made.