Reddit mentions: The best computer hardware & diy books
We found 1,470 Reddit comments discussing the best computer hardware & diy books. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 449 products and ranked them based on the amount of positive reactions they received. Here are the top 20.
1. Compilers: Principles, Techniques, and Tools (2nd Edition)
- O Reilly Media
Features:
Specs:
Height | 9.52754 Inches |
Length | 6.49605 Inches |
Number of items | 1 |
Weight | 3.2848877038 Pounds |
Width | 1.73228 Inches |
2. Patterns of Enterprise Application Architecture
- ✔ AMD FX-4300 3.80GHz / 4.0 Turbo Quad Core | 1 TB 7200RPM Hard Drive | AMD AM3+ 970 Chipset Motherboard
- ✔ 8 GB 1866 MHz Gaming Memory DDR3 with Heat Spreader | 24X DVD ±RW Optical | Genuine Windows Professional 10 64-bit
- ✔ GTX 750 Ti 2GB Graphics Card | 1 x DL-DVI, 1 x Display Port 1.4, 1 x HDMI 2.0b | 9 x USB (7 X USB 2.0; 2 X USB3.0)
- ✔ Wi-Fi Ready | No bloatware | Free Keyboard & Mouse | Monitor Not Included
- ✔ 1 Year Warranty on Parts and Labor | Lifetime Free Technical Support | Assemble in the USA
Features:
Specs:
Height | 9.4 Inches |
Length | 7.7 Inches |
Number of items | 1 |
Weight | 2.43831261772 pounds |
Width | 1.6 Inches |
3. Clean Architecture: A Craftsman's Guide to Software Structure and Design (Robert C. Martin Series)
- Great product!
Features:
Specs:
Height | 9 Inches |
Length | 0.8 Inches |
Number of items | 1 |
Release date | September 2017 |
Weight | 1.4109584768 Pounds |
Width | 6.9 Inches |
4. Operating Systems Design and Implementation (3rd Edition)
- The official controller for SHIELD portable and SHIELD Tablet. Support for GeForce-equipped PCs coming soon.
- Redesigned from the ground up for precision gaming
- Dual vibration feedback
- Stereo headphone jack for private audio
Features:
Specs:
Height | 9.55 Inches |
Length | 7.6 Inches |
Number of items | 1 |
Weight | 0.220462262 Pounds |
Width | 2.45 Inches |
5. High Speed Digital Design: A Handbook of Black Magic
- DC Power Supply and Adapter Cable Allows You to Power Two Computer Case Fans
- Cable Is Compatible with Both 3-Pin and 4-Pin PWM PC Fans
- Fans Will Be Operated at Full Power from the DC 12V 1A Power Supply
- Additional Fan Splitter Cables Can Be Used To Expand Your Fan Configuration (Within The Curent Capacity Of the 1A Power Supply)
- Female DC 5.5 x 2.1mm Adapter Cable Allows You The Flexibility To Use Other Power Supplies To Better Suit Your Needs
Features:
Specs:
Height | 9.6 Inches |
Length | 7.35 Inches |
Number of items | 1 |
Weight | 1.9180216794 Pounds |
Width | 1.2 Inches |
6. Compilers: Principles, Techniques, and Tools
Specs:
Height | 9.5 Inches |
Length | 6.75 Inches |
Number of items | 1 |
Weight | 2.61468242732 Pounds |
Width | 1.5 Inches |
7. Thing Explainer: Complicated Stuff in Simple Words
- Houghton Mifflin Harcourt
Features:
Specs:
Height | 13 Inches |
Length | 9 Inches |
Number of items | 1 |
Release date | November 2015 |
Weight | 1.8 Pounds |
Width | 0.585 Inches |
8. Computer Organization and Design: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design)
Specs:
Height | 9 Inches |
Length | 7.25 Inches |
Number of items | 1 |
Weight | 3.48109911698 Pounds |
Width | 1.75 Inches |
9. Building Microservices: Designing Fine-Grained Systems
- O Reilly Media
Features:
Specs:
Height | 9.19 Inches |
Length | 7 Inches |
Number of items | 1 |
Weight | 1.04499112188 Pounds |
Width | 0.59 Inches |
10. The Design and Implementation of the FreeBSD Operating System (2nd Edition)
- Addison-Wesley Professional
Features:
Specs:
Height | 9.55 Inches |
Length | 6.75 Inches |
Number of items | 1 |
Weight | 3.527396192 Pounds |
Width | 2.15 Inches |
11. Structured Computer Organization (5th Edition)
- 50 Foot Extension cable reel - UL Approved - 14/3-Gauge
- 4 Outlets
- Overload protection
- 13 AMPS Resettable Circuit Breaker
- 14 Gauge 3 Wire
Features:
Specs:
Height | 9.25 Inches |
Length | 7.25 Inches |
Number of items | 1 |
Weight | 2.9101018584 Pounds |
Width | 1.5 Inches |
12. Programming Arduino: Getting Started With Sketches
Specs:
Height | 8 Inches |
Length | 5.5 Inches |
Number of items | 1 |
Weight | 0.4519476371 Pounds |
Width | 0.25 Inches |
13. Understanding Digital Signal Processing (3rd Edition)
Specs:
Height | 1.4 Inches |
Length | 9.2 Inches |
Number of items | 1 |
Weight | 3.747858454 Pounds |
Width | 7.1 Inches |
14. But How Do It Know? - The Basic Principles of Computers for Everyone
- EASY TO USE- Our automatic tennis ball launcher for dogs has 3 THROW DISTANCE SETTINGS. Lights on the automatic dog ball launcher will indicate which distance is selected- Throws the ball up to 20 feet! Choose a distance to fit the space you have- Makes this ball thrower toy GREAT FOR INDOOR USE!
- GREAT FOR EXERCISE! Train your pet to play fetch on its own with the Playball auto ball launcher for dogs. Your furry friend will LOVE this fetching toy! Your dog can fetch the ball and drop it back in the hole at the top- the Playball machine will self launch the ball for your pet to fetch again! Both you and your pet will feel like a WINNER!
- SUITABLE FOR INDOOR AND OUTDOOR USE! Our automatic ball thrower for dogs can run on both an electric plug (included) or on 6 size "C" batteries. The playball small tennis ball launcher is sure to be one of your dog's favorite toys !!
- What's in the box? This product contains a tennis ball launcher, power plug, and 3 HIGHLY DURABLE SMALL SIZE TENNIS BALLS. THIS automatic dog ball thrower is INTENDED FOR SMALL DOGS ONLY!! Balls are 1.5 inches in diameter- this tennis ball launcher does NOT use regular size tennis balls. Our NEW and IMPROVED balls are highly durable and made of special material that will not jam in the Playball shooter even when wet!
- CUSTOMER SATISFACTION GUARANTEED! Felix and Fido stands behind their products 100%! If you encounter any issue or the dog ball thrower does not fit your needs- we will gladly do what we can to make you a happy customer!
Features:
Specs:
Height | 9 Inches |
Length | 6 Inches |
Number of items | 1 |
Weight | 0.6503636729 Pounds |
Width | 0.5 Inches |
15. Advanced Compiler Design and Implementation
- 3mm diameter, 1kg net weight, ABS Filament
- Filament Roundness: +/- 0.07mm
- Filament Diameter: +/- 0.05mm
- Recommended Print Temperature: 230 - 240°C, depending on printer model
- Compatible with RepRap, Makerbot, Afinia, Solidoodle or any standard spool based 3D printer
Features:
Specs:
Height | 9.75 Inches |
Length | 7.75 Inches |
Number of items | 1 |
Weight | 3.60014873846 Pounds |
Width | 1.75 Inches |
16. Structured Computer Organization (6th Edition)
- Used Book in Good Condition
Features:
Specs:
Height | 9.2 Inches |
Length | 7 Inches |
Number of items | 1 |
Weight | 2.4471311082 Pounds |
Width | 1.2 Inches |
17. The Scientist & Engineer's Guide to Digital Signal Processing
Used Book in Good Condition
Specs:
Height | 0 Inches |
Length | 0 Inches |
Number of items | 1 |
Weight | 0 Pounds |
Width | 0 Inches |
18. Making Embedded Systems: Design Patterns For Great Software
- O Reilly Media
Features:
Specs:
Height | 9.19 Inches |
Length | 7 Inches |
Number of items | 1 |
Release date | November 2011 |
Weight | 1.18 Pounds |
Width | 0.76 Inches |
19. Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture
- Used Book in Good Condition
Features:
Specs:
Height | 9.25 Inches |
Length | 7 Inches |
Number of items | 1 |
Release date | December 2006 |
Weight | 1.67 Pounds |
Width | 1 Inches |
20. An Introduction to Quantum Computing
- Oxford University Press USA
Features:
Specs:
Height | 0.57 Inches |
Length | 9.2 Inches |
Number of items | 1 |
Release date | January 2007 |
Weight | 0.97444319804 Pounds |
Width | 6.24 Inches |
🎓 Reddit experts on computer hardware & diy books
The comments and opinions expressed on this page are written exclusively by redditors. To provide you with the most relevant data, we sourced opinions from the most knowledgeable Reddit users based the total number of upvotes and downvotes received across comments on subreddits where computer hardware & diy books are discussed. For your reference and for the sake of transparency, here are the specialists whose opinions mattered the most in our ranking.
> It’s hard to consolidate databases theory without writing a good amount of code. CS 186 students add features to Spark, which is a reasonable project, but we suggest just writing a simple relational database management system from scratch. It will not be feature rich, of course, but even writing the most rudimentary version of every aspect of a typical RDBMS will be illuminating.
>
> Finally, data modeling is a neglected and poorly taught aspect of working with databases. Our suggested book on the topic is Data and Reality: A Timeless Perspective on Perceiving and Managing Information in Our Imprecise World.
>
>
>
>
>
> ### Languages and Compilers
>
> Most programmers learn languages, whereas most computer scientists learn about languages. This gives the computer scientist a distinct advantage over the programmer, even in the domain of programming! Their knowledge generalizes; they are able to understand the operation of a new language more deeply and quickly than those who have merely learnt specific languages.
>
> The canonical introductory text is Compilers: Principles, Techniques & Tools, commonly called “the Dragon Book”. Unfortunately, it’s not designed for self-study, but rather for instructors to pick out 1-2 semesters worth of topics for their courses. It’s almost essential then, that you cherrypick the topics, ideally with the help of a mentor.
>
> If you choose to use the Dragon Book for self-study, we recommend following a video lecture series for structure, then dipping into the Dragon Book as needed for more depth. Our recommended online course is Alex Aiken’s, available from Stanford’s MOOC platform Lagunita.
>
> As a potential alternative to the Dragon Book we suggest Language Implementation Patterns by Terence Parr. It is written more directly for the practicing software engineer who intends to work on small language projects like DSLs, which may make it more practical for your purposes. Of course, it sacrifices some valuable theory to do so.
>
> For project work, we suggest writing a compiler either for a simple teaching language like COOL, or for a subset of a language that interests you. Those who find such a project daunting could start with Make a Lisp, which steps you through the project.
>
>
>
> [Compilers: Principles, Techniques & Tools](https://teachyourselfcs.com//dragon.jpg) [Language Implementation Patterns](https://teachyourselfcs.com//parr.jpg)> Don’t be a boilerplate programmer. Instead, build tools for users and other programmers. Take historical note of textile and steel industries: do you want to build machines and tools, or do you want to operate those machines?
>
> — Ras Bodik at the start of his compilers course
>
>
>
>
>
> ### Distributed Systems
>
> As computers have increased in number, they have also spread. Whereas businesses would previously purchase larger and larger mainframes, it’s typical now for even very small applications to run across multiple machines. Distributed systems is the study of how to reason about the tradeoffs involved in doing so, an increasingly important skill.
>
> Our suggested textbook for self-study is Maarten van Steen and Andrew Tanenbaum’s Distributed Systems, 3rd Edition. It’s a great improvement over the previous edition, and is available for free online thanks to the generosity of its authors. Given that the distributed systems is a rapidly changing field, no textbook will serve as a trail guide, but Maarten van Steen’s is the best overview we’ve seen of well-established foundations.
>
> A good course for which some videos are online is MIT’s 6.824 (a graduate course), but unfortunately the audio quality in the recordings is poor, and it’s not clear if the recordings were authorized.
>
> No matter the choice of textbook or other secondary resources, study of distributed systems absolutely mandates reading papers. A good list is here, and we would highly encourage attending your local Papers We Love chapter.
>
>
>
> [Distributed Systems 3rd edition](https://teachyourselfcs.com//distsys.png)
>
>
>
> ## Frequently asked questions
>
> #### What about AI/graphics/pet-topic-X?
>
> We’ve tried to limit our list to computer science topics that we feel every practicing software engineer should know, irrespective of specialty or industry. With this foundation, you’ll be in a much better position to pick up textbooks or papers and learn the core concepts without much guidance. Here are our suggested starting points for a couple of common “electives”:
>
> - For artificial intelligence: do Berkeley’s intro to AI course by watching the videos and completing the excellent Pacman projects. As a textbook, use Russell and Norvig’s Artificial Intelligence: A Modern Approach.
> - For machine learning: do Andrew Ng’s Coursera course. Be patient, and make sure you understand the fundamentals before racing off to shiny new topics like deep learning.
> - For computer graphics: work through Berkeley’s CS 184 material, and use Computer Graphics: Principles and Practice as a textbook.
>
> #### How strict is the suggested sequencing?
>
> Realistically, all of these subjects have a significant amount of overlap, and refer to one another cyclically. Take for instance the relationship between discrete math and algorithms: learning math first would help you analyze and understand your algorithms in greater depth, but learning algorithms first would provide greater motivation and context for discrete math. Ideally, you’d revisit both of these topics many times throughout your career.
>
> As such, our suggested sequencing is mostly there to help you just get started… if you have a compelling reason to prefer a different sequence, then go for it. The most significant “pre-requisites” in our opinion are: computer architecture before operating systems or databases, and networking and operating systems before distributed systems.
>
> #### Who is the target audience for this guide?
>
> We have in mind that you are a self-taught software engineer, bootcamp grad or precocious high school student, or a college student looking to supplement your formal education with some self-study. The question of when to embark upon this journey is an entirely personal one, but most people tend to benefit from having some professional experience before diving too deep into CS theory. For instance, we notice that students love learning about database systems if they have already worked with databases professionally, or about computer networking if they’ve worked on a web project or two.
>
> #### How does this compare to Open Source Society or freeCodeCamp curricula?
>
> The OSS guide has too many subjects, suggests inferior resources for many of them, and provides no rationale or guidance around why or what aspects of particular courses are valuable. We strove to limit our list of courses to those which you really should know as a software engineer, irrespective of your specialty, and to help you understand why each course is included.
>
> freeCodeCamp is focused mostly on programming, not computer science. For why you might want to learn computer science, see above.
>
> #### What about language X?
>
> Learning a particular programming language is on a totally different plane to learning about an area of computer science — learning a language is much easier and much less valuable. If you already know a couple of languages, we strongly suggest simply following our guide and fitting language acquisition in the gaps, or leaving it for afterwards. If you’ve learned programming well (such as through Structure and Interpretation of Computer Programs), and especially if you have learned compilers, it should take you little more than a weekend to learn the essentials of a new language.
>
> #### What about trendy technology X?
>
> (continues in next comment)
The resource seems very extensive such that it should suffice you plenty to be a good software engineer. I hope you don't get exhausted from it. I understand that some people can "hack" the technical interview process by memorizing a plethora of computer science and software engineering knowledge, but I hope you pay great attention to the important theoretical topics.
If you want a list of books to read over the summer to build a strong computer science and software engineering foundation, then I recommend to read the following:
The general theme of this list of books is to teach a hierarchy of abstract solutions, techniques, patterns, heuristics, and advice which can be applied to all fields in software engineering to solve a wide variety of problems. I believe a great software engineer should never be blocked by the availability of tools. Tools come and go, so I hope software engineers have strong problem solving skills, trained in computer science theory, to be the person who can create the next big tools to solve their problems. Nonetheless, a software engineer should not reinvent the wheel by recreating solutions to well-solved problems, but I think a great software engineer can be the person to invent the wheel when problems are not well-solved by the industry.
P.S. It's also a lot of fun being able to create the tools everyone uses; I had a lot of fun by implementing Promises and Futures for a programming language or writing my own implementation of Cassandra, a distributed database.
I've posted this before but I'll repost it here:
Now in terms of the question that you ask in the title - this is what I recommend:
Job Interview Prep
Junior Software Engineer Reading List
Read This First
Fundementals
Understanding Professional Software Environments
Mentality
History
Mid Level Software Engineer Reading List
Read This First
Fundementals
Software Design
Software Engineering Skill Sets
Databases
User Experience
Mentality
History
Specialist Skills
In spite of the fact that many of these won't apply to your specific job I still recommend reading them for the insight, they'll give you into programming language and technology design.
>I do have a textbook called "C: A modern approach" by King, but like I said before, I think it focuses more on the coding aspect.
Most books that focus on C are going to be about learning the language. If you want to learn low level stuff, you need to find books that focus on them (and they'll usually incidentally use C). The language itself is quite small and minimalistic in what it can do. Most heavy handed things like networking and GUIs require interaction with the OS.
Eg, if you wanted to do networking, you could use the Windows API or the POSIX socket API (POSIX being the standards that *nix systems follow -- and certain versions of Windows). Or you could use a higher level library like curl for cross platform support (and a wealth of nicer features).
>Can somebody please guide me on where to start?
Firstly, as much of a linux fanboy I am, I do want to make sure you know that you don't need to use Linux for any of the other things you wanted to learn (low-level programming, command lines, networking, etc). In fact, my OS class mostly used Linux, but we started out with a project using Windows threads (I guess the prof wanted us to see the difference from POSIX threading).
All that said, I do think Linux is something you'd want to learn and that a lot of low level things just seem more natural in Linux. But I'm biased. Linux fanboy, remember?
I'd start with downloading a Linux OS. Doesn't really matter which. I'd recommend going with Ubuntu. It's the most popular, easiest to find help with, and seems to be what most web servers are running, to boot. You can play around with the GUI for a bit if you want. It won't feel that different. Modern OSes sort of converged into the same high level ideas.
My favourite book for getting into the command line ever so slightly touching the low level aspects of OSes is Mark Sobel's A Practical Guide to Linux Commands, Editors, and Shell Programming. It will include some basic knowledge of Linux, but mostly focuses on the command line. But this is very useful because not only is the command line very practical to learn, but you'll end up learning a lot about Linux in the process (eg, by learning how everything is a file, how pipes work, etc). And arguably the command line a super big part of Linux, anyway. It makes sense as the first step.
Now, for the next step, you need to know C very well. So finish with your class, first. Read ahead if you have to. Yes, you already know if statements and functions and all, but do you understand pointers well? How about function pointers and void pointers? Do you understand how C's arrays work and the usage of pointer arithmetic? How about how arguments are passed to functions and when you'd want to pass a pointer to a function instead? As a rough skill testing question, you should implement a linked list for arbitrary data types with functions such as prepending, appending, concatenating lists, searching, removing, and iterating through the list. Make sure that your list can be allocated and freed correctly (no memory leaks).
Anyway, the next step is to learn OSes. Now, I said OSes and not Linux, because the Linux OS is a bit constrained if you want to learn low level programming (which would include a knowledge of what OSes in general do, and alternatives to OSes like Linux). But never fear, pretty much any OS book will heavily use Linux as an example of how things work and consequently explain a great deal of Linux internals. I can't recommend a class because mine was a regular university class, but Tanenbaum's Modern Operating Systems is a good book on the subject.
In particular, you can expect an OS class to not merely be focused on building an OS yourself (my class worked on aspects of OS101 to implement portions of our own OS), but also on utilizing low level aspects of existing OSes. Eg, as mentioned, my class involved working with Linux threading, as well as processes. We later implemented the syscalls for
fork
,join
, etc ourselves, which was a fascinating exercise. Nothing gets you to understand how Linux creates processes like doing it yourself.Do note, however, that I had taken a class on computer architecture (I found Computer Organization and Design a good book there, although note that I never did any of the excerises in the book, which seem to be heavily criticized in the reviews). It certainly helps in understand OSes. It's basically as low as you can go with programming (and a bit lower, entering the domain of computer engineering). I cannot say for sure if it's absolutely necessary. I would recommend it first, but it's probably skippable if you're not interested (personally, I found it phenomenally interesting).
For learning networking, Beej's book is well written. You don't need to know OSes before this or anything.
Here's my advice, as a recent grad who was first exposed to arduino in school two years ago.
Get a starter kit that has a nice amount of sensors, jumpers, resistors. Nothing worse than seeing a project online and realizing you'd have to make a trip to radio shack just for some 30 cent resistor.
Amazon - $125
Sparkfun - $60
Jameco - $99
These are all a little pricey, but if you have a decent amount of confidence that you'll stick with things, I think this is a good way to get started. You could get one of the cheaper starter kits, but pushing a button to light an LED is only impressive for like a second. After that you're going to want to start moving and sensing things and it's nice to already have that at your fingertips.
Word of advice on tutorials. If you're anything like me, the internet can be your best friend and worst enemy. There are so many tutorials for stuff like arduino with varying levels of quality. It can be super distracting to look through a long tutorial and then see 100 other things you might want to do. At this point, that's bad because you're just chasing after a cool project, not actually learning. I'd encourage you to commit to buying a book, plugging away through every single tutorial in it, and then looking online. You'll start to see quicker which projects you actually want to dive into when you know a bit more about the process.
That first kit from Amazon comes with a book that I'm sure is great. Here's the one we went through at school: Programming Arduino - $12
That said, I'd very strongly encourage you to do it. Save up some money, get one of those kits, and start learning! It's incredible rewarding, and after even a few months you'll have projects lying around that will impress pretty much anyone who doesn't know what arduino is. I really wish I had started at your age. Good luck!
I started from scratch on the formal CS side, with an emphasis on program analysis, and taught myself the following starting from 2007. If you're in the United States, I recommend BookFinder to save money buying these things used.
On the CS side:
On the math side, I was advantaged in that I did my undergraduate degree in the subject. Here's what I can recommend, given five years' worth of hindsight studying program analysis:
Final bit of advice: you'll notice that I heavily stuck to textbooks and Ph.D. theses in the above list. I find that jumping straight into the research literature without a foundational grounding is perhaps the most ill-advised mistake one can make intellectually. To whatever extent that what you're interested in is systematized -- that is, covered in a textbook or thesis already, you should read it before digging into the research literature. Otherwise, you'll be the proverbial blind man with the elephant, groping around in the dark, getting bits and pieces of the picture without understanding how it all forms a cohesive whole. I made that mistake and it cost me a lot of time; don't do the same.
While being a self taught sys admin is great, learning the internals of how things work can really extend your knowledge beyond what you may have considered possible. This starts to get more into the CS portion of things, but who cares. It's still great stuff to know, and if you know this you will really be set apart. Im not sure if it will help you directly as a sys admin, but may quench your thirst. Im both a programmer and unix admin, so I tend to like both. I own or have owned most of these and enjoy them greatly. You may also consider renting them or just downloading them. I can say that knowing how thing operate internally is great, it fills in a lot of holes.
OS Internals
While you obviously are successful at the running and maintaining of unix like systems. How much do you know about their internal functions? While reading source code is the best method, some great books will save you many hours of time and will be a bit more enjoyable. These books are Amazing
The Design and Implementation of the FreeBSD Operating System
Linux Kernel Development
Advanced Programming in the UNIX Environment
Networking
Learning the actual function of networking at the code level is really interesting. Theres a whole other world below implementation. You likely know a lot of this.
Computer Networks
TCP/IP Illustrated, Vol. 1: The Protocols
Unix Network Programming, Volume 1: The Sockets Networking API
Compilers/Low Level computer Function
Knowing how a computer actually works, from electricity, to EE principles , through assembly to compilers may also interest you.
Code: The Hidden Language of Computer Hardware and Software
Computer Systems: A Programmer's Perspective
Compilers: Principles, Techniques, and Tools
OK, a few things:
It looks like you're trying to build a shift/reduce parser, which is a form of an LR parser, for your language. LR parsers try to reduce symbols into more abstract terms as soon as possible. To do this, an LR parser "remembers" all the possible reductions that it's pursuing, and as soon as it sees the input symbols that correspond to a specific reduction, it will perform that reduction. This is called "handle finding".
> If I am correct, my Automaton is a DFA?
When the parser is pursuing a reduction, it's looking for sequences of symbols that match the right-hand sides of the relevant (to our current parse state) productions in our grammar. Since the right-hand sides of all the productions in a grammar are simple sequences, all the handle finding work can be done by a DFA. Yes, the handle recognizer of your parser is a DFA. But keep in mind that it needs to be combined with other parts to make a full parser, and your actual grammar can't be recognized with just a DFA.
In particular, you've shown the
ACTION
table for a shift/reduce parser. It determines what to do when you encounter a symbol in the input stream. But a shift/reduce parser typically needs a second table as well - theGOTO
table - that determines what to do after a reduction has taken place.One other thing that's worth mentioning: you've expressed your
ACTION
table as a plain DFA transition table. That's not necessarily wrong, but it's not commonly done that way. Instead of reducing when you reach a certain state, it's common to instead attach an action - either 'shift' or 'reduce' ('accept') - to each transition itself. So in a shift/reduce parser, your table might look more like this:| [ | ] | < | > | id | / | attr
----+-----+-----+-----+-----+------+-----+--------
0 | S1 | | S4 | | | |
1 | | | | | S2 | | R3 : Reduce Tag -> [ id ]
2 | | R3 | | | | | R7 : Reduce Tag -> < id ??? / >
4 | | | | | S5 | S10 | R9 : Reduce Tag -> < id ??? >
5 | | | | R9 | | S6 | S8 R12 : Reduce Tag -> < / id >
6 | | | | R7 | | |
8 | | | | R9 | | S6 | S8
10 | | | | | S11 | |
11 | | | | R12 | | |
Note that
R7
andR9
aren't well-formed, since multiple sequences of input tokens might cause you to reach these actions. While it would be possible to construct a shift / reduce parser this way, it's not commonly done. Typically, the DFA to recognize handles is an acyclic graph, but your have a self-transition in state 8.> What would be the best way of implementing this automaton in C++? Do I really have to make a huge array?
In general, yes, you need a big array (or, as suggested before, two big arrays). But you can use any space-saving technique you want. For example, since most entries in the
ACTION
table are invalid, one could represent that data with a sparse array data structure. Also, both The Dragon Book and Cooper and Torczon briefly cover parser-specific ways to compress those tables. For example, notice that rows 5 and 8 in your example have the same entries. Most real grammars have multiple instances of identical rows, so factoring out this commonality can save enough space that the extra complexity is worth it.---
I'm a little surprised that you're building a parser like this by hand, though. Typically people do one of two things:
You're sort of doing a mix of the two, which means you have the downsides of both approaches. You need to track all the states and transitions by hand, instead of relying on tools to automate that process, yet you don't get the flexibility of a hand-coded recursive descent parser.
If you're doing this for education's sake, then by all means proceed. I'd highly encourage you to pick up a book on parsing; I think Cooper and Torczon is a great source. But if you just want a parser that works, I'd definitely recommend using a tool or using a more direct approach, like recursive-descent.
Okay, you're definitely at the beginning. I'll clarify a few things and then recommend some resources.
I feel like I've gone off on a few tangents, but just ask for clarification if you want. I'd be happy to point you towards other resources.
I did learn all of this stuff from experience. Honestly, I had a little bit of a tough time right out of college because I didn't have much practical circuit design experience. I now feel like I have a very good foundation for that and it came through experience, learning from my peers, and lots of research. I have no affiliation with Henry Ott, but I treat his book like a bible . I refer to it just about every time I do a board design. Why? because it's packed with this type of practical information. Here's his book. I bought mine used as cheap as I could. At my previous job, they just had one in the library. Either way, it was good to have around.
So why should you care about electromagnetic compatibility (EMC)? A couple reasons:
Anyways, it's definitely worth looking at and is a huge asset if you can follow those guidelines. Be prepared to enter the workforce and see rampant disregard for EMC best practices as well as rampant EMC problems in existing products. This is common because, as I said, it's not taught and engineers often don't know what tools to use to fix it. It often leads to expensive solutions where a few extra caps and a better layout would have sufficed.
A couple more books I personally like and use:
Howard Johnson, High Speed Digital Design (it's from 1993, but still works well)
Horowitz and Hill, The Art of Electronics (good for understanding just about anything, good for finding tricks and ideas to help you for problems you haven't solved before but someone probably has)
Last thing since I'm sitting here typing anyways:
When I first got out of college, I really didn't trust myself even when I had done extensive research on a particular part of design. I was surrounded by engineers who also didn't have the experience or knowledge to say whether I was on the right path or not. It's important to use whatever resources you have to gain experience, even if those resources are books alone. It's unlikely that you will be lucky and get a job working with the world's best EE who will teach you everything you need to know. When I moved on from my first job after college, I found out that I was on the right path on many things thanks to my research and hard work. This was in opposition to my thinking before then as my colleagues at my first job were never confident in our own ability to "do EE the right way" - as in, the way that engineers at storied, big companies like Texas Instruments and Google had done. Hope that anecdotal story pushes you to keep going and learning more!
Any engineering job is going to have a significant amount of domain knowledge that is specific to that company's products, services, or research. Getting an engineering degree is just the beginning. Once you get a job at a company, you will need to learn a shit load of new terms, IP, history, and procedures that are specific to that company. It's the next level of your education, and will take years to fully assimilate. School doesn't teach you anywhere near enough to walk into most engineering jobs and be independently productive. You are there to learn as much as do. The senior engineers are your teachers and gaining their knowledge and experience is the key to building a successful career. You need to look at them as a valuable resource that you should be taking every opportunity to learn from. If you don't understand what they are saying, then ask, take notes, and do independent research to fill in your knowledge gaps. Don't just dismiss what they say as techo-babble.
!!!!!! TAKE THIS TO HEART !!!!! - The single biggest challenge you will have in your engineering career is learning how to work well with your peers, seniors, and managers. Interpersonal skills are ABSOLUTELY critical. Engineering is easy: Math, science, physics, chemistry, software, electronics.... all of that is a logical, and learnable, and a piece of cake compared to dealing with the numerous and often quirky personalities of the other engineers and managers. Your success will be determined by your creativity, productivity, initiative, and intelligence. Your failure will be determined by everyone else around you. If they don't like you, no amount of cleverness or effort on your part will get you ahead. Piss off your peers or managers, and you will be stepped on, marginalized, criticized, and sabotaged. It's the hard truth about the work world that they don't teach you in school. You aren't going anywhere without the support of the people around you. You are much more likely to be successful as a moron that everyone loves, than a genius that everyone hates. It sucks, but that's the truth.
You are the new guy, you have lots to learn, and that is normal and expected. It's going to be hard and frustrating for a while, but you will get the hang of it and find your footing. Learn as much as you can, and be appreciative for any help or information that you can get.
As for digitizing a signal, it is correct that you should stick with powers of 2 for a number of technical reasons. At the heart of the FFT algorithm, the signal processing is done in binary. This is part of the "Fast" in Fast Fourier Transforms. By sticking with binary and powers of 2, you can simply shift bits or drop bits to multiply or divide by 2, which is lightning fast for hardware. If you use non powers of 2 integers or fractional sampling rates, then the algorithm would need to do extensive floating point math, which can be much slower for DSPs, embedded CPUs, and FPGAs with fixed-point ALUs. It's about the efficiency of the calculations in a given platform, not what is theoretically possible. Power of 2 sample rates are much more efficient to calculate with integer math for almost all digital signal processing.
I highly recommend reading the book "The Scientist and Engineer's Guide to Digital Signal Processing" by Steven W. Smith. It is by far the best hand-holding, clearly-explained, straight-to-the-point, introductory book for learning the basics of digital signal processing, including the FFT.
You can buy the book from Amazon [here.] (https://www.amazon.com/Scientist-Engineers-Digital-Signal-Processing/dp/0966017633/ref=sr_1_1?ie=UTF8&amp;qid=1492940980&amp;sr=8-1&amp;keywords=The+Scientist+and+Engineer%27s+Guide+to+Digital+Signal+Processing) If you can afford it, the physical book is great for flipping though and learning tons about different signal processing techniques.
Or you can download the entire book in PDF form legally for free here. The author is actually giving the book away for free in electronic form ( chapter by chapter ).
Chapter 12 covers FFTs.
I've been playing around with writing a programming language and compiler in my spare time for a while now (shameless plug: http://eigenstate.org/myrddin.html; source: http://git.eigenstate.org/git/ori/mc.git). Lots of fun, and it can be as shallow or as deep as you want it to be.
Where are you with the calculator? Have you got a handle on tokenizing and parsing? Are you intending to use tools like lex and yacc, or do you want to do a recursive descent parser by hand? (Neither option is too hard; hand written is far easier to comprehend, but it doesn't give you any correctness guarantees)
The tutorials I'd suggest depend on exactly where you are and what you're trying to do. As far as books, the three that I would go with are, in order:
For basic recursive descent parsing:
For general compiler knowledge, here are the books that I'd recommend in order:
This is a great book. It's fairly modern, written reasonably well, covers most topics you'll need. Just stick to the ML version, because it's pretty clear that Appel wrote first in that language, and then did pretty poor translations to C and Java for the other books.
Another good compiler textbook. Covers language design, a large number of various programming models, and so on. Too light on optimization, IMO.
This is the classic compiler book. It's one of the oldest, and it focuses a bit heavily on parsing, but it covers a large number of topics. It's, in my opinion, dense but very well written and surprisingly easy to follow.
This book's main claim to fame is that it's a decent overview, but it's available online for free.
This is the fastest zero-to-compiler book out there, but it's ridiculously simplistic, and doesn't lead to any next steps. It just tells you how to generate terrible machine code. But it's hands on.
And for later,
More a collection of papers on optimization than a textbook, it's a good summary of relatively advanced techniques to explore once you have a compiler that works and does what you want, and generates tolerably good code.
The Stanford Algorithm book is complete overkill in my opinion do NOT read that book. That's insane. Read it when you've been doing programming for a while and have a grasp of how it even applies.
Here's my list, it's a "wanna be a decent junior" list:
&#x200B;
Reasoning: So, the first book is to give you a sense of all that's out there. It's short and sweet and primes you for what's ahead. It helps you understand most of the basic industry buzz words and whatnot. It answers a lot of unknown unknowns for a newbie.
Next is just a list languages off the top of my head. But you can pick anything, seriously it's not a big deal. I did put Java first because that's the most popular and you'll like find a mountain of resources.
Then after some focused practice, I suggest grabbing some SQL. You don't need to be an expert but you gotta know about DBs to some degree.
Then I put an analysis book that's OOP focused. The nifty thing about that book, is it breaks into design patterns nicely with some very simple design patters to introduce you to design patterns and GRASP.
Then I put in a legit Design Patterns book that explains and explores design patterns and principles associated with many of them.
Now that you know how code is structured, you're ready for a conversation about Architecture. Clean architecture is a simple primer on the topic. Nothing too crazy, just preps you for the idea of architecture and dealing with it.
Finally, refactoring is great for working devs. Often your early work will be focused on working with legacy code. Then knowing how to deal with those problems can be helpful.
FINAL NOTE: Read the soft skills books first.
The reason for reading the soft skills books first is it helps develop a mental framework for learning all the stuff.
Good luck! I get this isn't strictly computer science and it's likely focused more toward Software Development. But I hope it helps. If it doesn't. My apologies.
He sounds like a younger version of myself! Technical and adventurous in equal measure. My girlfriend and I tend to organise surprise activities or adventures we can do together as gifts which I love - it doesn't have to be in any way extravegant but having someone put time and thought into something like that it amazing.
You could get something to do with nature and organise a trip or local walk that would suit his natural photography hobby. I love to learn about new things and how stuff works so if he's anything like me, something informative that fits his photography style like a guide to local wildflowers or bug guide. I don't know much about parkour but I do rock climb and a beginners bouldering or climbing session might also be fun and something you can do together.
For a more traditional gift Randall Munroe from the web comic XKCD has a couple of cool books that might be of interest - Thing Explainer and What If. Also the book CODE is a pretty good book for an inquisitive programmer and it isn't tied to any particular language, skillset or programming level.
Agree. It depends on what you want to know, and how much you're willing to commit to learning. It's a big world. Code is a nice book if you want a very very simple explanation of the basics of bits and bytes and logic gates. It might be a good place to start, though it's intended for a non-technical audience and you may find it a little TOO simple. A proper digital systems book will go in to much more detail about digital logic (AND gates, flip-flops etc.). You might be surprised just how easy to learn the fundamentals are. I learned from Tocci which I found to be excellent, but that was a long time ago and I'm sure there's many other good ones around.
That's pretty low level digit circuits though. If you are really serious about learning computer architecture, I'd highly recommend Patterson and Hennssey . It covers the guts of how processors execute instructions, pipelining, caches, virtual memory and more.
If you're more interested in specific, modern technologies... then obviously Wikipedia, or good tech review sites. Especially reviews that focus on major new architectures. I remember reading lots of good in depth stuff about Intel's Nehalem architecture back when it was new, or nvidia's Fermi. There's a wealth of information out there about CUDA and GPU computing which may give you a sense of how GPUs are so different to CPUs. Also when I first started learning many years ago, I loved my copy of Upgrading and Repairing PCs , great for a less technical, more hobbyist perspective.
Lastly, ask questions! For example, you ask about DDR vs GDDR. Deep inside the memory chips themselves, actually not a great deal of difference. But the interface between the memory and the processor are quite different, they're designed for very different purposes. I'm simplifying here but CPUs have relatively low levels of parallism, they tend to operate on small units of memory (say a single value) at a time, they have quite unpredictable access patterns so low latency is essential, and the cores often work tightly together so coherency has to be maintained. With GPUs, they have a very predictable access pattern, so you can load much larger chunks at a time, latency is less important since you can easily keep your processors busy while memory is streamed in, and the GPUs many many tiny processors for the most part all work on separate words of memory, so coherence usually does not need to be maintained and they have much less need for caches.
The "L" (Level) naming for caches is quite simple. Memory that is closer to the core is faster to access. Generally each core has it's own L1 and L2, with L2 being slightly slower but there's more of it, and all cores share an L3, slower still but way more of it. Memory on the cpu is made out of transistors and is super fast but also takes up alot of space. Look how big the L3 is (here)[http://www.anandtech.com/show/8426/the-intel-haswell-e-cpu-review-core-i7-5960x-i7-5930k-i7-5820k-tested] and that's just 20MB. external ram is obviously much slower, but it is made out of capacitors and has much higher densities.
If you want to just know buzzwords to throw around, spend a bunch of time clicking around on Wikipedia, and watch stuff like Crash Course on YouTube. It's easy to absorb, and you'll learn stuff, even if it's biased, but at least you'll be learning.
If you want to become SMARTER, one of my biggest pieces of advice is to either carry a notebook with you, or find a good note taking app you like on your phone. When someone makes a statement you don't understand, write it down and parse it up.
So for instance, write down "Social Democracy", and write down "The New Deal", and go look them up on simple.wikipedia.com (Put's all of it in simplest language possible), it's a great starting point for learning about any topic, and provides you a jumping board to look more deeply into it.
If you are really curious about starting an education, and you absolutely aren't a reader, some good books to start on are probably:
"Thing Explainer: Complicated Stuff in Simple Words" by Randall Munroe
"A Short History of Nearly Everything" by Bill Bryson
"Philosophy 101" by Paul Kleinman, in fact the ____ 101 books are all pretty good "starter" books for people that want an overview of a topic they are unfamiliar with.
"The World's Religions" by Huston Smith
"An Incomplete Education" by Judy Jones and Will Wilson
Those are all good jumping off points, but great books that I think everyone should read... "A History of Western Philosophy" by Bertrand Russell, "Western Canon" by Harold Bloom, "Education For Freedom" by Robert Hutchins, The Norton Anthology of English Literature; The Major Authors, The Bible.
Read anything you find critically, don't just swallow what someone else says, read into it and find out what their sources were, otherwise you'll find yourself quoting from Howard Zinn verbatim and thinking you're clever and original when you're just an asshole.
Hi,
do you want to become a computer scientist or a programmer? That's the question you have to ask yourself. Just recently someone asked about some self-study courses in cs and I compiled a list of courses that focuses on the theoretical basics (roughly the first year of a bachelor class). Maybe it's helpful to you so I'm gonna copy&paste it here for you:
I think before you start you should ask yourself what you want to learn. If you're into programming or want to become a sysadmin you can learn everything you need without taking classes.
If you're interested in the theory of cs, here are a few starting points:
Introduction to Automata Theory, Languages, and Computation
The book you should buy
MIT: Introduction to Algorithms
The book you should buy
Computer Architecture<- The intro alone makes it worth watching!
The book you should buy
Linear Algebra
The book you should buy <-Only scratches on the surface but is a good starting point. Also it's extremely informal for a math book. The MIT-channel offers many more courses and are a great for autodidactic studying.
Everything I've posted requires no or only minimal previous education.
You should think of this as a starting point. Maybe you'll find lessons or books you'll prefer. That's fine! Make your own choices. If you've understood everything in these lessons, you just need to take a programming class (or just learn it by doing), a class on formal logic and some more advanced math classes and you will have developed a good understanding of the basics of cs. The materials I've posted roughly cover the first year of studying cs. I wish I could tell you were you can find some more math/logic books but I'm german and always used german books for math because they usually follow a more formal approach (which isn't necessarily a good thing).
I really recommend learning these thing BEFORE starting to learn the 'useful' parts of CS like sql,xml, design pattern etc.
Another great book that will broaden your understanding is this Bertrand Russell: Introduction to mathematical philosophy
If you've understood the theory, the rest will seam 'logical' and you'll know why some things are the way they are. Your working environment will keep changing and 20 years from now, we will be using different tools and different languages, but the theory won't change. If you've once made the effort to understand the basics, it will be a lot easier for you to switch to the next 'big thing' once you're required to do so.
One more thing: PLEASE, don't become one of those people who need to tell everyone how useless a university is and that they know everything they need just because they've been working with python for a year or two. Of course you won't need 95% of the basics unless you're planning on staying in academia and if you've worked instead of studying, you will have a head start, but if someone is proud of NOT having learned something, that always makes me want to leave this planet, you know...
EDIT: almost forgot about this: use Unix, use Unix, and I can't emphasize this enough: USE UNIX! Building your own linux from scratch is something every computerscientist should have done at least once in his life. It's the only way to really learn how a modern operating system works. Also try to avoid apple/microsoft products, since they're usually closed source and don't give you the chance to learn how they work.
There are a ton of books, but i guess the main question is: what are you interested in? Concepts or examples? Because many strong conceptual books are using examples from java, c++ and other languages, very few of them use php as example. If you have the ability to comprehend other languages, then:
http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_1?ie=UTF8&amp;qid=1322476598&amp;sr=8-1 definetly a must read. Beware not to memorize it, it is more like a dictionary. It should be pretty easy to read, a little harder to comprehend and you need to work with the patterns presented in that book.
http://www.amazon.com/PHP-5-Objects-Patterns-Practice/dp/1590593804 - has already been mentioned, is related directly to the above mentioned one, so should be easier to grasp.
http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/ref=sr_1_1?ie=UTF8&amp;qid=1322476712&amp;sr=8-1 - one of the most amazing books i have read some time ago. Needs alot of time and good prior knowledge.
http://www.amazon.com/Refactoring-Improving-Design-Existing-Code/dp/0201485672/ref=sr_1_4?ie=UTF8&amp;qid=1322476712&amp;sr=8-4 - another interesting read, unfortunatelly i cannot give details because i haven't had the time to read it all.
You need to show that you know your stuff. Just because you're doing something more applied like Network Security in grad school doesn't mean that you won't have a base level of knowledge you're expected to understand. In that case, you need to learn some basic stuff a CS student at a good school would know. I'm not "dumbing down" anything on my list here, so if it seems hard, don't get discouraged. I'm just trying to cut the bullshit and help you. (:
Again, don't be discouraged, but you'll need to work hard to catch up. If you were trying for something like mathematics or physics while doing this, I'd call you batshit insane. You may be able to pull it off with CS though (at least for what you want to study). Make no mistake: getting through all these books I posted on your own is hard. Even if you do, it might be the case that still no one will admit you! But if you do it, and you can retain and flaunt your knowledge to a sympathetic professor, you might be surprised.
Best of luck, and post if you need more clarification. As a side note, follow along here as well.
Netsec people feel free to give suggestions as well.
Self-taught software engineer checking in to add on to this.
Everything u/TOM_BRADYS_PET_GOAT said is true.
I'll add a few specific resources:
Computer science fundamentals are really scary and overwhelming if you're self-taught. I'd highly recommend reading The Imposter's Handbook to get started with this topic. You'll want more in-depth material afterwards on each of the various subtopics, but this book is absolutely fantastic as a (surprisingly deep) introduction to all the concepts that's framed specifically to get self-taught programmers up to speed.
After you're familiar with the concepts at a conceptual level, and it's time to just get down to dedicated practice, Cracking the Coding Interview will be an invaluable resource. This book exists for the sole purpose of helping people become better at the types of questions most commonly asked during coding interviews. It's not just a list of a bunch of questions with solutions, it actually explains the theory in-depth, provides drill and smaller practice questions, as well as questions designed to emulate specific interview scenarios at real tech companies like Google, Microsoft, Amazon, etc. It'll even talk about the interview process at those companies outside of just the questions and theory behind them.
As a more general resource that you'll reach for repeatedly throughout your career, I'd recommend The Complete Software Developer's Career Guide. This book covers everything. How to learn, how to interview, how to negotiate salary, how to ask for raises, how to network, how to speak at conferences and prepare talks, how to build your personal brand, how to go into business for yourself if you want, etc. and that's just scratching the surface of what's covered in that book. I did't even buy this book until I was 10 years into my career and it's still very insightful.
And lets not forget, being a good developer isn't just a matter of making things that work, it's a matter of writing code that readable, extensible, and a pleasure for other developers to work on. So to this end, I'd recommend any developer read both Clean Code and Clean Architecture: A Craftsman's Guide to Software Structure and Design
I'm an embedded software developer who used to use C and now primarily works with C++.
Learning C is relatively easier when you start off and gives you a better appreciation of memory handling and it's complexities than C++ does in my opinion. The C knowledge will also transfer well to C++.
C++ is definitely a much more powerful language and you can get your tasks done quicker with it. There are a lot of things to learn in C++, but you can get them with time. A lot of embedded processors, particularly the ARM based ones, support C++ as well, so that is not a problem
Like someone else mentioned though, embedded development relies on a good knowledge of programming as well as a good understanding of computer architecture.
Here's a nice book I've read which is useful for new embedded developers - Making Embedded Systems: Design Patterns for Great Software https://www.amazon.com/dp/1449302149/ref=cm_sw_r_cp_apa_i_MuFhDb1WWXK3W
What field do you want to specialize in? Embedded? Web? Mobile?
The best way to learn is by practicing, but if you want more of an abstract, design level read, there are lots of options.
I'm have a web background, so here's three that I've read recently as examples.
I enjoyed this book on microservice design and I think everyone who uses OOP should at least familiarize themselves with the common OOP design patterns.
If you are into JavaScript, Eloquent JavaScript is my go-to for a good mix of summary/detail of the language. It's well written, and comes with fun exercises at the end of each chapter to help solidify your understanding of each concept.
I'm sure there are other great books, but these are some of my favorites so far.
Agreed. There are plenty of resources out there that will help you understand design patterns. If you're new to the concept, I would recommend Head First: Design Patterns, it might be based on Java, but the examples are simple to understand and can mostly apply to PHP as well. When you feel like you've grasped the basic concepts of design patterns, you can move on to more advanced texts, like Martin Fowler's Patterns of Enterprise Design - this is a great reference for a lot of the more common patterns. There is also Refactoring: Improving the Design of Existing Code. These are great investments that will help you with any project you work on, and will help you if you decide to use a framework like Zend which uses design patterns very heavily.
It largely depends on which Computer Science degree you are going to do. There can be some that focus heavily on software and very little on hardware and some that get a nice balance between the two. If the degree is going to focus on hardware I would recommend reading up on the underlying logic of a computer and then reading this book (Inside the machine). ITM isn't a very technical book(I would label it as the computer science equivalent of popular science) but it gives a nice clear overview of the what happens in a processor.
When it comes to programming, I would recommend starting with Java and Eclipse. Java gets quite a bit of hate but for a newcomer, I think Java would be easier to grasp than the likes of C/C++. C/C++ are nice languages but a newcomer may find their error messages a little bit obscure and may get confused with the nitty-gritty nuances of the languages.
Though the one thing you should realise is that programming is a skill that isn't confined to one language. If you understand the basic concepts of recursion, arrays, classes, generics/templates, inheritance, etc. you can apply this knowledge to almost any language. Ideally i would recomend two books on programming (Algorithmics) and (Introduction to Algorithms). Algorithmics is another books I would label as the cs equivalent to popular science but the early chapters give a nice overview of exactly what algorithms actually are. Introduction to Algorithms is a more technical book that I would recommend to someone once they know how to program and want a deeper understanding of algorithms.
The rest is personal preference, personally I prefer to use a Unix machine with Sublime Text 2 and the command line. Some will try to convince you to use Vim or Emacs but you should just find whichever you are most comfortable with.
By biology I don't mean what they teach you in college or med-school, I mean understanding the basic processes (physiology-esque) that underlie living things, and understanding how those systems interact and build into more complex systems. Knowing the names of organs or parts of a cat is completely worthless, understanding the process of gene-activation, and how that enables living organisms to better adapt to their environments, especially, for instance, for stress factors activating responses due to new stimuli, can be very valuable, especially as a function of applied neurology.
Also, what we call biology and medicine today will be so pathetically obsolete in 10 years as to be comical, similar to how most mechanics can rebuild a carburetor, but not design and build a hybrid drivetrain, complete with controller software.
Economics and politics are controversial, but it is a question of seeing the underlying forces that is important, similar to not understanding how gravity works, but still knowing that dropping a lead ball will accelerate downwards at 9.78m/s^2. This is a field that can wait till later though, and probably should.
For systems analysis, I'm sorry but I can't recommend anything. I tended to learn it by experience more than anything.
I think I understand what you are looking for better now though, and think you might be headed in the right direction as it is.
For CS I highly recommend the dragon book, and design patterns, and if you need ASM The worst designed website ever.
For the other fields I tend to wiki subjects then google for papers, so I can't help you there. :(
Best of luck in your travels however! :)
edit: For physics, if your math is bad get both of his books. They break it down well. If your math is better try one of wittens books, but they are kinda tough, guy is a fucking genius.
also, Feynman QED is great, but his other book is awesome just as a happy intellectual read
also try to avoid either kaku and hawking for anything more complicated than primers.
edit no. 9: mit's ocw is win itself.
edit no. 10: Differential equations (prolly take a class depending on your math, they are core to almost all these fields)
It kind of sounds like you'd be good just getting a textbook. I think any book will be fine since you mainly just want questions (and presumably answers), but try to find one that implements code in a language that you're comfortable with, or that you want to learn.
There are a lot of different "final year" DSP courses, but it sounds like you want something covering the fundamentals rather than anything too advanced. I started off with The Scientist & Engineer's Guide to Digital Signal Processing and then used Signals and Systems for my first undergraduate course, but we used it largely because he co-authored it. I would recommend scouring the web for some free books though. There are books like ThinkDSP popping up that seem pretty neat.
Edit: Oppenheim is always mentioned also.
So, I think I am the kind of person you are describing. I have a pretty great job, so I usually just buy my own technology stuff. Not only that, but I am rather picky with technology stuff, so even if someone did get me something like that, I would act excited and happy, but in the back of my mind I would secretly wishing they did more research before buying the thing that they did.
That said! If I were buying for me, I would go with something like the hyperbole and a half book (http://www.amazon.com/Hyperbole-Half-Unfortunate-Situations-Mechanisms/dp/1451666179), or something by the creator of the XKCD comics (http://www.amazon.com/Thing-Explainer-Complicated-Stuff-Simple/dp/0544668251/ref=sr_1_1?s=books&amp;ie=UTF8&amp;qid=1449202837&amp;sr=1-1&amp;keywords=xkcd).
If it has to be tech related, there is always http://www.thinkgeek.com - they have tons of fun, nerdy gifts that I would like. All of these things combined are probably way less than $1,000. That is just a lot of money.
Another random suggestion - if they were ever into pokemon, this is a dream come true: Gym Badges! https://www.etsy.com/listing/128753018/pokemon-kanto-gym-badges-gen-1?utm_source=google&amp;utm_medium=cpc&amp;utm_campaign=shopping_us_b-accessories-patches_and_pins-pins_and_pinback_buttons&amp;utm_custom1=a91c90fb-48c1-4024-87f9-fb14aadac033&amp;gclid=CjwKEAiA7f-yBRDAgdv4jZ-78TwSJAA_WdMaz_NXsXrFH_0f-Mb6ovmqqcCHto-b7S6zm1DplssHQhoCNuvw_wcB
-How long after completing the camp did it take for you to get hired?
Within 10 days.
-Who do you work for?
~16 person consulting company in the bay.
-Did you have any prior coding experience before enrolling at the camp?
Yes full year of self study and some classes in high school and college.
-Are you happy with your current earnings?
I was untill I realized the cost of living where I am and how much Uncle Sam takes.
-Do employers consider the camps as sufficient to warrant upward mobility potential?
There is another person in my company that also went to my code camp. Our camp (app academy) discouraging revealing our participation in the camp till late in the hiring process.
-Best strategy to get accepted?
Apply.
What kind of students are they looking for? Can I, with my limited background become successful?
In my experience you can have the ability to think in that way or not.
What sort of students are most successful both during the camp and then in the job search following the camp?
The ones you would expect.
-Recommendations for pre-study?
Keep trying different tools until you really find something that works.
A great book is http://www.amazon.com/But-How-Know-Principles-Computers/dp/0615303765.
If i was gonna put forward one online resource it would be http://www.tutorialspoint.com/.
If you have a little time try some of the assembler stuff.
One final tip. There will be a time (or thousands) where you will be staring at some concept and drawing a blank. It may feel like nothing is happening. It may well be that lots of things are and you just gotta process the concepts.
Good luck.
>Do you know of a book or a website that teach useful optimization techniques?
I'm only an enthusiast, I've never needed really optimised code (truth be told, most of what I do day to day is quick-and-dirty, appallingly inefficient scripts, because it "needs to be done yesterday"), so I can't give you a canonical list, but here's what I do know;
For books, there's this /r/compsci reddit thread from a while ago. Something on compilers like The Dragon Book might be your best bet, especially the optimisation chapter. And obviously
jotux
's "How Computers Do Maths" - though never having even flicked through it, I can't say if it's any good.You could try your luck in /r/ReverseEngineering (or the quieter /r/asm and /r/compilers), there are a lot of low-level guys there who'd know a lot more than me. You could also try /r/compsci or /r/algorithms, although they'd be more useful for algorithms than for optimisation. And of course, /r/quantfinance.
I would suggest that the carlh programming guides is not a bad idea then!
I would heavily suggest learning C well - this is a language that was designed to stay close to the hardware while being portable, and is a very small language. So, buy a copy of the K&R Book, ever C programmer has one.
Then, Patterson's book is a tome for computer engineering. It'll show you assembly, all the way down to NAND gates.
I would suggest you start by watching and working through Berkeley's CS61C course. It's the logically second course in CS, and after a quick overview of C it dives into the machine itself. Website here, videos here. Also, Dan Garcia is an excellent lecturer.
Once you have all the machine details down, you'll probably feel hampered by your actual program wizardry. This is where you start looking into algorithms and data structures. Your go-to guide here is probably Cormen's Introduction to Algorithms since it handles both data structures and algorithms. It's definitely more of a theoretical/CS-ey book, so if this is not what you want, then Head First Java will teach you a new language (and learning more languages is one of the best ways to grow as a programmer!) and also do many data structures. In fact, you can get both those books and have the light side and the serious side of programming books.
At this point you should be well equipped to go off in whatever direction you want with programming. Start contributing to open source projects! Find things that interest you and try to solve problems! Being a part of the programming community will be your biggest aid in both learning programming and starting to make money through it. People pay for programmers that they know can deliver, and success in the open source world means a lot, and you don't need to go to school for it to get to this point!
Lastly, many CS/programming folks hang out on IRC. If you have questions, find the appropriate IRCS channels and go talk to people. Good luck and welcome to programming!
Thanks for the great reply!
The Lessons In Electric Circuits was already on my radar, and I believe will be the first resource in electronics I go through after hearing it beat in my head yet again!
That DSP book I have not seen. I just grabbed a copy and it looks like a great text. I mentioned this post to a fellow electronics enthusiast and he loaned me a copy of a book he said was exceptional for entry into the world of DSP: http://www.amazon.com/Understanding-Digital-Signal-Processing-3rd/dp/0137027419/ DSP is pretty complex, More than likely I will go through both to fully absorb this topic.
EMRFD sounds like a cookbook. Given that its by ARRL I expect its quality to be superb. I am not against these type of text, I have a few already, however I'd rather have more of the theory at this point. I imagine this will be great once I am satisified with the basics, and want to build an actual radio with its operation noted.
When you say you want to make a simplistic OS, do you mean you want to put together a simplistic Linux distro, or you want to code an OS from scratch?
In former case DSL might be your friend (Damn Small Linux):
http://www.damnsmalllinux.org/. There are other similar distros that might fit under 25 megabyte, google is your friend. As already mentioned by somebody else linuxfromscratch.org is another option. If you go with LFS, you want to look at minimal libraries instead of standard GNU libs for C library and standard system applications. For example you would like to get https://www.uclibc.org/ for c library (or some other similar, there are few) and say busybox https://www.busybox.net/ for your system apps. There other "micro" versions of some popular software (X server etc) which you might wish to consider if you are going completely custom route.
If I was you, I wouldn't do it, since many others have same thoughts as you and have already put effort and hours into making it, so why repeating all that work if you can just get a distro like DSL and install it and simply customize/change what you dislike. If you want it as an educational experience than certainly go for it, LFS might be very rewarding in that case.
If you want to code your own kernel and OS, than you might wish to take a CS class about OS:s, Tanenbaum is your eternal friend:
https://www.amazon.com/Modern-Operating-Systems-Andrew-Tanenbaum/dp/013359162X/ref=sr_1_1?ie=UTF8&amp;qid=1498831929&amp;sr=8-1&amp;keywords=andrew+tanenbaum
https://www.amazon.com/Structured-Computer-Organization-Andrew-Tanenbaum/dp/0132916525/ref=sr_1_4?ie=UTF8&amp;qid=1498831929&amp;sr=8-4&amp;keywords=andrew+tanenbaum
And don't forget Google ...
And heck! If you don't have an Arduino just yet, you can try one out virtually first! I like 123D Circuits by AutoDesk; however, there are many other simulators with Arduinos built in. Google them! :-)
Like it but don't like the $$$? You can make your own! There are many tutorials online for making a bare bones Arduino with cheap* electronics components.
I really like @schorhr book suggestions. To add on, the following books are great for Arduino beginners: Programming Arduino: Getting Started with Sketches & Make: Getting Started with Arduino. Also, great tutorials can be found here: tronixstuff Arduino Tutorials & Ladyada's Arduino Tutorials.
Good luck!
Basically any SRE advice for a normal service but replace/compliment HAproxy / nginx / ingress controller / ELB with the Tor daemon / OnionBalance.
I run Ablative Hosting and we have a few people who value uptime over anonymity etc and so we follow the usual processes for keeping stuff online.
Have multiples of everything (especially stuff that doesn't keep state), ensure you have monitoring of everything from connections, memory pressure, open files, free RAM etc etc.
Just think of the Tor daemon onion service as just a TCP reverse proxy, with load-balancing capability and then follow any other advice when it comes to building reliable infrastructure;
Once you've got to grips with running a reliable service then you can start layering your Onion reverse proxy / load balancing on top.
All of this aside, check /u/alecmuffett's "Onions that don't suck" repo for examples that are both well setup and stable.
TL;DR; Tor is just a TCP reverse proxy with load balancer capabilities go learn some DevOps dodads
Edit: As per Alec's comment - clarify that Tor is technically a reverse proxy with load-balancing capabilities rather than a straight up TCP load balancer.
Interesting!
Looks to me that you can "feel" what good code looks like but you're not able to rationalise it enough for you to write it on your own.
Couple of suggestions:
When you see elegant code, ask yourself: why is it elegant? Is it because is simple? Easy to understand? Try to recognise the desired attributes so you can try to reproduce on your code.
Try to write really short classes/methods that have only one responsibility. For more about this, search for Single Responsibility Principle.
How familiar are you with unit testing and TDD? It should help you a lot to write better designed code.
Some other resources:
Since you know Java I would suggest that you a read one of the best programming books ever written: [K&R The C Programming language] (http://www.amazon.com/The-Programming-Language-Brian-Kernighan/dp/0131103628/), this book was written by the people who made the C language and it's one of the best books ever written. It is a must read for every C programmer. [Computer Systems: A Programmer's Perspective (3rd Edition)] (http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/013409266X/) is a great book to learn about computer systems. But I would recommend [Operating Systems Design and Implementation (3rd Edition)] (http://www.amazon.com/Operating-Systems-Design-Implementation-Edition/dp/0131429388) because it has some minix source code which will go really well with learning C.
Best of luck buddy :)
Here's my list of the classics:
General Computing
Computer Science
Software Development
Case Studies
Employment
Language-Specific
C
Python
C#
C++
Java
Linux Shell Scripts
Web Development
Ruby and Rails
Assembly
This is a great question! It's also one that every serious CS person will ask at some point. As others here have noted, to really understand this question you must understand how compilers work. However, it isn't necessary to understand the gory details of compiler internals to see what a compiler does for you. Let's say you have a file called hello.cpp that contains the quintessential C++ program
include <iostream>
The first thing the compiler does is called preprocessing. Part of this process includes expanding the
#include
statements into their proper text. Assuming you are using gcc, you can have it show you the output of this stepgcc -E -o hello.pp hello.cpp
For me, the hello.cpp files explodes from 4 lines to nearly 18000! The important thing to note here is that the contents of the iostream library header occur before the
int main
lines in the output.The next several step for the compiler are what you will learn about in compiler design courses. You can take a peek at gcc-specific representations using some flags as discussed on SO. However, I pray you give heed. For there be dragons!
Now let's take a look at the compiler's output. To do this, I am going to not
#include
anything so the output is very simple. Let's use a file called test.cpp for the rest of the tests.int main() {
int i = 3, j = 5;
float f = 13.6 / i;
long k = i<<j;
}
To see the compiler's output, you can use
g++ -S -masm=intel test.cpp
The
-S
flag asks gcc to just output the generated assembly code and-masm=intel
requests the intel dialect (by default, gcc uses the AT&T dialect, but everyone knows the intel one is superior. :) ) The output on my machine (ignoring setup and teardown code) is outlined below.push rbp
mov rbp, rsp
/ int i = 3, j = 5; /
mov DWORD PTR [rbp-20], 3
mov DWORD PTR [rbp-16], 5
/ float f = 13.6 / i; /
pxor xmm0, xmm0
cvtsi2sd xmm0, DWORD PTR [rbp-20]
movsd xmm1, QWORD PTR .LC0[rip]
divsd xmm1, xmm0
movapd xmm0, xmm1
cvtsd2ss xmm2, xmm0
movss DWORD PTR [rbp-12], xmm2
/ long k = i<<j; /
mov eax, DWORD PTR [rbp-16]
mov edx, DWORD PTR [rbp-20]
mov ecx, eax
sal edx, cl
mov eax, edx
cdqe
mov QWORD PTR [rbp-8], rax
/ implicit return 0; /
mov eax, 0
pop rbp
ret
There are lots of details to learn in here, but you can generally see how each simple C++ statement translates into many assembly instructions. For fun, try compiling that program with the optimizer turned on (with g++, you can use
-O3
). What is the output?There is still much to see from the binary that is assembled. You can use
nm
andobjdump
to see symbols orldd
to see what other libraries were (dynamically) linked into the executable. I will leave that as an exercise for the reader. :)As someone else mentioned, the Hennessy and Patterson Computer Architecture: A Quantitative Approach, and the Patterson and Hennessy Computer Organization and Design are the de facto standards (I used both in my Comp. Eng. undergrad) and are really fantastic books (the latter being more "software" oriented so to speak).
They are not EE textbooks (as far as I know) but they are text books nonetheless. A great book I found that is slightly dated but gives a simplified review of many processors is Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture which is less technical but I enjoyed it very much all the same. It is NOT a textbook, and I highly, highly recommend it.
Hope that helps!
The object you're interested in is the call graph of the program. As you've observed, this is a DAG iff there is no recursion in the program. If function A calls B and B calls A, this is called mutual recursion and still counts as recursion :)
A related graph is the control flow graph (CFG) of a function. Again, the CFG is a DAG iff the function doesn't contain loops.
An execution trace of a program can certainly be represented as a DAG. In fact, since an execution trace does not have any branching, it is just a straight line! However you are very rarely interested in a single trace through a program -- you usually want to reason about all the traces. This is more difficult because if you have any looping structure in the global CFG, there is no (obvious) upper bound on the size of a trace, and so you can't capture them all with a finite structure that you can map into SMT.
Every program can be put into SSA form. The trick is that when you have joins in the control flow graph (such as at the head of a loop), you need a phi node to fix up the SSA indices. If you don't have it already, the dragon book is pretty much required reading if you're interested in any kind of program analysis.
In general, if you have a loop free control flow graph of any kind (a regular CFG or a call graph), then you can translate that graph directly into SAT or SMT in a fairly obvious way. If you have loops in the graph then you can't do this (because of the halting problem). To reason about programs containing loops, you're going to need some more advanced techniques than just symbolic execution. The big names in verification algorithms are:
A good overview of the field is this survey paper. To give an even briefer idea of the flavour of each of these techniques:
Bounded model checking involves unwinding all the loops in the program a fixed number of times [; k ;]. This gives you a DAG representing all of the traces of length up to [; k ;]. You bitblast this DAG (i.e. convert it to SAT/SMT) and hand off the resulting problem to a SMT solver. If the problem is SAT, you've found a concrete bug in the program. If it's UNSAT, all you know is that there is no bug within the first [; k ;] steps of the program.
Abstract interpretation is about picking an abstract domain to execute your program on, then running the program until you reach a fixed point. This fixed point tells you some invariants of you program (i.e. things which are always true in all runs of the program). The hope is that one of these invariants will be strong enough to prove the property you're interested in.
Predicate abstraction is just a particular type of abstract interpretation where your abstract domain is a bunch of predicates over the variables of the program. The idea is that you get to keep refining your abstraction until it's good enough to prove your property using counterexample guided abstraction refinement.
Interpolation can be viewed as a fancy way of doing predicate refinement. It uses some cool logic tricks to do your refinement lazily. The downside is that we don't have good methods for interpolating bitvector arithmetic, which is pretty crucial for analyzing real programs (otherwise you don't take into account integer overflow, which is a problem).
A final wildcard technique that I'm just going to throw out there is loop acceleration. The idea here is that you can sometimes figure out a closed form for a loop and replace the loop with that. This means that you can sometimes remove a loop altogether from the CFG without losing any information or any program traces. You can't always compute these closed forms, but when you can you're in real good shape.
Drop me a message if you want to know anything else. I'm doing a PhD in this exact area & would be happy to answer any questions you have.
Sounds like what you're interested in is computer architecture. This is the study of how a computer system (whether it's chip level or system level) is organized and designed from a higher-level abstraction (usually at the register-transfer level or above). There are plenty of good resources on this, including many books (this one comes to mind). Not knowing your background, I can't say if this would be much of a stretch for you. I would say prior to jumping to this level you should have an idea of basic MOS logic design, sequential and combinational logic as well as some background in delays and timing.
Your best bet is probably to find a good old book on amazon or ebay and read to your hearts content. Feel free to PM me if you have any questions (I design microprocessors for a living).
Your best bet is to read an introductory text first and wrap your head around what quantum computing is.
I suggest this one: Intro Text
I like it because it isn't very long and still gives a good overview.
My former supervisor has a web tutorial: here
Lastly, Michael Nielson has a set of video lectures: here
The issue is, there is a decent sized gap between what these introductions and tutorials will give you and the current state of the art (like the articles you read on arxiv). A good way to bridge this gap is to find papers that are published in something like the Physical Review Letters here is their virtual journal on quantum information and see what they cite. When you don't understand something either refer to a text, or start following the citations.
Basically, if you can start practicing this kind of activity (the following of references) now, you'll already have a good grasp on a large part of what grad school is about.
Best of luck!
When I started getting interested in compilers, the first thing I did was skim issues and PRs in the GitHub repositories of compilers, and read every thread about compiler construction that I came across on reddit and Hacker News. In my opinion, reading the discussions of experienced people is a nice way to get a feel of the subject.
As for 'normal' resources, I've personally found these helpful:
In addition, just reading through the source code of open-source compilers such as Go's or Rust's helped immensely. You don't have to worry about understanding everything - just read, understand what you can, and try to recognize patterns.
For example, here's Rust's parser. And here's Go's parser. These are for different languages, written in different languages. But they are both hand-written recursive descent parsers - basically, this means that you start at the 'top' (a source file) and go 'down', making decisions as to what to parse next as you scan through the tokens that make up the source text.
I've started reading the 'Dragon Book', but so far, I can't say it has been immensely helpful. Your mileage may vary.
You may also find the talk 'Growing a language' interesting, even though it's not exactly about compiler construction.
EDIT: grammar
The Design of the Unix Operating System is a classic. It's from the 80's, but still plenty relevant. It's very well written, with plenty of diagrams to help you along.
It doesn't quite start from the very beginning. If you're looking for information on how to start with absolutely nothing (ie, write a bootloader, implement basic device drivers, etc), then you'll need to supplement with other sources. It does, however, do a really great job of explaining things like processes, threads, memory management, and other basic concepts. It doesn't give you source code (though it contains a bit of pseudocode), but it explains in succinct, legible prose, the data structures and algorithms that drive core functionality. Again, it's an old book - $6.00 plus shipping used. Can't really go wrong.
Operating Systems Design and Implementation covers basically the same ground. I prefer the former, as it treats you a little more like an adult and skips straight to explaining how concepts are implemented (and the cover art is just so undeniably classic).
When I said "I can see how maybe they could be useless to you.", that's because I instantly know what kind of programmer you were. You're a low level guy.
I have a copy of "Algorithms in a Nutshell" (http://www.amazon.com/Algorithms-Nutshell-In-OReilly/dp/059651624X) but I never finished it. My favorit programming book may be "Patterns of Enterprise Application Architecture" (http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420). Neither of these books are language specific, but I don't think they could be further apart in every way. Both are very valuable and I appreciate that they both exist.
There is a good number of reasons that you should maximize your use of the built-in PHP functions (http://webandphp.com/5reasonstomaximizeyouruseofPHP%E2%80%99sbuiltinfeatures). My book is an attempt to come up with a system that will help you learn all of the built-in PHP functions by giving a realistic use case that could be applied in your everyday work.
Being a PHP programmer, it is much more useful to know what functions PHP has for array sorting, than it is to know how to implement array sorting in PHP code.
This is so thoughtful! Very similar to Hyperbole and a Half is The Oatmeal, which is another sardonic blog with funny cartoons that has a book of its best content. I also highly recommend XKCD's book "Thing Explainer", which is a highly informative and entertaining read. Wishing your friend the best!
Course 1 was definitely useful but I also found it pretty easy. I've been busy with other things for the last several days (mostly learning Fusion 360) so I'm still on the UART lesson. I think the most useful part so far has been learning more about design patterns. I've been concurrently reading Making Embedded Systems and the combination of the book and the course has been great.
Here's a list I compiled, the list books kind of hit a wide range. I don't really have any given list of talks I like. I'll just randomly google for videos by a speaker i like or subject I'm interested in.
Books:
Watch:
Podcasts:
Blogs:
News Letter:
Misc:
It seems that most introductory texts focus on parsing. However, in my experience, the dragon book does a good job on introductory code generation. Appel's Tiger Book had good information as well. As a heads up, the C and Java versions of the same book are done as an afterthought, and you can tell that the code was translated after the fact. Stick with the ML version.
For optimization algorithms, I've heard good (And bad) things about Muchhnik: Advanced Compiler Design and Implementation.
However, I've had better luck just reading various papers. If there's a specific part of code generation and emission, I can point you to plenty of good papers.
The book, Head First Design Patterns, is actually pretty good.
You could also read the book that started it all, Design Patterns: Elements of Reusable Object-Oriented Software. Although good, it is a dull read - I had to force myself to get through it.
Martin Fowler is also really good, in particular, I thoroughly enjoyed his book Patterns of Enterprise Architecture.
If you want more of an MS/.NET slant of things, you should also check out Dino Esposito. I really enjoyed his book Microsoft .NET: Architecting Applications for the Enterprise.
My recommendation would be to start with the Head First book first, as this will give you a good overview of the major design patterns.
Hey that is the million dollar question. But because software is not an engineering, actually there is no reference book on SW architecture. Certainly there are books talking about this, but usually covering only some aspects and without real application examples.
Notice that in iOS programming the system imposes a great part of the architecture, so these guys are usually less concerned. But in Android we have more freedom, and the API actually encourages really bad practices (thanks Google). Because of this we are all a bit lost. Nowadays layered architecture and MVP seems to be the most popular approach, but then again everybody produces a different implementation...
Specifically for Clean Architecture you should read its author, Robert C. Martin. AFAIK this is not covered in detail in his books. You can read this blog post and watch this video. Other designs usually coming up in conferences are the Onion Architecture and the Hexagonal Architecture. But make no mistake: there's no route map on how to implement any of those, and examples claiming to follow this or that approach are usually not written by the authors of the architecture.
For DDD there is a very good book by Scott Millet with actual examples. But this style is meant for large enterprise backend apps, and the author himself advices against using is in small apps. So I'd say it is overkill for Android, but of course you could reuse some concepts successfully.
Theres also Software Architecture in Practice 3rd, but having read the 2nd edition I can tell you this is just smoke.
Probably best book to date is Fowler's but this is more a patterns compilation than an architecture guide.
You might have some better luck if you go top down. Start out with an abstracted view of reality as provided by the computer, and then peel off the layers of complexity like an onion.
I would recommend a "bare metal" approach to programming to start, so C is a logical choice. I would recommend Zed Shaw's intro to C: http://c.learncodethehardway.org/book/
I would proceed to learning about programming languages, to see how a compiler transforms code to machine instructions. For that, the classical text is the dragon book: http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811
After that, you can proceed to operating systems, to see how many programs and pieces of hardware are managed on a single computer. For that, the classical text is the dinosaur book: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/1118063333 Alternatively, Tannenbaum has a good one as well, which uses its own operating system (Minix) as a learning tool: http://www.amazon.com/Modern-Operating-Systems-Andrew-Tanenbaum/dp/0136006639/ref=sr_1_1?s=books&amp;ie=UTF8&amp;qid=1377402221&amp;sr=1-1
Beyond this, you get to go straight to the implementation details of architecture. Hennessy has one of the best books in this area: http://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X/ref=sr_1_1?s=books&amp;ie=UTF8&amp;qid=1377402371&amp;sr=1-1
Edit: Got the wrong Hennessy/Patterson book...
Science mode limits the available parts until you do the science to unlock more, without having to deal with restrictions like funding. You're almost literally forced to start simple, which is very useful given the steep curve of this game.
I'm no education expert, but I've been playing games since the NES came out. And what I've seen of this coming generation, they're pretty sharp, even if their reading skills are limited. Don't expect a 4 year old to understand delta-v, but fully expect them, after a few weeks of play, to not need to worry about it. If they can survive the steep learning curve, they'll know what engine they want by the picture (most of us do anyway) and they'll know what it does because they tried it and saw for themselves. It might be useful at the very least to explain "this one makes you go fast but uses up all your fuel, this one makes you go slow but uses less fuel" and stuff like that. Basically, talk to them as if you're quoting this book.
A child's mind is a very wondrous machine. If nothing else, trust that, if their interest is strong enough to overcome their failures, they will blow you away sooner than you could ever realize.
POSA books by Buschmann are considered to be the textbooky, Knuth/SICP level stuff from when I studied and are architecture equivalent to the GoF's design thing, but as a consequence they're also huge on OOP. The Fowler Book is even more preachy and OOPy but many things are still relevant. The third would be Uncle Bob's Clean Architecture (sorry for cryptic refs, am on mobile, just Google the refs you'll find them.
On the systems design front one should learn ESB and SOA as they are patterns still relevant in this microservices world but most books on the subject are often tied to particular tech (and its often wrong tech like IBM or Oracle or MS proprietary, untraslateable/untransferable stuff). I've heard good things about Thomas Erl books [1].
I've recently read Sam Newman book on Microservices and while it does have a lot of zeitgeist in it at least it's current zeitgeist and the book is decent.
Edits:
I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.
Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.
Full disclosure: I haven't actually read more than the preface of any of those books. Software engineering topics are more directly applicable to me than CS topics right now, so here are some that I've actually started reading:
Honestly I haven't read a "PHP book" in ages so I am a very bad source in critiquing. the majority of the one's I have come across are painfully outdated and some right out inaccurate. My suggestion to learn PHP programming better is try books that have to do with programming in general. Books on object orientation and patterns, like GOF http://en.wikipedia.org/wiki/Design_Patterns or PoEAA http://www.amazon.com/Enterprise-Application-Architecture-Addison-Wesley-Signature/dp/0321127420 are great for learning Object Oriented principals.
But those will help somewhat. What really helped me become a better PHP programmer is to study other languages, and then study their web frameworks, then take what you learned back to PHP. Find out why one aspect is pretty common in X language frameworks, but not PHP frameworks. How do other language frameworks solve the dependency issue, etc. Some languages I suggest learning are actually other ones that are pretty mainstream in web programming: Python for it's typing and how it compares to PHP. Ruby with how mixins relate to PHP traits. and Java is great as there are quite a bit of aspects that PHP stole off Java in it's OO design.
I'll throw out some of my favorite books from my book shelf when it comes to Computer Science, User Experience, and Mathematics - all will be essential as you begin your journey into app development:
Universal Principles of Design
Dieter Rams: As Little Design as Possible
Rework by 37signals
Clean Code
The Art of Programming
The Mythical Man-Month
The Pragmatic Programmer
Design Patterns - "Gang of Four"
Programming Language Pragmatics
Compilers - "The Dragon Book"
The Language of Mathematics
A Mathematician's Lament
The Joy of x
Mathematics: Its Content, Methods, and Meaning
Introduction to Algorithms (MIT)
If time isn't a factor, and you're not needing to steamroll into this to make money, then I'd highly encourage you to start by using a lower-level programming language like C first - or, start from the database side of things and begin learning SQL and playing around with database development.
I feel like truly understanding data structures from the lowest level is one of the most important things you can do as a budding developer.
Heh, that's a loaded phrase because it people haven't agreed on what it means.
So I agree with both the other posters in that it can include the stack but usually implies a deeper design understanding.
To me, it doesn't make much sense to ask about a rails app's architecture without going into the tech stack precisely because 1) rails apps have the same basic architecture (MVC) 2) the rest of the stack is actually part of the application. Do you use MySQL or Postges or something else? How many rails servers do you have? How many database servers are there and how are they replicated? Etc etc.
However, when you're talking about apps that don't have a given, accepted base design then it's really important to know how it's designed.
I'm going to use the phrase design and architecture interchangeably here, but one could argue they're slightly different.
The architecture of an app influences the "non-functional" characteristics it embodies (also called quality attributes). Furthermore, and more importantly, the architecture itself is (or should be) influenced by the desired non-functional characteristics.
What do I mean by non-functional characteristics? Stuff like:
If you think about it, these things are difficult and expensive to change down the road. If you want to add security to an app that's highly modular, you will have a lot of work due to the high amount of decoupling throughout the app. Or imagine trying to add performance to a highly modifiable app. Modifiability usually implies low coupling between parts which also, usually, impacts performance.
So when you think about the architecture of an app, it's how the larger parts are put together to express these non-fuctionals. This can get down to the level of design patterns like MVC (modifiability) and dependency injection (testability) but it starts at a higher level where you look at things like Java packages instead of classes, as an example.
There are a number of books on amazon about this but here are 2 (I've read the first, but not the second):
We wrote a compiler for one of my CS classes in college. The language was called YAPL (yet another programming language).
First thing first, as other's have mentioned, a compiler translates from one language to another...typically assembly...but could be any other language. Our compiler compiled YAPL, which was a lot like Pascal, into C, which we then fed to the C compiler...which in turn was fed to the assembler. We actually wrote working programs in YAPL. For my final project, I wrote a functional--albeit VERY basic--web server.
With that said, it's quite a bit different for an interpreted language, but the biggest part for each is still the same. By far, the most complicated part of a compiler is the parser.
The parser is what reads a source code file and does whatever it's going to do with it. Entire bookshelves have been written on this subject, and PhD's given out on the matter, so parsing can be extremely complicated.
In a theoretical sense, higher level languages abstract common or more complicated tasks from the lower level languages. For example, to a CPU, variables don't have sizes or names, neither do functions, etc. On one hand, it greatly speeds up development because the code is far more understandable. On the other hand, certain tricks you can pull of in the lower-level languages (that can vastly improve performance) can be abstracted away. This trade-off is mostly considered acceptable. An extra $500 web server (or 100 for that matter) to handle some of the load is far less expensive than 10 extra $100,000 a year x86 assembly developers to develop, optimize, and debug lower level code.
So generally speaking, the parser looks for what are called tokens, which is why there are reserved words in languages. You can't name a variable
int
in C becauseint
is a reserved word for a type. So when you name variable, you're simply telling the compiler "when I reference this name again, I'm talking about the same variable." The compiler knows anint
is 4 bytes, so does the developer. When it makes it into assembly, it's just some 4 bytes somewhere in memory.So the parser starts looking for keywords or symbols. When it sees
int
, the next thing it's going to expect is a label, and if that label is followed by(
, it knows it's a function, if it's followed by;
it's a variable--it's more complicated than this but you get the idea.The parser builds a big structure in memory of what's what and essentially the functionality. From there, either the interpreter goes through and interprets the language, or for a compiler, that gets handed to what's called the emitter. The emitter is the function that spits out the assembly (or whatever other language) equivalent
a = b + c;
happens to be.This is complicated, but if you take it in steps, it's not really that hard. This is the book we used. There's a much newer version out now. If I can find my copy, I'll give it to you if you pay shipping. PM me.
Not a magic bullet but these helped me:
Disclaimer: I'm a pretty terrible programmer, but I used to be a lot worse.
It's on my to-do list, but this is something that I want to get right. I don't think I could fully appreciate it without a more formal approach. I'm currently working through this, and will try my hand in the subject afterword. I will definitely check out Professor Might's insight on the subject, and I would gladly take up any other resources you might have to offer!
I really would recommend Randall Munroe's Thing Explainer. When I started doing propulsion work, I actually used it as a reference because it's easy to reference and, it has a pretty strong foundation on a number of things at a very accessible level. As u/zaures mentioned, The Way Things Work (any edition) is excellent and in much the same vein.
I'm mostly self-taught, so I've learned to lean heavily on App Notes, simulations, and experience, but I also like these books:
The Howard Johnson Books:
High Speed Digital Design: A Handbook of Black Magic
https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_api_I0Iwyb99K9XCV
High Speed Signal Propagation: Advanced Black Magic
https://www.amazon.com/dp/013084408X/ref=cm_sw_r_cp_api_c3IwybKSBFYVA
Signal and Power Integrity - Simplified (2nd Edition)
https://www.amazon.com/dp/0132349795/ref=cm_sw_r_cp_api_J3IwybAAG9BWV
Also, another thing that can be overlooked is PCB manufacturability. It's vitally important to understand exactly what can and can't be manufactured so that you can make design trade offs, and in order to do that you need to know how they are made. As a fairly accurate intro, I like the Eurocircuits videos:
http://www.eurocircuits.com/making-a-pcb-pcb-manufacture-step-by-step
Check out Simon Monk's book: Programming Arduino, Getting Started with Sketches . I found it a great starter book, and was easy to understand and follow.
As for your keyboard interface.. it sounds like you will need the serial monitor running and waiting for a key-press. Arduino: SerialAvailable
Hope that gets you moving in the right direction! GL!
Thanks, I'm sure you will. It's just a question of getting that first success. Afterwards, it gets much easier, once you can point at a company and say "Their customers are using my code every day."
As for the interviews, I don't know, I'm honestly not the type to get nervous at interviews, either because I know my skill level is most likely too low and I take it as a learning experience, or because I know I can do it. I'd say that you should always write down all the interview questions you couldn't answer properly and afterwards google them extensively.
Besides, if you're from the US, you have a virtually unlimited pool of jobs to interview for. I live in a tiny European country that has 2 million people and probably somewhere in the range of 20 actual IT companies, so I had to be careful not to exhaust the pool too soon.
Funnily enough, right now, my CTO would kill for another even halfway competent nodejs developer with potential, but we literally can't find anyone.
Anyway, I'm nowhere near senior level, but I can already tell you that the architecture:language part is something your bootcamp got right. To that I would add a book my CTO gave me to read (I'm not finished yet myself, but it is a great book) - Patterns of Enterprise Architecture. Give it a look. I suspect, without ever having tried to implement a piece of architecture like that, it won't make much sense beyond theoretical, but I promise you, it's worth its weight in gold, once you start building something more complex and have to decide how to actually do it.
I was pretty happy with "Programming Arduino - Getting Started With Sketchs" by Simon Monk. It provide a good overview of C and the various steps needed to get to working code. I've already a lot of coding experience by knew nothing about Arduino when I started so it brought me up to speed quickly but I think it would useful for beginners, as well.
It's available on Amazon for sub-$9: -> and he has a site which has a fair amount of errata, etc.: ->
It's hard to say because that's a more advanced section of computer science called architecture and I learned it in a college setting. With that being said, I've heard good things about this textbook. Hopefully whichever book you pick up on ARM assembly will have a few chapters going over how the processor functions.
Good luck! :)
Not sure if it's quite what you're looking for but Computer Organization & Design - The HW/SW Interface is a fantastic book on processor architecture and uses the MIPS design as an example through the entire text including good stuff on MIPS assembly programming. The link is for the latest edition (fourth) but if you want to go cheaper the third edition is still available. I used (and still use, about to tutor a course on CompArch) the third edition and it's one of the most useful texts I have ever owned.
Upvote for Domain-Driven Design, it's a great book. Depending on the size of the system, Martin Fowler's PoEAA might also be helpful.
Also what dethswatch said: what's the audience & scope; i.e. what's in the previous document? If you're presenting three architectures you probably need enough detail that people can choose between them. That means knowing how well each will address the goals, some estimate on implementation effort (time & cost), limitations, future-proofing, etc.
Finally, IMHO, this really isn't computer science. You might have better luck asking in /r/programming/ or the new r/SWArchitecture/
Amazing? These look like they were swiped from an overview lecture, there isn't any really good explanation in here. If this is all new to you they might be a good starting point for learning some basic concepts and vocabulary of signal integrity.
Johnson's Black Magic book is the general reference for this. There are many other (well written) white papers out there. Ott and Bogatin have good books as well.
Software dev checking in. If you want to go into plugin design, make sure you read books like The Scientist And Engineer's Guide to Digital Signal Processing, and have a heavy focus on algorithms, physics, and matrix math.
There are SDKs and APIs to help though. The Steinberg VST SDK is how VST plugins are made, and it removes a lot of the underlying math that you need to know. Writing multi-threaded C code with a library like OpenMP will also help, as you plugins will be more efficient, resulting in less latency.
I also recommend the book named Thing Explainer which uses the 1000 most common words to explain complicated things. It has a lot of simple, funny pictures and is fun to read overall (the tone is not serious at all). I'd be happy to ship you a copy if you like. Just PM me your address.
Well, some books can help:
These books on patterns tend to be good on teaching a lot of what you're asking. Largely because you've named some patterns in your question, but also because many patterns are about:
There are several ways to say what I just did in there. You're allowing X to vary independently from Y. This makes X a parameter of Y, which is yet another way to say it. You're separating what is likely to change often (X) from what doesn't need to be affected by that change (Y).
Some benefits from this is that a reason to change X, now, doesn't affect Y because X can be changed independently of Y. Another is that understanding X can be significantly done without looking at Y. This is a core guiding rule in the separation of concerns principle: the concern X is separated from the concern Y. Now, a lot of activities you want with X can be performed independently of Y.
You probably know all of this, so I'm sorry if this isn't much helpful. But just to finish, a classic example of this is a sorting function (the Y) and the comparison criteria (the X). Many people, in many projects, would like to have that a change in the comparison criteria not lead to a change in the sorting function. They're 2 separate concerns we'd like to deal with separately. Therefore, the comparison criteria, as commonly done today, is a parameter of sorting. In this case, the word "parameter" is being used both in the sense of a function parameter in the source code, but also in the more general sense of something being a parameter of something else, in which case something can be one of many, and may change over time.
> I want to learn how linux (and computers) work.
If you want to learn how Linux (and computers) work, take a course on operating system design and development. It's offered at any university that has a respectable computer science program, and you can probably find online courses that teach it for free. If you're more of a self-starter, grab a textbook and work your way through it. A book on the internal workings of Linux in particular might also be helpful, but IMO the development of the Linux kernel is too rapid for a book to provide a useful up-to-date reference.
If you want to learn Linux as a day-to-day user (which is what I suspect you're looking for), pick Ubuntu or one of its derivatives. They are easy to get up and running, while still allowing you to "spread your wings" when you're ready.
Ah gotcha, yeah to be honest this approach probably won't be terribly illuminating. The problem is that the D-Wave really doesn't work in any kind of classically equivalent way. When you think about algorithms classically, the procedure is highly linear. First you do this, then that, and finally the other. The D-Wave One involves nothing of the sort.
Here's a quick rundown of what a quantum annealing machine actually does, with analogies to (hopefully) clarify a few things. In fact, an analogy is where I'll start. Suppose you had a problem you were working on, and in the course of trying to find the solution you notice that the equation you need to solve looks just like the equation describing how a spring moves with a mass hanging from it. Now you could continue your work, ignoring this coincidence, and solve out the equation on your own. Alternatively, you could go to the storage closet, grab a spring and a mass, and let the physics do the work for you. By observing the motion of the spring, you have found the solution to your original problem (because the equations were the same to begin with).
This is the same process used by the D-Wave One, but instead of a spring and a mass, the D-Wave system uses the physics of something called an Ising system (or model, or problem, etc.). In an Ising system, you have a series of particles^ with nonzero spin that can interact with each other. You arrange this system so that you can easily solve for the ground state (lowest energy) configuration. Now with the system in this ground state, you very, very slowly vary the parameters of the system so that the ground state changes from the one you could easily solve to one that you can't. Of course this new ground state, if you've done things correctly, will be the solution to the problem you were actually concerned with in the first place, just like the spring-mass example above.
So perhaps now I have explained at least a little bit of why I don't call the D-Wave One a "computer". It doesn't compute things. Rather, by a happy coincidence, it sets up an experiment (i.e. the Ising system) which results in a measurement that gives you the answer to the problem you were trying to solve. Unfortunately for you, the software engineer, this resembles precisely nothing of the usual programming-based approach to solving problems on a classical computer.
My advice is this: if you want to learn some quantum computing, check out An Introduction to Quantum Computing by Kaye, Laflamme, and Mosca, or the classic Quantum Computation and Quantum Information by Nielson and Chuang.
^ They don't actually have to be single particles (e.g. electrons), but rather they are only required to have spin interactions with each other, as this is the physical mechanism on which computations are based.
Edit: Okay, this was supposed to be a reply to achille below, but apparently I'm not so good with computers.
I've peeked at this free online book a few times when implementing things. I think it's a pretty solid reference with more discussion of these sorts of things!
Another option is a "real" textbook.
My programming languages course in university followed Programming Languages: Application and Interpretation (which is available online for free). It's more theory-based, which I enjoyed more than compilers.
But the dragon book is the go-to reference on compilers that is slightly old but still good. Another option is this one, which is a bit more modern. The latter was used in my compilers course.
Outside of that, you can read papers! The older papers are actually pretty accessible because they're fairly fundamental. Modern papers in PL theory can be tricky because they build on so much other material.
I've tried doing something similar for x86 about a decade ago.
I've written this GUI a while back from scratch that could work from DOS (for things like loading images into memory). All those controls that you see in the picture worked well and how they should. I believe it only took like 3000-5000 lines of code to write it. How I started was, I already knew x86 assembly (not really needed for GUI part, unless you want to optimize) and C. I found some sort of Linux bootloader that had a GUI. I looked through its code and got a basic idea on how to write it. Its author used C++ (just for classes and inheritance), which is what I used. So for a GUI I recommend learning C and a bit of C++, then finding this bootloader (I don't remember the name) or some other relatively small project that has its own GUI and see how it's made.
For the OS part, I recommend a book called "Operating Systems Design and Implementation (3rd Edition)". You will need to know C and x86 assembly to understand it. It discusses how a particular OS named Minix was made. Linus Torvalds used the first edition of this book to write the first version of Linux.
You could google for something like "osdev", "osdev gui".
Both books I've read. The latter sits on my bookshelf. It was a gift from my girlfriend. Please don't waste your time trying to implement a compiler. It's a PhD level endeavor that will take years of dedicated 60 hour work weeks.
Here are the same links linked from my Amazon affiliates account:
You are better off implementing a algebraic calculator using LR Parse. Start with Tom Torf's - Programmers Calculator - PCalc. It's written in C and pretty simple. You can fork it from my GitHub account if you have trouble finding Tom's source archive. Tom (may he rest in peace) also wrote several programming tutorials and contributed to comp.lang.c, alt.lang.c and the comp.lang.c FAQ.
This here is my patterns bible:
https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420
As for .Net and SQL Server, it really depends on what you want to get into. Both have such a huge field of applications. In general MS Press is really good for books on their own stuff and written well enough that you can actually read through it.
Edit: But yeah, just realized that the book from Fowler also already is 14 years old. I need to update that as well :)
This is as good a place to start as any. https://www.youtube.com/watch?v=wgdBVIX9ifA
This video was helpful too: https://www.youtube.com/watch?v=e0E6po7tG-8&t=533s
This book is pretty good https://www.amazon.ca/Building-Microservices-Designing-Fine-Grained-Systems/dp/1491950358/ref=sr_1_1?ie=UTF8&qid=1549050596&sr=8-1&keywords=microservices (updated version coming out in august)
Regarding sandboxing, at least in lua from what I know you can have minute control over what libs access to, and users can only import other libraries if you allow them to (by including a "library" that imports other libraries :-).
Perhaps you should look into formal languages and parser generators, so you can create more complex languages if you feel like it. Even if you build the parsers yourself having the language specified, factorized and so on, helps a lot. The dragon book is a good choice, although it presupposes you know a bit about specifying a formal language IIRC. If you're a student (I know how it is!) then even the old dragon book is an excellent read and it's very cheap.
The chapter on quadrature signals in this book is really good. It has some of the best illustrations of the concept that I have come across. The amazon link also lets you browse that chapter for free.
Just going from a bunch of hardware to the point where you can input machine code to be executed is a vast topic in itself (and something I don't have knowledge of). Once you can input machine language and have it execute though, I at least have an idea.
You can use machine code to write an assembler, which is a lot of work but not particularly complex.
You can use an assembler to write a compiler (good luck with this one, I'm in compiler design right now and it's a mind blow).
You can use a compiler to write pong.
There are many topics that you can really get acquainted with by just wandering the web. I don't think this is one of them. Once you get it you can really go some complex places, so what you're likely to find online is either too simple, or too complex for the understanding you seek. With dedication a book can probably help you, but if you can make nice with a teacher--auditing a computer organization/assembly language class will really open your eyes to what is going on in there.
Take a look at the course listing at a local college and e-mail the teacher, see if they'll let you audit their class.
This was my textbook for that class, it's decent. Maybe you can find an early edition for cheap:
http://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123744938/ref=sr_1_2?ie=UTF8&amp;qid=1302110540&amp;sr=8-2-spell
> Is this a realistic goal
Yes, quite. The bits you are going to be missing are some of the mathematical underpinnings. Depending on what you're programming, you'll also want to grab books on the particular topic at hand that don't try to teach you programming at the same time.
For example, if you want to learn why C# is object-oriented and what that means and how to use it, grab a copy of this book: http://en.wikipedia.org/wiki/Object-Oriented_Software_Construction
If you want to learn how relational databases work, read this one http://www.amazon.com/Introduction-Database-Systems-8th-Edition/dp/0321197844 (You can easily find online versions, but I didn't investigate whether they were legally released or not.)
You want to write a compiler? Grab the "dragon book": http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811
None of those teach you how to program. They teach you the math and background behind major inventions in programming. Keep up with those, find a local mentor who enjoys talking about this stuff, and you'll do fine.
I would very much recommend http://www.amazon.com/Black-Video-Game-Console-Design/dp/0672328208 ; it goes really far in explaining everything from quantum level up to a working game console. And after that; http://www.amazon.com/Operating-Systems-Design-Implementation-3rd/dp/0131429388/ref=sr_1_1?s=books&amp;ie=UTF8&amp;qid=1405346881&amp;sr=1-1&amp;keywords=minix . Then you'll be set.
Edit; Although the Black art is about game consoles; if you work through it, you can build your own computer in the end, or, what I really like, you can pick up old machines (80s/begin 90s) from Ebay for < $5, open them up , understand them and change them. As they are not 'one chip' with some power supply stuff, but almost everything is held in separate ICs, so you can follow the PCB and see actually what it is doing and how. Great fun. And it scales, as I have no issue making digital things with FGPA's etc because I know it at this level.
Cheers man! The Dragon Book is a great place to start, and there's always this, but mainly it's about facing each problem as you come to it and hoping for the best :P
"Introduction to Algorithms"by Cormen et.al. Is for me the most important one.
The "Dragon" book is maybe antoher one I would recommend, although it is a little bit more practical (it's about language and compiler design basically). It will also force you to do some coding, which is good.
Concrete Mathematics by Knuth and Graham (you should know these names) is good for mathematical basics.
Modern Operating Systems by Tennenbaum is a little dated, but I guess anyone should still read it.
SICP(although married to a language) teaches very very good fundamentals.
Be aware that the stuff in the books above is independent of the language you choose (or the book chooses) to outline the material.
For lexing and parsing, you should just pick up a compiler book. You could bang your head against it your whole life without figuring it out, and the Right Answer is not that hard if you have someone to show it to you. There are lots of good ones; the classic is the "dragon book" (http://www.amazon.com/Compilers-Principles-Techniques-Alfred-Aho/dp/0201100886).
Beyond that, VMs are a big topic. They include all of compilers, and almost all of systems programming. The Smith and Nair book (http://www.amazon.com/Virtual-Machines-Versatile-Platforms-Architecture/dp/1558609105) is a great jumping off point. But so is playing around with a project that means something to you. It depends what you find more rewarding.
I would strongly recommend reading this book
https://www.amazon.com/Clean-Architecture-Craftsmans-Software-Structure/dp/0134494164
It should be mandatory reading in all CS and SE degrees IMO.
It will not answer your specific question, but it will provide you with the tools and knowledge to understand how best to approach the problem and ensure your architect and design is well though through and draws on the learnings of those who have come before us.
-----
-----
BOOKS
Children Electronics and Electricity books:
Newbie Electronics books:
Basic Circuit Theory books:
Analog Design books:
Digital Design books:
(download old edition)
Digital Signal Processing books:
Computer Design books:
6502,
6800,
6809,
8080,
8085,
Z80,
68000,
x86
processors on Wikipedia.
8051,
ARM,
AVR,
PIC,
RISC-V
microcontrollers on Wikipedia.
Electronics Reference books:
Historical books:
-----
-----
MAGAZINES
Current Electronics Magazines: (subscribe now)
Historical Electronics Magazines: (archives)
Historical Computer Magazines: (archives)
"Kilobaud"
-----
Going to a private, but non-profit institution, its cool.
(as a matter of fact, a friend has friends that go to bent state university and after comparing physics hw found that, well our curriculum is much harder, I guess their intro final had a "draw a line to match the term to its definition" type thing)
Anywho, one of our compsci upper level courses is based on this book http://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123744938/ref=la_B000APBUAE_1_2?s=books&amp;ie=UTF8&amp;qid=1406428553&amp;sr=1-2 It goes through and explains computer architecture for an actual cpu, I don't recall how easy it is to read however. (if you buy it and it makes no sense, the intro book we use was called, "an invitation to computer science", but get an adition or two back from current if you buy)
Finally you can a bunch of info here http://ocw.mit.edu/index.htm
Initially I watched an episode of lunduke hour that featured a freeBSD dev https://www.youtube.com/watch?v=cofKxtIO3Is
I like the documentation available to help me learn. I got my hands on the FreeBSD handbook and can't wait to get into the design and implementation text book, Addison Wesley, 928 pages. https://www.amazon.com/Design-Implementation-FreeBSD-Operating-System/dp/0321968972/ref=pd_lpo_sbs_14_t_0/143-0574353-4482766?_encoding=UTF8&amp;psc=1
I appreciate the focus on servers and research computing that is BSD's strong suit.
> Yes, the Native Query and access to Connection is always THE Hibernate's answer to all the lacking support of basic SQL features like Window Functions or being able to count aggregated results.
That's a very common misconception. Hibernate is not a replacement for SQL. It's an alternative to JDBC API that implements the Enterprise Patterns stated by Martin Flower in his book.
Thre are many alternatives to JPA or Hibernate. In fact, I'm also using jOOQ, and I like it a lot. I wrote about it. I'm using it in my training and workshops as well.
There are things you can do in jOOQ that you can't do with Hibernate, and there are also things you can do with Hibernate that you can't do with jOOQ.
There are books specifically on language design, syntax trees, and unambiguous grammars.
The classic books on compiler design are "The Dragon Book", designing a compiler is important because a statement in the language should mean exactly one thing, and a language should be able to be compiled efficiently. This is more difficult than it sounds.
Second, you need to understand language design, variable binding, etc. This is a topic of Programming Language Paradigms. I'll figure out a good book for this and edit to add it. The best book probably covers languages like Ada, Haskell, C, and Java and gives an overview of their design and reasons.
edit: The book for design is Concepts of Programming Languages 9th ed, by Robert W. Sebesta.
C targets a virtual memory system and instruction set architecture (ISA). It's an abstraction over the hardware implementation of the ISA. Those worlds are just different, and you'll gain a better understanding if you just study them separately.
for computer architecture, I've found two books to be most helpful.
https://www.amazon.com/Digital-Design-Computer-Architecture-Harris-ebook/dp/B00HEHG7W2
https://www.amazon.com/Structured-Computer-Organization-Andrew-Tanenbaum/dp/0132916525/ref=sr_1_1?ie=UTF8&amp;qid=1536687062&amp;sr=8-1&amp;keywords=tanenbaum+computer+architecture&amp;dpID=41B7uYANs%252BL&amp;preST=_SX218_BO1,204,203,200_QL40_&amp;dpSrc=srch
there is a low level operating system book that uses a lot of C code to explain how to build a kernel. This might interest you
https://www.amazon.com/Operating-System-Design-Approach-Second/dp/1498712436
The code you posted was generated from a grammar definition, here's a copy of it:
http://www.opensource.apple.com/source/bc/bc-21/bc/bc/bc.y
As such, to answer the question in your title, this is the best code you've ever seen, in the sense that it embodies some very powerful computer science concepts.
It [edit: the Bison parser generator] takes a definition of a language grammar in a high-level, domain-specific language (the link above) and converts it to a custom state machine (the generated code that you linked) that can extremely efficiently parse source code that conforms to the defined grammar.
This is actually a very deep topic, and what you are looking at here is the output of decades of computer science research, which all modern programming language compilers rely on. For more, the classic book on the subject is the so-called Dragon Book, Compilers: Principles, Techniques, and Tools.
If you're interested in learning a surprising amount about this without needing heavy technical background, might I recommend a fantastic book, But How Do It Know?
I should point out that this (I'm pretty sure at least) comes from Randall Munroe's book Thing Explainer which uses the xkcd art style etc. to explain complicated concepts using the 1,000 most common English words. It's pretty great, check it out if you can!
Yes. Do it. It's great to know. Useful ocassionally - especially grammars. The dragon book is the only college text I've kept.
https://www.amazon.com/Compilers-Principles-Techniques-Alfred-Aho/dp/0201100886/ref=pd_lpo_sbs_14_img_0?_encoding=UTF8&amp;psc=1&amp;refRID=6GT8HPHEKPGJX9GGVMNR
I have this as well, but don't really have any remarks for you. That said, maybe you should look through some of the reviews for it on Amazon or the like. The reviews there seem pretty authentic.
https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/
Thank you all for your responses! I have compiled a list of books mentioned by at least three different people below. Since some books have abbreviations (SICP) or colloquial names (Dragon Book), not to mention the occasional omission of a starting "a" or "the" this was done by hand and as a result it may contain errors.
edit: This list is now books mentioned by at least three people (was two) and contains posts up to icepack's.
edit: Updated with links to Amazon.com. These are not affiliate - Amazon was picked because they provide the most uniform way to compare books.
edit: Updated up to redline6561
There seems to be two approaches to learning DSP: the mathematically rigorous approach, and the conceptual approach. I think most university textbooks are the former. While I'm not going to understate the importance of understanding the mathematics behind DSP, it's less helpful if you don't have a general understanding of the concepts.
There are two books I can recommend that take a conceptual approach: The Scientist and Engineer's Guide to Digital Signal Processing, which is free. There's also Understanding Digital Signal Processing, which I've never seen a bad word about. It recently got its third edition.
Here is an interesting video where they build a cpu up from the transistor level.
The CPU is only a theoretical one called a "Scott CPU", which was designed by John Scott, who is the author of the book, But How Do It Know?, which is an amazingly straight-forward, easy-to-digest book about computing.
I would recommend it as it was the first thing I read that gave me a deep understanding of computers on an abstract level. It completely demystified them and got me well on my way to programming.
Edit: The video doesn't go down to the transistor level, just goes over each component of a CPU. The book does go down to the transistor level, however, and again, I would highly recommend it.
I also recommend reading this book: Patterns of Enterprise Application Architecture
The following books would be good suggestions irrespective of the language you're developing in:
Patterns of Enterprise Application Architecture was certainly an eye-opener on first read-through, and remains a much-thumbed reference.
Domain-Driven Design is of a similar vein & quality.
Refactoring - another fantastic Martin Fowler book.
I thought Programming Arduino Getting Started with Sketches by Simon Monk was very good. It starts from the very basics of what a micro controller is and the concepts of how it works. Then steps you through the example sketches in the Arduino IDE explaining how and why they work. It's written in a way that's very easy to understand even for the absolute beginner.
Once you've gone through those you'll have a good understanding of what is and isn't possible and how to make your own projects around it. After that Google.
Do you want to know which parts make an OS or how it's actually run in runtime. Former, is easy just install Arch, Gentoo or Linux From Scratch. Latter is a lot complicated nowadays but https://www.amazon.de/Operating-Systems-Implementation-Prentice-Software/dp/0131429388/ref=tmm_hrd_swatch_0?_encoding=UTF8&amp;qid=&amp;sr= is a great start or there's https://wiki.osdev.org/Tutorials if you want to go deep.
If you do both and combine the knowledge, your beard will grow 100x.
Source: I did but I am hairless 0*100 = 0 :/.
Its a little old but operating system design and implementation by Andrew Tanenbaum is a pretty good book and it includes a lot of the Minix source code in the book.
LOL it seems interesting to me. I'm reading https://www.amazon.ca/Design-Implementation-FreeBSD-Operating-System/dp/0321968972/ref=dp_ob_title_bk right now.
&#x200B;
Maybe its good in theory, and not in practice.
In that pile-o-stuff there are really two main subjects: architecture and operating systems. I'd pick up recent copies of the dinosaur book and where's waldo. Silbershatz and Tanenbaum are seminal authors on both subjects.
There are numerous resources to learn C. Since I seem to be recommending books, Kernighan and Ritchie's book is pretty much the gold standard.
Good luck.
Oppenheim & Schafer is the usual standard text, as others have said. However, it's pretty theory-intensive and may not be that much of an improvement over your current book, if you are looking for alternative explanations.
I'd say you should look at Lyons' Understanding Digital Signal Processing instead of O&S. Also the Steven Smith guide that mostly_complaints mentioned is very accessible. Between Smith and Lyons you will get most of the knowledge that you need to actually do useful DSP work, if not pass a test in it.
If you are looking for a very easy to read introduction to how computers work, I can recommend the book "But How Do It Know?". Strange title, but book is great. https://www.amazon.com/But-How-Know-Principles-Computers/dp/0615303765
> Writing an interpreter is an order of magnitude easier for a beginner, especially if they write it in a high level language like Scheme, OCaml or Haskell.
Depends on your target language. If you're compiling to a reasonably high level language (like C or ActionScript) and don't care much about optimization, they're basically the same thing. Instead of maintaining a stack of environments and evaluating to a value, you maintain a stack of symbol tables and evaluate to a program fragment in the target language.
If you need to do instruction selection, calculate branch offsets, or do any sort of optimization, it gets more complicated. But you can still find nearly everything you need between The Dragon Book, Steve Muchnik's Advanced Compiler Design and Implementation, and Appel's Modern Compiler Implementation in ML.
> A good tutorial, if you know Haskell
There's also ArcLite, which implements a Lisp in JavaScript.
Also note that it's possible to blur the lines between compiler and interpreter significantly. For example, one of the exercises in SICP splits the evaluator into one pass to analyze it into closures that do whatever needs to be done with the environment, and then a second pass that actually evaluates that function with the provided environment.
Another trick you can do is to depend on the data structures of the host language for some of your processing. For example, variable lookup in ArcLite is implemented in C. How? Each activation record is a JavaScript object, and then the __proto__ property is used as the static link to knit them together. So when the interpreter goes to look up a variable, it's just a hash lookup, and then the JavaScript runtime automatically consults that activation record's prototype if it fails, exactly as if it were looking up a normal JavaScript variable.
Thing Explainer by Randall Munroe is my favorite coffee table book. (He’s the guy that makes XKCD comics.)
Giant detailed drawings of complex things explained using common language, and a candy coating of humor. Really fun book!
The good news is that MIPS is possibly the most straightforward assembly language to code or read. Compared to x86, it's a dream.
If your class is computer architecture related, you will probably be learning from Hennessy and Patterson, which is an excellent book covering hardware and software.
If you want to learn MIPS from a software perspective, check out the classic See MIPS Run.
Can't use free eagle (too big) for this, but kicad or probably other things would work. With a few good books you can lay out a big board without advanced tools, although it can take longer. With cheap/free tools you'll usually have to use some finicky or kludgy methods to do really complex routing (blind/buried vias, free vias, heat transfer, trace length), but that usually isn't too big a deal. Here's a timelapse of a guy using Altium to route a high speed, large (a bit smaller than op's) data board for a high speed camera. The description has rough steps with timestamps- 38 hours total to lay out.
No problem. Good luck finding a class, compilers are a really fun subject!
If you are just generally interested (I don't know your experience level) but the dragon book is still highly regarded. And might be a good entry way into the theory of it all.
If you're looking to expand your knowledge and learn some of the more advanced concepts, then read the following books:
Here's some leaders in the EMC field, to start:
Henry Ott - lots of good info on his website (and his book is a classic)
[Eric Bogatin}(https://www.bethesignal.com/bogatin/)
Howard Johnson (not the hotel chain) - High-Speed Digital Design: A Handbook of Black Magic
If you understand how a VNA and TX lines work, you are most of the way there to understanding signal integrity.
For designing programming languages, my favorites are
If you only get one then go with Pierce. But If you want to get serious about semantics, then Winskel is the way to go.
On the implementation side my favorites are
My compilers course in college used the Dragon Book, which is one of the more quintessential books on the subject.
&#x200B;
But you might also consider Basics of Compiler Design which is a good and freely available resource.
&#x200B;
I'd also suggest that you have familiarity with formal languages and automata, preferably through a Theory of Computation course (Sipser's Introduction to the Theory of Computation is a good resource). But these texts provide a brief primer.
I'd suggest an arduino uno to start out I don't know what the kits include but I'd suggest against them. You'll do better ordering your parts for projects separately so you only buy what you need. I'd suggest starting with an led cube it's easy to solder and there are code sequences already written for patterns. You'll need a soldering iron. I'd suggest this book also.
http://www.amazon.com/gp/aw/d/0071784225/ref=pd_aw_sims_5?pi=SL500_SY115&amp;simLd=1
The books referenced by the most presenters and PCB design conferences are
Right the first time by Lee Ritchie http://www.thehighspeeddesignbook.com/
Highspeed design: A handbook of Black Magic - Howard Johnson https://www.amazon.ca/High-Speed-Digital-Design-Handbook/dp/0133957241
Note that A handbook of black magic reads like a text book, it is very long and very boring.
The subject of PCB is complicated and requires an in depth understanding of the physics because just knowing the rules isn't enough convince other engineers that it's the right way to do something. More importantly, in my experience PCB design is always the least bad solution, you have to understand when you can break the rules and what the implications will be and understand if the trade off is acceptable
I recommend you grab this book, I used it in the university and gives a pretty good explanation of how computers work.
That being said, you would need input from material physicist, electronic engineers, chemists, and a bunch of other professionals to really understand how a computer really works. It is a complex machine, and building one combines knowledge from many many disciplines.
Yeah, Su's not just reading off a script. Her English has come really far; she's at the point of having enough vocabulary to feel like she can express what she wants to express once she picks the right words out of her dictionary. So she still has to do a translation of concepts into a smaller set of words (sort of like the book Thing Explainer) but she's got the confidence to do so.
Actually i would suggest you to start learning OOP and maybe investigate the MVC design pattern, since those are both of subjects which average CodeIgniter user will be quite inexperienced in. While you might keep on "learning" frameworks, it is much more important to actually learn programming.
Here are few lecture that might help you with it:
This should give you some overview of the subject .. yeah, its a lot. But is suspect that you will prefer videos over books. Otherwise, some reading materials:
You will notice that a lot of materials are language-agnostic. That's because the theory, for class-based object oriented languages, is the same.
Also I would recommend for you to look into following concepts of object oriented programming:
[2]: http://www.youtube.com/watch?v=4F72VULWFvc
[3]: https://vimeo.com/21173483
[4]: http://qafoo.com/talks/11_02_phpuk_advanced_oo_patterns.pdf
[5]: http://www.youtube.com/watch?v=wEhu57pih5w
[6]: http://www.infoq.com/presentations/principles-agile-oo-design
[7]: http://www.youtube.com/watch?v=-FRm3VPhseI
[8]: http://www.youtube.com/watch?v=RlfLCWKxHJ0
[9]: https://vimeo.com/21145583
[10]: http://www.slideshare.net/stuartherbert/beyond-frameworks
[11]: https://vimeo.com/20610390
[12]: http://www.slideshare.net/sebastian_bergmann/agility-and-quality-php-uk-2011
[13]: https://vimeo.com/13439458
[14]: https://vimeo.com/12643301
[15]: http://www.amazon.com/PHP-Object-Oriented-Solutions-David-Powers/dp/1430210117
[16]: http://www.amazon.com/Design-Patterns-Explained-Perspective-Object-Oriented/dp/0201715945
Have you seen the MIPS WikiBooks site?
It's not complete, but after a cursory inspection it looks as though it may be useful for beginners.
What textbook is your class using? I believe that the textbook my course assigned (though I may or may not be able to recommend it as I may or may not have ever read it...) was this one.
Clean Code: Clean Code
Clean Architecture: Clean Arch
&#x200B;
Just picked these two up myself.. not sure if its what you are looking for, but seem to be very valuable for software design as a whole.
If you want to know how interpreters/compilers work:
Bonus: http://journal.stuffwithstuff.com/2011/03/19/pratt-parsers-expression-parsing-made-easy/
Edit: have fun!
I started early on with Arduino and moved into lower level embedded with the stm32 discovery line of development boards. Attached link of a good starting board that has tons of example code from ST.
https://www.mouser.com/ProductDetail/STMicroelectronics/STM32F407G-DISC1?qs=mKNKSX85ZJejxc9JOGT45A%3D%3D
If you want a decent intro book into embedded topics, this book does a decent job of introducing the different parts of working on an embedded project:
https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149
&#x200B;
Between the Arduino and Pi, the Arduino is more representative of an embedded device. It introduces you to resource constrained system that can be a good starting point for working with digital and analog io's and can let you hook up to communication bus enabled peripherals that use UART, I2C, and SPI. The biggest problem is that it will not introduce you immediately to debugging and standard compilation tools. However, arduino has been a starting point for many developers. Good luck!
Don't give up just yet, keep looking.
Do you have a portfolio? if not try to work on a project of your own so you can have something to show.
And if you are considering improving your java skills try work with libraries like:
With spare time I would also recommend you to read:
Thanks for your advice. The O'reilly book you mentioned is this one? (Building-Microservices-Sam-Newman). And could you send me some material that you like please? (blog posts included).
Start with the Dragon Book.
When it actually comes time to implement the language, I would recommend just writing the frontend and reusing the backend from another compiler. LLVM is a good option (it's becoming popular to use as a backend, it now has frontends for C, C++, Objective C, Java, D, Pure, Hydra, Scheme, Rust, etc). See here for a case study on how to write a compiler using LLVM as the backend.
This is a good book on the subject. I would personally work with a 4-layer board with a GND and VCC layer. It sounds like you already have a bunch of layers as it is so yes I would recommend a VCC layer.
This book Structured Computer Organization is also very good at explaining in detail how the computer works, it's the one I used in college... Pretty expensive, I know, but at least the cover has nice drawings!
Much of your thinking seems to be based on a confusion of levels. If you knew more specifically how the firing together of neurons strengthens the probability they'll fire together in the future; or if you'd examined a program simulating physics, you wouldn't be using confusion as building blocks for arguments.
For instance, you would not be as confused right here if you were a systems developer instead of a philosopher; one read-through of the Dragon Book would clear everything right up. I'll try to summarize, but please understand this is not rigorous:
Your mind is running the algorithm "Step 1: Move to front of house. Step 2: Move to back of house. Step 3: Go to Step 1." Your mind is something your brain does. Your brain is implemented on physics. Exactly like the boulder.
The most legitimate question related to this post is that of substrate. Note: I do not agree with everything in this essay, but it presents the problem better than writings on "dust theory" (unless you're willing to read the whole Greg Egan novel Permutation City).
Your foes are kids in their twenties with a degree which takes years to achieve, this will be tough! But I think your age and your willingness to learn will help you lot.
&#x200B;
Other things to learn:
&#x200B;
If there's one framework to look at, it would be spring: spring.io provides dozens of frameworks, for webservices, backends, websites, and so on, but mainly their core technology for dependency injection.
&#x200B;
(edit: other important things)
The fact that you mentioned that it'd be cool to work on a DAW tells me that you want to go low level. What you want to study is digital signal processing or DSP. I recommend Understanding Digital Signal Processing. Also watch This talk by Timur Doumler. Or anything by him. I recommend that you pick a programming language and try to output a sin wave to the speakers, then go on from there.
Also check those out:
https://theaudioprogrammer.com/
https://jackschaedler.github.io/circles-sines-signals/
https://blog.demofox.org/#Audio
&#x200B;
Good luck.
I think The Scientist and Engineer's Guide to Digital Signal Processing and Understanding Digital Signal Processing and generally considered the most accessible introductions. I've gotten more mileage out of Understanding DSP; I feel like it goes into a little more detail and really works to walk you through concepts, step by step.
http://www.dspguide.com/pdfbook.htm
https://www.amazon.com/Understanding-Digital-Signal-Processing-3rd/dp/0137027419
Aside from searching out good learning resources, IMO nothing is more helpful for learning than setting up your environment with Matlab, Jupyter notebooks, or whatever you're going to use, and getting comfortable with the tools you'll be using to explore these topics.
Yeah, he ate a lot of the front cover and destroyed the first 20-30 pages of my hardback HP and the Deathly Hallows. But he removed the dust jacket first without damaging it, so at least I can put that on and cover the damage.
He also destroyed the Thing Explainer by Randall Munroe, which I highly recommend as a gift for anyone, including kids, who likes cool drawings and nerdy things. Or maybe for dogs who eat hardback books. My dog found it extra tasty and super chewy.
Hey, last year I followed a course in Operating Systems where we used MINIX as an example OS, which is one of the most understandable OS's out there, and great for learning.
This is a good (and quite pricey, unfortunately) book for MINIX and Operating Systems in general.
I'd check out these two books from the local library and read the first 2-3 chapters. It might contain more than what you need, but these are pretty well written books and don't assume a lot of previous knowledge.
http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210
http://www.amazon.com/Computer-Networks-5th-Andrew-Tanenbaum/dp/0132126958/ref=la_B000AQ1UBW_sp-atf_title_1_1?s=books&amp;ie=UTF8&amp;qid=1376126566&amp;sr=1-1
Or you could just check out your network settings and search for the terms that you encounter (IP address, DNS, DHCP, gateway, proxy, router, firewall)
Make use of that intelligence and get her an arduino uno.
She'll be able to make anything from simple robots to a light up dress that changes colors. A simple guide to get started will help as well. Guide.
I highly recommend Structured Computer Organization by Andrew Tanenbaum. It is a great book for beginners
OOP is a very general concept and it doesn't go further than the SOLID principles. As for the how this actually gets done is somewhat of an opinion and open to interpretation. I'm a fan of this book....
Clean Architecture A Craftsman's Guide to Software Structure and Design
It has very strong opinions and because of that it gives a consistent message and direction. It should by no means be the only opinion or book you take in to account (learn from as many people as you can). But it's a very good start.
I literally have a book called "A Handbook of Black Magic". It's a little old, but it's still one of the best books on the subject.
For compilers:
Types and Programming Languages by Pierce is also a must read for theory people.
Almost every PCB/EDA software doing length matching automatically so you don't need to worry about that. If you wanna know how softwares are doing it, It's more like a mathematical problem. I think they are using parametric curves like Bezier. You can calculate length of a bezier curve easily so you can match them.
https://en.wikipedia.org/wiki/B%C3%A9zier_curve
If you wanna know more about high speed pcb design, I recommend this book.
https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241
Clean Architecture, for sure.
Clean Architecture: A Craftsman's Guide to Software Structure and Design (Robert C. Martin Series) https://www.amazon.com/dp/0134494164/ref=cm_sw_r_cp_apa_i_yJ83Db51V9ZZS
I will also recommended 'High Speed Digital Design: A Handbook of Black Magic book' , it definitely has some good stuff!
https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_tai_O05TBb9HPRG90#
Highly endorsed, first book I read out of school:
Code Complete - Steve McConnell
Bonus, engineers at my office were just given this book as recommended reading:
Clean Architecture - Robert C. Martin
There aren't any that I'd recommend, unfortunately.
This book is not specifically about embedded C, but about embedded in general:
https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149
Anything by Jack Ganssle is good as well.