Reddit mentions: The best computer hardware & diy books

We found 1,470 Reddit comments discussing the best computer hardware & diy books. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 449 products and ranked them based on the amount of positive reactions they received. Here are the top 20.

1. Compilers: Principles, Techniques, and Tools (2nd Edition)

    Features:
  • O Reilly Media
Compilers: Principles, Techniques, and Tools (2nd Edition)
Specs:
Height9.52754 Inches
Length6.49605 Inches
Number of items1
Weight3.2848877038 Pounds
Width1.73228 Inches
▼ Read Reddit mentions

3. Clean Architecture: A Craftsman's Guide to Software Structure and Design (Robert C. Martin Series)

    Features:
  • Great product!
Clean Architecture: A Craftsman's Guide to Software Structure and Design (Robert C. Martin Series)
Specs:
Height9 Inches
Length0.8 Inches
Number of items1
Release dateSeptember 2017
Weight1.4109584768 Pounds
Width6.9 Inches
▼ Read Reddit mentions

6. Compilers: Principles, Techniques, and Tools

Compilers: Principles, Techniques, and Tools
Specs:
Height9.5 Inches
Length6.75 Inches
Number of items1
Weight2.61468242732 Pounds
Width1.5 Inches
▼ Read Reddit mentions

7. Thing Explainer: Complicated Stuff in Simple Words

    Features:
  • Houghton Mifflin Harcourt
Thing Explainer: Complicated Stuff in Simple Words
Specs:
Height13 Inches
Length9 Inches
Number of items1
Release dateNovember 2015
Weight1.8 Pounds
Width0.585 Inches
▼ Read Reddit mentions

8. Computer Organization and Design: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design)

Computer Organization and Design: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design)
Specs:
Height9 Inches
Length7.25 Inches
Number of items1
Weight3.48109911698 Pounds
Width1.75 Inches
▼ Read Reddit mentions

9. Building Microservices: Designing Fine-Grained Systems

    Features:
  • O Reilly Media
Building Microservices: Designing Fine-Grained Systems
Specs:
Height9.19 Inches
Length7 Inches
Number of items1
Weight1.04499112188 Pounds
Width0.59 Inches
▼ Read Reddit mentions

10. The Design and Implementation of the FreeBSD Operating System (2nd Edition)

    Features:
  • Addison-Wesley Professional
The Design and Implementation of the FreeBSD Operating System (2nd Edition)
Specs:
Height9.55 Inches
Length6.75 Inches
Number of items1
Weight3.527396192 Pounds
Width2.15 Inches
▼ Read Reddit mentions

12. Programming Arduino: Getting Started With Sketches

Programming Arduino: Getting Started With Sketches
Specs:
Height8 Inches
Length5.5 Inches
Number of items1
Weight0.4519476371 Pounds
Width0.25 Inches
▼ Read Reddit mentions

13. Understanding Digital Signal Processing (3rd Edition)

Understanding Digital Signal Processing (3rd Edition)
Specs:
Height1.4 Inches
Length9.2 Inches
Number of items1
Weight3.747858454 Pounds
Width7.1 Inches
▼ Read Reddit mentions

14. But How Do It Know? - The Basic Principles of Computers for Everyone

    Features:
  • EASY TO USE- Our automatic tennis ball launcher for dogs has 3 THROW DISTANCE SETTINGS. Lights on the automatic dog ball launcher will indicate which distance is selected- Throws the ball up to 20 feet! Choose a distance to fit the space you have- Makes this ball thrower toy GREAT FOR INDOOR USE!
  • GREAT FOR EXERCISE! Train your pet to play fetch on its own with the Playball auto ball launcher for dogs. Your furry friend will LOVE this fetching toy! Your dog can fetch the ball and drop it back in the hole at the top- the Playball machine will self launch the ball for your pet to fetch again! Both you and your pet will feel like a WINNER!
  • SUITABLE FOR INDOOR AND OUTDOOR USE! Our automatic ball thrower for dogs can run on both an electric plug (included) or on 6 size "C" batteries. The playball small tennis ball launcher is sure to be one of your dog's favorite toys !!
  • What's in the box? This product contains a tennis ball launcher, power plug, and 3 HIGHLY DURABLE SMALL SIZE TENNIS BALLS. THIS automatic dog ball thrower is INTENDED FOR SMALL DOGS ONLY!! Balls are 1.5 inches in diameter- this tennis ball launcher does NOT use regular size tennis balls. Our NEW and IMPROVED balls are highly durable and made of special material that will not jam in the Playball shooter even when wet!
  • CUSTOMER SATISFACTION GUARANTEED! Felix and Fido stands behind their products 100%! If you encounter any issue or the dog ball thrower does not fit your needs- we will gladly do what we can to make you a happy customer!
But How Do It Know? - The Basic Principles of Computers for Everyone
Specs:
Height9 Inches
Length6 Inches
Number of items1
Weight0.6503636729 Pounds
Width0.5 Inches
▼ Read Reddit mentions

16. Structured Computer Organization (6th Edition)

    Features:
  • Used Book in Good Condition
Structured Computer Organization (6th Edition)
Specs:
Height9.2 Inches
Length7 Inches
Number of items1
Weight2.4471311082 Pounds
Width1.2 Inches
▼ Read Reddit mentions

17. The Scientist & Engineer's Guide to Digital Signal Processing

Used Book in Good Condition
The Scientist & Engineer's Guide to Digital Signal Processing
Specs:
Height0 Inches
Length0 Inches
Number of items1
Weight0 Pounds
Width0 Inches
▼ Read Reddit mentions

18. Making Embedded Systems: Design Patterns For Great Software

    Features:
  • O Reilly Media
Making Embedded Systems: Design Patterns For Great Software
Specs:
Height9.19 Inches
Length7 Inches
Number of items1
Release dateNovember 2011
Weight1.18 Pounds
Width0.76 Inches
▼ Read Reddit mentions

19. Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture

    Features:
  • Used Book in Good Condition
Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture
Specs:
Height9.25 Inches
Length7 Inches
Number of items1
Release dateDecember 2006
Weight1.67 Pounds
Width1 Inches
▼ Read Reddit mentions

20. An Introduction to Quantum Computing

    Features:
  • Oxford University Press USA
An Introduction to Quantum Computing
Specs:
Height0.57 Inches
Length9.2 Inches
Number of items1
Release dateJanuary 2007
Weight0.97444319804 Pounds
Width6.24 Inches
▼ Read Reddit mentions

🎓 Reddit experts on computer hardware & diy books

The comments and opinions expressed on this page are written exclusively by redditors. To provide you with the most relevant data, we sourced opinions from the most knowledgeable Reddit users based the total number of upvotes and downvotes received across comments on subreddits where computer hardware & diy books are discussed. For your reference and for the sake of transparency, here are the specialists whose opinions mattered the most in our ranking.
Total score: 808
Number of comments: 8
Relevant subreddits: 1
Total score: 144
Number of comments: 19
Relevant subreddits: 4
Total score: 136
Number of comments: 17
Relevant subreddits: 2
Total score: 48
Number of comments: 23
Relevant subreddits: 2
Total score: 48
Number of comments: 6
Relevant subreddits: 1
Total score: 38
Number of comments: 6
Relevant subreddits: 1
Total score: 24
Number of comments: 6
Relevant subreddits: 2
Total score: 20
Number of comments: 14
Relevant subreddits: 4
Total score: 17
Number of comments: 8
Relevant subreddits: 1
Total score: 14
Number of comments: 8
Relevant subreddits: 2

idea-bulb Interested in what Redditors like? Check out our Shuffle feature

Shuffle: random products popular on Reddit

Top Reddit comments about Computer Hardware & DIY:

u/empleadoEstatalBot · 1 pointr/argentina

> It’s hard to consolidate databases theory without writing a good amount of code. CS 186 students add features to Spark, which is a reasonable project, but we suggest just writing a simple relational database management system from scratch. It will not be feature rich, of course, but even writing the most rudimentary version of every aspect of a typical RDBMS will be illuminating.
>
> Finally, data modeling is a neglected and poorly taught aspect of working with databases. Our suggested book on the topic is Data and Reality: A Timeless Perspective on Perceiving and Managing Information in Our Imprecise World.
>
>
>
>
>
> ### Languages and Compilers
>
> Most programmers learn languages, whereas most computer scientists learn about languages. This gives the computer scientist a distinct advantage over the programmer, even in the domain of programming! Their knowledge generalizes; they are able to understand the operation of a new language more deeply and quickly than those who have merely learnt specific languages.
>
> The canonical introductory text is Compilers: Principles, Techniques & Tools, commonly called “the Dragon Book”. Unfortunately, it’s not designed for self-study, but rather for instructors to pick out 1-2 semesters worth of topics for their courses. It’s almost essential then, that you cherrypick the topics, ideally with the help of a mentor.
>
> If you choose to use the Dragon Book for self-study, we recommend following a video lecture series for structure, then dipping into the Dragon Book as needed for more depth. Our recommended online course is Alex Aiken’s, available from Stanford’s MOOC platform Lagunita.
>
> As a potential alternative to the Dragon Book we suggest Language Implementation Patterns by Terence Parr. It is written more directly for the practicing software engineer who intends to work on small language projects like DSLs, which may make it more practical for your purposes. Of course, it sacrifices some valuable theory to do so.
>
> For project work, we suggest writing a compiler either for a simple teaching language like COOL, or for a subset of a language that interests you. Those who find such a project daunting could start with Make a Lisp, which steps you through the project.
>
>
>
> [Compilers: Principles, Techniques & Tools](https://teachyourselfcs.com//dragon.jpg) [Language Implementation Patterns](https://teachyourselfcs.com//parr.jpg)> Don’t be a boilerplate programmer. Instead, build tools for users and other programmers. Take historical note of textile and steel industries: do you want to build machines and tools, or do you want to operate those machines?
>
> — Ras Bodik at the start of his compilers course
>
>
>
>
>
> ### Distributed Systems
>
> As computers have increased in number, they have also spread. Whereas businesses would previously purchase larger and larger mainframes, it’s typical now for even very small applications to run across multiple machines. Distributed systems is the study of how to reason about the tradeoffs involved in doing so, an increasingly important skill.
>
> Our suggested textbook for self-study is Maarten van Steen and Andrew Tanenbaum’s Distributed Systems, 3rd Edition. It’s a great improvement over the previous edition, and is available for free online thanks to the generosity of its authors. Given that the distributed systems is a rapidly changing field, no textbook will serve as a trail guide, but Maarten van Steen’s is the best overview we’ve seen of well-established foundations.
>
> A good course for which some videos are online is MIT’s 6.824 (a graduate course), but unfortunately the audio quality in the recordings is poor, and it’s not clear if the recordings were authorized.
>
> No matter the choice of textbook or other secondary resources, study of distributed systems absolutely mandates reading papers. A good list is here, and we would highly encourage attending your local Papers We Love chapter.
>
>
>
> [Distributed Systems 3rd edition](https://teachyourselfcs.com//distsys.png)
>
>
>
> ## Frequently asked questions
>
> #### What about AI/graphics/pet-topic-X?
>
> We’ve tried to limit our list to computer science topics that we feel every practicing software engineer should know, irrespective of specialty or industry. With this foundation, you’ll be in a much better position to pick up textbooks or papers and learn the core concepts without much guidance. Here are our suggested starting points for a couple of common “electives”:
>
> - For artificial intelligence: do Berkeley’s intro to AI course by watching the videos and completing the excellent Pacman projects. As a textbook, use Russell and Norvig’s Artificial Intelligence: A Modern Approach.
> - For machine learning: do Andrew Ng’s Coursera course. Be patient, and make sure you understand the fundamentals before racing off to shiny new topics like deep learning.
> - For computer graphics: work through Berkeley’s CS 184 material, and use Computer Graphics: Principles and Practice as a textbook.
>
> #### How strict is the suggested sequencing?
>
> Realistically, all of these subjects have a significant amount of overlap, and refer to one another cyclically. Take for instance the relationship between discrete math and algorithms: learning math first would help you analyze and understand your algorithms in greater depth, but learning algorithms first would provide greater motivation and context for discrete math. Ideally, you’d revisit both of these topics many times throughout your career.
>
> As such, our suggested sequencing is mostly there to help you just get started… if you have a compelling reason to prefer a different sequence, then go for it. The most significant “pre-requisites” in our opinion are: computer architecture before operating systems or databases, and networking and operating systems before distributed systems.
>
> #### Who is the target audience for this guide?
>
> We have in mind that you are a self-taught software engineer, bootcamp grad or precocious high school student, or a college student looking to supplement your formal education with some self-study. The question of when to embark upon this journey is an entirely personal one, but most people tend to benefit from having some professional experience before diving too deep into CS theory. For instance, we notice that students love learning about database systems if they have already worked with databases professionally, or about computer networking if they’ve worked on a web project or two.
>
> #### How does this compare to Open Source Society or freeCodeCamp curricula?
>
> The OSS guide has too many subjects, suggests inferior resources for many of them, and provides no rationale or guidance around why or what aspects of particular courses are valuable. We strove to limit our list of courses to those which you really should know as a software engineer, irrespective of your specialty, and to help you understand why each course is included.
>
> freeCodeCamp is focused mostly on programming, not computer science. For why you might want to learn computer science, see above.
>
> #### What about language X?
>
> Learning a particular programming language is on a totally different plane to learning about an area of computer science — learning a language is much easier and much less valuable. If you already know a couple of languages, we strongly suggest simply following our guide and fitting language acquisition in the gaps, or leaving it for afterwards. If you’ve learned programming well (such as through Structure and Interpretation of Computer Programs), and especially if you have learned compilers, it should take you little more than a weekend to learn the essentials of a new language.
>
> #### What about trendy technology X?
>

> (continues in next comment)

u/Cohesionless · 17 pointsr/cscareerquestions

The resource seems very extensive such that it should suffice you plenty to be a good software engineer. I hope you don't get exhausted from it. I understand that some people can "hack" the technical interview process by memorizing a plethora of computer science and software engineering knowledge, but I hope you pay great attention to the important theoretical topics.

If you want a list of books to read over the summer to build a strong computer science and software engineering foundation, then I recommend to read the following:

  • Introduction to Algorithms, 3rd Edition: https://www.amazon.com/Introduction-Algorithms-3rd-MIT-Press/dp/0262033844. A lot of people do not like this classic book because it is very theoretical, very mathematical, and very abstract, but I think that is its greatest strength. I find a lot of algorithms books either focus too much about how to implement an algorithm in a certain language or it underplays the theoretical foundation of the algorithm such that their readers can only recite the algorithms to their interviewers. This book forced me to think algorithmically to be able to design my own algorithms from all the techniques and concepts learned to solve very diverse problems.

  • Design Patterns: Elements of Reusable Object-Oriented Software, 1st Edition: https://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/. This is the original book on object-oriented design patterns. There are other more accessible books to read for this topic, but this is a classic. I don't mind if you replace this book with another.

  • Clean Code: A Handbook of Agile Software Craftsmanship, 1st Edition: https://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882. This book is the classic book that teaches software engineer how to write clean code. A lot of best practices in software engineering is derived from this book.

  • Java Concurrency in Practice, 1st Edition: https://www.amazon.com/Java-Concurrency-Practice-Brian-Goetz/dp/0321349601. As a software engineer, you need to understand concurrent programming. These days there are various great concurrency abstractions, but I believe everyone should know how to use low-level threads and locks.

  • The Architecture of Open Source Applications: http://aosabook.org/en/index.html. This website features 4 volumes of books available to purchase or to read online for free. It's content focuses on over 75 case studies of widely used open-source projects often written by the creators of said project about the design decisions and the like that went into creating their popular projects. It is inspired by this statement: "Architects look at thousands of buildings during their training, and study critiques of those buildings written by masters."

  • Patterns of Enterprise Application Architecture, 1st Edition: https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/. This is a good read to start learning how to architect large applications.

    The general theme of this list of books is to teach a hierarchy of abstract solutions, techniques, patterns, heuristics, and advice which can be applied to all fields in software engineering to solve a wide variety of problems. I believe a great software engineer should never be blocked by the availability of tools. Tools come and go, so I hope software engineers have strong problem solving skills, trained in computer science theory, to be the person who can create the next big tools to solve their problems. Nonetheless, a software engineer should not reinvent the wheel by recreating solutions to well-solved problems, but I think a great software engineer can be the person to invent the wheel when problems are not well-solved by the industry.

    P.S. It's also a lot of fun being able to create the tools everyone uses; I had a lot of fun by implementing Promises and Futures for a programming language or writing my own implementation of Cassandra, a distributed database.
u/CSMastermind · 4 pointsr/learnprogramming

I've posted this before but I'll repost it here:

Now in terms of the question that you ask in the title - this is what I recommend:

Job Interview Prep


  1. Cracking the Coding Interview: 189 Programming Questions and Solutions
  2. Programming Interviews Exposed: Coding Your Way Through the Interview
  3. Introduction to Algorithms
  4. The Algorithm Design Manual
  5. Effective Java
  6. Concurrent Programming in Java™: Design Principles and Pattern
  7. Modern Operating Systems
  8. Programming Pearls
  9. Discrete Mathematics for Computer Scientists

    Junior Software Engineer Reading List


    Read This First


  10. Pragmatic Thinking and Learning: Refactor Your Wetware

    Fundementals


  11. Code Complete: A Practical Handbook of Software Construction
  12. Software Estimation: Demystifying the Black Art
  13. Software Engineering: A Practitioner's Approach
  14. Refactoring: Improving the Design of Existing Code
  15. Coder to Developer: Tools and Strategies for Delivering Your Software
  16. Perfect Software: And Other Illusions about Testing
  17. Getting Real: The Smarter, Faster, Easier Way to Build a Successful Web Application

    Understanding Professional Software Environments


  18. Agile Software Development: The Cooperative Game
  19. Software Project Survival Guide
  20. The Best Software Writing I: Selected and Introduced by Joel Spolsky
  21. Debugging the Development Process: Practical Strategies for Staying Focused, Hitting Ship Dates, and Building Solid Teams
  22. Rapid Development: Taming Wild Software Schedules
  23. Peopleware: Productive Projects and Teams

    Mentality


  24. Slack: Getting Past Burnout, Busywork, and the Myth of Total Efficiency
  25. Against Method
  26. The Passionate Programmer: Creating a Remarkable Career in Software Development

    History


  27. The Mythical Man-Month: Essays on Software Engineering
  28. Computing Calamities: Lessons Learned from Products, Projects, and Companies That Failed
  29. The Deadline: A Novel About Project Management

    Mid Level Software Engineer Reading List


    Read This First


  30. Personal Development for Smart People: The Conscious Pursuit of Personal Growth

    Fundementals


  31. The Clean Coder: A Code of Conduct for Professional Programmers
  32. Clean Code: A Handbook of Agile Software Craftsmanship
  33. Solid Code
  34. Code Craft: The Practice of Writing Excellent Code
  35. Software Craftsmanship: The New Imperative
  36. Writing Solid Code

    Software Design


  37. Head First Design Patterns: A Brain-Friendly Guide
  38. Design Patterns: Elements of Reusable Object-Oriented Software
  39. Domain-Driven Design: Tackling Complexity in the Heart of Software
  40. Domain-Driven Design Distilled
  41. Design Patterns Explained: A New Perspective on Object-Oriented Design
  42. Design Patterns in C# - Even though this is specific to C# the pattern can be used in any OO language.
  43. Refactoring to Patterns

    Software Engineering Skill Sets


  44. Building Microservices: Designing Fine-Grained Systems
  45. Software Factories: Assembling Applications with Patterns, Models, Frameworks, and Tools
  46. NoEstimates: How To Measure Project Progress Without Estimating
  47. Object-Oriented Software Construction
  48. The Art of Software Testing
  49. Release It!: Design and Deploy Production-Ready Software
  50. Working Effectively with Legacy Code
  51. Test Driven Development: By Example

    Databases


  52. Database System Concepts
  53. Database Management Systems
  54. Foundation for Object / Relational Databases: The Third Manifesto
  55. Refactoring Databases: Evolutionary Database Design
  56. Data Access Patterns: Database Interactions in Object-Oriented Applications

    User Experience


  57. Don't Make Me Think: A Common Sense Approach to Web Usability
  58. The Design of Everyday Things
  59. Programming Collective Intelligence: Building Smart Web 2.0 Applications
  60. User Interface Design for Programmers
  61. GUI Bloopers 2.0: Common User Interface Design Don'ts and Dos

    Mentality


  62. The Productive Programmer
  63. Extreme Programming Explained: Embrace Change
  64. Coders at Work: Reflections on the Craft of Programming
  65. Facts and Fallacies of Software Engineering

    History


  66. Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software
  67. New Turning Omnibus: 66 Excursions in Computer Science
  68. Hacker's Delight
  69. The Alchemist
  70. Masterminds of Programming: Conversations with the Creators of Major Programming Languages
  71. The Information: A History, A Theory, A Flood

    Specialist Skills


    In spite of the fact that many of these won't apply to your specific job I still recommend reading them for the insight, they'll give you into programming language and technology design.

  72. Peter Norton's Assembly Language Book for the IBM PC
  73. Expert C Programming: Deep C Secrets
  74. Enough Rope to Shoot Yourself in the Foot: Rules for C and C++ Programming
  75. The C++ Programming Language
  76. Effective C++: 55 Specific Ways to Improve Your Programs and Designs
  77. More Effective C++: 35 New Ways to Improve Your Programs and Designs
  78. More Effective C#: 50 Specific Ways to Improve Your C#
  79. CLR via C#
  80. Mr. Bunny's Big Cup o' Java
  81. Thinking in Java
  82. JUnit in Action
  83. Functional Programming in Scala
  84. The Art of Prolog: Advanced Programming Techniques
  85. The Craft of Prolog
  86. Programming Perl: Unmatched Power for Text Processing and Scripting
  87. Dive into Python 3
  88. why's (poignant) guide to Ruby
u/the_omega99 · 18 pointsr/learnprogramming

>I do have a textbook called "C: A modern approach" by King, but like I said before, I think it focuses more on the coding aspect.

Most books that focus on C are going to be about learning the language. If you want to learn low level stuff, you need to find books that focus on them (and they'll usually incidentally use C). The language itself is quite small and minimalistic in what it can do. Most heavy handed things like networking and GUIs require interaction with the OS.

Eg, if you wanted to do networking, you could use the Windows API or the POSIX socket API (POSIX being the standards that *nix systems follow -- and certain versions of Windows). Or you could use a higher level library like curl for cross platform support (and a wealth of nicer features).

>Can somebody please guide me on where to start?

Firstly, as much of a linux fanboy I am, I do want to make sure you know that you don't need to use Linux for any of the other things you wanted to learn (low-level programming, command lines, networking, etc). In fact, my OS class mostly used Linux, but we started out with a project using Windows threads (I guess the prof wanted us to see the difference from POSIX threading).

All that said, I do think Linux is something you'd want to learn and that a lot of low level things just seem more natural in Linux. But I'm biased. Linux fanboy, remember?

I'd start with downloading a Linux OS. Doesn't really matter which. I'd recommend going with Ubuntu. It's the most popular, easiest to find help with, and seems to be what most web servers are running, to boot. You can play around with the GUI for a bit if you want. It won't feel that different. Modern OSes sort of converged into the same high level ideas.

My favourite book for getting into the command line ever so slightly touching the low level aspects of OSes is Mark Sobel's A Practical Guide to Linux Commands, Editors, and Shell Programming. It will include some basic knowledge of Linux, but mostly focuses on the command line. But this is very useful because not only is the command line very practical to learn, but you'll end up learning a lot about Linux in the process (eg, by learning how everything is a file, how pipes work, etc). And arguably the command line a super big part of Linux, anyway. It makes sense as the first step.

Now, for the next step, you need to know C very well. So finish with your class, first. Read ahead if you have to. Yes, you already know if statements and functions and all, but do you understand pointers well? How about function pointers and void pointers? Do you understand how C's arrays work and the usage of pointer arithmetic? How about how arguments are passed to functions and when you'd want to pass a pointer to a function instead? As a rough skill testing question, you should implement a linked list for arbitrary data types with functions such as prepending, appending, concatenating lists, searching, removing, and iterating through the list. Make sure that your list can be allocated and freed correctly (no memory leaks).

Anyway, the next step is to learn OSes. Now, I said OSes and not Linux, because the Linux OS is a bit constrained if you want to learn low level programming (which would include a knowledge of what OSes in general do, and alternatives to OSes like Linux). But never fear, pretty much any OS book will heavily use Linux as an example of how things work and consequently explain a great deal of Linux internals. I can't recommend a class because mine was a regular university class, but Tanenbaum's Modern Operating Systems is a good book on the subject.

In particular, you can expect an OS class to not merely be focused on building an OS yourself (my class worked on aspects of OS101 to implement portions of our own OS), but also on utilizing low level aspects of existing OSes. Eg, as mentioned, my class involved working with Linux threading, as well as processes. We later implemented the syscalls for fork, join, etc ourselves, which was a fascinating exercise. Nothing gets you to understand how Linux creates processes like doing it yourself.

Do note, however, that I had taken a class on computer architecture (I found Computer Organization and Design a good book there, although note that I never did any of the excerises in the book, which seem to be heavily criticized in the reviews). It certainly helps in understand OSes. It's basically as low as you can go with programming (and a bit lower, entering the domain of computer engineering). I cannot say for sure if it's absolutely necessary. I would recommend it first, but it's probably skippable if you're not interested (personally, I found it phenomenally interesting).

For learning networking, Beej's book is well written. You don't need to know OSes before this or anything.

u/NateRudolph · 4 pointsr/arduino

Here's my advice, as a recent grad who was first exposed to arduino in school two years ago.

Get a starter kit that has a nice amount of sensors, jumpers, resistors. Nothing worse than seeing a project online and realizing you'd have to make a trip to radio shack just for some 30 cent resistor.

Amazon - $125

Sparkfun - $60

Jameco - $99

These are all a little pricey, but if you have a decent amount of confidence that you'll stick with things, I think this is a good way to get started. You could get one of the cheaper starter kits, but pushing a button to light an LED is only impressive for like a second. After that you're going to want to start moving and sensing things and it's nice to already have that at your fingertips.

Word of advice on tutorials. If you're anything like me, the internet can be your best friend and worst enemy. There are so many tutorials for stuff like arduino with varying levels of quality. It can be super distracting to look through a long tutorial and then see 100 other things you might want to do. At this point, that's bad because you're just chasing after a cool project, not actually learning. I'd encourage you to commit to buying a book, plugging away through every single tutorial in it, and then looking online. You'll start to see quicker which projects you actually want to dive into when you know a bit more about the process.

That first kit from Amazon comes with a book that I'm sure is great. Here's the one we went through at school: Programming Arduino - $12

That said, I'd very strongly encourage you to do it. Save up some money, get one of those kits, and start learning! It's incredible rewarding, and after even a few months you'll have projects lying around that will impress pretty much anyone who doesn't know what arduino is. I really wish I had started at your age. Good luck!

u/rolfr · 57 pointsr/ReverseEngineering

I started from scratch on the formal CS side, with an emphasis on program analysis, and taught myself the following starting from 2007. If you're in the United States, I recommend BookFinder to save money buying these things used.

On the CS side:

  • Basic automata/formal languages/Turing machines; Sipser is recommended here.
  • Basic programming language theory; I used University of Washington CSE P505 online video lectures and materials and can recommend it.
  • Formal semantics; Semantics with Applications is good.
  • Compilers. You'll need several resources for this; my personal favorites for an introductory text are Appel's ML book or Programming Language Pragmatics, and Muchnick is mandatory for an advanced understanding. All of the graph theory that you need for this type of work should be covered in books such as these.
  • Algorithms. I used several books; for a beginner's treatment I recommend Dasgupta, Papadimitriou, and Vazirani; for an intermediate treatment I recommend MIT's 6.046J on Open CourseWare; for an advanced treatment, I liked Algorithmics for Hard Problems.

    On the math side, I was advantaged in that I did my undergraduate degree in the subject. Here's what I can recommend, given five years' worth of hindsight studying program analysis:

  • You run into abstract algebra a lot in program analysis as well as in cryptography, so it's best to begin with a solid foundation along those lines. There's a lot of debate as to what the best text is. If you're never touched the subject before, Gallian is very approachable, if not as deep and rigorous as something like Dummit and Foote.
  • Order theory is everywhere in program analysis. Introduction to Lattices and Order is the standard (read at least the first two chapters; the more you read, the better), but I recently picked up Lattices and Ordered Algebraic Structures and am enjoying it.
  • Complexity theory. Arora and Barak is recommended.
  • Formal logic is also everywhere. For this, I recommend the first few chapters in The Calculus of Computation (this is an excellent book; read the whole thing).
  • Computability, undecidability, etc. Not entirely separate from previous entries, but read something that treats e.g. Goedel's theorems, for instance The Undecidable.
  • Decision procedures. Read Decision Procedures.
  • Program analysis, the "accessible" variety. Read the BitBlaze publications starting from the beginning, followed by the BAP publications. Start with these two: TaintCheck and All You Ever Wanted to Know About Dynamic Taint Analysis and Forward Symbolic Execution. (BitBlaze and BAP are available in source code form, too -- in OCaml though, so you'll want to learn that as well.) David Brumley's Ph.D. thesis is an excellent read, as is David Molnar's and Sean Heelan's. This paper is a nice introduction to software model checking. After that, look through the archives of the RE reddit for papers on the "more applied" side of things.
  • Program analysis, the "serious" variety. Principles of Program Analysis is an excellent book, but you'll find it very difficult even if you understand all of the above. Similarly, Cousot's MIT lecture course is great but largely unapproachable to the beginner. I highly recommend Value-Range Analysis of C Programs, which is a rare and thorough glimpse into the development of an extremely sophisticated static analyzer. Although this book is heavily mathematical, it's substantially less insane than Principles of Program Analysis. I also found Gogul Balakrishnan's Ph.D. thesis, Johannes Kinder's Ph.D. thesis, Mila Dalla Preda's Ph.D. thesis, Antoine Mine's Ph.D. thesis, and Davidson Rodrigo Boccardo's Ph.D. thesis useful.
  • If you've gotten to this point, you'll probably begin to develop a very selective taste for program analysis literature: in particular, if it does not have a lot of mathematics (actual math, not just simple concepts formalized), you might decide that it is unlikely to contain a lasting and valuable contribution. At this point, read papers from CAV, SAS, and VMCAI. Some of my favorite researchers are the Z3 team, Mila Dalla Preda, Joerg Brauer, Andy King, Axel Simon, Roberto Giacobazzi, and Patrick Cousot. Although I've tried to lay out a reasonable course of study hereinbefore regarding the mathematics you need to understand this kind of material, around this point in the course you'll find that the creature we're dealing with here is an octopus whose tentacles spread in every direction. In particular, you can expect to encounter topology, category theory, tropical geometry, numerical mathematics, and many other disciplines. Program analysis is multi-disciplinary and has a hard time keeping itself shoehorned in one or two corners of mathematics.
  • After several years of wading through program analysis, you start to understand that there must be some connection between theorem-prover based methods and abstract interpretation, since after all, they both can be applied statically and can potentially produce similar information. But what is the connection? Recent publications by Vijay D'Silva et al (1, 2, 3, 4, 5) and a few others (1 2 3 4) have begun to plough this territory.
  • I'm not an expert at cryptography, so my advice is basically worthless on the subject. However, I've been enjoying the Stanford online cryptography class, and I liked Understanding Cryptography too. Handbook of Applied Cryptography is often recommended by people who are smarter than I am, and I recently picked up Introduction to Modern Cryptography but haven't yet read it.

    Final bit of advice: you'll notice that I heavily stuck to textbooks and Ph.D. theses in the above list. I find that jumping straight into the research literature without a foundational grounding is perhaps the most ill-advised mistake one can make intellectually. To whatever extent that what you're interested in is systematized -- that is, covered in a textbook or thesis already, you should read it before digging into the research literature. Otherwise, you'll be the proverbial blind man with the elephant, groping around in the dark, getting bits and pieces of the picture without understanding how it all forms a cohesive whole. I made that mistake and it cost me a lot of time; don't do the same.
u/coned88 · 1 pointr/linux

While being a self taught sys admin is great, learning the internals of how things work can really extend your knowledge beyond what you may have considered possible. This starts to get more into the CS portion of things, but who cares. It's still great stuff to know, and if you know this you will really be set apart. Im not sure if it will help you directly as a sys admin, but may quench your thirst. Im both a programmer and unix admin, so I tend to like both. I own or have owned most of these and enjoy them greatly. You may also consider renting them or just downloading them. I can say that knowing how thing operate internally is great, it fills in a lot of holes.

OS Internals

While you obviously are successful at the running and maintaining of unix like systems. How much do you know about their internal functions? While reading source code is the best method, some great books will save you many hours of time and will be a bit more enjoyable. These books are Amazing
The Design and Implementation of the FreeBSD Operating System

Linux Kernel Development
Advanced Programming in the UNIX Environment

Networking

Learning the actual function of networking at the code level is really interesting. Theres a whole other world below implementation. You likely know a lot of this.
Computer Networks

TCP/IP Illustrated, Vol. 1: The Protocols

Unix Network Programming, Volume 1: The Sockets Networking API

Compilers/Low Level computer Function

Knowing how a computer actually works, from electricity, to EE principles , through assembly to compilers may also interest you.
Code: The Hidden Language of Computer Hardware and Software

Computer Systems: A Programmer's Perspective

Compilers: Principles, Techniques, and Tools

u/balefrost · 1 pointr/AskProgramming

OK, a few things:

It looks like you're trying to build a shift/reduce parser, which is a form of an LR parser, for your language. LR parsers try to reduce symbols into more abstract terms as soon as possible. To do this, an LR parser "remembers" all the possible reductions that it's pursuing, and as soon as it sees the input symbols that correspond to a specific reduction, it will perform that reduction. This is called "handle finding".

> If I am correct, my Automaton is a DFA?

When the parser is pursuing a reduction, it's looking for sequences of symbols that match the right-hand sides of the relevant (to our current parse state) productions in our grammar. Since the right-hand sides of all the productions in a grammar are simple sequences, all the handle finding work can be done by a DFA. Yes, the handle recognizer of your parser is a DFA. But keep in mind that it needs to be combined with other parts to make a full parser, and your actual grammar can't be recognized with just a DFA.

In particular, you've shown the ACTION table for a shift/reduce parser. It determines what to do when you encounter a symbol in the input stream. But a shift/reduce parser typically needs a second table as well - the GOTO table - that determines what to do after a reduction has taken place.

One other thing that's worth mentioning: you've expressed your ACTION table as a plain DFA transition table. That's not necessarily wrong, but it's not commonly done that way. Instead of reducing when you reach a certain state, it's common to instead attach an action - either 'shift' or 'reduce' ('accept') - to each transition itself. So in a shift/reduce parser, your table might look more like this:

| [ | ] | < | > | id | / | attr
----+-----+-----+-----+-----+------+-----+--------
0 | S1 | | S4 | | | |
1 | | | | | S2 | | R3 : Reduce Tag -> [ id ]
2 | | R3 | | | | | R7 : Reduce Tag -> < id ??? / >
4 | | | | | S5 | S10 | R9 : Reduce Tag -> < id ??? >
5 | | | | R9 | | S6 | S8 R12 : Reduce Tag -> < / id >
6 | | | | R7 | | |
8 | | | | R9 | | S6 | S8
10 | | | | | S11 | |
11 | | | | R12 | | |

Note that R7 and R9 aren't well-formed, since multiple sequences of input tokens might cause you to reach these actions. While it would be possible to construct a shift / reduce parser this way, it's not commonly done. Typically, the DFA to recognize handles is an acyclic graph, but your have a self-transition in state 8.

> What would be the best way of implementing this automaton in C++? Do I really have to make a huge array?

In general, yes, you need a big array (or, as suggested before, two big arrays). But you can use any space-saving technique you want. For example, since most entries in the ACTION table are invalid, one could represent that data with a sparse array data structure. Also, both The Dragon Book and Cooper and Torczon briefly cover parser-specific ways to compress those tables. For example, notice that rows 5 and 8 in your example have the same entries. Most real grammars have multiple instances of identical rows, so factoring out this commonality can save enough space that the extra complexity is worth it.

---

I'm a little surprised that you're building a parser like this by hand, though. Typically people do one of two things:

  1. Build, by hand, a modified LL(1) recursive descent parser (or variant, like a packrat parser)
  2. Build, using a tool like YACC or Bison, a LR(1) shift/reduce parser

    You're sort of doing a mix of the two, which means you have the downsides of both approaches. You need to track all the states and transitions by hand, instead of relying on tools to automate that process, yet you don't get the flexibility of a hand-coded recursive descent parser.

    If you're doing this for education's sake, then by all means proceed. I'd highly encourage you to pick up a book on parsing; I think Cooper and Torczon is a great source. But if you just want a parser that works, I'd definitely recommend using a tool or using a more direct approach, like recursive-descent.
u/MrAureliusR · 2 pointsr/ElectricalEngineering

Okay, you're definitely at the beginning. I'll clarify a few things and then recommend some resources.

  1. Places to buy components: Depending on where you live in the world, the large component suppliers are almost always the way to go, with smaller suppliers like Adafruit/Sparkfun if you need development boards or specialised things. I buy almost exclusively from Digikey -- they have $8 flat shipping to Canada, which typically arrives the next day, with no customs fees. They have some sort of agreement in place where they cover these costs. This *always* saves money over going to my local stores where the prices are inflated. It's crazy how cheap some things are. If I need a few 2.2K 1206 resistors for a project, I just buy a reel of 1000 because they are so cheap.
  2. "Steer a joystick with an app" Do you mean connect motors to it and have them move the joystick for you? You're going to want some sort of microcontroller platform, along with a motor controller and way to communicate with a smartphone app. You mention you know C++ so it will be easy to switch to C. This is both true and false. Programming for microcontrollers is not the same as programming for computers. You are much closer to the hardware, typically manipulating many registers directly instead of abstracting it away. Each microcontroller vendor has their own tools and compilers, although *some* do support GCC or alternatives. You mentioned PIC, which is a line of microcontrollers by a large company called Microchip. There are 8-bit, 16-bit, and 32-bit PICs, all at different price points and with hugely differing capabilities. Selecting the microcontroller for a project can be half the battle sometimes. Or, like me, you can just go with whatever you have on hand (which is usually MSP430s or PIC32MX's)
  3. A lot of people will recommend the book The Art of Electronics. It's decent, but it's not for everyone. Some really like the conversational style, others don't. Many people who want to get into microcontroller programming and embedded development want to skip over the fundamentals and just get something working. For those, I point them to Arduino and let them on their merry way. However, if you actually want to learn something, I highly recommend buying an actual microcontroller development board, learning the fundamentals about electrical circuits, and programming in actual C with actual IDEs.
  4. As far as resources go, again it depends on your actual goal. Whenever I want to learn a new tool (like a PCB layout software, or a new IDE) I always start with a simple project. Having an end point to reach will keep you motivated when things seem complicated. Your controlling a joystick with motors is a great starting point. I would buy a development board, Microchip PICs are popular, as are ST32s, and MSP430. It doesn't really matter that much in the long run. Just don't tie yourself too hard to one brand. Then pick up some stepper motors, and a stepper motor control board (grab one from Sparkfun/Adafruit, etc). Get yourself a breadboard, and some breadboard jumpers, a cheap power supply (there are tons available now for cheap that are pretty decent), and then jump in head first!
  5. I highly recommend the book Making Embedded Systems by Elecia White, once you've covered the basics. It's a great way to learn more about how professionals actually design things. For the basics, you can watch *EARLY* EEVBlog videos (anything past around video 600/650 he gets progressively more annoying and set in his ways, another topic entirely, but the early stuff is decent). I'd also recommend picking up your choice of books about the fundamentals -- Electronics for Dummies, the aforementioned Art of Electronics, Making Embedded Systems, The Art of Designing Embedded Systems, and even stuff like Design Patterns for Embedded Systems in C. Again, it all depends on what your goal is. If you want to do embedded design, then you'll need to focus on that. If you're more into analog circuits, then maybe check out The Art and Science of Analog Circuit Design. Either way, grounding yourself in the fundamentals will help a LOT later on. It will make reading schematics way easier.

    I feel like I've gone off on a few tangents, but just ask for clarification if you want. I'd be happy to point you towards other resources.
u/Beagles_are_da_best · 9 pointsr/PrintedCircuitBoard

I did learn all of this stuff from experience. Honestly, I had a little bit of a tough time right out of college because I didn't have much practical circuit design experience. I now feel like I have a very good foundation for that and it came through experience, learning from my peers, and lots of research. I have no affiliation with Henry Ott, but I treat his book like a bible . I refer to it just about every time I do a board design. Why? because it's packed with this type of practical information. Here's his book. I bought mine used as cheap as I could. At my previous job, they just had one in the library. Either way, it was good to have around.

So why should you care about electromagnetic compatibility (EMC)? A couple reasons:

  1. EMC compliance is often regulated by industry and because a product requirement. The types of tests that your product has to pass is dependent on the industry typically, but in general there are tests where bad things are injected into your board and tests where they measure how noisy your board. You have to pass both.
  2. EMC compliance, in my opinion, is very well correlated with the reliability and quality of a product. If a product is destroyed "randomly" or stops working when the microwave is on, you're not likely to have a good opinion of that product. Following guidelines like the one I did above is the path to avoiding problems like that.
  3. EMC design is usually not taught in schools and yet it is the most important part of the design (besides making it perform the required product function in the first place). It also is very hard to understand because many of the techniques for improving your design do not necessarily show up on your schematics. Often, it's about how well your layout your board, how the mechanical design for the enclosure of your board is considered, etc.

    Anyways, it's definitely worth looking at and is a huge asset if you can follow those guidelines. Be prepared to enter the workforce and see rampant disregard for EMC best practices as well as rampant EMC problems in existing products. This is common because, as I said, it's not taught and engineers often don't know what tools to use to fix it. It often leads to expensive solutions where a few extra caps and a better layout would have sufficed.

    A couple more books I personally like and use:

    Howard Johnson, High Speed Digital Design (it's from 1993, but still works well)

    Horowitz and Hill, The Art of Electronics (good for understanding just about anything, good for finding tricks and ideas to help you for problems you haven't solved before but someone probably has)

    Last thing since I'm sitting here typing anyways:

    When I first got out of college, I really didn't trust myself even when I had done extensive research on a particular part of design. I was surrounded by engineers who also didn't have the experience or knowledge to say whether I was on the right path or not. It's important to use whatever resources you have to gain experience, even if those resources are books alone. It's unlikely that you will be lucky and get a job working with the world's best EE who will teach you everything you need to know. When I moved on from my first job after college, I found out that I was on the right path on many things thanks to my research and hard work. This was in opposition to my thinking before then as my colleagues at my first job were never confident in our own ability to "do EE the right way" - as in, the way that engineers at storied, big companies like Texas Instruments and Google had done. Hope that anecdotal story pushes you to keep going and learning more!
u/DeepMusing · 23 pointsr/engineering

Any engineering job is going to have a significant amount of domain knowledge that is specific to that company's products, services, or research. Getting an engineering degree is just the beginning. Once you get a job at a company, you will need to learn a shit load of new terms, IP, history, and procedures that are specific to that company. It's the next level of your education, and will take years to fully assimilate. School doesn't teach you anywhere near enough to walk into most engineering jobs and be independently productive. You are there to learn as much as do. The senior engineers are your teachers and gaining their knowledge and experience is the key to building a successful career. You need to look at them as a valuable resource that you should be taking every opportunity to learn from. If you don't understand what they are saying, then ask, take notes, and do independent research to fill in your knowledge gaps. Don't just dismiss what they say as techo-babble.

!!!!!! TAKE THIS TO HEART !!!!! - The single biggest challenge you will have in your engineering career is learning how to work well with your peers, seniors, and managers. Interpersonal skills are ABSOLUTELY critical. Engineering is easy: Math, science, physics, chemistry, software, electronics.... all of that is a logical, and learnable, and a piece of cake compared to dealing with the numerous and often quirky personalities of the other engineers and managers. Your success will be determined by your creativity, productivity, initiative, and intelligence. Your failure will be determined by everyone else around you. If they don't like you, no amount of cleverness or effort on your part will get you ahead. Piss off your peers or managers, and you will be stepped on, marginalized, criticized, and sabotaged. It's the hard truth about the work world that they don't teach you in school. You aren't going anywhere without the support of the people around you. You are much more likely to be successful as a moron that everyone loves, than a genius that everyone hates. It sucks, but that's the truth.

You are the new guy, you have lots to learn, and that is normal and expected. It's going to be hard and frustrating for a while, but you will get the hang of it and find your footing. Learn as much as you can, and be appreciative for any help or information that you can get.

As for digitizing a signal, it is correct that you should stick with powers of 2 for a number of technical reasons. At the heart of the FFT algorithm, the signal processing is done in binary. This is part of the "Fast" in Fast Fourier Transforms. By sticking with binary and powers of 2, you can simply shift bits or drop bits to multiply or divide by 2, which is lightning fast for hardware. If you use non powers of 2 integers or fractional sampling rates, then the algorithm would need to do extensive floating point math, which can be much slower for DSPs, embedded CPUs, and FPGAs with fixed-point ALUs. It's about the efficiency of the calculations in a given platform, not what is theoretically possible. Power of 2 sample rates are much more efficient to calculate with integer math for almost all digital signal processing.

I highly recommend reading the book "The Scientist and Engineer's Guide to Digital Signal Processing" by Steven W. Smith. It is by far the best hand-holding, clearly-explained, straight-to-the-point, introductory book for learning the basics of digital signal processing, including the FFT.

You can buy the book from Amazon [here.] (https://www.amazon.com/Scientist-Engineers-Digital-Signal-Processing/dp/0966017633/ref=sr_1_1?ie=UTF8&qid=1492940980&sr=8-1&keywords=The+Scientist+and+Engineer%27s+Guide+to+Digital+Signal+Processing) If you can afford it, the physical book is great for flipping though and learning tons about different signal processing techniques.

Or you can download the entire book in PDF form legally for free here. The author is actually giving the book away for free in electronic form ( chapter by chapter ).

Chapter 12 covers FFTs.



u/oridb · 2 pointsr/learnprogramming

I've been playing around with writing a programming language and compiler in my spare time for a while now (shameless plug: http://eigenstate.org/myrddin.html; source: http://git.eigenstate.org/git/ori/mc.git). Lots of fun, and it can be as shallow or as deep as you want it to be.

Where are you with the calculator? Have you got a handle on tokenizing and parsing? Are you intending to use tools like lex and yacc, or do you want to do a recursive descent parser by hand? (Neither option is too hard; hand written is far easier to comprehend, but it doesn't give you any correctness guarantees)

The tutorials I'd suggest depend on exactly where you are and what you're trying to do. As far as books, the three that I would go with are, in order:

For basic recursive descent parsing:

u/CodeTamarin · 2 pointsr/computerscience

The Stanford Algorithm book is complete overkill in my opinion do NOT read that book. That's insane. Read it when you've been doing programming for a while and have a grasp of how it even applies.

Here's my list, it's a "wanna be a decent junior" list:

  • Computer Science Distilled
  • Java/ C# / PHP/ JS (pick one)
  • Do some Programming Challenges
  • SQL
  • Maybe build a small web app. Don't worry about structure so much, just build something simple.
  • Applying UML: and Patterns: An Introduction to Object Oriented Anaysis and Design Iterative Development
  • Head First Design Patterns
  • Clean Architecture
  • Refactoring: Improving the Design of Existing Code
  • If you're interested in Web
  • Soft Skills: Power of Habit , A Mind for Numbers , Productivity Project

    ​

    Reasoning: So, the first book is to give you a sense of all that's out there. It's short and sweet and primes you for what's ahead. It helps you understand most of the basic industry buzz words and whatnot. It answers a lot of unknown unknowns for a newbie.

    Next is just a list languages off the top of my head. But you can pick anything, seriously it's not a big deal. I did put Java first because that's the most popular and you'll like find a mountain of resources.

    Then after some focused practice, I suggest grabbing some SQL. You don't need to be an expert but you gotta know about DBs to some degree.

    Then I put an analysis book that's OOP focused. The nifty thing about that book, is it breaks into design patterns nicely with some very simple design patters to introduce you to design patterns and GRASP.

    Then I put in a legit Design Patterns book that explains and explores design patterns and principles associated with many of them.

    Now that you know how code is structured, you're ready for a conversation about Architecture. Clean architecture is a simple primer on the topic. Nothing too crazy, just preps you for the idea of architecture and dealing with it.

    Finally, refactoring is great for working devs. Often your early work will be focused on working with legacy code. Then knowing how to deal with those problems can be helpful.

    FINAL NOTE: Read the soft skills books first.

    The reason for reading the soft skills books first is it helps develop a mental framework for learning all the stuff.

    Good luck! I get this isn't strictly computer science and it's likely focused more toward Software Development. But I hope it helps. If it doesn't. My apologies.
u/JimWibble · 1 pointr/Gifts

He sounds like a younger version of myself! Technical and adventurous in equal measure. My girlfriend and I tend to organise surprise activities or adventures we can do together as gifts which I love - it doesn't have to be in any way extravegant but having someone put time and thought into something like that it amazing.

You could get something to do with nature and organise a trip or local walk that would suit his natural photography hobby. I love to learn about new things and how stuff works so if he's anything like me, something informative that fits his photography style like a guide to local wildflowers or bug guide. I don't know much about parkour but I do rock climb and a beginners bouldering or climbing session might also be fun and something you can do together.

For a more traditional gift Randall Munroe from the web comic XKCD has a couple of cool books that might be of interest - Thing Explainer and What If. Also the book CODE is a pretty good book for an inquisitive programmer and it isn't tied to any particular language, skillset or programming level.

u/loubs001 · 2 pointsr/hardware

Agree. It depends on what you want to know, and how much you're willing to commit to learning. It's a big world. Code is a nice book if you want a very very simple explanation of the basics of bits and bytes and logic gates. It might be a good place to start, though it's intended for a non-technical audience and you may find it a little TOO simple. A proper digital systems book will go in to much more detail about digital logic (AND gates, flip-flops etc.). You might be surprised just how easy to learn the fundamentals are. I learned from Tocci which I found to be excellent, but that was a long time ago and I'm sure there's many other good ones around.

That's pretty low level digit circuits though. If you are really serious about learning computer architecture, I'd highly recommend Patterson and Hennssey . It covers the guts of how processors execute instructions, pipelining, caches, virtual memory and more.

If you're more interested in specific, modern technologies... then obviously Wikipedia, or good tech review sites. Especially reviews that focus on major new architectures. I remember reading lots of good in depth stuff about Intel's Nehalem architecture back when it was new, or nvidia's Fermi. There's a wealth of information out there about CUDA and GPU computing which may give you a sense of how GPUs are so different to CPUs. Also when I first started learning many years ago, I loved my copy of Upgrading and Repairing PCs , great for a less technical, more hobbyist perspective.

Lastly, ask questions! For example, you ask about DDR vs GDDR. Deep inside the memory chips themselves, actually not a great deal of difference. But the interface between the memory and the processor are quite different, they're designed for very different purposes. I'm simplifying here but CPUs have relatively low levels of parallism, they tend to operate on small units of memory (say a single value) at a time, they have quite unpredictable access patterns so low latency is essential, and the cores often work tightly together so coherency has to be maintained. With GPUs, they have a very predictable access pattern, so you can load much larger chunks at a time, latency is less important since you can easily keep your processors busy while memory is streamed in, and the GPUs many many tiny processors for the most part all work on separate words of memory, so coherence usually does not need to be maintained and they have much less need for caches.

The "L" (Level) naming for caches is quite simple. Memory that is closer to the core is faster to access. Generally each core has it's own L1 and L2, with L2 being slightly slower but there's more of it, and all cores share an L3, slower still but way more of it. Memory on the cpu is made out of transistors and is super fast but also takes up alot of space. Look how big the L3 is (here)[http://www.anandtech.com/show/8426/the-intel-haswell-e-cpu-review-core-i7-5960x-i7-5930k-i7-5820k-tested] and that's just 20MB. external ram is obviously much slower, but it is made out of capacitors and has much higher densities.

u/CricketPinata · 1 pointr/milliondollarextreme

If you want to just know buzzwords to throw around, spend a bunch of time clicking around on Wikipedia, and watch stuff like Crash Course on YouTube. It's easy to absorb, and you'll learn stuff, even if it's biased, but at least you'll be learning.

If you want to become SMARTER, one of my biggest pieces of advice is to either carry a notebook with you, or find a good note taking app you like on your phone. When someone makes a statement you don't understand, write it down and parse it up.

So for instance, write down "Social Democracy", and write down "The New Deal", and go look them up on simple.wikipedia.com (Put's all of it in simplest language possible), it's a great starting point for learning about any topic, and provides you a jumping board to look more deeply into it.

If you are really curious about starting an education, and you absolutely aren't a reader, some good books to start on are probably:

"Thing Explainer: Complicated Stuff in Simple Words" by Randall Munroe

"A Short History of Nearly Everything" by Bill Bryson

"Philosophy 101" by Paul Kleinman, in fact the ____ 101 books are all pretty good "starter" books for people that want an overview of a topic they are unfamiliar with.

"The World's Religions" by Huston Smith

"An Incomplete Education" by Judy Jones and Will Wilson

Those are all good jumping off points, but great books that I think everyone should read... "A History of Western Philosophy" by Bertrand Russell, "Western Canon" by Harold Bloom, "Education For Freedom" by Robert Hutchins, The Norton Anthology of English Literature; The Major Authors, The Bible.

Read anything you find critically, don't just swallow what someone else says, read into it and find out what their sources were, otherwise you'll find yourself quoting from Howard Zinn verbatim and thinking you're clever and original when you're just an asshole.

u/IjonTichy85 · 2 pointsr/compsci

Hi,
do you want to become a computer scientist or a programmer? That's the question you have to ask yourself. Just recently someone asked about some self-study courses in cs and I compiled a list of courses that focuses on the theoretical basics (roughly the first year of a bachelor class). Maybe it's helpful to you so I'm gonna copy&paste it here for you:



I think before you start you should ask yourself what you want to learn. If you're into programming or want to become a sysadmin you can learn everything you need without taking classes.

If you're interested in the theory of cs, here are a few starting points:

Introduction to Automata Theory, Languages, and Computation

The book you should buy

MIT: Introduction to Algorithms

The book you should buy


Computer Architecture<- The intro alone makes it worth watching!

The book you should buy

Linear Algebra

The book you should buy <-Only scratches on the surface but is a good starting point. Also it's extremely informal for a math book. The MIT-channel offers many more courses and are a great for autodidactic studying.

Everything I've posted requires no or only minimal previous education.
You should think of this as a starting point. Maybe you'll find lessons or books you'll prefer. That's fine! Make your own choices. If you've understood everything in these lessons, you just need to take a programming class (or just learn it by doing), a class on formal logic and some more advanced math classes and you will have developed a good understanding of the basics of cs. The materials I've posted roughly cover the first year of studying cs. I wish I could tell you were you can find some more math/logic books but I'm german and always used german books for math because they usually follow a more formal approach (which isn't necessarily a good thing).
I really recommend learning these thing BEFORE starting to learn the 'useful' parts of CS like sql,xml, design pattern etc.
Another great book that will broaden your understanding is this Bertrand Russell: Introduction to mathematical philosophy
If you've understood the theory, the rest will seam 'logical' and you'll know why some things are the way they are. Your working environment will keep changing and 20 years from now, we will be using different tools and different languages, but the theory won't change. If you've once made the effort to understand the basics, it will be a lot easier for you to switch to the next 'big thing' once you're required to do so.

One more thing: PLEASE, don't become one of those people who need to tell everyone how useless a university is and that they know everything they need just because they've been working with python for a year or two. Of course you won't need 95% of the basics unless you're planning on staying in academia and if you've worked instead of studying, you will have a head start, but if someone is proud of NOT having learned something, that always makes me want to leave this planet, you know...

EDIT: almost forgot about this: use Unix, use Unix, and I can't emphasize this enough: USE UNIX! Building your own linux from scratch is something every computerscientist should have done at least once in his life. It's the only way to really learn how a modern operating system works. Also try to avoid apple/microsoft products, since they're usually closed source and don't give you the chance to learn how they work.

u/xnoise · 1 pointr/PHP

There are a ton of books, but i guess the main question is: what are you interested in? Concepts or examples? Because many strong conceptual books are using examples from java, c++ and other languages, very few of them use php as example. If you have the ability to comprehend other languages, then:

http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_1?ie=UTF8&qid=1322476598&sr=8-1 definetly a must read. Beware not to memorize it, it is more like a dictionary. It should be pretty easy to read, a little harder to comprehend and you need to work with the patterns presented in that book.

http://www.amazon.com/PHP-5-Objects-Patterns-Practice/dp/1590593804 - has already been mentioned, is related directly to the above mentioned one, so should be easier to grasp.

http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/ref=sr_1_1?ie=UTF8&qid=1322476712&sr=8-1 - one of the most amazing books i have read some time ago. Needs alot of time and good prior knowledge.

http://www.amazon.com/Refactoring-Improving-Design-Existing-Code/dp/0201485672/ref=sr_1_4?ie=UTF8&qid=1322476712&sr=8-4 - another interesting read, unfortunatelly i cannot give details because i haven't had the time to read it all.

u/[deleted] · 9 pointsr/programming

You need to show that you know your stuff. Just because you're doing something more applied like Network Security in grad school doesn't mean that you won't have a base level of knowledge you're expected to understand. In that case, you need to learn some basic stuff a CS student at a good school would know. I'm not "dumbing down" anything on my list here, so if it seems hard, don't get discouraged. I'm just trying to cut the bullshit and help you. (:

  • Redo your introduction to Computer Science. If you finish this, picking up a new language is cake.

  • Discrete Mathematics, A.K.A. "Math for Computer Scientists" This is the standard text for this, but this is pretty good for a cheap book.

  • Algorithms

  • Compilers

  • Operating Systems

  • Networking

  • For basic CS theory, "Introduction to Theory of Computation by Michael Sipser" is what I used to recommend, but Amazon doesn't seem to have a sanely priced copy. Either buy that used, or get the classic "Cinderella Book". Get an older edition if you can!

    Again, don't be discouraged, but you'll need to work hard to catch up. If you were trying for something like mathematics or physics while doing this, I'd call you batshit insane. You may be able to pull it off with CS though (at least for what you want to study). Make no mistake: getting through all these books I posted on your own is hard. Even if you do, it might be the case that still no one will admit you! But if you do it, and you can retain and flaunt your knowledge to a sympathetic professor, you might be surprised.

    Best of luck, and post if you need more clarification. As a side note, follow along here as well.

    Netsec people feel free to give suggestions as well.
u/spoonraker · 4 pointsr/personalfinance

Self-taught software engineer checking in to add on to this.

Everything u/TOM_BRADYS_PET_GOAT said is true.

I'll add a few specific resources:

Computer science fundamentals are really scary and overwhelming if you're self-taught. I'd highly recommend reading The Imposter's Handbook to get started with this topic. You'll want more in-depth material afterwards on each of the various subtopics, but this book is absolutely fantastic as a (surprisingly deep) introduction to all the concepts that's framed specifically to get self-taught programmers up to speed.

After you're familiar with the concepts at a conceptual level, and it's time to just get down to dedicated practice, Cracking the Coding Interview will be an invaluable resource. This book exists for the sole purpose of helping people become better at the types of questions most commonly asked during coding interviews. It's not just a list of a bunch of questions with solutions, it actually explains the theory in-depth, provides drill and smaller practice questions, as well as questions designed to emulate specific interview scenarios at real tech companies like Google, Microsoft, Amazon, etc. It'll even talk about the interview process at those companies outside of just the questions and theory behind them.

As a more general resource that you'll reach for repeatedly throughout your career, I'd recommend The Complete Software Developer's Career Guide. This book covers everything. How to learn, how to interview, how to negotiate salary, how to ask for raises, how to network, how to speak at conferences and prepare talks, how to build your personal brand, how to go into business for yourself if you want, etc. and that's just scratching the surface of what's covered in that book. I did't even buy this book until I was 10 years into my career and it's still very insightful.

And lets not forget, being a good developer isn't just a matter of making things that work, it's a matter of writing code that readable, extensible, and a pleasure for other developers to work on. So to this end, I'd recommend any developer read both Clean Code and Clean Architecture: A Craftsman's Guide to Software Structure and Design

u/adi123456789 · 2 pointsr/cpp

I'm an embedded software developer who used to use C and now primarily works with C++.

Learning C is relatively easier when you start off and gives you a better appreciation of memory handling and it's complexities than C++ does in my opinion. The C knowledge will also transfer well to C++.

C++ is definitely a much more powerful language and you can get your tasks done quicker with it. There are a lot of things to learn in C++, but you can get them with time. A lot of embedded processors, particularly the ARM based ones, support C++ as well, so that is not a problem

Like someone else mentioned though, embedded development relies on a good knowledge of programming as well as a good understanding of computer architecture.

Here's a nice book I've read which is useful for new embedded developers - Making Embedded Systems: Design Patterns for Great Software https://www.amazon.com/dp/1449302149/ref=cm_sw_r_cp_apa_i_MuFhDb1WWXK3W

u/welfare_pvm · 1 pointr/SoftwareEngineering

What field do you want to specialize in? Embedded? Web? Mobile?

The best way to learn is by practicing, but if you want more of an abstract, design level read, there are lots of options.

I'm have a web background, so here's three that I've read recently as examples.

I enjoyed this book on microservice design and I think everyone who uses OOP should at least familiarize themselves with the common OOP design patterns.

If you are into JavaScript, Eloquent JavaScript is my go-to for a good mix of summary/detail of the language. It's well written, and comes with fun exercises at the end of each chapter to help solidify your understanding of each concept.

I'm sure there are other great books, but these are some of my favorites so far.

u/dohpaz42 · 3 pointsr/PHP

Agreed. There are plenty of resources out there that will help you understand design patterns. If you're new to the concept, I would recommend Head First: Design Patterns, it might be based on Java, but the examples are simple to understand and can mostly apply to PHP as well. When you feel like you've grasped the basic concepts of design patterns, you can move on to more advanced texts, like Martin Fowler's Patterns of Enterprise Design - this is a great reference for a lot of the more common patterns. There is also Refactoring: Improving the Design of Existing Code. These are great investments that will help you with any project you work on, and will help you if you decide to use a framework like Zend which uses design patterns very heavily.

u/PinPinIre · 1 pointr/learnprogramming

It largely depends on which Computer Science degree you are going to do. There can be some that focus heavily on software and very little on hardware and some that get a nice balance between the two. If the degree is going to focus on hardware I would recommend reading up on the underlying logic of a computer and then reading this book (Inside the machine). ITM isn't a very technical book(I would label it as the computer science equivalent of popular science) but it gives a nice clear overview of the what happens in a processor.

When it comes to programming, I would recommend starting with Java and Eclipse. Java gets quite a bit of hate but for a newcomer, I think Java would be easier to grasp than the likes of C/C++. C/C++ are nice languages but a newcomer may find their error messages a little bit obscure and may get confused with the nitty-gritty nuances of the languages.

Though the one thing you should realise is that programming is a skill that isn't confined to one language. If you understand the basic concepts of recursion, arrays, classes, generics/templates, inheritance, etc. you can apply this knowledge to almost any language. Ideally i would recomend two books on programming (Algorithmics) and (Introduction to Algorithms). Algorithmics is another books I would label as the cs equivalent to popular science but the early chapters give a nice overview of exactly what algorithms actually are. Introduction to Algorithms is a more technical book that I would recommend to someone once they know how to program and want a deeper understanding of algorithms.

The rest is personal preference, personally I prefer to use a Unix machine with Sublime Text 2 and the command line. Some will try to convince you to use Vim or Emacs but you should just find whichever you are most comfortable with.

u/ntr0p3 · 3 pointsr/AskReddit

By biology I don't mean what they teach you in college or med-school, I mean understanding the basic processes (physiology-esque) that underlie living things, and understanding how those systems interact and build into more complex systems. Knowing the names of organs or parts of a cat is completely worthless, understanding the process of gene-activation, and how that enables living organisms to better adapt to their environments, especially, for instance, for stress factors activating responses due to new stimuli, can be very valuable, especially as a function of applied neurology.

Also, what we call biology and medicine today will be so pathetically obsolete in 10 years as to be comical, similar to how most mechanics can rebuild a carburetor, but not design and build a hybrid drivetrain, complete with controller software.

Economics and politics are controversial, but it is a question of seeing the underlying forces that is important, similar to not understanding how gravity works, but still knowing that dropping a lead ball will accelerate downwards at 9.78m/s^2. This is a field that can wait till later though, and probably should.

For systems analysis, I'm sorry but I can't recommend anything. I tended to learn it by experience more than anything.

I think I understand what you are looking for better now though, and think you might be headed in the right direction as it is.

For CS I highly recommend the dragon book, and design patterns, and if you need ASM The worst designed website ever.

For the other fields I tend to wiki subjects then google for papers, so I can't help you there. :(

Best of luck in your travels however! :)

edit: For physics, if your math is bad get both of his books. They break it down well. If your math is better try one of wittens books, but they are kinda tough, guy is a fucking genius.

also, Feynman QED is great, but his other book is awesome just as a happy intellectual read

also try to avoid either kaku and hawking for anything more complicated than primers.

edit no. 9: mit's ocw is win itself.

edit no. 10: Differential equations (prolly take a class depending on your math, they are core to almost all these fields)

u/humanmanguy · 9 pointsr/AmazonTopRated
  • Fire TV Stick, which is a lower-cost alternative to the awesome Fire TV. (think Apple TV, but actually good)

  • Raspberry Pi which is a tiny fully-functional/fully-featured ARM computer.

  • Arduino, which is an easy-to-use electronics prototyping platform, great if you're interested in learning how to make your own electronics and whatnot. (you might also want this, this, this, this, and this. Should be less than $40 altogether, though you could also probably find like a starter kit that comes with an arduino, book, and components.)

  • Huion drawing tablet, great for if you want to do digital art. I haven't used this model specifically, but I do have the (bigger/more expensive) Huion 610 Pro, which I love.

  • Amazon Prime student was like $40 IIRC, not sure if that has changed though.
u/albatrossy · 2 pointsr/DSP

It kind of sounds like you'd be good just getting a textbook. I think any book will be fine since you mainly just want questions (and presumably answers), but try to find one that implements code in a language that you're comfortable with, or that you want to learn.

There are a lot of different "final year" DSP courses, but it sounds like you want something covering the fundamentals rather than anything too advanced. I started off with The Scientist & Engineer's Guide to Digital Signal Processing and then used Signals and Systems for my first undergraduate course, but we used it largely because he co-authored it. I would recommend scouring the web for some free books though. There are books like ThinkDSP popping up that seem pretty neat.

Edit: Oppenheim is always mentioned also.

u/nvincent · 1 pointr/GiftIdeas

So, I think I am the kind of person you are describing. I have a pretty great job, so I usually just buy my own technology stuff. Not only that, but I am rather picky with technology stuff, so even if someone did get me something like that, I would act excited and happy, but in the back of my mind I would secretly wishing they did more research before buying the thing that they did.

That said! If I were buying for me, I would go with something like the hyperbole and a half book (http://www.amazon.com/Hyperbole-Half-Unfortunate-Situations-Mechanisms/dp/1451666179), or something by the creator of the XKCD comics (http://www.amazon.com/Thing-Explainer-Complicated-Stuff-Simple/dp/0544668251/ref=sr_1_1?s=books&ie=UTF8&qid=1449202837&sr=1-1&keywords=xkcd).

If it has to be tech related, there is always http://www.thinkgeek.com - they have tons of fun, nerdy gifts that I would like. All of these things combined are probably way less than $1,000. That is just a lot of money.

Another random suggestion - if they were ever into pokemon, this is a dream come true: Gym Badges! https://www.etsy.com/listing/128753018/pokemon-kanto-gym-badges-gen-1?utm_source=google&utm_medium=cpc&utm_campaign=shopping_us_b-accessories-patches_and_pins-pins_and_pinback_buttons&utm_custom1=a91c90fb-48c1-4024-87f9-fb14aadac033&gclid=CjwKEAiA7f-yBRDAgdv4jZ-78TwSJAA_WdMaz_NXsXrFH_0f-Mb6ovmqqcCHto-b7S6zm1DplssHQhoCNuvw_wcB

u/poorbowelcontrol · 4 pointsr/cscareerquestions

-How long after completing the camp did it take for you to get hired?
Within 10 days.
-Who do you work for?
~16 person consulting company in the bay.
-Did you have any prior coding experience before enrolling at the camp?
Yes full year of self study and some classes in high school and college.
-Are you happy with your current earnings?
I was untill I realized the cost of living where I am and how much Uncle Sam takes.
-Do employers consider the camps as sufficient to warrant upward mobility potential?
There is another person in my company that also went to my code camp. Our camp (app academy) discouraging revealing our participation in the camp till late in the hiring process.
-Best strategy to get accepted?
Apply.
What kind of students are they looking for? Can I, with my limited background become successful?
In my experience you can have the ability to think in that way or not.

What sort of students are most successful both during the camp and then in the job search following the camp?
The ones you would expect.
-Recommendations for pre-study?
Keep trying different tools until you really find something that works.

A great book is http://www.amazon.com/But-How-Know-Principles-Computers/dp/0615303765.
If i was gonna put forward one online resource it would be http://www.tutorialspoint.com/.

If you have a little time try some of the assembler stuff.

One final tip. There will be a time (or thousands) where you will be staring at some concept and drawing a blank. It may feel like nothing is happening. It may well be that lots of things are and you just gotta process the concepts.

Good luck.

u/name_censored_ · 1 pointr/learnprogramming

>Do you know of a book or a website that teach useful optimization techniques?

I'm only an enthusiast, I've never needed really optimised code (truth be told, most of what I do day to day is quick-and-dirty, appallingly inefficient scripts, because it "needs to be done yesterday"), so I can't give you a canonical list, but here's what I do know;

For books, there's this /r/compsci reddit thread from a while ago. Something on compilers like The Dragon Book might be your best bet, especially the optimisation chapter. And obviously jotux's "How Computers Do Maths" - though never having even flicked through it, I can't say if it's any good.

You could try your luck in /r/ReverseEngineering (or the quieter /r/asm and /r/compilers), there are a lot of low-level guys there who'd know a lot more than me. You could also try /r/compsci or /r/algorithms, although they'd be more useful for algorithms than for optimisation. And of course, /r/quantfinance.

u/njoubert · 1 pointr/compsci

I would suggest that the carlh programming guides is not a bad idea then!

I would heavily suggest learning C well - this is a language that was designed to stay close to the hardware while being portable, and is a very small language. So, buy a copy of the K&R Book, ever C programmer has one.

Then, Patterson's book is a tome for computer engineering. It'll show you assembly, all the way down to NAND gates.

I would suggest you start by watching and working through Berkeley's CS61C course. It's the logically second course in CS, and after a quick overview of C it dives into the machine itself. Website here, videos here. Also, Dan Garcia is an excellent lecturer.

Once you have all the machine details down, you'll probably feel hampered by your actual program wizardry. This is where you start looking into algorithms and data structures. Your go-to guide here is probably Cormen's Introduction to Algorithms since it handles both data structures and algorithms. It's definitely more of a theoretical/CS-ey book, so if this is not what you want, then Head First Java will teach you a new language (and learning more languages is one of the best ways to grow as a programmer!) and also do many data structures. In fact, you can get both those books and have the light side and the serious side of programming books.

At this point you should be well equipped to go off in whatever direction you want with programming. Start contributing to open source projects! Find things that interest you and try to solve problems! Being a part of the programming community will be your biggest aid in both learning programming and starting to make money through it. People pay for programmers that they know can deliver, and success in the open source world means a lot, and you don't need to go to school for it to get to this point!

Lastly, many CS/programming folks hang out on IRC. If you have questions, find the appropriate IRCS channels and go talk to people. Good luck and welcome to programming!

u/necr0tik · 1 pointr/amateurradio

Thanks for the great reply!

The Lessons In Electric Circuits was already on my radar, and I believe will be the first resource in electronics I go through after hearing it beat in my head yet again!

That DSP book I have not seen. I just grabbed a copy and it looks like a great text. I mentioned this post to a fellow electronics enthusiast and he loaned me a copy of a book he said was exceptional for entry into the world of DSP: http://www.amazon.com/Understanding-Digital-Signal-Processing-3rd/dp/0137027419/ DSP is pretty complex, More than likely I will go through both to fully absorb this topic.

EMRFD sounds like a cookbook. Given that its by ARRL I expect its quality to be superb. I am not against these type of text, I have a few already, however I'd rather have more of the theory at this point. I imagine this will be great once I am satisified with the basics, and want to build an actual radio with its operation noted.

u/arthurno1 · 1 pointr/linuxquestions

When you say you want to make a simplistic OS, do you mean you want to put together a simplistic Linux distro, or you want to code an OS from scratch?

In former case DSL might be your friend (Damn Small Linux):
http://www.damnsmalllinux.org/. There are other similar distros that might fit under 25 megabyte, google is your friend. As already mentioned by somebody else linuxfromscratch.org is another option. If you go with LFS, you want to look at minimal libraries instead of standard GNU libs for C library and standard system applications. For example you would like to get https://www.uclibc.org/ for c library (or some other similar, there are few) and say busybox https://www.busybox.net/ for your system apps. There other "micro" versions of some popular software (X server etc) which you might wish to consider if you are going completely custom route.

If I was you, I wouldn't do it, since many others have same thoughts as you and have already put effort and hours into making it, so why repeating all that work if you can just get a distro like DSL and install it and simply customize/change what you dislike. If you want it as an educational experience than certainly go for it, LFS might be very rewarding in that case.

If you want to code your own kernel and OS, than you might wish to take a CS class about OS:s, Tanenbaum is your eternal friend:
https://www.amazon.com/Modern-Operating-Systems-Andrew-Tanenbaum/dp/013359162X/ref=sr_1_1?ie=UTF8&qid=1498831929&sr=8-1&keywords=andrew+tanenbaum

https://www.amazon.com/Structured-Computer-Organization-Andrew-Tanenbaum/dp/0132916525/ref=sr_1_4?ie=UTF8&qid=1498831929&sr=8-4&keywords=andrew+tanenbaum

And don't forget Google ...

u/uptocode · 1 pointr/arduino

And heck! If you don't have an Arduino just yet, you can try one out virtually first! I like 123D Circuits by AutoDesk; however, there are many other simulators with Arduinos built in. Google them! :-)

Like it but don't like the $$$? You can make your own! There are many tutorials online for making a bare bones Arduino with cheap* electronics components.

I really like @schorhr book suggestions. To add on, the following books are great for Arduino beginners: Programming Arduino: Getting Started with Sketches & Make: Getting Started with Arduino. Also, great tutorials can be found here: tronixstuff Arduino Tutorials & Ladyada's Arduino Tutorials.

Good luck!

u/NAMOS · 10 pointsr/onions

Basically any SRE advice for a normal service but replace/compliment HAproxy / nginx / ingress controller / ELB with the Tor daemon / OnionBalance.

I run Ablative Hosting and we have a few people who value uptime over anonymity etc and so we follow the usual processes for keeping stuff online.

Have multiples of everything (especially stuff that doesn't keep state), ensure you have monitoring of everything from connections, memory pressure, open files, free RAM etc etc.

Just think of the Tor daemon onion service as just a TCP reverse proxy, with load-balancing capability and then follow any other advice when it comes to building reliable infrastructure;

u/guifroes · 2 pointsr/learnprogramming

Interesting!

Looks to me that you can "feel" what good code looks like but you're not able to rationalise it enough for you to write it on your own.

Couple of suggestions:

When you see elegant code, ask yourself: why is it elegant? Is it because is simple? Easy to understand? Try to recognise the desired attributes so you can try to reproduce on your code.

Try to write really short classes/methods that have only one responsibility. For more about this, search for Single Responsibility Principle.

How familiar are you with unit testing and TDD? It should help you a lot to write better designed code.

Some other resources:

u/llFLAWLESSll · 3 pointsr/learnprogramming

Since you know Java I would suggest that you a read one of the best programming books ever written: [K&R The C Programming language] (http://www.amazon.com/The-Programming-Language-Brian-Kernighan/dp/0131103628/), this book was written by the people who made the C language and it's one of the best books ever written. It is a must read for every C programmer. [Computer Systems: A Programmer's Perspective (3rd Edition)] (http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/013409266X/) is a great book to learn about computer systems. But I would recommend [Operating Systems Design and Implementation (3rd Edition)] (http://www.amazon.com/Operating-Systems-Design-Implementation-Edition/dp/0131429388) because it has some minix source code which will go really well with learning C.

Best of luck buddy :)

u/samort7 · 257 pointsr/learnprogramming

Here's my list of the classics:

General Computing

u/OmegaNaughtEquals1 · 3 pointsr/cpp_questions

This is a great question! It's also one that every serious CS person will ask at some point. As others here have noted, to really understand this question you must understand how compilers work. However, it isn't necessary to understand the gory details of compiler internals to see what a compiler does for you. Let's say you have a file called hello.cpp that contains the quintessential C++ program

include <iostream>

int main() {<br />
    std::cout &amp;lt;&amp;lt; &quot;Hello, world!\n&quot;;<br />
}<br />


The first thing the compiler does is called preprocessing. Part of this process includes expanding the #include statements into their proper text. Assuming you are using gcc, you can have it show you the output of this step

gcc -E -o hello.pp hello.cpp

For me, the hello.cpp files explodes from 4 lines to nearly 18000! The important thing to note here is that the contents of the iostream library header occur before the int main lines in the output.

The next several step for the compiler are what you will learn about in compiler design courses. You can take a peek at gcc-specific representations using some flags as discussed on SO. However, I pray you give heed. For there be dragons!

Now let's take a look at the compiler's output. To do this, I am going to not #include anything so the output is very simple. Let's use a file called test.cpp for the rest of the tests.

int main() {
int i = 3, j = 5;
float f = 13.6 / i;
long k = i&lt;&lt;j;
}

To see the compiler's output, you can use

g++ -S -masm=intel test.cpp

The -S flag asks gcc to just output the generated assembly code and -masm=intel requests the intel dialect (by default, gcc uses the AT&amp;T dialect, but everyone knows the intel one is superior. :) ) The output on my machine (ignoring setup and teardown code) is outlined below.

push rbp
mov rbp, rsp

/ int i = 3, j = 5; /
mov DWORD PTR [rbp-20], 3
mov DWORD PTR [rbp-16], 5

/ float f = 13.6 / i; /
pxor xmm0, xmm0
cvtsi2sd xmm0, DWORD PTR [rbp-20]
movsd xmm1, QWORD PTR .LC0[rip]
divsd xmm1, xmm0
movapd xmm0, xmm1
cvtsd2ss xmm2, xmm0
movss DWORD PTR [rbp-12], xmm2

/ long k = i&lt;&lt;j; /
mov eax, DWORD PTR [rbp-16]
mov edx, DWORD PTR [rbp-20]
mov ecx, eax
sal edx, cl
mov eax, edx
cdqe
mov QWORD PTR [rbp-8], rax

/ implicit return 0; /
mov eax, 0
pop rbp
ret

There are lots of details to learn in here, but you can generally see how each simple C++ statement translates into many assembly instructions. For fun, try compiling that program with the optimizer turned on (with g++, you can use -O3). What is the output?

There is still much to see from the binary that is assembled. You can use nm and objdump to see symbols or ldd to see what other libraries were (dynamically) linked into the executable. I will leave that as an exercise for the reader. :)

u/Caret · 2 pointsr/hardware

As someone else mentioned, the Hennessy and Patterson Computer Architecture: A Quantitative Approach, and the Patterson and Hennessy Computer Organization and Design are the de facto standards (I used both in my Comp. Eng. undergrad) and are really fantastic books (the latter being more "software" oriented so to speak).

They are not EE textbooks (as far as I know) but they are text books nonetheless. A great book I found that is slightly dated but gives a simplified review of many processors is Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture which is less technical but I enjoyed it very much all the same. It is NOT a textbook, and I highly, highly recommend it.

Hope that helps!

u/blexim · 5 pointsr/REMath

The object you're interested in is the call graph of the program. As you've observed, this is a DAG iff there is no recursion in the program. If function A calls B and B calls A, this is called mutual recursion and still counts as recursion :)

A related graph is the control flow graph (CFG) of a function. Again, the CFG is a DAG iff the function doesn't contain loops.

An execution trace of a program can certainly be represented as a DAG. In fact, since an execution trace does not have any branching, it is just a straight line! However you are very rarely interested in a single trace through a program -- you usually want to reason about all the traces. This is more difficult because if you have any looping structure in the global CFG, there is no (obvious) upper bound on the size of a trace, and so you can't capture them all with a finite structure that you can map into SMT.

Every program can be put into SSA form. The trick is that when you have joins in the control flow graph (such as at the head of a loop), you need a phi node to fix up the SSA indices. If you don't have it already, the dragon book is pretty much required reading if you're interested in any kind of program analysis.

In general, if you have a loop free control flow graph of any kind (a regular CFG or a call graph), then you can translate that graph directly into SAT or SMT in a fairly obvious way. If you have loops in the graph then you can't do this (because of the halting problem). To reason about programs containing loops, you're going to need some more advanced techniques than just symbolic execution. The big names in verification algorithms are:

  • Bounded model checking
  • Abstract interpretation
  • Predicate abstraction
  • Interpolation based methods

    A good overview of the field is this survey paper. To give an even briefer idea of the flavour of each of these techniques:

    Bounded model checking involves unwinding all the loops in the program a fixed number of times [; k ;]. This gives you a DAG representing all of the traces of length up to [; k ;]. You bitblast this DAG (i.e. convert it to SAT/SMT) and hand off the resulting problem to a SMT solver. If the problem is SAT, you've found a concrete bug in the program. If it's UNSAT, all you know is that there is no bug within the first [; k ;] steps of the program.

    Abstract interpretation is about picking an abstract domain to execute your program on, then running the program until you reach a fixed point. This fixed point tells you some invariants of you program (i.e. things which are always true in all runs of the program). The hope is that one of these invariants will be strong enough to prove the property you're interested in.

    Predicate abstraction is just a particular type of abstract interpretation where your abstract domain is a bunch of predicates over the variables of the program. The idea is that you get to keep refining your abstraction until it's good enough to prove your property using counterexample guided abstraction refinement.

    Interpolation can be viewed as a fancy way of doing predicate refinement. It uses some cool logic tricks to do your refinement lazily. The downside is that we don't have good methods for interpolating bitvector arithmetic, which is pretty crucial for analyzing real programs (otherwise you don't take into account integer overflow, which is a problem).

    A final wildcard technique that I'm just going to throw out there is loop acceleration. The idea here is that you can sometimes figure out a closed form for a loop and replace the loop with that. This means that you can sometimes remove a loop altogether from the CFG without losing any information or any program traces. You can't always compute these closed forms, but when you can you're in real good shape.

    Drop me a message if you want to know anything else. I'm doing a PhD in this exact area &amp; would be happy to answer any questions you have.
u/tramast · 4 pointsr/ECE

Sounds like what you're interested in is computer architecture. This is the study of how a computer system (whether it's chip level or system level) is organized and designed from a higher-level abstraction (usually at the register-transfer level or above). There are plenty of good resources on this, including many books (this one comes to mind). Not knowing your background, I can't say if this would be much of a stretch for you. I would say prior to jumping to this level you should have an idea of basic MOS logic design, sequential and combinational logic as well as some background in delays and timing.

Your best bet is probably to find a good old book on amazon or ebay and read to your hearts content. Feel free to PM me if you have any questions (I design microprocessors for a living).

u/jmct · 9 pointsr/Physics

Your best bet is to read an introductory text first and wrap your head around what quantum computing is.

I suggest this one: Intro Text

I like it because it isn't very long and still gives a good overview.

My former supervisor has a web tutorial: here

Lastly, Michael Nielson has a set of video lectures: here

The issue is, there is a decent sized gap between what these introductions and tutorials will give you and the current state of the art (like the articles you read on arxiv). A good way to bridge this gap is to find papers that are published in something like the Physical Review Letters here is their virtual journal on quantum information and see what they cite. When you don't understand something either refer to a text, or start following the citations.

Basically, if you can start practicing this kind of activity (the following of references) now, you'll already have a good grasp on a large part of what grad school is about.

Best of luck!

u/yberreby · 4 pointsr/programming

When I started getting interested in compilers, the first thing I did was skim issues and PRs in the GitHub repositories of compilers, and read every thread about compiler construction that I came across on reddit and Hacker News. In my opinion, reading the discussions of experienced people is a nice way to get a feel of the subject.

As for 'normal' resources, I've personally found these helpful:

  • This list of talks about compilers in general.
  • The LLVM Kaleidoscope tutorial, which walks you through the creation of a compiler for a simple language, written in C++.
  • The Super Tiny Compiler. A really, really simple compiler, written in Go. It helps with understanding how a compilation pipeline can be structured and what it roughly looks like.
  • Anders Hejlsberg's talk on Modern Compiler Construction. Helps you understand the difference between the traditional approach to compilation and new approaches, with regards to incremental recompilation, analysis of incomplete code, etc. It's a bit more advanced, but very interesting nevertheless.

    In addition, just reading through the source code of open-source compilers such as Go's or Rust's helped immensely. You don't have to worry about understanding everything - just read, understand what you can, and try to recognize patterns.

    For example, here's Rust's parser. And here's Go's parser. These are for different languages, written in different languages. But they are both hand-written recursive descent parsers - basically, this means that you start at the 'top' (a source file) and go 'down', making decisions as to what to parse next as you scan through the tokens that make up the source text.

    I've started reading the 'Dragon Book', but so far, I can't say it has been immensely helpful. Your mileage may vary.

    You may also find the talk 'Growing a language' interesting, even though it's not exactly about compiler construction.

    EDIT: grammar
u/wicker0 · 1 pointr/osdev

The Design of the Unix Operating System is a classic. It's from the 80's, but still plenty relevant. It's very well written, with plenty of diagrams to help you along.

It doesn't quite start from the very beginning. If you're looking for information on how to start with absolutely nothing (ie, write a bootloader, implement basic device drivers, etc), then you'll need to supplement with other sources. It does, however, do a really great job of explaining things like processes, threads, memory management, and other basic concepts. It doesn't give you source code (though it contains a bit of pseudocode), but it explains in succinct, legible prose, the data structures and algorithms that drive core functionality. Again, it's an old book - $6.00 plus shipping used. Can't really go wrong.

Operating Systems Design and Implementation covers basically the same ground. I prefer the former, as it treats you a little more like an adult and skips straight to explaining how concepts are implemented (and the cover art is just so undeniably classic).

u/SamHennessy · 1 pointr/PHP

When I said "I can see how maybe they could be useless to you.", that's because I instantly know what kind of programmer you were. You're a low level guy.

I have a copy of "Algorithms in a Nutshell" (http://www.amazon.com/Algorithms-Nutshell-In-OReilly/dp/059651624X) but I never finished it. My favorit programming book may be "Patterns of Enterprise Application Architecture" (http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420). Neither of these books are language specific, but I don't think they could be further apart in every way. Both are very valuable and I appreciate that they both exist.

There is a good number of reasons that you should maximize your use of the built-in PHP functions (http://webandphp.com/5reasonstomaximizeyouruseofPHP%E2%80%99sbuiltinfeatures). My book is an attempt to come up with a system that will help you learn all of the built-in PHP functions by giving a realistic use case that could be applied in your everyday work.

Being a PHP programmer, it is much more useful to know what functions PHP has for array sorting, than it is to know how to implement array sorting in PHP code.

u/lifelongintent · 3 pointsr/suggestmeabook

This is so thoughtful! Very similar to Hyperbole and a Half is The Oatmeal, which is another sardonic blog with funny cartoons that has a book of its best content. I also highly recommend XKCD's book "Thing Explainer", which is a highly informative and entertaining read. Wishing your friend the best!

u/Yelneerg · 1 pointr/embedded

Course 1 was definitely useful but I also found it pretty easy. I've been busy with other things for the last several days (mostly learning Fusion 360) so I'm still on the UART lesson. I think the most useful part so far has been learning more about design patterns. I've been concurrently reading Making Embedded Systems and the combination of the book and the course has been great.

u/case-o-nuts · 3 pointsr/compsci

It seems that most introductory texts focus on parsing. However, in my experience, the dragon book does a good job on introductory code generation. Appel's Tiger Book had good information as well. As a heads up, the C and Java versions of the same book are done as an afterthought, and you can tell that the code was translated after the fact. Stick with the ML version.

For optimization algorithms, I've heard good (And bad) things about Muchhnik: Advanced Compiler Design and Implementation.

However, I've had better luck just reading various papers. If there's a specific part of code generation and emission, I can point you to plenty of good papers.

u/FattyBurgerBoy · 6 pointsr/webdev

The book, Head First Design Patterns, is actually pretty good.

You could also read the book that started it all, Design Patterns: Elements of Reusable Object-Oriented Software. Although good, it is a dull read - I had to force myself to get through it.

Martin Fowler is also really good, in particular, I thoroughly enjoyed his book Patterns of Enterprise Architecture.

If you want more of an MS/.NET slant of things, you should also check out Dino Esposito. I really enjoyed his book Microsoft .NET: Architecting Applications for the Enterprise.

My recommendation would be to start with the Head First book first, as this will give you a good overview of the major design patterns.

u/st4rdr0id · 2 pointsr/androiddev

Hey that is the million dollar question. But because software is not an engineering, actually there is no reference book on SW architecture. Certainly there are books talking about this, but usually covering only some aspects and without real application examples.

Notice that in iOS programming the system imposes a great part of the architecture, so these guys are usually less concerned. But in Android we have more freedom, and the API actually encourages really bad practices (thanks Google). Because of this we are all a bit lost. Nowadays layered architecture and MVP seems to be the most popular approach, but then again everybody produces a different implementation...

Specifically for Clean Architecture you should read its author, Robert C. Martin. AFAIK this is not covered in detail in his books. You can read this blog post and watch this video. Other designs usually coming up in conferences are the Onion Architecture and the Hexagonal Architecture. But make no mistake: there's no route map on how to implement any of those, and examples claiming to follow this or that approach are usually not written by the authors of the architecture.


For DDD there is a very good book by Scott Millet with actual examples. But this style is meant for large enterprise backend apps, and the author himself advices against using is in small apps. So I'd say it is overkill for Android, but of course you could reuse some concepts successfully.


Theres also Software Architecture in Practice 3rd, but having read the 2nd edition I can tell you this is just smoke.


Probably best book to date is Fowler's but this is more a patterns compilation than an architecture guide.

u/mcur · 14 pointsr/linux

You might have some better luck if you go top down. Start out with an abstracted view of reality as provided by the computer, and then peel off the layers of complexity like an onion.

I would recommend a "bare metal" approach to programming to start, so C is a logical choice. I would recommend Zed Shaw's intro to C: http://c.learncodethehardway.org/book/

I would proceed to learning about programming languages, to see how a compiler transforms code to machine instructions. For that, the classical text is the dragon book: http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811

After that, you can proceed to operating systems, to see how many programs and pieces of hardware are managed on a single computer. For that, the classical text is the dinosaur book: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/1118063333 Alternatively, Tannenbaum has a good one as well, which uses its own operating system (Minix) as a learning tool: http://www.amazon.com/Modern-Operating-Systems-Andrew-Tanenbaum/dp/0136006639/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1377402221&amp;amp;sr=1-1

Beyond this, you get to go straight to the implementation details of architecture. Hennessy has one of the best books in this area: http://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1377402371&amp;amp;sr=1-1

Edit: Got the wrong Hennessy/Patterson book...

u/FallingStar7669 · 2 pointsr/KerbalSpaceProgram

Science mode limits the available parts until you do the science to unlock more, without having to deal with restrictions like funding. You're almost literally forced to start simple, which is very useful given the steep curve of this game.

I'm no education expert, but I've been playing games since the NES came out. And what I've seen of this coming generation, they're pretty sharp, even if their reading skills are limited. Don't expect a 4 year old to understand delta-v, but fully expect them, after a few weeks of play, to not need to worry about it. If they can survive the steep learning curve, they'll know what engine they want by the picture (most of us do anyway) and they'll know what it does because they tried it and saw for themselves. It might be useful at the very least to explain "this one makes you go fast but uses up all your fuel, this one makes you go slow but uses less fuel" and stuff like that. Basically, talk to them as if you're quoting this book.

A child's mind is a very wondrous machine. If nothing else, trust that, if their interest is strong enough to overcome their failures, they will blow you away sooner than you could ever realize.

u/bmarkovic · 6 pointsr/webdev

POSA books by Buschmann are considered to be the textbooky, Knuth/SICP level stuff from when I studied and are architecture equivalent to the GoF's design thing, but as a consequence they're also huge on OOP. The Fowler Book is even more preachy and OOPy but many things are still relevant. The third would be Uncle Bob's Clean Architecture (sorry for cryptic refs, am on mobile, just Google the refs you'll find them.

On the systems design front one should learn ESB and SOA as they are patterns still relevant in this microservices world but most books on the subject are often tied to particular tech (and its often wrong tech like IBM or Oracle or MS proprietary, untraslateable/untransferable stuff). I've heard good things about Thomas Erl books [1].

I've recently read Sam Newman book on Microservices and while it does have a lot of zeitgeist in it at least it's current zeitgeist and the book is decent.

Edits:

  • On keyboard now, added links.
  • [1] Arcitura, Thomas Erl's company has informative (if a bit ugly) websites on both classical SOA patterns and Microservice patterns. Again, it's buzzwordy, preachy, enterprisey CIO-talk but if you cut through it there are some good overviews of various systems design patterns there and are a quick way to ingest the concepts before dedicating your time to these huge tomes.
  • The mentioned books have aged well in general but some of the ideas they propose haven't aged that well so I'd like to dedicate a few bullet points to those:
    • MVC in particular, has lately fallen out of grace as UI pattern and has become delegated to the backend as data-presentation pattern (i.e. how you design, say, JSON API backends wrt DB access and transforming to JSON), whereas front-end UI has migrated to MOVE pattern which, in terms of GoF speek, mostly relates to MVC by replacing MVC's core Observer pattern with a Reactive programming Observable.
    • Active Record ORMs (think Hibernate) have fallen from grace and are becoming replaced with SQL building DSLs like Linq or ORMs with SQL builders below them. DTOs have also given way to either Monad-ic data access objects or swung back to the pre-AR/pre-DTO concept of Data Gateways (more common with Linq-style DSLs).
    • Reactive design in combination with Message Queuing has become more and more the method of choice for managing distributed state in SOAs.
u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/cythrawll · 1 pointr/PHP

Honestly I haven't read a "PHP book" in ages so I am a very bad source in critiquing. the majority of the one's I have come across are painfully outdated and some right out inaccurate. My suggestion to learn PHP programming better is try books that have to do with programming in general. Books on object orientation and patterns, like GOF http://en.wikipedia.org/wiki/Design_Patterns or PoEAA http://www.amazon.com/Enterprise-Application-Architecture-Addison-Wesley-Signature/dp/0321127420 are great for learning Object Oriented principals.

But those will help somewhat. What really helped me become a better PHP programmer is to study other languages, and then study their web frameworks, then take what you learned back to PHP. Find out why one aspect is pretty common in X language frameworks, but not PHP frameworks. How do other language frameworks solve the dependency issue, etc. Some languages I suggest learning are actually other ones that are pretty mainstream in web programming: Python for it's typing and how it compares to PHP. Ruby with how mixins relate to PHP traits. and Java is great as there are quite a bit of aspects that PHP stole off Java in it's OO design.

u/Elynole · 1 pointr/nfl

I'll throw out some of my favorite books from my book shelf when it comes to Computer Science, User Experience, and Mathematics - all will be essential as you begin your journey into app development:

Universal Principles of Design

Dieter Rams: As Little Design as Possible

Rework by 37signals

Clean Code

The Art of Programming

The Mythical Man-Month

The Pragmatic Programmer

Design Patterns - "Gang of Four"

Programming Language Pragmatics

Compilers - "The Dragon Book"

The Language of Mathematics

A Mathematician's Lament

The Joy of x

Mathematics: Its Content, Methods, and Meaning

Introduction to Algorithms (MIT)

If time isn't a factor, and you're not needing to steamroll into this to make money, then I'd highly encourage you to start by using a lower-level programming language like C first - or, start from the database side of things and begin learning SQL and playing around with database development.

I feel like truly understanding data structures from the lowest level is one of the most important things you can do as a budding developer.


u/e7hz3r0 · 2 pointsr/learnprogramming

Heh, that's a loaded phrase because it people haven't agreed on what it means.

So I agree with both the other posters in that it can include the stack but usually implies a deeper design understanding.

To me, it doesn't make much sense to ask about a rails app's architecture without going into the tech stack precisely because 1) rails apps have the same basic architecture (MVC) 2) the rest of the stack is actually part of the application. Do you use MySQL or Postges or something else? How many rails servers do you have? How many database servers are there and how are they replicated? Etc etc.

However, when you're talking about apps that don't have a given, accepted base design then it's really important to know how it's designed.

I'm going to use the phrase design and architecture interchangeably here, but one could argue they're slightly different.

The architecture of an app influences the "non-functional" characteristics it embodies (also called quality attributes). Furthermore, and more importantly, the architecture itself is (or should be) influenced by the desired non-functional characteristics.

What do I mean by non-functional characteristics? Stuff like:

  • Performance
  • Security
  • Modifiability
  • Modular
  • Testability
  • etc.

    If you think about it, these things are difficult and expensive to change down the road. If you want to add security to an app that's highly modular, you will have a lot of work due to the high amount of decoupling throughout the app. Or imagine trying to add performance to a highly modifiable app. Modifiability usually implies low coupling between parts which also, usually, impacts performance.

    So when you think about the architecture of an app, it's how the larger parts are put together to express these non-fuctionals. This can get down to the level of design patterns like MVC (modifiability) and dependency injection (testability) but it starts at a higher level where you look at things like Java packages instead of classes, as an example.

    There are a number of books on amazon about this but here are 2 (I've read the first, but not the second):
  • Software Architecture in Practice
  • Clean Architecture
u/lordvadr · 2 pointsr/AskComputerScience

We wrote a compiler for one of my CS classes in college. The language was called YAPL (yet another programming language).

First thing first, as other's have mentioned, a compiler translates from one language to another...typically assembly...but could be any other language. Our compiler compiled YAPL, which was a lot like Pascal, into C, which we then fed to the C compiler...which in turn was fed to the assembler. We actually wrote working programs in YAPL. For my final project, I wrote a functional--albeit VERY basic--web server.

With that said, it's quite a bit different for an interpreted language, but the biggest part for each is still the same. By far, the most complicated part of a compiler is the parser.

The parser is what reads a source code file and does whatever it's going to do with it. Entire bookshelves have been written on this subject, and PhD's given out on the matter, so parsing can be extremely complicated.

In a theoretical sense, higher level languages abstract common or more complicated tasks from the lower level languages. For example, to a CPU, variables don't have sizes or names, neither do functions, etc. On one hand, it greatly speeds up development because the code is far more understandable. On the other hand, certain tricks you can pull of in the lower-level languages (that can vastly improve performance) can be abstracted away. This trade-off is mostly considered acceptable. An extra $500 web server (or 100 for that matter) to handle some of the load is far less expensive than 10 extra $100,000 a year x86 assembly developers to develop, optimize, and debug lower level code.

So generally speaking, the parser looks for what are called tokens, which is why there are reserved words in languages. You can't name a variable int in C because int is a reserved word for a type. So when you name variable, you're simply telling the compiler "when I reference this name again, I'm talking about the same variable." The compiler knows an int is 4 bytes, so does the developer. When it makes it into assembly, it's just some 4 bytes somewhere in memory.

So the parser starts looking for keywords or symbols. When it sees int, the next thing it's going to expect is a label, and if that label is followed by (, it knows it's a function, if it's followed by ; it's a variable--it's more complicated than this but you get the idea.

The parser builds a big structure in memory of what's what and essentially the functionality. From there, either the interpreter goes through and interprets the language, or for a compiler, that gets handed to what's called the emitter. The emitter is the function that spits out the assembly (or whatever other language) equivalent a = b + c; happens to be.

This is complicated, but if you take it in steps, it's not really that hard. This is the book we used. There's a much newer version out now. If I can find my copy, I'll give it to you if you pay shipping. PM me.

u/pop-pop-pop-pop-pop · 14 pointsr/javascript

Not a magic bullet but these helped me:

  • Study the structure of big popular open source projects like lodash or JQuery that have been around for awhile, learn how they structure and organize their code and borrow from it.

  • A book like Clean Architecture might help too.

  • Understand how JavaScript works under the hood and computers in general so you have a better understanding of the whole system, this involves learning low level documentation.

  • Get really good with OOP.

  • Code-&gt;Refactor-&gt;Code-&gt;Refactor, apply and reiterate all the stuff you've learned and see if it works.

    Disclaimer: I'm a pretty terrible programmer, but I used to be a lot worse.
u/110100100_Blaze_It · 2 pointsr/learnprogramming

It's on my to-do list, but this is something that I want to get right. I don't think I could fully appreciate it without a more formal approach. I'm currently working through this, and will try my hand in the subject afterword. I will definitely check out Professor Might's insight on the subject, and I would gladly take up any other resources you might have to offer!

u/shittyNaturalist · 2 pointsr/engineering

I really would recommend Randall Munroe's Thing Explainer. When I started doing propulsion work, I actually used it as a reference because it's easy to reference and, it has a pretty strong foundation on a number of things at a very accessible level. As u/zaures mentioned, The Way Things Work (any edition) is excellent and in much the same vein.

u/greenlambda · 9 pointsr/ECE

I'm mostly self-taught, so I've learned to lean heavily on App Notes, simulations, and experience, but I also like these books:
The Howard Johnson Books:
High Speed Digital Design: A Handbook of Black Magic
https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_api_I0Iwyb99K9XCV
High Speed Signal Propagation: Advanced Black Magic
https://www.amazon.com/dp/013084408X/ref=cm_sw_r_cp_api_c3IwybKSBFYVA

Signal and Power Integrity - Simplified (2nd Edition)
https://www.amazon.com/dp/0132349795/ref=cm_sw_r_cp_api_J3IwybAAG9BWV

Also, another thing that can be overlooked is PCB manufacturability. It's vitally important to understand exactly what can and can't be manufactured so that you can make design trade offs, and in order to do that you need to know how they are made. As a fairly accurate intro, I like the Eurocircuits videos:
http://www.eurocircuits.com/making-a-pcb-pcb-manufacture-step-by-step

u/LiquidLogic · 2 pointsr/arduino

Check out Simon Monk's book: Programming Arduino, Getting Started with Sketches . I found it a great starter book, and was easy to understand and follow.

As for your keyboard interface.. it sounds like you will need the serial monitor running and waiting for a key-press. Arduino: SerialAvailable

Hope that gets you moving in the right direction! GL!

u/LXXXVI · 2 pointsr/learnprogramming

Thanks, I'm sure you will. It's just a question of getting that first success. Afterwards, it gets much easier, once you can point at a company and say "Their customers are using my code every day."

As for the interviews, I don't know, I'm honestly not the type to get nervous at interviews, either because I know my skill level is most likely too low and I take it as a learning experience, or because I know I can do it. I'd say that you should always write down all the interview questions you couldn't answer properly and afterwards google them extensively.

Besides, if you're from the US, you have a virtually unlimited pool of jobs to interview for. I live in a tiny European country that has 2 million people and probably somewhere in the range of 20 actual IT companies, so I had to be careful not to exhaust the pool too soon.

Funnily enough, right now, my CTO would kill for another even halfway competent nodejs developer with potential, but we literally can't find anyone.

Anyway, I'm nowhere near senior level, but I can already tell you that the architecture:language part is something your bootcamp got right. To that I would add a book my CTO gave me to read (I'm not finished yet myself, but it is a great book) - Patterns of Enterprise Architecture. Give it a look. I suspect, without ever having tried to implement a piece of architecture like that, it won't make much sense beyond theoretical, but I promise you, it's worth its weight in gold, once you start building something more complex and have to decide how to actually do it.

u/EngineerBill · 3 pointsr/arduino

I was pretty happy with "Programming Arduino - Getting Started With Sketchs" by Simon Monk. It provide a good overview of C and the various steps needed to get to working code. I've already a lot of coding experience by knew nothing about Arduino when I started so it brought me up to speed quickly but I think it would useful for beginners, as well.

It's available on Amazon for sub-$9: -&gt; and he has a site which has a fair amount of errata, etc.: -&gt;

u/smith7018 · 1 pointr/jailbreak

It's hard to say because that's a more advanced section of computer science called architecture and I learned it in a college setting. With that being said, I've heard good things about this textbook. Hopefully whichever book you pick up on ARM assembly will have a few chapters going over how the processor functions.

Good luck! :)

u/frenchy_999 · 6 pointsr/learnprogramming

Not sure if it's quite what you're looking for but Computer Organization &amp; Design - The HW/SW Interface is a fantastic book on processor architecture and uses the MIPS design as an example through the entire text including good stuff on MIPS assembly programming. The link is for the latest edition (fourth) but if you want to go cheaper the third edition is still available. I used (and still use, about to tutor a course on CompArch) the third edition and it's one of the most useful texts I have ever owned.

u/ndanger · 1 pointr/compsci

Upvote for Domain-Driven Design, it's a great book. Depending on the size of the system, Martin Fowler's PoEAA might also be helpful.

Also what dethswatch said: what's the audience &amp; scope; i.e. what's in the previous document? If you're presenting three architectures you probably need enough detail that people can choose between them. That means knowing how well each will address the goals, some estimate on implementation effort (time &amp; cost), limitations, future-proofing, etc.

Finally, IMHO, this really isn't computer science. You might have better luck asking in /r/programming/ or the new r/SWArchitecture/

u/frankenbeans · 2 pointsr/ECE

Amazing? These look like they were swiped from an overview lecture, there isn't any really good explanation in here. If this is all new to you they might be a good starting point for learning some basic concepts and vocabulary of signal integrity.

Johnson's Black Magic book is the general reference for this. There are many other (well written) white papers out there. Ott and Bogatin have good books as well.

u/mitchell271 · 2 pointsr/audioengineering

Software dev checking in. If you want to go into plugin design, make sure you read books like The Scientist And Engineer's Guide to Digital Signal Processing, and have a heavy focus on algorithms, physics, and matrix math.

There are SDKs and APIs to help though. The Steinberg VST SDK is how VST plugins are made, and it removes a lot of the underlying math that you need to know. Writing multi-threaded C code with a library like OpenMP will also help, as you plugins will be more efficient, resulting in less latency.

u/dishayu · 2 pointsr/NoStupidQuestions

I also recommend the book named Thing Explainer which uses the 1000 most common words to explain complicated things. It has a lot of simple, funny pictures and is fun to read overall (the tone is not serious at all). I'd be happy to ship you a copy if you like. Just PM me your address.

u/phao · 2 pointsr/java

Well, some books can help:

  • There is the design patterns book (gang of four) =&gt; http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/
  • Another book on patterns, but targetting enterprise applications (i.e. information systems) =&gt; http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/

    These books on patterns tend to be good on teaching a lot of what you're asking. Largely because you've named some patterns in your question, but also because many patterns are about:

  • Identifying the need for something X to be changed without affecting another thing Y with which X is coupled; and
  • separating X from Y in a way that allows X to change independently from Y.

    There are several ways to say what I just did in there. You're allowing X to vary independently from Y. This makes X a parameter of Y, which is yet another way to say it. You're separating what is likely to change often (X) from what doesn't need to be affected by that change (Y).

    Some benefits from this is that a reason to change X, now, doesn't affect Y because X can be changed independently of Y. Another is that understanding X can be significantly done without looking at Y. This is a core guiding rule in the separation of concerns principle: the concern X is separated from the concern Y. Now, a lot of activities you want with X can be performed independently of Y.

    You probably know all of this, so I'm sorry if this isn't much helpful. But just to finish, a classic example of this is a sorting function (the Y) and the comparison criteria (the X). Many people, in many projects, would like to have that a change in the comparison criteria not lead to a change in the sorting function. They're 2 separate concerns we'd like to deal with separately. Therefore, the comparison criteria, as commonly done today, is a parameter of sorting. In this case, the word "parameter" is being used both in the sense of a function parameter in the source code, but also in the more general sense of something being a parameter of something else, in which case something can be one of many, and may change over time.
u/theevilsharpie · 0 pointsr/linux

&gt; I want to learn how linux (and computers) work.

If you want to learn how Linux (and computers) work, take a course on operating system design and development. It's offered at any university that has a respectable computer science program, and you can probably find online courses that teach it for free. If you're more of a self-starter, grab a textbook and work your way through it. A book on the internal workings of Linux in particular might also be helpful, but IMO the development of the Linux kernel is too rapid for a book to provide a useful up-to-date reference.

If you want to learn Linux as a day-to-day user (which is what I suspect you're looking for), pick Ubuntu or one of its derivatives. They are easy to get up and running, while still allowing you to "spread your wings" when you're ready.

u/dolphinrisky · 5 pointsr/Physics

Ah gotcha, yeah to be honest this approach probably won't be terribly illuminating. The problem is that the D-Wave really doesn't work in any kind of classically equivalent way. When you think about algorithms classically, the procedure is highly linear. First you do this, then that, and finally the other. The D-Wave One involves nothing of the sort.

Here's a quick rundown of what a quantum annealing machine actually does, with analogies to (hopefully) clarify a few things. In fact, an analogy is where I'll start. Suppose you had a problem you were working on, and in the course of trying to find the solution you notice that the equation you need to solve looks just like the equation describing how a spring moves with a mass hanging from it. Now you could continue your work, ignoring this coincidence, and solve out the equation on your own. Alternatively, you could go to the storage closet, grab a spring and a mass, and let the physics do the work for you. By observing the motion of the spring, you have found the solution to your original problem (because the equations were the same to begin with).

This is the same process used by the D-Wave One, but instead of a spring and a mass, the D-Wave system uses the physics of something called an Ising system (or model, or problem, etc.). In an Ising system, you have a series of particles^ with nonzero spin that can interact with each other. You arrange this system so that you can easily solve for the ground state (lowest energy) configuration. Now with the system in this ground state, you very, very slowly vary the parameters of the system so that the ground state changes from the one you could easily solve to one that you can't. Of course this new ground state, if you've done things correctly, will be the solution to the problem you were actually concerned with in the first place, just like the spring-mass example above.

So perhaps now I have explained at least a little bit of why I don't call the D-Wave One a "computer". It doesn't compute things. Rather, by a happy coincidence, it sets up an experiment (i.e. the Ising system) which results in a measurement that gives you the answer to the problem you were trying to solve. Unfortunately for you, the software engineer, this resembles precisely nothing of the usual programming-based approach to solving problems on a classical computer.

My advice is this: if you want to learn some quantum computing, check out An Introduction to Quantum Computing by Kaye, Laflamme, and Mosca, or the classic Quantum Computation and Quantum Information by Nielson and Chuang.

^
They don't actually have to be single particles (e.g. electrons), but rather they are only required to have spin interactions with each other, as this is the physical mechanism on which computations are based.

Edit: Okay, this was supposed to be a reply to achille below, but apparently I'm not so good with computers.

u/DonaldPShimoda · 8 pointsr/ProgrammingLanguages

I've peeked at this free online book a few times when implementing things. I think it's a pretty solid reference with more discussion of these sorts of things!

Another option is a "real" textbook.

My programming languages course in university followed Programming Languages: Application and Interpretation (which is available online for free). It's more theory-based, which I enjoyed more than compilers.

But the dragon book is the go-to reference on compilers that is slightly old but still good. Another option is this one, which is a bit more modern. The latter was used in my compilers course.

Outside of that, you can read papers! The older papers are actually pretty accessible because they're fairly fundamental. Modern papers in PL theory can be tricky because they build on so much other material.

u/JonasY · 1 pointr/raspberry_pi

I've tried doing something similar for x86 about a decade ago.

I've written this GUI a while back from scratch that could work from DOS (for things like loading images into memory). All those controls that you see in the picture worked well and how they should. I believe it only took like 3000-5000 lines of code to write it. How I started was, I already knew x86 assembly (not really needed for GUI part, unless you want to optimize) and C. I found some sort of Linux bootloader that had a GUI. I looked through its code and got a basic idea on how to write it. Its author used C++ (just for classes and inheritance), which is what I used. So for a GUI I recommend learning C and a bit of C++, then finding this bootloader (I don't remember the name) or some other relatively small project that has its own GUI and see how it's made.

For the OS part, I recommend a book called "Operating Systems Design and Implementation (3rd Edition)". You will need to know C and x86 assembly to understand it. It discusses how a particular OS named Minix was made. Linus Torvalds used the first edition of this book to write the first version of Linux.

You could google for something like "osdev", "osdev gui".

u/levu-webworks · 0 pointsr/learnprogramming
  • The "Red Dragon Book of Compiler Design"
  • Compiler Design in C

    Both books I've read. The latter sits on my bookshelf. It was a gift from my girlfriend. Please don't waste your time trying to implement a compiler. It's a PhD level endeavor that will take years of dedicated 60 hour work weeks.

    Here are the same links linked from my Amazon affiliates account:

  • The Red Dragon Book of Compiler Design
  • Compiler Design in C


    You are better off implementing a algebraic calculator using LR Parse. Start with Tom Torf's - Programmers Calculator - PCalc. It's written in C and pretty simple. You can fork it from my GitHub account if you have trouble finding Tom's source archive. Tom (may he rest in peace) also wrote several programming tutorials and contributed to comp.lang.c, alt.lang.c and the comp.lang.c FAQ.
u/farox · 1 pointr/cscareerquestions

This here is my patterns bible:

https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420

As for .Net and SQL Server, it really depends on what you want to get into. Both have such a huge field of applications. In general MS Press is really good for books on their own stuff and written well enough that you can actually read through it.

Edit: But yeah, just realized that the book from Fowler also already is 14 years old. I need to update that as well :)

u/Scaliwag · 2 pointsr/gamedev

Regarding sandboxing, at least in lua from what I know you can have minute control over what libs access to, and users can only import other libraries if you allow them to (by including a "library" that imports other libraries :-).

Perhaps you should look into formal languages and parser generators, so you can create more complex languages if you feel like it. Even if you build the parsers yourself having the language specified, factorized and so on, helps a lot. The dragon book is a good choice, although it presupposes you know a bit about specifying a formal language IIRC. If you're a student (I know how it is!) then even the old dragon book is an excellent read and it's very cheap.

u/apcragg · 3 pointsr/RTLSDR

The chapter on quadrature signals in this book is really good. It has some of the best illustrations of the concept that I have come across. The amazon link also lets you browse that chapter for free.

u/MatrixManAtYrService · 2 pointsr/IWantToLearn

Just going from a bunch of hardware to the point where you can input machine code to be executed is a vast topic in itself (and something I don't have knowledge of). Once you can input machine language and have it execute though, I at least have an idea.

You can use machine code to write an assembler, which is a lot of work but not particularly complex.

You can use an assembler to write a compiler (good luck with this one, I'm in compiler design right now and it's a mind blow).

You can use a compiler to write pong.

There are many topics that you can really get acquainted with by just wandering the web. I don't think this is one of them. Once you get it you can really go some complex places, so what you're likely to find online is either too simple, or too complex for the understanding you seek. With dedication a book can probably help you, but if you can make nice with a teacher--auditing a computer organization/assembly language class will really open your eyes to what is going on in there.

Take a look at the course listing at a local college and e-mail the teacher, see if they'll let you audit their class.

This was my textbook for that class, it's decent. Maybe you can find an early edition for cheap:
http://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123744938/ref=sr_1_2?ie=UTF8&amp;amp;qid=1302110540&amp;amp;sr=8-2-spell

u/dnew · 1 pointr/worldnews

&gt; Is this a realistic goal

Yes, quite. The bits you are going to be missing are some of the mathematical underpinnings. Depending on what you're programming, you'll also want to grab books on the particular topic at hand that don't try to teach you programming at the same time.

For example, if you want to learn why C# is object-oriented and what that means and how to use it, grab a copy of this book: http://en.wikipedia.org/wiki/Object-Oriented_Software_Construction

If you want to learn how relational databases work, read this one http://www.amazon.com/Introduction-Database-Systems-8th-Edition/dp/0321197844 (You can easily find online versions, but I didn't investigate whether they were legally released or not.)

You want to write a compiler? Grab the "dragon book": http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811

None of those teach you how to program. They teach you the math and background behind major inventions in programming. Keep up with those, find a local mentor who enjoys talking about this stuff, and you'll do fine.

u/tluyben2 · 1 pointr/programming

I would very much recommend http://www.amazon.com/Black-Video-Game-Console-Design/dp/0672328208 ; it goes really far in explaining everything from quantum level up to a working game console. And after that; http://www.amazon.com/Operating-Systems-Design-Implementation-3rd/dp/0131429388/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1405346881&amp;amp;sr=1-1&amp;amp;keywords=minix . Then you'll be set.

Edit; Although the Black art is about game consoles; if you work through it, you can build your own computer in the end, or, what I really like, you can pick up old machines (80s/begin 90s) from Ebay for &lt; $5, open them up , understand them and change them. As they are not 'one chip' with some power supply stuff, but almost everything is held in separate ICs, so you can follow the PCB and see actually what it is doing and how. Great fun. And it scales, as I have no issue making digital things with FGPA's etc because I know it at this level.

u/BaconWraith · 2 pointsr/compsci

Cheers man! The Dragon Book is a great place to start, and there's always this, but mainly it's about facing each problem as you come to it and hoping for the best :P

u/Lericsui · 26 pointsr/learnprogramming

"Introduction to Algorithms"by Cormen et.al. Is for me the most important one.

The "Dragon" book is maybe antoher one I would recommend, although it is a little bit more practical (it's about language and compiler design basically). It will also force you to do some coding, which is good.


Concrete Mathematics by Knuth and Graham (you should know these names) is good for mathematical basics.


Modern Operating Systems by Tennenbaum is a little dated, but I guess anyone should still read it.


SICP(although married to a language) teaches very very good fundamentals.


Be aware that the stuff in the books above is independent of the language you choose (or the book chooses) to outline the material.

u/kmafb · 0 pointsr/IAmA

For lexing and parsing, you should just pick up a compiler book. You could bang your head against it your whole life without figuring it out, and the Right Answer is not that hard if you have someone to show it to you. There are lots of good ones; the classic is the "dragon book" (http://www.amazon.com/Compilers-Principles-Techniques-Alfred-Aho/dp/0201100886).

Beyond that, VMs are a big topic. They include all of compilers, and almost all of systems programming. The Smith and Nair book (http://www.amazon.com/Virtual-Machines-Versatile-Platforms-Architecture/dp/1558609105) is a great jumping off point. But so is playing around with a project that means something to you. It depends what you find more rewarding.

u/big-ookie · 3 pointsr/csharp

I would strongly recommend reading this book

https://www.amazon.com/Clean-Architecture-Craftsmans-Software-Structure/dp/0134494164

It should be mandatory reading in all CS and SE degrees IMO.

It will not answer your specific question, but it will provide you with the tools and knowledge to understand how best to approach the problem and ensure your architect and design is well though through and draws on the learnings of those who have come before us.

u/Enlightenment777 · 42 pointsr/ECE

-----
-----

BOOKS


Children Electronics and Electricity books:

u/The_Masked_Lurker · 1 pointr/talesfromtechsupport

Going to a private, but non-profit institution, its cool.

(as a matter of fact, a friend has friends that go to bent state university and after comparing physics hw found that, well our curriculum is much harder, I guess their intro final had a "draw a line to match the term to its definition" type thing)

Anywho, one of our compsci upper level courses is based on this book http://www.amazon.com/Computer-Organization-Design-Fourth-Architecture/dp/0123744938/ref=la_B000APBUAE_1_2?s=books&amp;amp;ie=UTF8&amp;amp;qid=1406428553&amp;amp;sr=1-2 It goes through and explains computer architecture for an actual cpu, I don't recall how easy it is to read however. (if you buy it and it makes no sense, the intro book we use was called, "an invitation to computer science", but get an adition or two back from current if you buy)

Finally you can a bunch of info here http://ocw.mit.edu/index.htm

u/a-schaefers · 3 pointsr/unixporn

Initially I watched an episode of lunduke hour that featured a freeBSD dev https://www.youtube.com/watch?v=cofKxtIO3Is

I like the documentation available to help me learn. I got my hands on the FreeBSD handbook and can't wait to get into the design and implementation text book, Addison Wesley, 928 pages. https://www.amazon.com/Design-Implementation-FreeBSD-Operating-System/dp/0321968972/ref=pd_lpo_sbs_14_t_0/143-0574353-4482766?_encoding=UTF8&amp;amp;psc=1

I appreciate the focus on servers and research computing that is BSD's strong suit.

u/vladmihalceacom · 2 pointsr/java

&gt; Yes, the Native Query and access to Connection is always THE Hibernate's answer to all the lacking support of basic SQL features like Window Functions or being able to count aggregated results.

That's a very common misconception. Hibernate is not a replacement for SQL. It's an alternative to JDBC API that implements the Enterprise Patterns stated by Martin Flower in his book.

Thre are many alternatives to JPA or Hibernate. In fact, I'm also using jOOQ, and I like it a lot. I wrote about it. I'm using it in my training and workshops as well.

There are things you can do in jOOQ that you can't do with Hibernate, and there are also things you can do with Hibernate that you can't do with jOOQ.

u/HotRodLincoln · 3 pointsr/IWantToLearn

There are books specifically on language design, syntax trees, and unambiguous grammars.

The classic books on compiler design are "The Dragon Book", designing a compiler is important because a statement in the language should mean exactly one thing, and a language should be able to be compiled efficiently. This is more difficult than it sounds.

Second, you need to understand language design, variable binding, etc. This is a topic of Programming Language Paradigms. I'll figure out a good book for this and edit to add it. The best book probably covers languages like Ada, Haskell, C, and Java and gives an overview of their design and reasons.

edit: The book for design is Concepts of Programming Languages 9th ed, by Robert W. Sebesta.

u/ToTimesTwoisToo · 12 pointsr/C_Programming

C targets a virtual memory system and instruction set architecture (ISA). It's an abstraction over the hardware implementation of the ISA. Those worlds are just different, and you'll gain a better understanding if you just study them separately.

for computer architecture, I've found two books to be most helpful.

https://www.amazon.com/Digital-Design-Computer-Architecture-Harris-ebook/dp/B00HEHG7W2

https://www.amazon.com/Structured-Computer-Organization-Andrew-Tanenbaum/dp/0132916525/ref=sr_1_1?ie=UTF8&amp;amp;qid=1536687062&amp;amp;sr=8-1&amp;amp;keywords=tanenbaum+computer+architecture&amp;amp;dpID=41B7uYANs%252BL&amp;amp;preST=_SX218_BO1,204,203,200_QL40_&amp;amp;dpSrc=srch

there is a low level operating system book that uses a lot of C code to explain how to build a kernel. This might interest you

https://www.amazon.com/Operating-System-Design-Approach-Second/dp/1498712436

u/antonivs · 18 pointsr/badcode

The code you posted was generated from a grammar definition, here's a copy of it:

http://www.opensource.apple.com/source/bc/bc-21/bc/bc/bc.y

As such, to answer the question in your title, this is the best code you've ever seen, in the sense that it embodies some very powerful computer science concepts.

It [edit: the Bison parser generator] takes a definition of a language grammar in a high-level, domain-specific language (the link above) and converts it to a custom state machine (the generated code that you linked) that can extremely efficiently parse source code that conforms to the defined grammar.

This is actually a very deep topic, and what you are looking at here is the output of decades of computer science research, which all modern programming language compilers rely on. For more, the classic book on the subject is the so-called Dragon Book, Compilers: Principles, Techniques, and Tools.

u/FastEddieTheG · 1 pointr/explainlikeimfive

If you're interested in learning a surprising amount about this without needing heavy technical background, might I recommend a fantastic book, But How Do It Know?

u/powerclaw1 · 1 pointr/mildlyinteresting

I should point out that this (I'm pretty sure at least) comes from Randall Munroe's book Thing Explainer which uses the xkcd art style etc. to explain complicated concepts using the 1,000 most common English words. It's pretty great, check it out if you can!

u/johnweeder · 2 pointsr/learnprogramming

Yes. Do it. It's great to know. Useful ocassionally - especially grammars. The dragon book is the only college text I've kept.

https://www.amazon.com/Compilers-Principles-Techniques-Alfred-Aho/dp/0201100886/ref=pd_lpo_sbs_14_img_0?_encoding=UTF8&amp;amp;psc=1&amp;amp;refRID=6GT8HPHEKPGJX9GGVMNR

u/vplatt · 1 pointr/java

I have this as well, but don't really have any remarks for you. That said, maybe you should look through some of the reviews for it on Amazon or the like. The reviews there seem pretty authentic.

https://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/

u/cronin1024 · 25 pointsr/programming

Thank you all for your responses! I have compiled a list of books mentioned by at least three different people below. Since some books have abbreviations (SICP) or colloquial names (Dragon Book), not to mention the occasional omission of a starting "a" or "the" this was done by hand and as a result it may contain errors.

edit: This list is now books mentioned by at least three people (was two) and contains posts up to icepack's.

edit: Updated with links to Amazon.com. These are not affiliate - Amazon was picked because they provide the most uniform way to compare books.

edit: Updated up to redline6561


u/waaaaaahhhhh · 7 pointsr/ECE

There seems to be two approaches to learning DSP: the mathematically rigorous approach, and the conceptual approach. I think most university textbooks are the former. While I'm not going to understate the importance of understanding the mathematics behind DSP, it's less helpful if you don't have a general understanding of the concepts.

There are two books I can recommend that take a conceptual approach: The Scientist and Engineer's Guide to Digital Signal Processing, which is free. There's also Understanding Digital Signal Processing, which I've never seen a bad word about. It recently got its third edition.

u/eitauisunity · 3 pointsr/learnpython

Here is an interesting video where they build a cpu up from the transistor level.

The CPU is only a theoretical one called a "Scott CPU", which was designed by John Scott, who is the author of the book, But How Do It Know?, which is an amazingly straight-forward, easy-to-digest book about computing.

I would recommend it as it was the first thing I read that gave me a deep understanding of computers on an abstract level. It completely demystified them and got me well on my way to programming.

Edit: The video doesn't go down to the transistor level, just goes over each component of a CPU. The book does go down to the transistor level, however, and again, I would highly recommend it.

u/bitcycle · 1 pointr/Database
  1. Use a relational data store first: MySQL/PostgreSQL/M$ SQL Server.
  2. Put CRUD operations behind a web service.
  3. Instead of arbitrary key/value pairs, try to get as specific as possible about the data (instead of storing everything as strings). Being more specific will help things (usually) to be more performant.
  4. Build the application on top of the service.
  5. Scale to the # of users you want to be able to support
  6. At this point, if you need to move part of the data into a non-relational data store, you can.

    I also recommend reading this book: Patterns of Enterprise Application Architecture
u/pitiless · 3 pointsr/PHP

The following books would be good suggestions irrespective of the language you're developing in:

Patterns of Enterprise Application Architecture was certainly an eye-opener on first read-through, and remains a much-thumbed reference.

Domain-Driven Design is of a similar vein &amp; quality.

Refactoring - another fantastic Martin Fowler book.

u/chopsuwe · 9 pointsr/arduino

I thought Programming Arduino Getting Started with Sketches by Simon Monk was very good. It starts from the very basics of what a micro controller is and the concepts of how it works. Then steps you through the example sketches in the Arduino IDE explaining how and why they work. It's written in a way that's very easy to understand even for the absolute beginner.

Once you've gone through those you'll have a good understanding of what is and isn't possible and how to make your own projects around it. After that Google.

u/idontchooseanid · 1 pointr/linux

Do you want to know which parts make an OS or how it's actually run in runtime. Former, is easy just install Arch, Gentoo or Linux From Scratch. Latter is a lot complicated nowadays but https://www.amazon.de/Operating-Systems-Implementation-Prentice-Software/dp/0131429388/ref=tmm_hrd_swatch_0?_encoding=UTF8&amp;amp;qid=&amp;amp;sr= is a great start or there's https://wiki.osdev.org/Tutorials if you want to go deep.

If you do both and combine the knowledge, your beard will grow 100x.

Source: I did but I am hairless 0*100 = 0 :/.

u/postmodern · 1 pointr/programming
  • ActiveRecord is a pattern according to Patterns of Enterprise Application Architecture. It's definitely the easiest pattern to implement, and thus the most popular amongst ORMs. Therefore, the simple misconception that ORM == ActiveRecord.
  • Your own description of the DataMapper pattern would imply that DataMapper does in fact differ from ActiveRecord. :) ActiveRecord has no concept of mapping, nor separation between the schema representation and the model. ActiveRecord simply instantiates Models froms rows.
  • Note: I am not quoting Martin Fowler as an Apple to Authority, but simply because Martin Fowler wrote Patterns of Enterprise Application Architecture (PoEAA), in which the ActiveRecord and DataMapper patterns are explained. :) See also the Wikipedia entry for ActiveRecord which lists Martin Fowler as the creator.
u/mrjaguar1 · 2 pointsr/learnprogramming

Its a little old but operating system design and implementation by Andrew Tanenbaum is a pretty good book and it includes a lot of the Minix source code in the book.

u/Yunath_ · 1 pointr/uwaterloo

LOL it seems interesting to me. I'm reading https://www.amazon.ca/Design-Implementation-FreeBSD-Operating-System/dp/0321968972/ref=dp_ob_title_bk right now.

&amp;#x200B;

Maybe its good in theory, and not in practice.

u/solid7 · 3 pointsr/learnprogramming

In that pile-o-stuff there are really two main subjects: architecture and operating systems. I'd pick up recent copies of the dinosaur book and where's waldo. Silbershatz and Tanenbaum are seminal authors on both subjects.

There are numerous resources to learn C. Since I seem to be recommending books, Kernighan and Ritchie's book is pretty much the gold standard.

Good luck.

u/NoahFect · 5 pointsr/ECE

Oppenheim &amp; Schafer is the usual standard text, as others have said. However, it's pretty theory-intensive and may not be that much of an improvement over your current book, if you are looking for alternative explanations.

I'd say you should look at Lyons' Understanding Digital Signal Processing instead of O&amp;S. Also the Steven Smith guide that mostly_complaints mentioned is very accessible. Between Smith and Lyons you will get most of the knowledge that you need to actually do useful DSP work, if not pass a test in it.

u/killver · 2 pointsr/videos

If you are looking for a very easy to read introduction to how computers work, I can recommend the book "But How Do It Know?". Strange title, but book is great. https://www.amazon.com/But-How-Know-Principles-Computers/dp/0615303765

u/nostrademons · 8 pointsr/programming

&gt; Writing an interpreter is an order of magnitude easier for a beginner, especially if they write it in a high level language like Scheme, OCaml or Haskell.

Depends on your target language. If you're compiling to a reasonably high level language (like C or ActionScript) and don't care much about optimization, they're basically the same thing. Instead of maintaining a stack of environments and evaluating to a value, you maintain a stack of symbol tables and evaluate to a program fragment in the target language.

If you need to do instruction selection, calculate branch offsets, or do any sort of optimization, it gets more complicated. But you can still find nearly everything you need between The Dragon Book, Steve Muchnik's Advanced Compiler Design and Implementation, and Appel's Modern Compiler Implementation in ML.

&gt; A good tutorial, if you know Haskell

There's also ArcLite, which implements a Lisp in JavaScript.

Also note that it's possible to blur the lines between compiler and interpreter significantly. For example, one of the exercises in SICP splits the evaluator into one pass to analyze it into closures that do whatever needs to be done with the environment, and then a second pass that actually evaluates that function with the provided environment.

Another trick you can do is to depend on the data structures of the host language for some of your processing. For example, variable lookup in ArcLite is implemented in C. How? Each activation record is a JavaScript object, and then the __proto__ property is used as the static link to knit them together. So when the interpreter goes to look up a variable, it's just a hash lookup, and then the JavaScript runtime automatically consults that activation record's prototype if it fails, exactly as if it were looking up a normal JavaScript variable.

u/HarmlessSnack · 3 pointsr/TheCulture

Thing Explainer by Randall Munroe is my favorite coffee table book. (He’s the guy that makes XKCD comics.)

Giant detailed drawings of complex things explained using common language, and a candy coating of humor. Really fun book!

u/pdq · 6 pointsr/programming

The good news is that MIPS is possibly the most straightforward assembly language to code or read. Compared to x86, it's a dream.

If your class is computer architecture related, you will probably be learning from Hennessy and Patterson, which is an excellent book covering hardware and software.

If you want to learn MIPS from a software perspective, check out the classic See MIPS Run.

u/hwillis · 6 pointsr/electronics

Can't use free eagle (too big) for this, but kicad or probably other things would work. With a few good books you can lay out a big board without advanced tools, although it can take longer. With cheap/free tools you'll usually have to use some finicky or kludgy methods to do really complex routing (blind/buried vias, free vias, heat transfer, trace length), but that usually isn't too big a deal. Here's a timelapse of a guy using Altium to route a high speed, large (a bit smaller than op's) data board for a high speed camera. The description has rough steps with timestamps- 38 hours total to lay out.

u/blahdom · 1 pointr/learnpython

No problem. Good luck finding a class, compilers are a really fun subject!

If you are just generally interested (I don't know your experience level) but the dragon book is still highly regarded. And might be a good entry way into the theory of it all.

u/erasmus42 · 1 pointr/rfelectronics

Here's some leaders in the EMC field, to start:

Henry Ott - lots of good info on his website (and his book is a classic)

[Eric Bogatin}(https://www.bethesignal.com/bogatin/)

Howard Johnson (not the hotel chain) - High-Speed Digital Design: A Handbook of Black Magic

If you understand how a VNA and TX lines work, you are most of the way there to understanding signal integrity.

u/poincareDuality · 10 pointsr/compsci

For designing programming languages, my favorites are

u/fbhc · 5 pointsr/AskComputerScience

My compilers course in college used the Dragon Book, which is one of the more quintessential books on the subject.

&amp;#x200B;

But you might also consider Basics of Compiler Design which is a good and freely available resource.

&amp;#x200B;

I'd also suggest that you have familiarity with formal languages and automata, preferably through a Theory of Computation course (Sipser's Introduction to the Theory of Computation is a good resource). But these texts provide a brief primer.

u/DocAtDuq · 0 pointsr/todayilearned

I'd suggest an arduino uno to start out I don't know what the kits include but I'd suggest against them. You'll do better ordering your parts for projects separately so you only buy what you need. I'd suggest starting with an led cube it's easy to solder and there are code sequences already written for patterns. You'll need a soldering iron. I'd suggest this book also.
http://www.amazon.com/gp/aw/d/0071784225/ref=pd_aw_sims_5?pi=SL500_SY115&amp;amp;simLd=1

u/kevlarcoated · 1 pointr/PrintedCircuitBoard

The books referenced by the most presenters and PCB design conferences are
Right the first time by Lee Ritchie http://www.thehighspeeddesignbook.com/
Highspeed design: A handbook of Black Magic - Howard Johnson https://www.amazon.ca/High-Speed-Digital-Design-Handbook/dp/0133957241

Note that A handbook of black magic reads like a text book, it is very long and very boring.
The subject of PCB is complicated and requires an in depth understanding of the physics because just knowing the rules isn't enough convince other engineers that it's the right way to do something. More importantly, in my experience PCB design is always the least bad solution, you have to understand when you can break the rules and what the implications will be and understand if the trade off is acceptable

u/jalagl · 1 pointr/explainlikeimfive

I recommend you grab this book, I used it in the university and gives a pretty good explanation of how computers work.

That being said, you would need input from material physicist, electronic engineers, chemists, and a bunch of other professionals to really understand how a computer really works. It is a complex machine, and building one combines knowledge from many many disciplines.

u/MrPopoGod · 18 pointsr/BABYMETAL

Yeah, Su's not just reading off a script. Her English has come really far; she's at the point of having enough vocabulary to feel like she can express what she wants to express once she picks the right words out of her dictionary. So she still has to do a translation of concepts into a smaller set of words (sort of like the book Thing Explainer) but she's got the confidence to do so.

u/teresko · 12 pointsr/PHP

Actually i would suggest you to start learning OOP and maybe investigate the MVC design pattern, since those are both of subjects which average CodeIgniter user will be quite inexperienced in. While you might keep on "learning" frameworks, it is much more important to actually learn programming.

Here are few lecture that might help you with it:

u/BlackDeath3 · 3 pointsr/csMajors

Have you seen the MIPS WikiBooks site?

It's not complete, but after a cursory inspection it looks as though it may be useful for beginners.

What textbook is your class using? I believe that the textbook my course assigned (though I may or may not be able to recommend it as I may or may not have ever read it...) was this one.

u/CaffinatedSquirrel · 2 pointsr/learnpython

Clean Code: Clean Code

Clean Architecture: Clean Arch

&amp;#x200B;

Just picked these two up myself.. not sure if its what you are looking for, but seem to be very valuable for software design as a whole.

u/If_you_just_lookatit · 7 pointsr/embedded

I started early on with Arduino and moved into lower level embedded with the stm32 discovery line of development boards. Attached link of a good starting board that has tons of example code from ST.

https://www.mouser.com/ProductDetail/STMicroelectronics/STM32F407G-DISC1?qs=mKNKSX85ZJejxc9JOGT45A%3D%3D

If you want a decent intro book into embedded topics, this book does a decent job of introducing the different parts of working on an embedded project:

https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149

&amp;#x200B;

Between the Arduino and Pi, the Arduino is more representative of an embedded device. It introduces you to resource constrained system that can be a good starting point for working with digital and analog io's and can let you hook up to communication bus enabled peripherals that use UART, I2C, and SPI. The biggest problem is that it will not introduce you immediately to debugging and standard compilation tools. However, arduino has been a starting point for many developers. Good luck!

u/Ispamm · 21 pointsr/androiddev

Don't give up just yet, keep looking.
Do you have a portfolio? if not try to work on a project of your own so you can have something to show.
And if you are considering improving your java skills try work with libraries like:

u/PedroMutter · 2 pointsr/typescript

Thanks for your advice. The O'reilly book you mentioned is this one? (Building-Microservices-Sam-Newman). And could you send me some material that you like please? (blog posts included).

u/boredcircuits · 5 pointsr/learnprogramming

Start with the Dragon Book.

When it actually comes time to implement the language, I would recommend just writing the frontend and reusing the backend from another compiler. LLVM is a good option (it's becoming popular to use as a backend, it now has frontends for C, C++, Objective C, Java, D, Pure, Hydra, Scheme, Rust, etc). See here for a case study on how to write a compiler using LLVM as the backend.

u/jayknow05 · 2 pointsr/AskElectronics

This is a good book on the subject. I would personally work with a 4-layer board with a GND and VCC layer. It sounds like you already have a bunch of layers as it is so yes I would recommend a VCC layer.

u/Dhekke · 8 pointsr/programming

This book Structured Computer Organization is also very good at explaining in detail how the computer works, it's the one I used in college... Pretty expensive, I know, but at least the cover has nice drawings!

u/khafra · 1 pointr/DebateReligion

Much of your thinking seems to be based on a confusion of levels. If you knew more specifically how the firing together of neurons strengthens the probability they'll fire together in the future; or if you'd examined a program simulating physics, you wouldn't be using confusion as building blocks for arguments.

For instance, you would not be as confused right here if you were a systems developer instead of a philosopher; one read-through of the Dragon Book would clear everything right up. I'll try to summarize, but please understand this is not rigorous:

Your mind is running the algorithm "Step 1: Move to front of house. Step 2: Move to back of house. Step 3: Go to Step 1." Your mind is something your brain does. Your brain is implemented on physics. Exactly like the boulder.

The most legitimate question related to this post is that of substrate. Note: I do not agree with everything in this essay, but it presents the problem better than writings on "dust theory" (unless you're willing to read the whole Greg Egan novel Permutation City).

u/slowfly1st · 2 pointsr/learnprogramming

Your foes are kids in their twenties with a degree which takes years to achieve, this will be tough! But I think your age and your willingness to learn will help you lot.

&amp;#x200B;

Other things to learn:

  • JDK - you should be at least aware what API's the JDK provides, better, have used them (https://docs.oracle.com/javase/8/docs/). I think (personal preference / experience) those are the minimum: JDBC, Serialization, Security, Date and Time, I/O, Networking, (Internationalization - I'm from a country with more than one official language), Math, Collections, Concurrency.
  • DBMS: How to create databases and how to access them via JDBC. (I like postgreSQL). Learn SQL.
  • Learn how to use an ORM Mapper. (I like jOOQ, I dislike JPA/hibernate)
  • Requirements Engineering. I think without someone who has the requirements you can't really practice that, but theory should be present. It's a essential part of software development: Get the customers requirements and bring it to paper. Bad RE can lead to tears.
  • Writing Unit Tests / TDD. Having working code means the work is 50% done - book recommendation: Growing Object-Oriented Software, Guided by Tests
  • CI/CD (Continuous Integration / Delivery) - book recommendation: Continuous Delivery.
  • Read Clean Code (mandatory!)
  • Read Design Patterns (also mandatory!)
  • (Read Patterns of Enterprise Application Architecture (bit outdated, I think it's probably a thing you should read later, but I still love it!))
  • Get familiar with a build tool, such as maven or gradle.

    &amp;#x200B;

    If there's one framework to look at, it would be spring: spring.io provides dozens of frameworks, for webservices, backends, websites, and so on, but mainly their core technology for dependency injection.

    &amp;#x200B;

    (edit: other important things)
u/cbrpnk · 2 pointsr/AskProgramming

The fact that you mentioned that it'd be cool to work on a DAW tells me that you want to go low level. What you want to study is digital signal processing or DSP. I recommend Understanding Digital Signal Processing. Also watch This talk by Timur Doumler. Or anything by him. I recommend that you pick a programming language and try to output a sin wave to the speakers, then go on from there.

Also check those out:

https://theaudioprogrammer.com/

https://jackschaedler.github.io/circles-sines-signals/

https://blog.demofox.org/#Audio

&amp;#x200B;

Good luck.

u/kwaddle · 1 pointr/DSP

I think The Scientist and Engineer's Guide to Digital Signal Processing and Understanding Digital Signal Processing and generally considered the most accessible introductions. I've gotten more mileage out of Understanding DSP; I feel like it goes into a little more detail and really works to walk you through concepts, step by step.

http://www.dspguide.com/pdfbook.htm

https://www.amazon.com/Understanding-Digital-Signal-Processing-3rd/dp/0137027419


Aside from searching out good learning resources, IMO nothing is more helpful for learning than setting up your environment with Matlab, Jupyter notebooks, or whatever you're going to use, and getting comfortable with the tools you'll be using to explore these topics.

u/anyones_ghost27 · 1 pointr/funny

Yeah, he ate a lot of the front cover and destroyed the first 20-30 pages of my hardback HP and the Deathly Hallows. But he removed the dust jacket first without damaging it, so at least I can put that on and cover the damage.

He also destroyed the Thing Explainer by Randall Munroe, which I highly recommend as a gift for anyone, including kids, who likes cool drawings and nerdy things. Or maybe for dogs who eat hardback books. My dog found it extra tasty and super chewy.

u/GPSMcAwesomeville · 2 pointsr/compsci

Hey, last year I followed a course in Operating Systems where we used MINIX as an example OS, which is one of the most understandable OS's out there, and great for learning.

This is a good (and quite pricey, unfortunately) book for MINIX and Operating Systems in general.

u/oldsecondhand · 1 pointr/technology

I'd check out these two books from the local library and read the first 2-3 chapters. It might contain more than what you need, but these are pretty well written books and don't assume a lot of previous knowledge.

http://www.amazon.com/Structured-Computer-Organization-5th-Edition/dp/0131485210

http://www.amazon.com/Computer-Networks-5th-Andrew-Tanenbaum/dp/0132126958/ref=la_B000AQ1UBW_sp-atf_title_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1376126566&amp;amp;sr=1-1

Or you could just check out your network settings and search for the terms that you encounter (IP address, DNS, DHCP, gateway, proxy, router, firewall)

u/jodraws · 1 pointr/hearthstone

Make use of that intelligence and get her an arduino uno.

She'll be able to make anything from simple robots to a light up dress that changes colors. A simple guide to get started will help as well. Guide.

u/godlikesme · 1 pointr/learnprogramming

I highly recommend Structured Computer Organization by Andrew Tanenbaum. It is a great book for beginners

u/EvasiveBeaver · 1 pointr/learnprogramming

OOP is a very general concept and it doesn't go further than the SOLID principles. As for the how this actually gets done is somewhat of an opinion and open to interpretation. I'm a fan of this book....

Clean Architecture A Craftsman's Guide to Software Structure and Design

It has very strong opinions and because of that it gives a consistent message and direction. It should by no means be the only opinion or book you take in to account (learn from as many people as you can). But it's a very good start.

u/m85476585 · 2 pointsr/AskEngineers

I literally have a book called "A Handbook of Black Magic". It's a little old, but it's still one of the best books on the subject.

u/ElectricRebel · 5 pointsr/compsci

For compilers:

u/somekindofdevil · 1 pointr/AskElectronics

Almost every PCB/EDA software doing length matching automatically so you don't need to worry about that. If you wanna know how softwares are doing it, It's more like a mathematical problem. I think they are using parametric curves like Bezier. You can calculate length of a bezier curve easily so you can match them.

https://en.wikipedia.org/wiki/B%C3%A9zier_curve

If you wanna know more about high speed pcb design, I recommend this book.
https://www.amazon.com/High-Speed-Digital-Design-Handbook/dp/0133957241

u/realjoeydood · 5 pointsr/csharp

Clean Architecture, for sure.

Clean Architecture: A Craftsman's Guide to Software Structure and Design (Robert C. Martin Series) https://www.amazon.com/dp/0134494164/ref=cm_sw_r_cp_apa_i_yJ83Db51V9ZZS

u/Skipper_Jos · 3 pointsr/engineering

I will also recommended 'High Speed Digital Design: A Handbook of Black Magic book' , it definitely has some good stuff!
https://www.amazon.com/dp/0133957241/ref=cm_sw_r_cp_tai_O05TBb9HPRG90#

u/turtlepot · 2 pointsr/AskComputerScience

Highly endorsed, first book I read out of school:

Code Complete - Steve McConnell


Bonus, engineers at my office were just given this book as recommended reading:

Clean Architecture - Robert C. Martin

u/RadioactiveAardvark · 2 pointsr/embedded

There aren't any that I'd recommend, unfortunately.

This book is not specifically about embedded C, but about embedded in general:

https://www.amazon.com/Making-Embedded-Systems-Patterns-Software/dp/1449302149

Anything by Jack Ganssle is good as well.