Reddit mentions: The best machine theory books

We found 787 Reddit comments discussing the best machine theory books. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 61 products and ranked them based on the amount of positive reactions they received. Here are the top 20.

1. Code: The Hidden Language of Computer Hardware and Software

    Features:
  • Microsoft Press
Code: The Hidden Language of Computer Hardware and Software
Specs:
Height8.9 Inches
Length6 Inches
Number of items1
Weight1.0361726314 Pounds
Width1 Inches
▼ Read Reddit mentions

2. Grokking Algorithms: An illustrated guide for programmers and other curious people

Manning Publications
Grokking Algorithms: An illustrated guide for programmers and other curious people
Specs:
Height9.25 Inches
Length7.38 Inches
Number of items1
Release dateMay 2016
Weight0.881849048 Pounds
Width0.4 Inches
▼ Read Reddit mentions

3. Introduction to the Theory of Computation

    Features:
  • Used Book in Good Condition
Introduction to the Theory of Computation
Specs:
Height9.5 Inches
Length6.5 Inches
Number of items1
Weight1.64905771976 Pounds
Width1 Inches
▼ Read Reddit mentions

5. Introduction to the Theory of Computation

    Features:
  • NIV, Bible
Introduction to the Theory of Computation
Specs:
Height9.21258 inches
Length5.70865 inches
Number of items1
Weight1.650380493332 pounds
Width0.90551 inches
▼ Read Reddit mentions

6. Feynman Lectures On Computation (Frontiers in Physics)

Used Book in Good Condition
Feynman Lectures On Computation (Frontiers in Physics)
Specs:
Height9.42 Inches
Length6.12 Inches
Number of items1
Weight1.08 Pounds
Width0.81 Inches
▼ Read Reddit mentions

7. Write Great Code: Volume 1: Understanding the Machine

    Features:
  • Used Book in Good Condition
Write Great Code: Volume 1: Understanding the Machine
Specs:
Height9.25 Inches
Length7 Inches
Number of items1
Weight1.95 Pounds
Width1.12 Inches
▼ Read Reddit mentions

8. Introduction to the Theory of Computation

Used Book in Good Condition
Introduction to the Theory of Computation
Specs:
Height9.75 Inches
Length6.75 Inches
Number of items1
Weight1.60055602212 Pounds
Width1 Inches
▼ Read Reddit mentions

9. Introduction to Automata Theory, Languages, and Computation (3rd Edition)

    Features:
  • Factory sealed DVD
Introduction to Automata Theory, Languages, and Computation (3rd Edition)
Specs:
Height9.55 Inches
Length6.7 Inches
Number of items1
Weight2.1384839414 Pounds
Width1.4 Inches
▼ Read Reddit mentions

10. Machine Learning for Hackers

O Reilly Media
Machine Learning for Hackers
Specs:
Height9.19 Inches
Length7 Inches
Number of items1
Release dateFebruary 2012
Weight1.14 Pounds
Width0.75 Inches
▼ Read Reddit mentions

12. Type Theory and Formal Proof: An Introduction

    Features:
  • Cambridge University Press
Type Theory and Formal Proof: An Introduction
Specs:
Height9.61 inches
Length6.69 inches
Number of items1
Weight2.1605301676 Pounds
Width1 inches
▼ Read Reddit mentions

13. Discrete Mathematics, 2nd Edition

    Features:
  • Oxford University Press USA
Discrete Mathematics, 2nd Edition
Specs:
Height0.97 Inches
Length9.32 Inches
Number of items1
Weight1.81660903888 Pounds
Width6.54 Inches
▼ Read Reddit mentions

16. Write Great Code, Volume 2: Thinking Low-Level, Writing High-Level

Used Book in Good Condition
Write Great Code, Volume 2: Thinking Low-Level, Writing High-Level
Specs:
Height9.25 Inches
Length7 Inches
Number of items1
Weight2.3 Pounds
Width1.5 Inches
▼ Read Reddit mentions

17. The New Hacker's Dictionary - 3rd Edition

    Features:
  • Used Book in Good Condition
The New Hacker's Dictionary - 3rd Edition
Specs:
ColorYellow
Height9 Inches
Length6 Inches
Number of items1
Release dateOctober 1996
Weight1.69976404002 Pounds
Width1.43 Inches
▼ Read Reddit mentions

18. Introduction to Machine Learning (Adaptive Computation and Machine Learning series)

Introduction to Machine Learning (Adaptive Computation and Machine Learning series)
Specs:
Height9 Inches
Length8 Inches
Number of items1
Weight2.65216101186 Pounds
Width0.9375 Inches
▼ Read Reddit mentions

19. Randomized Algorithms

Used Book in Good Condition
Randomized Algorithms
Specs:
Height10 Inches
Length7 Inches
Number of items1
Weight2.4030386558 Pounds
Width1.06 Inches
▼ Read Reddit mentions

🎓 Reddit experts on machine theory books

The comments and opinions expressed on this page are written exclusively by redditors. To provide you with the most relevant data, we sourced opinions from the most knowledgeable Reddit users based the total number of upvotes and downvotes received across comments on subreddits where machine theory books are discussed. For your reference and for the sake of transparency, here are the specialists whose opinions mattered the most in our ranking.
Total score: 143
Number of comments: 21
Relevant subreddits: 3
Total score: 53
Number of comments: 8
Relevant subreddits: 4
Total score: 46
Number of comments: 7
Relevant subreddits: 2
Total score: 46
Number of comments: 4
Relevant subreddits: 1
Total score: 20
Number of comments: 5
Relevant subreddits: 3
Total score: 18
Number of comments: 8
Relevant subreddits: 2
Total score: 16
Number of comments: 4
Relevant subreddits: 3
Total score: 16
Number of comments: 4
Relevant subreddits: 2
Total score: 11
Number of comments: 4
Relevant subreddits: 3
Total score: 10
Number of comments: 6
Relevant subreddits: 3

idea-bulb Interested in what Redditors like? Check out our Shuffle feature

Shuffle: random products popular on Reddit

Top Reddit comments about Machine Theory:

u/ffualo · 3 pointsr/askscience

Hi RandomNumber37,

So here's a little bit about me first; I don't want to misrepresent myself. My background is in economics and political science, where I was interested in statistical models that predict rare international events like war and state failure. It's here I became obsessed with statistics, machine learning, etc. Also, I've been programming in many languages since I was a kid, so after my undergraduate work in the social sciences and statistics, I took a job with a bioinformatics group doing coding. I thought this would be a temporary job until graduate school in economics or quantitative political science.

However working with large-scale biological and sequencing data was way more awesome than I expected. This caused me to shift focus. I also did a fair amount of work on computational statistics, i.e. ways of trying to make R better, understanding compiler technologies, etc. So after, I became more purely interested in statistics and computational biology, and I thought I would go to graduate school for pure statistics so I could also devote some time to computational statistics. However, now I work in a plant breeding lab (which I absolutely love). I will do this about another 2-3 years before I transition into a graduate program. This would mean I've worked in the field about 6 years before applying to graduate programs.

So, with that out of the way here are answers to your questions and some advice I offer:

  1. How much of your time is spent working with the plants themselves vs with computer-organized data?

    Being that my background isn't in biology, I don't currently work with plants much. However, this is why I moved towards plant biology. Before getting obsessed about social science methods, I loved plants. I worked at an orchid greenhouse, and actually went to UC Davis thinking I'd study plant biology (until an awesome political science professor got me excited about science applied to political data). However, the scientists I work with are often not doing too much work with plants: many grow the plants, do the wet lab work, then spend more than half the time (sometimes up to 90%) analyzing the huge amount of data. I spend my full day in front of a computer, except when a colleague wants me to check out something cool in the lab, etc.

  2. With what kind of operations does your computer aid you?

    Everything. We get raw sequencing data, I have to analyze it from start to finish. Or, from raw sequencing files until the point where the numbers behind it tell a story. I also spend a huge amount of my time writing programs that do certain things for biologists in our group. Everything — protein prediction, data quality analysis, statistical modeling, etc.

  3. Do you see a full cycle... from plant, to data, to application of knowledge to your specimens (and back to data)?

    Yes, at this current position I am starting to (which I why I sought work in plant biology). It depends on what plant you work with (Arabidopsis = short life cycle, you can do lots of stuff, vs citrus tree = long life cycle, you can't do lots of stuff). But some of the more awesome longer term projects will take 4 years to fully materialize.

    So now, what steps were more important? I will tell you the three things that have helped me the most. As a point of how much they've helped me, I'll just mention that despite that not having a Phd (yet), or much of a background in biology other than what I've taught myself or learned on the job (which is actually quite a lot after 4 years in the field), I've had (and continue to receive) really nice job offers.

  4. Learn programming really, really, really well. If you want to be a step above the rest, learn python and R. Perl is huge in bioinformatics, but it's a disgusting ugly language that's dying out in my opinion. It sucks for reproducibility; no one can read anyone else's code. It was great when everyone was racing to get the human genome sequenced and had to write quick scripts constantly. Now, we have larger software platforms for that stuff, and what will count most in the future is the distribution of your scientific code. Reproducibility problems will soon be primarily dry lab, not wet lab. If you doubt that, read the "Forensic Bioinformatics" paper (http://projecteuclid.org/euclid.aoas/1267453942) which was a game changer for me. I've always been passionate about open science and reproducibility, but that made me realize that we'll have a huge problem in a few years if we're not careful.

    Anyways, I'd recommend learning:

  • Python (with BioPython). Also, with Django if you're building web apps to interface with scientific databases.
  • R (with Bioconductor).
  • Unix command line (sed/awk, bash)
  • Know your editor. I use emacs. Even if it takes you 80 hours to learn emacs or your editor well, you will regain that time over a year of work. I promise. People watch me use emacs and they say it makes them dizzy because they can't keep up. That's dozens of hours saved each week.

    Now, optionally (but highly, highly recommended):

  • C. Absolutely necessary to debug compiling programs or writing high-usage programs that need to be fast.
  • SQL. You'll be storing biological data in databases. SQL is important. Use SQLite a lot. People like huge PostgreSQL or MySQL databases for even small things, but this is a waste of time IMO if you're just going to be the one accessing it. Bioconductor leverages huge amount of SQLite because it's so easy and awesome.

    Now, even more optionally:

  • Lisp. Lisp will change the way you think about programming. It's also used with AraCyc, MetaCyc, and PlantCyc data. I've used it extensively in these applications. The ratio of how Lisp has changed my thinking to how much I use it in production code is HUGE. Learn functional programming concepts; then concepts like map/reduce will fall easily into place. Know object orientation too.

  • Javascript. I love JS. It's doing amazing things too. And part of being a very effective bioinformatician/statistician is being able to easily convey your data. There is no easier and more interactive medium than a browser. Check out d3.js. Even old scientists can click a link and interact with data via Javascript. In contrast, they wouldn't want to install some old dusty Java application. Of course, with this comes HTML, XML, JSON, etc, etc. so learn those too.

  1. Learn statistics REALLY WELL. Honestly, try to pick up a statistics minor (over a CS minor IMO). Lots of brilliant programmers buy the Cormen algorithm book and are set for data structures and algorithms. But understanding statistics at a deeper level — that takes intimate study via courses. I would recommend taking courses on probability theory and mathematical statistics. I took two courses as part of our mathematical statistics series and I cannot even begin to emphasize how helpful they were. I hear a quote once: at Google they use Bayes theorem like other programmers use "if" statements. Same thing in bioinformatics. Look at the best SNP callers, software, etc, and they're using population genetics models and Bayes approaches. Know math stats early, and it will permeate your thinking in the best ways.

    Another quick story: I had a statistics graduate student come tell me he was working for a rather well known genomics professor on campus. He asked me how to analyze RNA-seq data. He said he wanted to use ANOVA. Even though he was a statistics graduate student, he went immediately to normality-assuming models, which was definitely not the case with this data the case. So know your Poisson, negative binomial, gamma, etc distributions. A probability course should introduce them all to you. It will also means when you start learning more theoretical population genetics, you'll be set.

    Also, buy a book on machine learning (Elements of Statistical Learning II is good, and a free PDF is available). ESL II is good, but dense; don't let it discourage you. I also like this book. But again, this is dense stuff, don't let it discourage you.

  2. Learn data structures and algorithms well. I think a single course, or doing this on your own is sufficient. However, if you want to do what Heng Li does (author of BWA, samtools, and fermi assembler) you need much, much more. Compression-based data structures are huge in bioinformatics now. I love this stuff, but it's too removed from the biology to be very interesting to me. But if that's the direction you want to move into, hang around CS department more.

  3. Learn to code well. This is vastly underemphasized in the sciences. Learn about test-driven development. Get the habit of writing unit tests early, and writing good documentation. Learn Git too — this is a must.

u/shred45 · 6 pointsr/gatech

So, when I was younger, I did attend one computer science related camp,

https://www.idtech.com

They have a location at Emory (which I believe I did one year) that was ok (not nearly as "nerdy"), and one at Boston which I really enjoyed (perhaps because I had to sleep on site). That being said, the stuff I learned there was more in the areas of graphic design and/or system administration, and not computer science. They are also quite expensive for only 1-2 weeks of exposure.

I felt it was a good opportunity to meet some very smart kids though, and it definitely lead me to push myself. Knowing and talking to people that are purely interested in CS, and are your age, is quite rare in high school. I think that kind of perspective can make your interests and hobbies seem more normal and set a much higher bar for what you expect for yourself.

On the other side of things, I believe that one of the biggest skills in any college program is an openness to just figure something out yourself if it interests you, without someone sitting there with you. This can be very helpful in life in general, and I think was one of the biggest skills I was missing in high school. I remember tackling some tricky stuff when I was younger, but I definitely passed over stuff I was interested in just because I figured "thats for someone with a college degree". The fact is that experience will make certain tasks easier but you CAN learn anything you want. You just may have to learn more of the fundamentals behind it than someone with more experience.

With that in mind, I would personally suggest a couple of things which I think would be really useful to someone his age, give him a massive leg up over the average freshman when he does get to college, and be a lot more productive than a summer camp.

One would be to pick a code-golf site (I like http://www.codewars.com) and simply try to work through the challenges. Another, much more math heavy, option is https://projecteuler.net. This, IMO is one of the best ways to learn a language, and I will often go there to get familiar with the syntax of a new language. I think he should pick Python and Clojure (or Haskell) and do challenges in both. Python is Object Oriented, whilst Clojure (or Haskell) is Functional. These are two very fundamental and interesting "schools of thought" and if he can wrap his head around both at this age, that would be very valuable.

A second option, and how I really got into programming, is to do some sort of web application development. This is pretty light on the CS side of things, but it allows you to be creative and manage more complex projects. He could pick a web framework in Python (flask), Ruby (rails), or NodeJS. There are numerous tutorials on getting started with this stuff. For Flask: http://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world. For Rails: https://www.railstutorial.org. This type of project could take a while, there are a lot of technologies which interact to make a web application, but the ability to be creative when designing the web pages can be a lot of fun.

A third, more systems level, option (which is probably a bit more opinionated on my part) is that he learn to use Linux. I would suggest that he install VirtualBox on his computer, https://www.virtualbox.org/wiki/Downloads. He can then install Linux in a virtual machine without messing up the existing OS (also works with Mac). He COULD install Ubuntu, but this is extremely easy and doesn't really teach much about the inner workings. I think he could install Arch. https://wiki.archlinux.org. This is a much more involved distribution to install, but their documentation is notoriously good, and it exposes you to a lot of command line (Ubuntu attempts to be almost exclusively graphical). From here, he should just try to use it as much as possible for his daily computing. He can learn general system management and Bash scripting. There should be tutorials for how to do just about anything he may want. Some more advanced stuff would be to configure a desktop environment, he could install Gnome by default, it is pretty easy, but a lot of people really get into this with more configurable ones ( https://www.reddit.com/r/unixporn ). He could also learn to code and compile in C.

Fourth, if he likes C, he may like seeing some of the ways in which programs which are poorly written can be broken. A really fun "game" is https://io.smashthestack.org. He can log into a server and basically "hack" his way to different levels. This can also really expose you to how Linux maintains security (user permissions, etc. ). I think this would be much more involved approach, but if he is really curious about this stuff, I think this could be the way to go. In this similar vein, he could watch talks from Defcon and Chaos Computer Club. They both have a lot of interesting stuff on youtube (it can get a little racy though).

Finally, there are textbooks. These can be really long, and kinda boring. But I think they are much more approachable than one might think. These will expose you much more to the "Science" part of computer science. A large portions of the classes he will take in college look into this sort of stuff. Additionally, if he covers some of this stuff, he could look into messing around with AI (Neural Networks, etc.) and Machine Learning (I would check out Scikit-learn for Python). Here I will list different broad topics, and some of the really good books in each. (Almost all can be found for free.......)

General CS:
Algorithms and Data Structures: https://mitpress.mit.edu/books/introduction-algorithms
Theory of Computation: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X
Operating Systems: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/0470128720

Some Math:
Linear Algebra: http://math.mit.edu/~gs/linearalgebra/
Probability and Stats: http://ocw.mit.edu/courses/mathematics/18-05-introduction-to-probability-and-statistics-spring-2014/readings/

I hope that stuff helps, I know you were asking about camps, and I think the one I suggested would be good, but this is stuff that he can do year round. Also, he should keep his GPA up and destroy the ACT.

u/shhh-quiet · 2 pointsr/learnprogramming

Your mileage with certifications may vary depending on your geographical area and type of IT work you want to get into. No idea about Phoenix specifically.

For programming work, generally certifications aren't looked at highly, and so you should think about how much actual programming you want to do vs. something else, before investing in training that employers may not give a shit about at all.

The more your goals align with programming, the more you'll want to acquire practical skills and be able to demonstrate them.

I'd suggest reading the FAQ first, and then doing some digging to figure out what's out there that interests you. Then, consider trying to get in touch with professionals in the specific domain you're interested in, and/or ask more specific questions on here or elsewhere that pertain to what you're interested in. Then figure out a plan of attack and get to it.

A lot of programming work boils down to:

  • Using appropriate data structures, and algorithms (often hidden behind standard libraries/frameworks as black boxes), that help you solve whatever problems you run into, or tasks you need to complete. Knowing when to use a Map vs. a List/Array, for example, is fundamental.
  • Integrating 3rd party APIs. (e.g. a company might Stripe APIs for abstracting away payment processing... or Salesforce for interacting with business CRM... countless 3rd party APIs out there).
  • Working with some development framework. (e.g. a web app might use React for an easier time producing rich HTML/JS-driven sites... or a cross-platform mobile app developer might use React-Native, or Xamarin to leverage C# skills, etc.).
  • Working with some sort of platform SDKs/APIs. (e.g. native iOS apps must use 1st party frameworks like UIKit, and Foundation, etc.)
  • Turning high-level descriptions of business goals ("requirements") into code. Basic logic, as well as systems design and OOD (and a sprinkle of FP for perspective on how to write code with reliable data flows and cohesion), is essential.
  • Testing and debugging. It's a good idea to write code with testing in mind, even if you don't go whole hog on something like TDD - the idea being that you want it to be easy to ask your code questions in a nimble, precise way. Professional devs often set up test suites that examine inputs and expected outputs for particular pieces of code. As you gain confidence learning a language, take a look at simple assertion statements, and eventually try dabbling with a tdd/bdd testing library (e.g. Jest for JS, or JUnit for Java, ...). With debugging, you want to know how to do it, but you also want to minimize having to do it whenever possible. As you get further into projects and get into situations where you have acquired "technical debt" and have had to sacrifice clarity and simplicity for complexity and possibly bugs, then debugging skills can be useful.

    As a basic primer, you might want to look at Code for a big picture view of what's going with computers.

    For basic logic skills, the first two chapters of How to Prove It are great. Being able to think about conditional expressions symbolically (and not get confused by your own code) is a useful skill. Sometimes business requirements change and require you to modify conditional statements. With an understanding of Boolean Algebra, you will make fewer mistakes and get past this common hurdle sooner. Lots of beginners struggle with logic early on while also learning a language, framework, and whatever else. Luckily, Boolean Algebra is a tiny topic. Those first two chapters pretty much cover the core concepts of logic that I saw over and over again in various courses in college (programming courses, algorithms, digital circuits, etc.)

    Once you figure out a domain/industry you're interested in, I highly recommend focusing on one general purpose programming language that is popular in that domain. Learn about data structures and learn how to use the language to solve problems using data structures. Try not to spread yourself too thin with learning languages. It's more important to focus on learning how to get the computer to do your bidding via one set of tools - later on, once you have that context, you can experiment with other things. It's not a bad idea to learn multiple languages, since in some cases they push drastically different philosophies and practices, but give it time and stay focused early on.

    As you gain confidence there, identify a simple project you can take on that uses that general purpose language, and perhaps a development framework that is popular in your target industry. Read up on best practices, and stick to a small set of features that helps you complete your mini project.

    When learning, try to avoid haplessly jumping from tutorial to tutorial if it means that it's an opportunity to better understand something you really should understand from the ground up. Don't try to understand everything under the sun from the ground up, but don't shy away from 1st party sources of information when you need them. E.g. for iOS development, Apple has a lot of development guides that aren't too terrible. Sometimes these guides will clue you into patterns, best practices, pitfalls.

    Imperfect solutions are fine while learning via small projects. Focus on completing tiny projects that are just barely outside your skill level. It can be hard to gauge this yourself, but if you ever went to college then you probably have an idea of what this means.

    The feedback cycle in software development is long, so you want to be unafraid to make mistakes, and prioritize finishing stuff so that you can reflect on what to improve.
u/PunsForHire · 5 pointsr/math

It sounds like you might perhaps want a background in Number Theory and/or Basic Logic and/or Set Theory. The thing about math is that there is a lot...

My advice for a text that might serve you well is N.L. Biggs' Discrete Mathematics (http://www.amazon.com/Discrete-Mathematics-Norman-L-Biggs/dp/0198507178). If you are at all interested in computer science, this is also a great book for that because it introduces some of the mathematical rigor behind it. Some people have a smidgen of difficulty with this text because it doesn't give some names to proofs/algorithms that maybe you've heard whispered (e.g. Dijkstra's shortest path and Prim's minimal spanning tree). A text that I tend to think is on par with Biggs', but many think is vastly superior (I love both, but for different reasons) that covers some (most) of the same topics is Eccles' An Introduction to Mathematical Reasoning (http://www.amazon.com/Introduction-Mathematical-Reasoning-Peter-Eccles/dp/0521597188/ref=pd_sim_b_4?ie=UTF8&refRID=1BB6VKRP59S2420M132F). This book has a wonderful focus on building from the ground up and emphasizes clearly worded and mathematically rigorous proofs.

You seem genuinely interested in mathematics, but I do want to warn you about some more ahem esoteric (read: improperly worded, perhaps?) problems that ask such things as why 1 is greater than 0. The mathematics here is largely armchair - lacking any fundamental logic. There would be no issue with redefining a set of bases such that "" is greater than "1". However, if you want to have rationale of the concept of things being greater than another, that's more like number theory. You can learn the 10 axioms of natural numbers and then build from there.

Both of the books I mentioned will cover stuff like this. For example, they both (unless I'm not remembering correctly) delve into Euclid's proof of infinite primes, something which may interest you.

Briefly (and not so rigorously), assume that the number of primes, p1, p2, p3, ..., pN, is finite. Then there exists a number P which is the product of these primes. Based on the axioms of natural numbers, since all primes p1,p2,...,pN are natural numbers P is a natural number and so is P+1. Consider S = P+1. If S is prime than our list is incomplete, assume S isn't prime. Then some number in our list, say pI, divides S because any natural number can be written as the product of primes. pI must also divide P because P equals the sum of all primes. Therefore if pI divides S and pI divides P, then pI divides S-P = 1. That's a contradiction because no prime evenly divides 1.

Stuff like this is super cool, super simple, and super beautiful and you absolutely can learn it. These two books would be a great place to start.

u/TheAdventMaster · 3 pointsr/learnprogramming

Something like Code: The Hidden Language of Computer Hardware and Software may be up your alley.

So may be From NAND 2 Tetris, a course where you build a computer (hardware architecture, assembler, OS, C-like compiler, and programs to run on the OS / written in the compiler) starting with just NAND.

At the end of the day though, the way things work is like this: Protocols and specifications.

Everything follows the same published IPO (input, processing, output) standards. Stuff is connected to and registers expected values on expected peripherals. The CPU, motherboard, graphics card, wireless modem, etc. all connect in the right, mostly pre-ordained places on the hardware.

In this vein, there's firmware level APIs for then communicating with all of these at the BIOS level. Although as far as I'm aware, "actual" "BIOS" is no longer used. UEFI is instead: https://en.wikipedia.org/wiki/Unified_Extensible_Firmware_Interface

This is what Firmware is / is built on-top of. Operating systems build on top of these. System calls. Operating systems communicate under the hood and expose some number of system calls that perform low-level actions like talking to devices to perform things like file access or network I/O. A lot of this stuff is asynchronous / non-blocking, so the OS or system will then have to respond to an interrupt or continuously check a registry or some other means of getting a response from the device to see if an operation completed and what its result was.

Loading the OS is one thing the BIOS is responsible for. This is through the bootstrapping process. The OSs are located at very specific locations on the partitions. In the past, the only command you had enough room for within BIOS / pre-operating system execution was to load your OS, and then the OS's startup scripts had to do everything else from there.

Once you have an operating system, you can ask the OS to make system calls and invoke low-level API requests to get information about your computer and computer system, such as the file system, networks, connected drives and partitions, etc. These calls are usually exposed via OS-specific APIs (think the win32 API) as well as through a command-line interface the OS provides.

New devices and I/O from/to those devices communicate through firmware, and interrupts, and low-level system calls that are able to communicate with these firmware APIs and respond to them.

Just about anything you can think of - graphics, audio, networking, file systems, other i/o - have published standards and specifications. Some are OS-specific (X windowing system for Linux, DirectX win32 API or GDI on Windows, Quartz on Mac, etc.). Others are vendor-specific but don't seem to be going anywhere (OpenGL, then nVidia vs AMD driver support which varies across operating systems, etc.).

The biggest hardware vendors and specification stakeholders will work with the biggest operating system vendors on their APIs and specifications. It's usually up to device manufacturers to provide OS-compatible drivers along with their devices.

Drivers are again just another specification. Linux has one driver specification. Windows has another. Drivers are a way that the OS allows devices and users to communicate, with the OS as a middle-manager of sorts. Drivers are also often proprietary, allowing device manufacturers to protect their intellectual property while providing free access to use their devices on the OS of your choice.

I'm not an expert in how it all works under the hood, but I found comfort in knowing it's all the same IPO and protocol specifications as the rest of computing. No real hidden surprises, although a lot of deep knowledge and learning sometimes required.

When we get to actually executing programs, the OS doesn't have too much to work with, just the hardware... So the responsibility of slicing up program execution into processes and threads is up to the OS. How that's done depends on the OS, but pretty much every OS supports the concept in some sense.

As far as how programs are multi-tasked, both operating systems and CPUs are pretty smart. Instructions get sent to the chips, batched and divided by them and the computational results placed into to registries and RAM. Again, something I'm not a huge expert in, and it honestly surprised me to find out that the OS is responsible for threading etc. I for some reason always thought this was at the chip level.

When you include libraries (especially system / OS / driver libraries) in your code, you're including copies of or references to OS native functions and definitions to help you reference these underlying OS or system calls to do all the cool things you want to do, like display graphics on the screen, or play audio. This is all possible because of the relationship between OS's and device manufacturers and the common standards between them, as well as the known and standard architectures of programs designed for OS's and programs themselves.

Inter-program compatibility is where many things start to become high level, such as serialization standards like JSON or XML, but not always. There are some low-level things to care about for some programs, such as big- vs little-endian. Or the structure of ASM-level function calls.

And then you have things like bitcode that programs like Java or JavaScript will compile to, which are a system-independent representation of code that most often uses a simple heap or stack to describe things that might instead be registry access or a low-level heap or stack if it had been written in ASM. Again, just more standards, and programs are written according to specifications and know how to interface with these.

The modularity of programming thanks to this IPO model and the fact that everything follows some standards / protocols was a real eye opener for me and made me feel like I understood a lot more about systems. What also helped was not only learning how to follow instructions when setting up things on my computer or in my programs, but learning how to verify that those instructions worked. This included a lot of 'ls' on the command-line and inspecting things in my debugger to ensure my program executed how I expected. These days, some might suggest instead using unit tests or integration tests to do the same.

u/SofaAssassin · 1 pointr/cscareerquestions

What kind of jobs are you applying for? Low-level stuff is typically applicable for things like engine work, graphics, optimization, networking and audio. Okay, that covers a lot of the game development process, but there are certainly jobs that aren't deep into that, like peripheral tooling (making tools for developers to use) or working on stuff like the webservices that powers the online community.

However, if your goal really is core game development, then you need to be a lot more targeted in how you learn. I have interviewed for and was hired by a game company that worked in C++, and have also worked in distributed, networked military simulations (think of it like boring, more realistic Starcraft), so here is how I gained the various knowledge I had in getting through those types of interviews (including a 90-minute written test for the game company where I had to debug C++ code on paper, answer various gotchas, etc.).

I don't know how far you have covered, but this is how I would approach the learning now, were I to start over again.


  • Become really good at C++ - During my first job, I mostly used Java with Python/C++/Perl/TCL on the side. I learned a lot of C++ in short order to prepare for interviews and move jobs (to simulation).

  • Read Accelerated C++ and/or C++ Primer. These are probably the best books for getting introduced to C++ and starting off in a good place (as in, not learning C++ in the form of C), getting familiar with using the OO system of C++ and using the standard library. Also remember to do the exercises to really reinforce the concepts.

  • Read Effective C++ SUPER COLLECTION - In honesty, you can make do with just Effective C++, Volume 1, but these cover good practices for using C++.

  • Read the C++ FAQ - lots of gotchas there and corner cases of C++.

  • If you want to go beyond those books and resources, there are Herb Sutter's Exceptional C++ books.

  • Understand the machine - this covers the low level component. Helping you to understand the machine itself, how your code runs, how it's executed.
  • Read Randall Hyde's Write Great Code - This is one of my favorite technical books, and is language agnostic.

    It covers low-level concepts like CPU pipelining, memory, and how code interacts with the machine. I read this years after I started my job building simulations, and it reinforced a lot of what I learned previously and in college. I also recommended this book to a friend of mine who credits it with giving him an edge over his fellow college grads (he's years younger than I am) in low-level knowledge. If you don't know concepts like cache locality, cache lines and how memory is allocated, this book will cover that and more.

  • Read Randall Hyde's Art of Assembly Language - I have only briefly touched upon this book, but it takes a unique approach to introducing you to x86 ASM (by using a higher-level form of ASM).


  • Understand the algorithms and data structures - I took multiple classes in this in college, as well as periodically read CLRS to refresh my knowledge. But CLRS is too mathematically rigorous and theoretical here if you just want to get familiar with algorithms.

  • Skeina's Algorithm Design Manual is a more practical approach to refreshing yourself on algorithms and also learning complexity theory.

  • Skeina's Data Structures Lectures are helpful for data structures. In general, though, know these (I include whatever C++ has as well):
    • Dynamic array - std::vector<T> in C++.
    • Associative structures - std::map and std::unordered_map
    • Sets - std::set and std::unordered_set
    • Linked List - std::list<T> and std::forward_list<T>
    • Stacks and Queues - std::stack and std::queue
    • std::deque - The C++ implementation of a double-ended queue.
    • Trees - binary trees, red-black, heaps, tries (no standard C++ implementations of these, though stuff like std::set is typically implemented with a red-black tree behind the scenes)
    • Graphs

    • Understand the complexities of actions on each data structure (insertion, deletion, modification, searching, etc.)

  • Read the wiki on Pathfinding, because this class of algorithms is very important in game development, as well as network communication.

    -----

    The above covers the 'core' stuff you'd have to learn. If you wanted to get into stuff like network programming or graphics programming rather than just core gameplay development, I can expound further.
u/[deleted] · 5 pointsr/cscareerquestions

I have some advice for you. I'm speaking more from the position of someone who didn't do a lot of these things and regrets it than someone who can say for sure what you need to do, but I still think I have some helpful advice:

First and foremost, take a little time to enjoy your last summer before going off to university. I'm not saying you shouldn't also learn some CS, but honestly if you're the type of person who knows how to study and is willing to go to class and put in the work you already have a "head-start" on a good percentage of your classmates. And frankly, life gets a lot more stressful after high school. Enjoy yourself. You're right that AP courses aren't representative of the college experience, but your first year's coursework isn't going to be too scary. If you can handle AP, you can handle your freshman level coursework. Go to class, take good notes, and be an active participant in your courses, and then study afterwards. That's 90% of college success in the classroom. Doing well in the classroom isn't the only thing you need to worry about though (more below).

If you'd like a little summer reading, I personally found this book really enjoyable. Picked it up during my freshman year IIRC: https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1541924346&sr=8-1&keywords=code

It is both very informative and a fun read.

If you've taken AP Comp. Sci. then I'm assuming you can code a bit. If you want to do some programming I'd recommend sticking to what you know and deepening your understanding of it or trying out the language that your university starts you off with. They might even be the same language. Don't worry about learning multiple languages yet. You will likely be introduced to several languages in undergrad. Focus on the fundamentals.

Get connected at your university. Universities have a lot of resources for people interested in research. For example, my university had an undergraduate research program. They also had a student success center with people equipped to help you develop a plan (which is what you're trying to do now) to get to where you want to be, including graduate school and beyond. Be an active participant in your CS classes. You're going to have some shitty professors but you'll likely meet some really cool people too, who are passionate about what they do and who have the connections to help you if you show some initiative. If it's a larger school there will also be plenty of student organizations for you to check out, both explicitly CS-related and basically anything you can imagine. Don't be afraid to check out non-CS stuff. See what your university has to offer. If you don't know where to start, check out the website and once you get on campus seek out the equivalent of what I called the student success center. They will be familiar with all of the programs and services that your university offers at a broad level. Should be pretty similarly named. Also, ask your professors! At my university student organizations had faculty advisors and even if your professor isn't that person they will probably know who is.

Final note: while you should be active, have a plan, and get involved, don't try to do too much at once. You will burn out. You need to be honest about your limits. It's healthy to push yourself, but only if you are mindful of what you can handle. And if you find yourself struggling, do not be afraid to seek out academic coaching. If you have trouble adjusting to the university experience (very common and nothing to be ashamed of) your university might also offer access to therapists. Finally, if you have a diagnosed disability, make sure to register with disability services to receive accommodations!

u/coned88 · 1 pointr/linux

While being a self taught sys admin is great, learning the internals of how things work can really extend your knowledge beyond what you may have considered possible. This starts to get more into the CS portion of things, but who cares. It's still great stuff to know, and if you know this you will really be set apart. Im not sure if it will help you directly as a sys admin, but may quench your thirst. Im both a programmer and unix admin, so I tend to like both. I own or have owned most of these and enjoy them greatly. You may also consider renting them or just downloading them. I can say that knowing how thing operate internally is great, it fills in a lot of holes.

OS Internals

While you obviously are successful at the running and maintaining of unix like systems. How much do you know about their internal functions? While reading source code is the best method, some great books will save you many hours of time and will be a bit more enjoyable. These books are Amazing
The Design and Implementation of the FreeBSD Operating System

Linux Kernel Development
Advanced Programming in the UNIX Environment

Networking

Learning the actual function of networking at the code level is really interesting. Theres a whole other world below implementation. You likely know a lot of this.
Computer Networks

TCP/IP Illustrated, Vol. 1: The Protocols

Unix Network Programming, Volume 1: The Sockets Networking API

Compilers/Low Level computer Function

Knowing how a computer actually works, from electricity, to EE principles , through assembly to compilers may also interest you.
Code: The Hidden Language of Computer Hardware and Software

Computer Systems: A Programmer's Perspective

Compilers: Principles, Techniques, and Tools

u/ephrion · 4 pointsr/haskellquestions

> How long did it take you to learn haskell to a useful degree?

Useful Haskell is a much lower bar than social media Haskell: this excellent post shows the "Haskell Pyramid," where useful and practical code can be achieved without learning anything too crazy.

Here's a brief list of things you don't need to know to write useful Haskell:

  • technical terms like *morphisms,
  • what exactly a functor is,
  • what linear types entail,
  • why type checking is necessary and how it is implemented in haskell.

    Here's some great resources for Useful Haskell:

  • Write an app! you're doing that already, so cool.
  • The Yesod Book is a great intro to web development in Haskell. Yesod is lighter and more modular than Ruby on Rails, but has more batteries included than any other framework in the language.
  • Parallel and Concurrent Programming in Haskell provided a fantastic overview of the two topics, along with GHC's exceptions and a bit of a dive into the IO execution model.
  • Stanford CS240H course is excellent with lower level information on GHC and IO.

    ---

    I don't mean to suggest that an interest in the things you mentioned is bad or not worth pursuing -- they're really cool! But my study of category theory, type theory, etc. has not meaningfully improved my performance at my day job of writing Haskell.

  • For an explanation of morphisms, you'll enjoy Bartosz Milewski's book on category theory, or alternatively recursion schemes.
  • Haskellbook is a great resource for the main type classes; I find that it has the best explanations and exercises for drilling intuition. "What exactly is a functor?" is a question that's only answerable by specifying the context -- C++, prolog, category theory, OCaml, and Haskell functors are all different.
  • A value with a linear type must be used exactly one. You can't duplicate it or drop it. There's not a lot of books on this, mostly papers and blog posts. Rust has affine types, which are similar, but they only require that a value is used at most once.
  • Why is type checking necessary? Strictly speaking, it isn't -- there are dynamically typed languages, after all. Types and Programming Languages is a great book that teaches about a wide variety of type systems, though it spends quite a lot of time on subtyping and object oriented type systems. If you're mostly interested in FP and Haskell type systems, you can skip those sections. I really enjoy Type Theory and Formal Proof as an introduction to sequent notation; it builds up the lambda calculus and explores each variant/extension until it develops the final Boss Mode "Calculus of Constructions" that is dependently typed and forms the basis of theorem proving languages like Coq and Agda. Haskell's type system is based on System Fw. Types are also extremely useful in optimization and compilation.
u/frostmatthew · 3 pointsr/WGU

tl;dr version:

  1. yes
  2. no, but that will be the case at any school

    Quick background to validate the above/below: I was a 30y/o banquet manager when I decided to change careers. I had no prior experience [unless you want to count a single programming class I took in high school] but did get a job in tech support at a medium size startup while I was in school and wrote a couple apps for our department. Just before I graduated I started working at a primarily Google & Mozilla funded non-profit as their sole software engineer. I moved on after a little over two years and am now a software engineer at VMware.

  3. The degree is a huge boost in getting past HR and/or having [good] recruiters work with you. You'll also learn the skills/knowledge necessary to get hired as a developer, which is obviously the more important part - but for the most part this is all stuff you can learn on your own, but you'll greatly reduce the number places that will even give you a phone screen if you don't have a degree [I'm not saying this is how it should be, but this is how it is].

  4. I typed out a lot before remembering New Relic had a great blog post a few months ago about all the stuff you don't learn in school [about software development], ha. So I would highly recommend you not only read it but also try to learn a little on your own (especially regarding SQL and version control) http://blog.newrelic.com/2014/06/03/10-secrets-learned-software-engineering-degree-probably-didnt/ Being a good developer (or good anything) takes time/experience - but knowing what they don't cover in school (and trying to learn it on your own) will help.

    Two books I'd suggest reading are The Pragmatic Programmer and Code: The Hidden Language of Computer Hardware and Software. Pragmatic Programmer is one of those classics that every good dev has read (and follows!). Code is great at giving you some insight into what's actually happening at a lower level - though it gets a bit repetitive/boring about halfway through so don't feel bad about putting it down once you reach that point.

    The best thing you can do to help you land a job is have some open-source side-projects (ideally on GitHub). Doesn't have to be anything major or unique - but it will help a lot for potential employers to see what your code looks like.

u/bhrgunatha · 6 pointsr/AskComputerScience

A famous artefact of early computing is the boot-strapping process where the goal is a self-hosting compiler - which lets you write the compiler for a new language in the new langauge. However to get to that point a lot of earlier innovations were needed.

Take all of this with a pinch of salt - the order and the details may be wildly inaccurate, but the overall ideas viewed from afar give an idea of how we got to the point that we can choose our own language to write a compiler for another language..

To start with, raw binary values had to be set in order to define and run a program. Those raw binary values represent instructions that tell the hardwaer what to do and data that the program needed to operate. This is now usually referred to as machine code.

At first you would enter values into computer storage using switches.

Since that's so tedious and error prone, puched cards were developed along with the necessary hardware to read them so you could represent lots of values that could be read toagether. They had their own problems but it was a step forward from switches.

After some time symbolic instructions were defined as a shortcut for several machine code instructions - now usually called assembly language. For example put the value 8 and store it into a memory location 58 could be written as ST 8, [58]. This might take 3 machine code instructions, one represents the store instruction, one the value 8 and one the location 58. Since now assembly language could be written down it was easier to understand what the computer is being instructed to do. Naturally someone had the bright idea to make that automatic so that for example you could write down the instructions by hand, then create punched cards representing those instructions, convert them to machines code and then run the program. The conversion from the symbolic instructions to machines code was handled by a program called an assembler - people still write programs in assembly code and use assemblers today.

The next logical step is to make the symbolic instructions more useful and less aimed at the mundane, physical processes that tells the computer exactly how to operate and more friendly for people to represent ideas. This is really the birth of programming languages. Since programming languages allowed you to do more abstract things symbolically - like saving the current instructions location, branching off to another part of the same program to return later, the conversion to machine code became more complex.Those programs are called compilers.

Compilers allow you to write more useful programs - for example the first program that allowed you to connected a keyboard that lets you enter numbers and characters, one connected to a device to print numbers and characters, then later to display them on another device like a screen. From there you are quite free to write other programs. More languages and their compilers developed that were more suitable to represent more abstract ideas like variables, procedure and functions.

During the whole process both hardware - the physical elctronic machines and devices and software, the instructions to get the machines to do useful work - were both developed and that process still continues.

There's a wonderful book called Code by Charles Petzold that details all of these developments, but actually researched and accurate.



u/random012345 · 1 pointr/learnprogramming

Books on project management, software development lifecycle, history of computing/programming, and other books on management/theory. It's hard to read about actual programming if you can't practice it.

Some of my favorites:

  • Code: The Hidden Language of Computer Hardware and Software - GREAT choice I notice you already have listed. Possibly one of my favorite, and this should be on everyone's reading list who is involved in IT somehow. It basically how computers and programming evolved and gets you in a great way of thinking.

  • The Code Book: The Science of Secrecy from Ancient Egypt to Quantum Cryptography - Another great history book on code and how things came to be. It's more about crypto, but realistically computing's history is deeply rooted into security and crypto and ways to pass hidden messages.

  • Software Project Survival Guide - It's a project management book that specifically explains it in terms of software development.

  • The Art of Intrusion: The Real Stories Behind the Exploits of Hackers, Intruders and Deceivers - A fun collection of short hacking stories compiled and narrated by Kevin Mitnick, one of the most infamous hackers. Actually, any of Mitnick's books are great. Theres a story in there about a guy who was in jail and learned to hack while in there and get all kind of special privileges with his skills.

  • Beautiful Data: The Stories Behind Elegant Data Solutions - Most of the books in the "Beautiful" series are great and insightful. This is one of my more favorite ones.

  • A Guide to the Project Management Body of Knowledge: PMBOK(R) Guide - THE guide to project management from the group that certifies PMP... boring, dry, and great to help you get to sleep. But if you're committed enough, reading it inside and out can help you get a grasp or project management and potentially line you up to get certified (if you can get the sponsors and some experience to sit for the test). This is one of the only real certifications worth a damn, and it actually can be very valuable.

    You can't exactly learn to program without doing, but hopefully these books will give you good ideas on the theories and management to give you the best understanding when you get out. They should give you an approach many here don't have to realize that programming is just a tool to get to the end, and you can really know before you even touch any code how to best organize things.

    IF you have access to a computer and the internet, look into taking courses on Udacity, Coursera, and EDX. Don't go to or pay for any for-profit technical school no matter how enticing their marketing may tell you you'll be a CEO out of their program.
u/fuckjeah · 1 pointr/todayilearned

Yes I know, that is why I mentioned general purpose computation. See Turing wrote a paper about making such a machine, but the British intelligence which funded him during the war needed a machine to crack codes through brute force, so he doesn't need general computation (his invention), but the machine still used fundamental parts of computation invented by Turing.

The Eniac is a marvel, but it is an implementation of his work, he invented it. Even Grace Hopper mentions this.

What the Americans did invent there though, was the higher level language and the compiler. That was a brilliant bit of work, but the credit for computation goes to Turing, and for general purpose computation (this is why the award in my field of comp. sci. is the Turing award, why a machine with all 8 operations to become a general computer is called Turing complete and why Turing along with Babbage are called the fathers of computation). This conversation is a bit like crediting Edison for the lightbulb. He certainly did not invent the lightbulb, what he did was make the lightbulb a practical utility by creating a longer lasting one (the lightbulbs first patent was filed 40 years earlier).

I didn't use a reference to a film as a historical reference, I used it because it is in popular culture, which I imagine you are more familiar with than the history of computation, as is shown by you not mentioning Babbage once and yet the original assertion was the invention of "Computation" and not the first implementation of the general purpose computer.

> The Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.

Here is a bit where Von-Neuman (American creator of the Von-Neuman architecture we use to this day) had to say:

> The principle of the modern computer was proposed by Alan Turing, in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" that is later known as a Universal Turing machine. He proved that such machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable.

> The fundamental concept of Turing's design is stored program, where all instruction for computing is stored in the memory.

> Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.

TLDR: History is not on your side, I'm afraid. Babbage invented computation, Turing invented the programmable computer. Americans invented the memory pipelines, transistor, compiler and first compilable programming language. Here is an American book by a famous Nobel prize winning physicist (Richard Feynman) where the roots of computation is discussed and the invention credit awarded to Alan Turing. Its called Feynman's Lectures on Computation, you should read it (or perhaps the silly movie is more your speed).

u/root_pentester · 3 pointsr/blackhat

No problem. I am by no means an expert in writing code or buffer overflows but I have written several myself and even found a few in the wild which was pretty cool. A lot of people want to jump right in to the fun stuff but find out rather quickly that they are missing the skills to perform those tasks. I always suggest to people to start from the ground up when learning to do anything like this. Before going into buffer overflows you need to learn assembly language. Yes, it can be excellent sleep material but it is certainly a must. Once you get an understand of assembly you should learn basic C++. You don't have to be an expert or even intermediate level just learn the basics of it and be familiar with it. The same goes for assembly. Once you get that writing things like shellcode should be no problem. I'll send you some links for a few books I found very helpful. I own these myself and it helped me tremendously.

Jumping into C++: Alex Allain

Write Great Code: Volume1 Understanding the Machine

Write Great Code: Volume2 Thinking Low-Level, Writing High Level

Reversing: Secrets of Reverse Engineering

Hacking: The Art of Exploitation I used this for an IT Security college course. Professor taught us using this book.

The Shellcoders Handbook This book covers EVERYTHING you need to know about shellcodes and is filled with lots of tips and tricks. I use mostly shells from metasploit to plug in but this goes really deep.

.

If you have a strong foundation of knowledge and know the material from the ground-up you will be very successful in the future.

One more thing, I recently took and passed the course from Offensive Security to get my OSCP (Offensive Security Certified Professional). I learned more from that class than years in school. It was worth every penny spent on it. You get to VPN in their lab and run your tools using Kali Linux against a LOT of machines ranging from Windows to Linux and find real vulnerabilities of all kinds. They have training videos that you follow along with and a PDF that teaches you all the knowledge you need to be a pentester. Going in I only had my CEH from eccouncil and felt no where close to being a pentester. After this course I knew I was ready. At the end you take a 24-long test to pass. No questions or anything just hands on hacking. You have 24 hrs to hack into a number of machines and then another 24 hours to write a real pentest report like you would give a client. You even write your own buffer overflow in the course and they walk you through step by step in a very clear way. The course may seem a bit pricey but I got to say it was really worth it. http://www.offensive-security.com/information-security-certifications/oscp-offensive-security-certified-professional/

u/JBlitzen · 2 pointsr/learnprogramming

You're asking a very complex question that the best current minds in the fields of sociology, politics, psychology, technology, futurology, neuroscience, education, and many others, cannot answer.

We just don't know what it is that makes a good programmer different from a bad one. We all have theories and ideas and notes and observations and anecdotes, but compiling them together doesn't generate an actual understanding of the subject.

So I'm sorry but there's no real way for anyone to answer your question.

I would look for local resources to double check wtf you're doing.

Befriend a trusted professor and visit them during office hours, or a trusted student or advisor or professional or something. Explain your concern and ask them to walk you through how they approach problems. Not the solution; just watch how they approach the problem and pay very close attention to it.

That's quite invaluable.

For instance, take SQL. If I get really stuck on a SQL problem, I go back to my root approach to it, which is to ask this question: if I had several tables of data printed out on paper, what would I tell a monkey to do to collate them and generate the output I want? That's all that SQL usually does; it explains to the computer how to collate and present diverse data tables.

And that's easy to forget when you're trying to think about join syntax or something and you're grasping at straws and trying to construct a pattern in your mind and it keeps unraveling because you don't have a good sense at the root level of what it is that you're trying to do.

Better programmers aren't defined by what they know, they're defined by how they think.

And I sense your problem is that you're trying to apply knowledge not without understanding it, but without understanding how to understand it.

Watch this, I'm about to do a cool programmer mental trick of segueing to what seems like a completely unrelated subject, but I'm actually following a mental thread that connects the two.

When linguists try to decipher ancient languages, they sometimes run into an interesting kind of trouble. Take Linear A as an example. They have the symbols. They have a pretty good idea of what some of them mean. But they have no idea how they go together or what most of them mean, because they have no context for deciphering the thing. It's completely inaccessible. For all scientists can tell, Linear A might as well be encoded symbols of sound waves which can only be translated intelligibly when played aloud. If they converted a Linear A script to MP3, it might come out a perfect rendition of All Along The Watchtower.

The problem isn't that they haven't unlocked the words, the problem is that they haven't figured out how the writers thought. And without knowing that, the words are probably unknowable. They could throw some together that have likely translations, but what sounds like "there is the sun" might actually mean "my dog ate my banana".

So the key to unlocking the language isn't to stare at the language and to try to wrestle words into place.

Instead, it's to research the culture that used the language, to try to learn more about the culture and how it functioned. Did it have a seat of government? Was it patriarchal or matriarchal? Was it militant or trade oriented or hunter/gatherer or what?

By understanding how the people lived, you get a sense of how they thought. And by understanding how they thought, you can start to figure out how they communicated. And more than one language over time has been deciphered in that way.

Or how about this; you don't speak french, but encountering a french person on the street, you're able to use hand gestures to ask them directions to something, and they're able to respond intelligently using hand gestures. How'd that happen? Because you both thought the same way.

This psychological question consumes exobiologists, because they're tasked with figuring out how to communicate with aliens who, by definition, don't think like us.

So what do they do? They return to the basic roots. And the simplest roots they can find are the hydrogen atom and the set of prime numbers. And maybe pi. Things like that. And people hear "math is the universal language" and sneer dismissively because math is boring, but it's actually true.

So I'm curious whether you're fluent in the universal language of computers, or whether you just think you are because you practiced writing linked lists 37 times.

Charles Petzold wrote a great book called "Code: The Hidden Language of Computer Hardware and Software". It doesn't teach you how to program, but it teaches you WHY you have to program. And by understanding the WHY, you get great inroads to the HOW.

I'd recommend taking a look at that if you can find it. "http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319". Notice that the second book in the "Customers Who Bought This Item Also Bought" section is titled "Think Like A Programmer". Code has better reviews so I'd start there.

But that might help.

I don't think you're stupid, I just think that you aren't thinking the right way about the problems.

Or maybe it's a problem solving thing altogether. How do you do with Sudoku or other games like that? Or chess puzzles?

I dunno. This is all food for thought, and maybe some of it will stick.

ETA: Damn I wrote a lot. I'll save it and blog it someday, maybe. Anyway, I did a search on "how programmers think" and came up with this interesting article: "http://www.techrepublic.com/blog/programming-and-development/how-to-think-like-a-programmer-part-1/43". It's interesting not because I entirely agree with it, and I couldn't find the other two parts of it anyway, but at the bottom of the first section, in bold, the author says that the big problem with some programmers is that they've been taught the HOW but not the WHY. So here again is someone supporting my point. Tricks and habits and stuff will only take you so far; at some point you have to learn the WHY in a way that lets you apply it and start coming up with your own HOW's as the need arises.

u/looeee · 1 pointr/math

some amazing books I would suggest to you are:

  • Godel Escher Bach

  • Road to Reality By Roger Penrose.

  • Code by
    Charles Petzold.

  • Pi in the Sky by John Barrow.

    All of these I would love to read again, if I had the time, but none more so than Godel, Escher, Bach, which is one of the most beautiful books I have ever come across.

    Road to Reality is the most technical of these books, but gives a really clear outline of how mathematics is used to describe reality (in the sense of physics).

    Code, basically, teaches you how you could build a computer (minus, you know, all the engineering. But that's trivial surely? :) ). The last chapter on operating systems is pretty dated now but the rest of it is great.

    Pi in the Sky is more of a casual read about the philosophy of mathematics. But its very well written, good night time reading!

    You have a really good opportunity to get an intuitive understanding of the heart of mathematics, which even at a college level is somewhat glossed over, in my experience. Use it!
u/JimWibble · 1 pointr/Gifts

He sounds like a younger version of myself! Technical and adventurous in equal measure. My girlfriend and I tend to organise surprise activities or adventures we can do together as gifts which I love - it doesn't have to be in any way extravegant but having someone put time and thought into something like that it amazing.

You could get something to do with nature and organise a trip or local walk that would suit his natural photography hobby. I love to learn about new things and how stuff works so if he's anything like me, something informative that fits his photography style like a guide to local wildflowers or bug guide. I don't know much about parkour but I do rock climb and a beginners bouldering or climbing session might also be fun and something you can do together.

For a more traditional gift Randall Munroe from the web comic XKCD has a couple of cool books that might be of interest - Thing Explainer and What If. Also the book CODE is a pretty good book for an inquisitive programmer and it isn't tied to any particular language, skillset or programming level.

u/anachronic · 3 pointsr/AskNetsec

> I have zero Linux experience. How should I correct this deficiency?

First, install a VM (Oracle OpenBox is free) and download a linux ISO and boot from it. Debian and Ubuntu are two of my favorites. Both are totally free (as are most linux distros). Once installed, start reading some beginner linux tutorials online (or get "Linux In A Nutshell" by O'Reilly).


Just fuck around with it... if you screw something up, blow it away and reinstall (or restore from a previous image)

> Is it necessary? Should I start trying to make Linux my primary OS instead of using windows, or should that come later?

It's not necessary, but will help you learn faster. A lot of security infrastructure runs on Linux and UNIX flavors. It's important to have at least a basic understanding of how a Linux POSIX system works.

> If you can, what are some good books to try to find used or on PDF to learn about cissp and cisa? Should I be going after both? Which should I seek first?

You don't need to worry about taking & passing them until you've been working in the field for at least 3-5 years, but if you can get some used review materials second-hand, it'll give you a rough idea what's out there in the security landscape and what a security professional is expected to know (generally)


CISSP - is more detailed and broader and is good if you're doing security work day-to-day (this is probably what you want)


CISA - is focused on auditing and IT governance and is good if you're an IT Auditor or working in compliance or something (probably not where you're headed)


> What are good books I can use to learn about networking? If you noticed I ask for books a lot its because the only internet I have is when I connect my android to my laptop by pdanet, and service is sketchy at my apartment.

O'Reilly is a reliable publisher of quality tech books. An amazon search for "O'Reilly networking" pull up a bunch. Also, their "in a nutshell" series of books are great reference books for Windows, Linux, Networking, etc... You can probably find older/used copies online for a decent price (check ebay and half.com too)

> How would you recommend learning about encryption? I just subscribed to /r/crypto so I can lurk there. Again, can you point me at some books?

Try "The Code Book" for a very accessible intro to crypto from ancient times thru today
http://www.amazon.com/The-Code-Book-Science-Cryptography/dp/0385495323


Also, for basics of computer architecture, read "CODE", which is absolutely excellent and shows how computers work from the ground up in VERY accessible writing.
http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/reddilada · 1 pointr/learnprogramming

Being fond of problem solving is a good indicator. Problem solving and executing a solution is essentially what programming is all about in the end. Pretty much any engineering degree for that matter. The good news is most STEM courseware is pretty much the same the first couple of years of college so you won't really have to commit straight away. Your classes will apply to multiple degree paths and having a few intro compsci courses under your belt will help in literally any major.

A computer science degree is (should be) geared to problem solving more than learning to write code. Writing code is the easy bit and the tech changes so quickly it is something best learned on the fly. You will be taking tons of math, studying algorithms, data structures, learning to play well with others -- that sort of thing.

Being fond of computers alone can lead one astray. The classic example is that liking listening to music doesn't necessarily lead to liking making music.

The Harvard cs50x extension course will give you a straight up taste of what an intro to CS class will be like in university. The pace is fast so fair warning.

A good armchair book is CODE. Nice overview of how computers compute.

It's a great career choice IMO. I've been at it for a long long (long) time with zero regrets. Along side getting to play with all the shiny bits, you can get a constant supply of feel good moments when you see your work actually doing something in the wild and seeing your work impact peoples lives in a positive way.

u/ryantriangles · 2 pointsr/webdev

> Recently released books? Udemy courses? Free stuff online like W3Schools or CodeAcademy?

Mozilla Developer Network has fantastic documentation for JS, HTML, CSS, and the browser APIs, and a section intended to guide you through them for the first time in a comfortable order.

If JS is your first language, I'd recommend checking out the book "Code" by Charles Petzold, a great and relatively short book that answers the fundamental beginner questions like "So what's a CPU actually doing in there?", "How does source code make stuff happen?" and "How can music be 1s and 0s?"

There's a great series of books, available for free online, called "You Don't Know JS", that teach you how the language operates. It might be too involved if it's the first thing you read, but definitely at least start it and bookmark the later volumes to come back to.

Marijn Haverbeke's book "Eloquent JavaScript" is available to read freely online. I'd really recommend that one, too. And when you want to stop reading theory and start working on actual projects, grab "JavaScript Cookbook" by Shelley Powers.

> Did I choose right languages for what I wish to create? Maybe I should also use something else, like Bootstrap, Node.js or Typescript?

Don't worry about those for now. Node will be useful if you decide you need a server component to your game. Bootstrap is nice, and it's helpful, but if you're still trying to learn HTML and CSS yourself, it will only get in the way and obscure things. TypeScript is something you'd look at only once you're comfortable and confident with regular old JS. Stick to plain old HTML/CSS/JS of your own to start.

u/FearMonstro · 3 pointsr/compsci

Nand to Tetris (coursera)

the first half of the book is free. You read a chapter then you write programs that simulate hardware modules (like memory, ALU, registers, etc). It's pretty insightful for giving you a more rich understanding of how computers work. You could benefit from just the first half the book. The second half focuses more on building assemblers, compilers, and then a java-like programming language. From there, it has you build a small operating system that can run programs like Tetris.

Code: The Hidden Language of Hardware and Software

This book is incredibly well written. It's intended for a casual audience and will guide the reader to understanding how a microcontroller works, from the ground up. It's not a text book, which makes it even more more impressive.

Computer Networking Top Down Approach

one of the best written textbook I've read. Very clear and concise language. This will give you a pretty good understanding of modern-day networking. I appreciated that book is filled to the brim of references to other books and academic papers for a more detailed look at subtopics.

Operating System Design

A great OS book. It actually shows you the C code used to design and code the Xinu operating system. It's written by a Purdue professor. It offers both a top-down look, but backs everything up with C code, which really solidifies understanding. The Xinu source code can be run on emulators or real hardware for you to tweak (and the book encourages that!)

Digital Design Computer Architecture

another good "build a computer from the ground up" book. The strength of this book is that it gives you more background into how real-life circuits are built (it uses VHDL and Verilog), and provides a nice chapter on transistor design overview. A lot less casual than the Code book, but easily digestible for someone who appreciates this stuff. It culminates into designing and describing a microarchitecture to implement a MIPS microcontroller. The diagrams used in this book are really nice.

u/maholeycow · 1 pointr/SoftwareEngineering

Alright man, let's do this. Sorry, had a bit of a distraction last night so didn't get around to this. By the way, if you look hard enough, you can find PDF versions of a lot of these books for free.

Classic computer science principle books that are actually fun and a great read (This is the kind of fundamental teachings you would learn in school, but I think these books teach it better):

  1. https://www.amazon.com/Code-Language-Computer-Developer-Practices-ebook/dp/B00JDMPOK2 - this one will teach you at a low level about 1's and 0's and logic and all sorts of good stuff. The interoperation of hardware and software. This was a fun book to read.
  2. https://www.nand2tetris.org/book - This book is a must in my opinion. It touches on so many things such as boolean logic, Machine language, architecture, compiling code, etc. And it is f*cking fun to work through.

    Then, if you want to get into frontend web development for example, I would suggest the following two books for the fundamentals of HTML, CSS, and JavaScript. What I like about these books is they have little challenges in them:

  3. https://www.amazon.com/Murachs-HTML5-CSS3-Boehm-Ruvalcaba/dp/1943872260/ref=sr_1_2_sspa?keywords=murach%27s+html5+and+css3&qid=1557323871&s=books&sr=1-2-spons&psc=1
  4. https://www.amazon.com/Murachs-JavaScript-jQuery-3rd-Ruvalcaba/dp/1943872058/ref=sr_1_1_sspa?keywords=murach%27s+javascript&qid=1557323886&s=books&sr=1-1-spons&psc=1

    Another great book that will teach you just fundamentals of coding using an extremely flexible programming language in Python, how to think like a programmer is this book (disclaimer: I haven't read this one, but have read other Head First books, and they rock. My roommate read this one and loved it though):

  5. https://www.amazon.com/Head-First-Learn-Code-Computational/dp/1491958863

    Let me know if you want any other recommendations when it comes to books on certain areas of software development. I do full stack web app development using .NET technology on the backend (C# and T-SQL) and React in the frontend. For my personal blog, I use vanilla HTML, CSS, and Javascript in the frontend and power backend content management with Piranha CMS (.NET Core based). I often times do things like pick up a shorter course or book on mobile development, IoT, etc. (Basically other areas from what I get paid to do at work that interest me).

    If I recommended the very first book to read on this list, it would be the Head First book. Then I would move over to the first book listed in the classic computer science book if you wanted to go towards understanding low level details, but if that's not the case, move towards implementing something with Python, or taking a Python web dev course on Udemy..

    Other really cool languages IMO: Go, C#, Ruby, Javascript, amongst many more

    P.S. Another book from someone that was in a similar situation to you: https://www.amazon.com/Self-Taught-Programmer-Definitive-Programming-Professionally-ebook/dp/B01M01YDQA/ref=sr_1_2?keywords=self+taught+programmer&qid=1557324500&s=books&sr=1-2
u/rispe · 3 pointsr/javascript

Congratulations! That's a big step. Be proud that you were able to make the switch. Not many people manage to transform ideas into results.

I think there are four areas on which you need to focus, in order to go from mediocre to great. Those areas are:

  1. Theoretical foundation.
  2. Working knowledge.
  3. Software engineering practices.
  4. Soft skills.

    Now, these areas don't include things like marketing yourself or building valuable relationships with coworkers or your local programming community. I see those as being separate from being great at what you do. However, they're at least as influential in creating a successful and long-lasting career.

    Let's take a look at what you can do to improve yourself in those four areas. I'll also suggest some resources.

    ​

    1. Theoretical foundation

    Foundational computer science. Most developers without a formal degree have some knowledge gaps here. I suggest taking a MOOC to remediate this. After that, you could potentially take a look at improving your data structures and algorithms knowledge.

  • CS50: Introduction to Computer Science
  • Grokking Algorithms
  • Algorithms by Sedgewick

    ​

    2. Working knowledge.

    I'd suggest doing a JavaScript deep-dive before focusing on your stack. I prefer screencasts and video courses for this, but there are also plenty of books available. After that, focus on the specific frameworks that you're using. While you're doing front-end work, I also suggest you to explore the back-end.

    ​

  • FunFunFunction on Youtube
  • You Don't Know JS
  • JavaScript Allonge
  • JavaScript Design Patterns

    3. Software engineering practices.

    Design patterns and development methodologies. Read up about testing, agile, XP, and other things about how good software is developed. You could do this by reading the 'Big Books' in software, like Code Complete 2 or the Pragmatic Programmer, in your downtime. Or, if you can't be bothered, just read different blog posts/Wikipedia articles.

    ​

    4. Soft skills.

  1. Actively seek to mentor and teach others (perhaps an intern at work, or someone at a local tech community, or create blog posts or videos online).
  2. Get mentorship and learn from others. Could be at work, or open source.
  3. Go to programming meetups.
  4. Try public speaking, go to a Toast Masters meetup.
  5. Learn more about and practice effective communication.
  6. Learn more about business and the domain that you're working in at your company.
  7. Read Soft Skills or Passionate Programmer for more tips.

    ​

    Some closing notes:

    - For your 'how to get started with open source' question, see FirstTimersOnly.

    - If you can't be bothered to read or do large online courses, or just want a structured path to follow, subscribe to FrontendMasters and go through their 'Learning Paths'.

    - 4, combined with building relationships and marketing yourself, is what will truly differentiate you from a lot of other programmers.

    ​

    Sorry for the long post, and good luck! :)
u/christianitie · 17 pointsr/math

I would guess that career prospects are a little worse than CS for undergrad degrees, but since my main concern is where a phd in math will take me, you should get a second opinion on that.

Something to keep in mind is that "higher" math (the kind most students start to see around junior level) is in many ways very different from the stuff before. I hated calculus and doing calculations in general, and was pursuing a math minor because I thought it might help with job prospects, but when I got to the more abstract stuff, I loved it. It's easily possible that you'll enjoy both, I'm just pointing out that enjoying one doesn't necessarily imply enjoying the other. It's also worth noting that making the transition is not easy for most of us, and that if you struggle a lot when you first have to focus a lot of time on proving things, it shouldn't be taken as a signal to give up if you enjoy the material.

This wouldn't be necessary, but if you like, here are some books on abstract math topics that are aimed towards beginners you could look into to get a basic idea of what more abstract math is like:

  • theoretical computer science (essentially a math text)

  • set theory

  • linear algebra

  • algebra

  • predicate calculus

    Different mathematicians gravitate towards different subjects, so it's not easy to predict which you would enjoy more. I'm recommending these five because they were personally helpful to me a few years ago and I've read them in full, not because I don't think anyone can suggest better. And of course, you could just jump right into coursework like how most of us start. Best of luck!

    (edit: can't count and thought five was four)
u/p7r · 4 pointsr/NoStupidQuestions

I've taught a lot of people how computers work, or more precisely how to program them. I am sure you can learn too.

First, let's make it fun.

There is a lot of material for people who like the Raspberry Pi out there that is fun and simple. You don't even need to own a Raspberry Pi to understand what they're talking about.

It's fun and simple because it's designed for youngsters who find long/complex books a bit too boring. I think you might enjoy it, because you've said you've found the books you've tried too boring.

Here is a load of magazines about the Pi - on each issue you can click on "Get Issue" and then under the cover "download the PDF" and read it and see if you enjoy that.

Next, have a play with Scratch. It's designed for kids but the exact same concepts are in professional programming languages.

The reason I recommend it is not because I think you are a child, but because it's a lot of fun and makes a lot of the dull and boring bits of programming go away so you can focus on the fun bits.

You have to remember all the things going on inside a computer are basically the things going on in there - just a lot more complex.

If you ever want to learn a programming language that professional developers use, I think you'll like Ruby.

It's very forgiving for new developers, but still lets you do what we would call "production grade" code. It's what I work in most days.

Also, why's poignant guide is quite funny, but you might find it a bit weird and confusing - I know I did the first time I read it. :-)

I also recommend this book to you: Code by Charles Petzoid. The first few chapters don't seem like they're about computers, because they talk about flags and electrical circuits - that's because you need to understand those things first.

If you can read and understand the whole thing you will know more about how computers work than half of the professional software engineers out there. And they're normally quite a clever bunch.

If you find it too difficult, slow down and think. Each paragraph has something in it worth thinking about and letting it mull over in your mind.

IQ is not a measure of how much you can learn, but perhaps how quickly it can see patterns and understand things.

You having a lower IQ than somebody else does not mean you can't see those patterns or understand things, it just means it might take you a little more thinking to get there. I'm sure you will.

If you ever have any questions about computers, I'd love to try and help answer them - feel free to ask me.

u/dnew · 1 pointr/Unity3D

There was a book I read 40 years ago that covered basically everything from vacuum tubes and semiconductors up to basically chips. It was in the library, and it was like 800 pages long. I asked on reddit if anyone knew what it was, and someone pointed me at the newest edition. But I don't really have time to go through all my comment history looking for "electronics book" or to write a program to do same, but you should feel free to do so. :-) Then I got into assembly for the 8-bit CPUs, picked up the 16-bit and 32-bit CPUs of the day, and the mainframe stuff. Then I went back to school. :-)

However, all that said, this looks like what I read, and the intro sounds like he's describing the first edition I remember: https://smile.amazon.com/Electronic-Devices-Circuit-Theory-11e-ebook/dp/B01LY6238B/ref=mt_kindle

If you want more about assembler, just flipping through this seems like it starts with the very fundamentals and goes through a fair amount. https://smile.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_5 If you already know how to program, and you understand the basics of how (for example) basic assembler language works and how the chip accesses memory and what an interrupt does and etc, then learning new assembler languages is pretty straightforward. Sort of like "I know Java, now I need to learn C#."

But honestly, at this point, I'd look online. When I learned all this stuff, textbooks were the way to go. Nowadays, everything moves so fast that you're probably better off finding a decent description online, or looking up an online class or something and seeing what texts they use.

If you don't want to learn assembler or hardware, but you still want to challenge yourself, the other thing to look into is unusual programming languages and operating systems. Things that are unlike what people now use for doing business programming. Languages like APL (or "J"), or Hermes, or Rust, or Erlang, or Smalltalk, or even Lisp or Forth if you've been steeped in OOP for too long. Operating systems like Eros or Amoeba or Singularity. Everything stretches your mind, everything gives you tools you can use in even the most mundane situations, and everything wonderful and wild helps you accept that what you're doing now is tedious and mundane but that's where you're at for the moment. :-) (Or, as I often exclaim at work, "My kingdom for a Java list comprehension!")

u/IjonTichy85 · 2 pointsr/compsci

Hi,
do you want to become a computer scientist or a programmer? That's the question you have to ask yourself. Just recently someone asked about some self-study courses in cs and I compiled a list of courses that focuses on the theoretical basics (roughly the first year of a bachelor class). Maybe it's helpful to you so I'm gonna copy&paste it here for you:



I think before you start you should ask yourself what you want to learn. If you're into programming or want to become a sysadmin you can learn everything you need without taking classes.

If you're interested in the theory of cs, here are a few starting points:

Introduction to Automata Theory, Languages, and Computation

The book you should buy

MIT: Introduction to Algorithms

The book you should buy


Computer Architecture<- The intro alone makes it worth watching!

The book you should buy

Linear Algebra

The book you should buy <-Only scratches on the surface but is a good starting point. Also it's extremely informal for a math book. The MIT-channel offers many more courses and are a great for autodidactic studying.

Everything I've posted requires no or only minimal previous education.
You should think of this as a starting point. Maybe you'll find lessons or books you'll prefer. That's fine! Make your own choices. If you've understood everything in these lessons, you just need to take a programming class (or just learn it by doing), a class on formal logic and some more advanced math classes and you will have developed a good understanding of the basics of cs. The materials I've posted roughly cover the first year of studying cs. I wish I could tell you were you can find some more math/logic books but I'm german and always used german books for math because they usually follow a more formal approach (which isn't necessarily a good thing).
I really recommend learning these thing BEFORE starting to learn the 'useful' parts of CS like sql,xml, design pattern etc.
Another great book that will broaden your understanding is this Bertrand Russell: Introduction to mathematical philosophy
If you've understood the theory, the rest will seam 'logical' and you'll know why some things are the way they are. Your working environment will keep changing and 20 years from now, we will be using different tools and different languages, but the theory won't change. If you've once made the effort to understand the basics, it will be a lot easier for you to switch to the next 'big thing' once you're required to do so.

One more thing: PLEASE, don't become one of those people who need to tell everyone how useless a university is and that they know everything they need just because they've been working with python for a year or two. Of course you won't need 95% of the basics unless you're planning on staying in academia and if you've worked instead of studying, you will have a head start, but if someone is proud of NOT having learned something, that always makes me want to leave this planet, you know...

EDIT: almost forgot about this: use Unix, use Unix, and I can't emphasize this enough: USE UNIX! Building your own linux from scratch is something every computerscientist should have done at least once in his life. It's the only way to really learn how a modern operating system works. Also try to avoid apple/microsoft products, since they're usually closed source and don't give you the chance to learn how they work.

u/infinitelyExplosive · 1 pointr/pcmasterrace

Here are some different sources for different aspects of computers.

The book Code: The Hidden Language of Computer Hardware and Software is an excellent introduction into the low-level concepts which modern CPUs are built on.

Link hopping on Wikipedia is a totally viable method to learn many aspects of computers. Start at some page you know about, like Graphics Cards or Internet, and just keep reading and clicking links.

Hacking challenges are a great way to learn about how computers work since they require you to have enough knowledge to be able to deliberately break programs. https://picoctf.com/ is an excellent choice for beginner- to intermediate-level challenges. https://overthewire.org/wargames/ also has some good challenges, but they start off harder and progress quickly. Note that these challenges will often require some programming, so learning a powerful language like Python will be very helpful.

This site is not very active anymore, but the old posts are excellent. It's very complex and advanced though, so it's not a good place to start. https://www.realworldtech.com/

In general, google will be your best friend. If you run into a word or program or concept you don't know, google it. If the explanations have more words you don't know, google them. It takes time, but it's the best way to learn on your own.

u/dhobsd · 9 pointsr/askscience

Hooray, a question I can answer!

One of the problems here is that the question is worded backwards. Binary doesn't combine to give us programming languages. So the answer to your question is somewhat to the contrary: programming languages were invented to ease the tedium of interfacing using binary codes. (Though it was still arguably tedious to work on e.g. punched cards.) Early interfaces to programming machines in binary took the form of "front panels" with switches, where a user would program one or several instructions at a time (depending on the complexity of the machine and the front panel interface), using the switches to signify the actual binary representation for the processor functions they desired to write.

Understanding how this works requires a deeper understanding of processors and computer design. I will only give a very high level overview of this (and others have discussed it briefly), but you can find a much more layperson accessible explanation in the wonderful book Code: The Hidden Language of Hardware and Software. This book explains Boolean logic, logic gates, arithmetic logic units (ALUs) and more, in a very accessible way.

Basically, logic gates can be combined in a number of ways to create different "components" of a computer, but in the field of programming languages, we're really talking about the CPU, which allows us to run code to interface with the other components in the system. Each implementation of a processor has a different set of instructions, known as its machine code. This code, at its most basic level, is a series of "on" or "off" electrical events (in reality, it is not "on" and "off" but high and low voltages). Thus, different combinations of voltages instruct a CPU to do different things, depending on its implementation. This is why some of the earliest computers had switch-interfaces on the front panel: you were directly controlling the flow of electricity into memory, and then telling the processor to start executing those codes by "reading" from the memory.

It's not hard to see how programming like this would be tedious. One could easily write a book to configure a machine to solve a simple problem, and someone reading that book could easily input the code improperly.

So eventually as interfacing with the machine became easier, we got other ways of programming them. What is commonly referred to as "assembly language" or "assembler" is a processor-specific language that contains mnemonics for every binary sequence the processor can execute. In an assembly language, there is a 1:1 correlation between what is coded, and what the processor actually executes. This was far easier than programming with flip-switches (or even by writing the binary code by hand), because it is much easier for a human to remember mnemonics and word-like constructs than it is to associate numbers with these concepts.

Still, programming in assembly languages can be difficult. You have to know a lot about the processor. You need to know what side-effects a particular instruction has. You don't have easy access to constructs like loops. You can't easily work with complex datatypes that are simply explained in other languages -- you are working directly with the processor and the attached memory. So other languages have been invented to make this easier. One of the most famous of these languages, a language called "C," presents a very small core language -- so it is relatively easy to learn -- but allows you to express concepts that are quite tedious to express in assembler. As time has gone on, computers have obviously become much faster, and we've created and embraced many languages that further and further abstract any knowledge about the hardware they are running on. Indeed, many modern languages are not compiled to machine code, but instead are interpreted by a compiled binary.

The trend here tends to be making it easier for people to come into the field and get things done fast. Early programming was hard, tedious. Programming today can be very simple, fun and rewarding. But these languages didn't spring out of binary code: they were developed specifically to avoid it.

TL;DR: People keep inventing programming languages because they think programming certain things in other ones is too hard.

u/Nerdlinger · 1 pointr/geek

Oi. Disclaimer: I haven't bought a book in the field in a while, so there might be some new greats that I'm not familiar with. Also, I'm old and have no memory, so I may very well have forgotten some greats. But here is what I can recommend.

I got my start with Koblitz's Course in Number Theory and Cryptography and Schneier's Applied Cryptography. Schneier's is a bit basic, outdated, and erroneous in spots, and the guy is annoying as fuck, but it's still a pretty darned good intro to the field.

If you're strong at math (and computation and complexity theory) then Oded Goldreich's Foundations of Cryptography Volume 1 and Volume 2 are outstanding. If you're not so strong in those areas, you may want to come up to speed with the help of Sipser and Moret first.

Also, if you need to shore up your number theory and algebra, Victor Shoup is the man.

At this point, you ought to have a pretty good base for building on by reading research papers.

One other note, two books that I've not looked at but are written by people I really respect Introduction to Modern Cryptography by Katz and Lindell and Computational Complexity: A Modern Approach by Arora and Barak.

Hope that helps.

u/di0spyr0s · 1 pointr/resumes

Thanks so much!

Where do hobbies and interests go? Below Education somewhere? Sample stuff I could add:

  • I started sewing this year and have achieved my goal to knit and sew all my own clothes for 2015.
  • I play guitar, drums, and piano, and I'm learning to play bass. A friend and I started a band called OCDC, because we're n00bs and play the same thing over and over a lot.
  • I read insatiably. Most recently Code: The Hidden Language of Computer Hardware And Software and A Guide to the Good Life: The Ancient Art of Stoic Joy, but also the backs of cereal packages and the "In case of fire" escape instructions on doors if there's nothing else.
  • I'm from New Zealand and can, if necessary, butcher a sheep/pig/deer/rabbit, build a fence, milk a cow by hand (or milk several hundred, given a decent sized milking shed), TB test deer, fell trees, and use the word "munted" in a sentence.
  • I've ridden horses all my life and still volunteer occasionally as an equine masseuse for some of the carriage horses in Central Park.
  • I love automating stuff and am working on fully automating my home aquaponics set up: a combination of an aquarium and a grow bed which currently produces great quantities of grass for my cats to puke up.

    I had sort of planned to put all this stuff in my personal website - write ups of personal projects, a good reads feed, an "About me" section, and maybe a page of my sewing/knitting creations.

    I'll certainly look into adding some more personality into the resume design, it is currently the result of a google template, which is pretty blah.

    Again, Thanks so much for your feedback! It's been really helpful!
u/rohit275 · 4 pointsr/hardware

I haven't read it, but it looks pretty good. I can personally vouch for this book however:

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=pd_sim_14_2?_encoding=UTF8&psc=1&refRID=C8KMCKES83EHGXS3VWSQ

It's truly amazing. I'm currently an EE PhD student but I had a pretty limited background in digital hardware and computer architecture, and I read most of this book just out of interest a little while ago and frankly learned quite a bit. It's written at a very readable level for anyone with almost no prior knowledge, yet gets technical when it needs to. It's very thorough, but approaches the topics at a wonderful and easy pace with very clear explanations. The author even says you can skip some of the more technical details if they're not of interest to you, and you'll still end up learning quite a lot. The book you posted looks pretty similar, so I'd say it's worth a shot.

u/YuleTideCamel · 3 pointsr/learnprogramming

Sure I really enjoy these podcasts.

u/KernlPanik · 20 pointsr/learnprogramming

I'm a ~10 year sysadmin that has decided to rebuild my software dev skills that I haven't used since college. Here's what I did to reawaken that part of my brain:

  1. Harvard's CS50. I figured an entry level college course would be "beneath me" but it was a great experience and I learned a surprising amount. It's very entertaining as well so that made the "simple" parts fun to do as well.

  2. Read CODE by Charles Petzold. Great insight into the nuts and bolts of how computers work. Read through it on my lunch breaks while taking CS50 in the evenings.

  3. Read and do the problems in C Primer Plus. This is a great book for learning how to write in C, which is the basis for all modern languages and is still widely used today. Great starter book for anyone who wants to learn to program.

    3.5) After going through the last chapters of C Primer Plus, I realized that some of my math skills were not up to par, so I took this MOOC from MIT to supplement that. No idea if that's something you need.

  4. Here comes the fun one: The Structure and Interpretation of Computer Programs, aka The Wizard Book. This book is more about how to design software in general, and it is pretty difficult. That being said, if you can get through it then you have the chops to do this professionally.
u/Sigb · 8 pointsr/emulation

Hyperthreading is also a way to utilize each core more effectively. Not all programs can run as many instructions simultaneously like emulators can, so hyperthreading is there to let you use two different threads on one core. It goes without saying that its not as effective as having more cores, but it helps a lot for mid range laptops doing typical user workloads.

State switching is also a feature that helps with typical workloads. The processor change (grossly simplified) to higher performance and lower performance and everything in between quicker and more effectively. So when games suddenly give the CPU more tasks to do, the CPU will be quicker to adjust and give the necessary "power". This improves times for things like wake from sleep, opening programs, much more, and reduces power usage and heat build up.

There are also a large amount of minor things like branch prediction and things we don't hear about as much (industry trade secrets), that does add up to very tangible improvements. In addition to me not knowing so much about them, many of them are by nature secretive since Intel has much competition.

My reason for writing just a few words about these things was that so you can understand that there is a plethora of ways CPU's can improve. And they do. And it might be impossible to get the typical consumer to do more than compare clockspeed (just get them to factor in core count is hard). In short, the topic is much less simple than it looks at first glance.

If you want to learn more about processors, start learning about a simple one can help a huge amount. A guy called J.C. Scott designed a fully working theoretical 8-bit CPU architecture similar to actual 8-bit CPU's just so he could in detail explain to people how CPU's "know" stuff. He starts off by showing you how you can build logic gates using transistors, then how you can build all of the components out of logic gates. And the book is very easy to follow. amazon link

u/JumboJellybean · 1 pointr/learnprogramming

Here are my two big enthusiastic suggestions.

Sign up to Lynda. You get a 10-day free trial, and then it's $30 for a month. Watch Foundations of Programming: Fundamentals (4h 47m), Foundations of Programming: Data Structures (2h 29m), Programming Fundamentals in the Real World (3h 8m), and Python 3 Essential Training (6h 36m).

These are short crash courses, you obviously don't walk away a full-on programmer. But the main instructor Simon Allardice is excellent at explaining the basics and fundamentals in a very clear, natural way. I have taken university courses, I have watched MIT and Harvard courses, I have used a dozen tutorial sites and watched a bunch of lecturers and read three dozen books: the Lynda programs I linked are the best first-intro things I've seen. I strongly recommend that you watch at least the first two.

You might not understand it all, that's fine. Don't worry about what languages he uses for examples, 90% of stuff carries over between languages. If you can absorb a good chunk of that material it'll be a huge boost for you and a nice foundation to start on. You'll walk into your first real class in a better headspace, already knowing the gist of what you're going to flesh out and properly sink your teeth into. And if you find that the Lynda stuff really works well, look up their C and databases courses, since you'll wind up using that stuff too.

My second recommendation is that you buy and read Charles Petzold's wonderful book Code: The Hidden Language of Computer Hardware and Software. This book doesn't focus on a specific programming language, or how to implement programs, or the mathematics, or the theory. Instead it answers "So what is a computer actually doing under there when we program? What's a CPU, exactly, how does it understand words and numbers?" It does this in a very natural, accessible, for-the-layman way, starting with really simple analogies about signal flags and morse code, and before you know it, bam, you understand logic gates and binary arithmetic, and a lot of the mystery and magic of computers has dissolved for you.

u/playaspec · 1 pointr/embedded

> Would you suggest something like a Linux From Scratch or Gentoo setup to learn more about these things?

No. Stick with what you know. Mint is a derivative of Ubuntu, which is a derivative of Debian. While Mint tends to appear more polished, I've run into trouble doing "not normal" things that I'm accustomed to in Ubuntu. For a development environment, I'd stick with Ubuntu, but if you're happy with your Mint setup, use that.

>Any particular book/resource recommendation?

There's a book I adore, but you should get it after you've gotten your hands dirty and have worked your way through most of the tutorials in Arduino IDE, and are already dabbling with bare metal. It's Write Great Code: Volume 1: Understanding the Machine by Randall Hyde.

>Should I start with the arduino libraries

Yeah. With a CS background, you should be able to blow through most of the demos and get a feel for the hardware and the environment in a weekend or several evenings. When you're comfortable, start poking at the source code to the libraries, and the Arduino itself. All the Arduino's code, including the bootloader are buried within the IDE itself, plus there's a repo on Github.

When you find yourself starting to dabble with programming the bare metal (or at least direct peripheral access form within the Arduino environment), get the datasheet for the processor on your board. It's the first thing I reach for when doing microcontroller programming. The datasheet has everything you'll need to know about configuring the peripherals, configuring IO, interrupts, and more.

>or jump straight to bare metal?

If you can, go for it! Don't feel bad if it's too much to figure out at first. The Arduino environment does hide quite a bit of complexity. One thing to note: avr-gcc (used by the Arduino IDE and available as a Linux package) uses slightly different register names in some cases than the datasheet does. You may find sample code that appears to do what you want, but fails to compile/assemble. It's probably because they're using Atmel's naming convention. Update the names to match avr-gcc and it usually builds without a hitch.

As for the books, both look pretty good, but I haven't read either, so you'll have to rely on the reviews and your wallet. The author of the first one has a fairly popular YouTube series that may supplement the book, and is sponsored by Element 14 (big components supplies). The latter had several comments citing that it covers more advanced topics (not sure which) in later chapters. Also, the latter book is published by No Starch Press, who also publishes the book I recommended. They do a terrific job of breaking down and explaining new technical concepts.

u/lazyAgnostic · 1 pointr/santashelpers

For programming, what kind of programming is he into? Here are some cool programming books and things:

  • Automate the Boring Stuff with Python This book has a lot of beginner projects that are actually useful.

  • Arduino A little microprocessor that he can use to make cool projects. I'm a software engineer and I had fun playing aroung with this. Plus, you can use it for actual useful things (I'm planning on making an automatic plant waterer, but you can look online for all the awesome stuff people have made).

  • Raspberry Pi Similar to the arduino but it's a full computer. For more software-heavy projects than the arduino. I'd probably recommend starting with the arduino.

  • Great book about how code and computers actually work that's geared towards the "intelligent layperson" link.

  • If he's already programming and wants to create games I can recommend this one.. Not good for beginners though.

  • If you want to give him a well written tome about game programming here it is. Again, not really for beginners but really good for someone wanting to learn about game programming
u/entropicone · 1 pointr/androiddev

You can start out with some of the New Boston Android Tutorials - http://www.youtube.com/playlist?list=PL3D7BFF1DDBDAAFE5

The best way to learn is to pick a project and see it through to fruition. Go with something simple but not too simple, I'd recommend trying to make your own clone of this tip calculator. Don't just make it kind-of work, get it to where you would be proud to release it.

If you are completely new to programming it will be slow going at first, but there is no better time to learn than now, with some google searches you can find hundreds of free online programming courses (MIT Open Courseware and UC Berkley should get you started). You can google just about any problem and find someone who has encountered it before and solved it, sites like stackoverflow have hit the mainstream with programmers and it has become far easier to disseminate and learn best practices.

Also, Code by Charles Petzold is by far the best introduction I have ever read on computing theory, it has very little to do with conventional programming though.

Shamelessly stealing from one of the Amazon reviews:
> The average person who uses a computer to surf the web or type letters has so little knowledge of the underlying technology he or she is using that it may as well be magic. Even programmers, who typically spend their days solving problems with the high-end abstractedness of object-orientation, may be more than a little unclear about what's actually going on inside the box when their compiled code is running.
Petzold attempts, and largely succeeds at, writing a book that leaves the reasonably intelligent layperson with a thorough comprehension of each layer that comprises a modern electronic computer (binary coding -> electronic representation -> transistors -> logic gates -> integrated circuits -> microprocessors -> opcodes -> assembly language -> high-level language -> applications). At times, the reader must follow along carefully, but Petzold tries to avoid needless complication.

u/serimachi · 5 pointsr/computerscience

It's so great you're being so proactive with your learning! It will definitely pay off for you.

I like other's suggestion of Clean Code, but I fear as a first year that it may have mostly flew over my head--not that it would at all hurt to read. For a first year student specifically, I'd recommend either of two books.

Structure & Interpretation of Computer Programs, also known as The Wizard Book and free on the link I just sent you, is a famous textbook formerly used in MIT's Intro to Computer Science course. However, it's conceptually useful to programmers on any level. If you really, seriously read it and do the exercises, it's gonna give you a rock-solid foundation and shoot you ahead of your peers.

It uses Scheme, a quote-on-quote "useless" programming language for any real-world purpose. That's arguable, but the important thing about the book is that it's really edifying for a programmer. The skill it helps you develop is not the kind that will directly show on your resume, it's nothing you can point to, but it's the kind of skill that will show in your code and how you think and approach problems in general. That said, the book has exercises and the MIT site I linked you to has labs that you could potentially show off on your github.

Code: The Hidden Language of Hardware and Software is much more approachable, is not marketed specifically for programmers, and does not contain any exercises. Read it, though, and you'll find you have a huge boost in understanding the low-level computing classes that your classmates will struggle with. What is basically does is show the reader how one can build a computer, step by step, from the very basics of logic and switches. It's readable and written for a casual audience, so you may find it easier to motivate yourself to finish it.

SICP and Code, despite both being extremely popular, can be a bit difficult conceptually. If you don't fully understand something, try reading it again, and if you still don't understand it, it's fine. Everyone experiences that sometimes. It's okay to move forward as long as you feel like you mostly get the topic. Don't let the perfect be the enemy of the good.

Best of luck to you, and be excited! It's thrilling stuff.

u/michael0x2a · 2 pointsr/learnprogramming

I've seen a lot of people recommending Code: The Hidden Language of Computer Hardware and Software

It is about 15 years old though, so it might seem a little out-of-date in some places/might appear to omit some modern developments in technology and computer science, but the existing content should still be pretty solid.

That being said, I do agree that the best thing to do is to just jump straight in. The best way to gain a good mindset for programming is to just start programming. You'll run head-first into obstacles and bugs, and figuring out how to fix those bugs/avoid those bugs is pretty much how you acquire that sort of mindset.

u/cholland89 · 27 pointsr/compsci

I just finished reading Code: The Hidden Language of Computer Hardware and Software and will state unequivocally that this book is the most satisfying read I've experienced. It starts with flashlights blinking through windows, moves to Morse code, introduces electrical relays and demonstrates how they can be connected to form logic gates, then uses those gates to construct an ALU/counter/RAM and multiplexors. It goes on to describe the development of an assembly language and the utilization of input and output devices.

This book can be described as knowledge hose flooding the gaps in my understanding of computer hardware/software at an extremely enjoyable pace. It may help satisfy your interest in the concepts and technology that led to modern computers. Check out the reviews for more info.

If you haven't already studied logic gates in depth in your formal education, I would suggest using a logic simulator to actually build the combinational logic structures. I now feel very comfortable with logic gates and have a strong understanding of their application in computing from my time spent building the described logic.

I went through the book very slowly, rereading chapters and sections until I felt confident that I understood the content. I can not recommend this book enough.

After reading CODE, I have been working through The Elements of Computing Systems: Building a Modern Computer from First Principles. If you are looking to gain a better understanding of the functions of hardware components, this is the book to read. This book's companion site http://www.nand2tetris.org has the first chapters free along with the entire open source software suite that is used in the book's projects. You will build, in the hardware design language starting with Nand gates, each logic gate and every part of a computing system up to a modern high level language with which you can program custom software of your own design to compile in a compiler you designed into an assembly language you specified which is turned into binary that runs in a processor you built from Nand gates and flip flops. This book was very challenging before reading CODE, now I feel like I'm simply applying everything I learned in code with even more detail. For somebody that hasn't attended college for computing yet, this has been a life changing experience.

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319


http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686

u/chunyukuo · 3 pointsr/TranslationStudies

First of all, congrats on the promotion and the learning spirit. I wish more managers had your attitude.

I had a similar situation where I went from in-house linguist to loc manager, and I wonder if my experiences might be of use to you. Like you, I definitely did not describe myself as "into programming." I'm still not into that sort of thing. But learning as much of it as I could had a direct benefit to a lot of my daily tasks, and I would recommend at least giving the more learner-friendly tutorial sites a try.

I finished a lot of modules on codecademy.com and genuinely enjoyed them because they were not particularly difficult and also allowed me automate a lot of things and also gain a deeper understanding of how things work. I went through Learn Python the Hard Way and gained a lot from that, especially since subsequent projects included quite a lot of assets in Python. I went so far as to plow through the first half of Code: The Hidden Language of Computer Hardware and Software (the latter half too arcane for me) and found that quite useful as well, although in hindsight it was a bit overkill.

Even after my department was given an actual programmer to code up solutions for us, I at least was able to understand how a good amount of it worked. Coding aside, a localization manager is the person that the linguists and testers go to when things break, and man do they break a lot. That said, I would also recommend training of some sort in SDL and Kilgray's products if you use them. In my experience as manager, both broke often or were fussy at best.

A few years later, I haven't really read much about code, but I still try to ask developers as many questions as I can about the technical aspects of their products and find it really helpful to follow up on Stack Overflow or just Wikipedia.

Good luck with your new position!

u/a_redditor · 1 pointr/learnprogramming

Let me just say right off the bat that it sounds like you're well on your way to being a successful programmer.

One thing I can definitely suggest (which helped me a lot) is reading Code or some other book like it. It is effectively a guide from the ground up of how computers and programming work in general. I had the fortune of reading most of it before I started my CS degree, and it really helped me breeze through my hardware courses.

As well, any books on data structures would probably be helpful, as this is one of the early topics covered in many CS programs. I can't suggest any specific books, but I'm sure others can.

Most of all I have to suggest just getting very comfortable with programming and learning several different languages. It looks like you're already well on your way with this, but the goal here is to have a strong passion for programming before college. That way, when you're up at 3 AM the night before an assignment is due, it's not because you procrastinated and you waited until the last minute to start because you loathe the thought of programming, but because you're so excited about making your code perfect and adding in additional functionality because you absolutely love programming.

u/MerlinTheFail · 1 pointr/gamedev

This is really cool! Thank you.

>A common question is whether the book is still relevant. After all it's over ten years old

I find that some old(ish) books can really hold some great significance, for example: Effective C++ and Clean code have both given me some brilliant tips on making better code. I'm also readingWrite Great Code. If you have any more books i'd love to see them :) Thank you, again.

u/jeykottalam · 8 pointsr/compsci

Introduction to Algorithms by CLRS

TAOCP is a waste of time and money; it's more for adorning your bookshelf than for actually reading. Pretty much anyone who suggests TAOCP and is less than 55 years old is just parroting Standard Wisdom™.

Godel, Escher, Bach is a nice book, but it's not as intellectually deep in today's world as it was when first published; a lot of the memes in GEB have been thoroughly absorbed into nerd culture at this point and the book should be enjoyed more as a work of art than expecting it to be particularly informative (IMO).

If you're interested in compilers, I recommend Engineering a Compiler by Cooper & Torczon. Same thing as TAOCP applies to people who suggest the Dragon Book. The Dragon Book is still good, but it focuses too much on parser generators and doesn't really cover enough of the other modern good stuff. (Yes, even the new edition.)

As far as real programming goes, K&R's The C Programming Language is still unmatched for its quality of exposition and brevity, but these days I'd strongly suggest picking up some Python or something before diving into C. And as a practical matter, I'd suggest learning some C++ from Koenig & Moo's Accelerated C++ before learning straight C.

Sipser's Introduction to the Theory of Computation is a good theory book, but I'd really suggest getting CLRS before Sipser. CLRS is way more interesting IMHO.

u/hooj · 28 pointsr/explainlikeimfive

The whole subject is a bit too complicated and a bit too deep for a short ELI5, but I'll give a stab at the gist of it.

The reason why computers work (at least in the vein of your question) is very similar to the reason why we have language -- written, spoken, etc.

What you're reading right at this very moment is a complex system (language) simplified to symbols on the screen. The very fact that you can read these words and attain meaning from them means that each sentence, each word, and each letter represent a sort of code that you can understand.

If we take an apple for example, there are many other ways to say that in different languages. Manzana. Pomme. Apfel. And so on. Codes -- some symbol maps to some concept.

In the context of computers, well, they can only "understand" binary. Ones and zeros. On and off. Well, that's okay, because we can map those ones and zeros to codes that we (humans) care about. Like 101010111 could represent "apple" if we wanted it to.

So we build these physical circuits that either have power or don't (on and off) and we can abstract that to 1's (power flowing through that circuit) and 0's (no power flowing through it). This way, we can build physical chips that give us basic building blocks (basic instructions it can do) that we can leverage in order to ultimately make programs, display stuff, play sounds, etc. And the way we communicate that to the computer is via the language it can understand, binary.

In other words, in a basic sense, we can pass the processor binary, and it should be able to interpret that as a command. The length of the binary, and what it should contain can vary from chip to chip. But lets say our basic chip can do basic math. We might pass it a binary number: 0001001000110100 but it might be able to slice it up as 0001 | 0010 | 0011 | 0100 -- so the first four, 0001, might map to an "add" command. The next four, 0010, might map to a memory location that holds a number. The third group of four might be the number to add it to. The last group might be where to put it. Using variables, it might look like:

c = a + b. Where "c" is 0100, "a" is 0010, "b" is 0011, and the "+" (addition operator) is 0001.

From there, those basic instructions, we can layer abstractions. If I tell you to take out the trash, that's a pretty basic statement. If I were to detail all the steps needed to do that, it would get a lot longer -- take the lid off the can, pull the bag up, tie the bag, go to the big garbage can, open the lid, put the trash in. Right? Well, if I tell you to take out the trash, it rolls up all those sub actions needed to do the task into one simple command.

In programming, it's not all that different. We layer abstractions to a point where we can call immense functionality with relatively little code. Some of that code might control the video signal being sent to the screen. Some of that code might control the logic behind an app or a game. All of the code though, is getting turned into 1's and 0's and processed by your cpu in order to make the computer do what is asked.

If you want to learn more, I highly recommend Code by Charles Petzold for a much more in depth but still layman friendly explanation of all this.

u/cunttard · 1 pointr/C_Programming

Start with getting the XCode developer tools and the command-line package.

C is an important language in Computer Science because it is pretty much the language for heavy duty Operating Systems, the type you see in Desktop OSes, Network OSes (the type that runs on a networking router/switch), Server OSes (Linux, BSD, Windows, etc.).

I think C is a hard language to learn, but it is a great first serious language while also simultaneously learning an easier language like shell or Python to make yourself more efficient/productive.

However fundamental to CS is about the theory of comptuation not really languages. Languages are just a way to express computation. Some languages are better than others for expressing computation to solve certain problems. I would highly encourage also looking into understanding computation from first principles, a great introduction is Theory of Computation (2nd edition is really really cheap used). The only background knowledge you need to know is highschool mathematics.

u/boojit · 5 pointsr/learnprogramming

Ok here goes.

One of the challenges with answering your question directly is deciding at what scope to answer it. If we take your question at its broadest level of meaning, it almost becomes "how do computers work?" What I mean by this is when you said in another comment that you'd like to see how a program goes from outputting simple text into a command window, to a program with a GUI like Audacity, you should be asking yourself, "waitasecond...how does the program output to the console?" And for that matter, how do the characters get drawn on the screen? What is happening when I "compile" code into an executable, and what happens when I "run" an executable?

So you see how this can quickly get out of hand. One of the things that is frustrating with starting to learn how to program, at least it was for me, is you get sorta plopped right in the middle of the story. Typically, you don't get the beginning of the story until much later in your studies.

And that's because the beginning of the story, while very interesting and well worth knowing, is pretty damn complicated. It takes quite a bit of study and effort to wrap your head around.

But let me whet your appetite anyway. Since we have been talking about input/output (console output vs. a GUI), take a look at this page. What you have here is a very, very, very basic computer. It doesn't have a monitor, and it doesn't have a keyboard. It doesn't have a disk and it can't connect to the internet. But a computer it is, and its fundamental operation is exactly the same as the computer that is your laptop, or tablet, or smart phone. The way you input information into this computer is by flipping switches on the front. The output from the computer is displayed on the little LED lights on the front. But at a high level, it's not really that different from what happens on your computer: you input information into the computer that you want processed, the computer processes that information, and the output of the processing is displayed to you.

Watch the video on that page, and you'll see what I mean about even though this computer is primitive compared to your laptop, it is still quite complicated. I don't expect you to understand what he's talking about in this video (you will later as you progress), all you really need to take away is this idea that even with a primitive computer like this, explaining how you input data into it and get data back out of it, is fundamentally a complicated thing. If this sort of thing interests you and you would like to know more, I would recommend you pick up this book.

Here's the point I'm trying to make: you won't get a truly satisfying answer to your question until you understand how the toggle-switch-blinkenlight computer works, for starters. Obviously, that won't be for a while. And that's OK! Just understand that you're kind of starting somewhere in the middle of the whole "how computers really work" story, and know that eventually you will read the beginning.

Now to your specific question.

First I think reading this page will do more to put you on the right path than anything else I could say. Therein, the author will show you some very basic Java GUI programs that you can run yourself, even if you're not using an IDE. In one sense, programs like Audacity are just more complicated versions of these primitive GUI programs. So that is off to a good start.

But what this author doesn't really address is how when you write JOptionPane.showMessage, how the computer takes that line of code and is able to go "Ok, um i am now going to draw a box on the screen and make this thing called a button, etc etc".

So what's going on there? As some others have pointed out in various comments, there are things called APIs that Java can access; these are libraries of code including the code to draw those windows and buttons on the screen, made such that YOU can reuse that functionality in your own programs without having to write it yourself. Also others pointed out that you have something called an operating system or OS; similarly this provides a sort of "platform" for your code to run on; so for example you can ask the OS to load a file from disk, read from the keyboard, etc, without having to write that code yourself.

The major point here is that other industrious individuals have spent a lot of time writing tools that your little tiny program can leverage. Even the simplest "hello world" program is relying on all these tools -- otherwise the code wouldn't look so simple!

I hope this helps. Please let me know if any of this is unclear, I'll try to clarify as my schedule allows.

Edit: spelling, grammar, tightened up some sentences

u/Tyaedalis · 2 pointsr/learnprogramming

I just began reading CODE and it talks about the lowest level of computing mechanisms. This could be something of interest, although it wont teach you how to program specifically.

For that, I propose to you -- as others have -- Learn to Code the Hard Way. I would recommend the Python version, but he is working on a C version that is being completed. Another great contribution is How to Think Like a Computer Scientist, another book that focuses on Python.

I guess I could best help if I knew what your goals and intentions are. If you want to learn the basics, you can't go wrong with installing a virtual machine with some simple virtual hardware and code at the hardware level. You could even go so far as to build a computer from individual components connected in a specific circuit and hard-code the hardware itself. If you want to learn the more modern, abstract methods, I would strongly suggest Python, C#, or Java. There are many good books on each subject.

u/akame_21 · 5 pointsr/learnprogramming

Despite their age, the MIT lectures were great. If you're good at math and enjoy proofs this is the class for you. Same thing with the CLRS book. One of the best books on DS & Algos out there, but it's so dense it'll make your eyes glaze over, unless you love proofs and highly technical reading.

To get your feet wet, Grokking Algorithms is a good book.

A lot of people recommend Princeton's Algorithm Course. I took Algorithms in school already, but I'm probably going to take this course to round out my knowledge.

EDIT: special shout out to geeks for geeks. Great Website

u/AlSweigart · 1 pointr/learnprogramming

Find out what language the class is using. If it is Java, then I've heard that Thinking in Java is a good book (the 3rd edition is free while the 4th is not).

Here are some books you should read, the sooner the better:

Code: The Hidden Language of Hardware and Software - This is a great book that is technical while being a fairly light read. It answers the question "how do computers physically work?" with just enough theory to understand, and it doesn't have, like, metaphors of pictures of trains carrying 1s and 0s around.

Don't Make Me Think and User Interface Design for Programmers which are also light-but-still-technical books on user interface design, which gives a good idea of how to create software that people actually use.

At some point after you've done some coding, read these books:

Code Complete - A great tome of the non-programming aspects of being a good software engineer.

Pragmatic Programmer is a technical book, and probably a good one to read towards the end of your first year maybe.

Also note: You don't need to know a lot of math to program. High school level algebra and basic arithmetic is fine for the most part.

u/samort7 · 257 pointsr/learnprogramming

Here's my list of the classics:

General Computing

u/Flofinator · 0 pointsr/learnprogramming

Yikes! Well it's going to be pretty hard for you to really understand how to do Python without actually coding in it.

The one thing you could do though is get a book with examples and write them down and try to modify the examples to do something a little extra while at work.

I find the http://www.headfirstlabs.com/books/hfpython/ books the absolute best books for almost anything if you are just starting out. The Java book is especially fun!

I know this isn't exactly what you are asking but it might be a good resource for you to start using.

Another great book that will teach you parts of the theory, and has really good examples on how computers work is http://www.amazon.com/Code-Language-Computer-Developer-Practices-ebook/dp/B00JDMPOK2/ref=sr_1_1?s=digital-text&ie=UTF8&qid=1457746705&sr=1-1&keywords=code+charles+petzold .

That really helped me think about computers in a more intuitive way when I was first starting. It goes through the history and to what an adder is and more. I highly recommend that book if you want to understand how computers work.

u/Pally321 · 33 pointsr/mildlyinteresting

If you're serious about getting into software development, I'd recommend you start looking into data structures and algorithms as well. It's something I think a lot of people who were self-taught tend to miss because it's not required knowledge to program, but it will give you a huge competitive advantage.

While I haven't read it, this book seems like a good introduction to the concept: https://smile.amazon.com/dp/1617292230/?coliid=I34MEOIX2VL8U8&colid=MEZKMZI215ZL&psc=0

From there I'd recommend looking at MIT's Intro to Algorithms, 3rd Edition. A bit more advanced, but the topics in there will play a huge role in getting a job in software.

u/chakke_ooch · 4 pointsr/mbti

> Would you say there's more opportunity working exclusively front end and design to exercise nfp creativity or novelty?

NFP creativity and novelty in the sense that Ne has free range, period? Sure, you get more of that in web design and even more of that as to step further and further away from the sciences. There is tons of creativity in real software engineering where you can be creative to solve actually challenging problems, not figuring out what color you'd like a button to be. To me, that's not creativity – or it's a lesser version. Creativity in problem solving is much more interesting. The way I see it is like when I was in music school and all the SFs were bitching about music theory and how they thought it limited their ability to "be creative". Such bullshit. It only exposes their lack of creativity. So you're saying that someone like Chopin who wrote amazing pieces and abided by the rules of music theory wasn't being creative? Hardly.

> Are you a web dev?

No, I'm a software engineer at an astrodynamics company; I do a lot of orbital mechanics, back-end work with web services, high performance computing, etc.

> By hardcore I meant requiring being meticulous, detail oriented.

I think that the lack of attention to detail is never permissible in either back-end software engineering or front-end web development, honestly.

> One thing I've realized is how shit my high school was at explaining math conceptually. Which I think lead to misconceptions about its use in programming

Well, then read some books on computer science and/or mathematics like this.

u/CSMastermind · 1 pointr/learnprogramming

Senior Level Software Engineer Reading List


Read This First


  1. Mastery: The Keys to Success and Long-Term Fulfillment

    Fundamentals


  2. Patterns of Enterprise Application Architecture
  3. Enterprise Integration Patterns: Designing, Building, and Deploying Messaging Solutions
  4. Enterprise Patterns and MDA: Building Better Software with Archetype Patterns and UML
  5. Systemantics: How Systems Work and Especially How They Fail
  6. Rework
  7. Writing Secure Code
  8. Framework Design Guidelines: Conventions, Idioms, and Patterns for Reusable .NET Libraries

    Development Theory


  9. Growing Object-Oriented Software, Guided by Tests
  10. Object-Oriented Analysis and Design with Applications
  11. Introduction to Functional Programming
  12. Design Concepts in Programming Languages
  13. Code Reading: The Open Source Perspective
  14. Modern Operating Systems
  15. Extreme Programming Explained: Embrace Change
  16. The Elements of Computing Systems: Building a Modern Computer from First Principles
  17. Code: The Hidden Language of Computer Hardware and Software

    Philosophy of Programming


  18. Making Software: What Really Works, and Why We Believe It
  19. Beautiful Code: Leading Programmers Explain How They Think
  20. The Elements of Programming Style
  21. A Discipline of Programming
  22. The Practice of Programming
  23. Computer Systems: A Programmer's Perspective
  24. Object Thinking
  25. How to Solve It by Computer
  26. 97 Things Every Programmer Should Know: Collective Wisdom from the Experts

    Mentality


  27. Hackers and Painters: Big Ideas from the Computer Age
  28. The Intentional Stance
  29. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine
  30. The Back of the Napkin: Solving Problems and Selling Ideas with Pictures
  31. The Timeless Way of Building
  32. The Soul Of A New Machine
  33. WIZARDRY COMPILED
  34. YOUTH
  35. Understanding Comics: The Invisible Art

    Software Engineering Skill Sets


  36. Software Tools
  37. UML Distilled: A Brief Guide to the Standard Object Modeling Language
  38. Applying UML and Patterns: An Introduction to Object-Oriented Analysis and Design and Iterative Development
  39. Practical Parallel Programming
  40. Past, Present, Parallel: A Survey of Available Parallel Computer Systems
  41. Mastering Regular Expressions
  42. Compilers: Principles, Techniques, and Tools
  43. Computer Graphics: Principles and Practice in C
  44. Michael Abrash's Graphics Programming Black Book
  45. The Art of Deception: Controlling the Human Element of Security
  46. SOA in Practice: The Art of Distributed System Design
  47. Data Mining: Practical Machine Learning Tools and Techniques
  48. Data Crunching: Solve Everyday Problems Using Java, Python, and more.

    Design


  49. The Psychology Of Everyday Things
  50. About Face 3: The Essentials of Interaction Design
  51. Design for Hackers: Reverse Engineering Beauty
  52. The Non-Designer's Design Book

    History


  53. Micro-ISV: From Vision to Reality
  54. Death March
  55. Showstopper! the Breakneck Race to Create Windows NT and the Next Generation at Microsoft
  56. The PayPal Wars: Battles with eBay, the Media, the Mafia, and the Rest of Planet Earth
  57. The Business of Software: What Every Manager, Programmer, and Entrepreneur Must Know to Thrive and Survive in Good Times and Bad
  58. In the Beginning...was the Command Line

    Specialist Skills


  59. The Art of UNIX Programming
  60. Advanced Programming in the UNIX Environment
  61. Programming Windows
  62. Cocoa Programming for Mac OS X
  63. Starting Forth: An Introduction to the Forth Language and Operating System for Beginners and Professionals
  64. lex & yacc
  65. The TCP/IP Guide: A Comprehensive, Illustrated Internet Protocols Reference
  66. C Programming Language
  67. No Bugs!: Delivering Error Free Code in C and C++
  68. Modern C++ Design: Generic Programming and Design Patterns Applied
  69. Agile Principles, Patterns, and Practices in C#
  70. Pragmatic Unit Testing in C# with NUnit

    DevOps Reading List


  71. Time Management for System Administrators: Stop Working Late and Start Working Smart
  72. The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services
  73. The Practice of System and Network Administration: DevOps and other Best Practices for Enterprise IT
  74. Effective DevOps: Building a Culture of Collaboration, Affinity, and Tooling at Scale
  75. DevOps: A Software Architect's Perspective
  76. The DevOps Handbook: How to Create World-Class Agility, Reliability, and Security in Technology Organizations
  77. Site Reliability Engineering: How Google Runs Production Systems
  78. Cloud Native Java: Designing Resilient Systems with Spring Boot, Spring Cloud, and Cloud Foundry
  79. Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation
  80. Migrating Large-Scale Services to the Cloud
u/TomCoughlinHotSeat · 0 pointsr/learnprogramming

Sipser's book is basically free on amazon if u buy used old editions. http://www.amazon.com/gp/aw/ol/053494728X/ref=olp_tab_used?ie=UTF8&condition=used

It basically just asks wut restricted types of computers can do. Like wut happens if u have a program but only a finite amt of memory, or if u have infinite memory but it's all stored in a stack. Or if u have infinite memory with random access.

Turns out lots of models r equal and lots r different and u can prove this. Also, these models inspire and capture lots of ur favorite programming tools like regex (= DFA) and parser generators (= restricted PDA) for ur favorite programming languages.

u/kecupochren · 1 pointr/computerscience

What’s your background?

Going by some curriculum sounds like a sure way to learn it all properly and in-depth. Though also to get bored easily.

The beauty of being self-taught is that you can learn areas that actually interests you. There are of course fundmentals that you need, for which I recommend the vastly popular and very high quality free course from Harvard - CS50x

Taking it will give you a solid foundation to learn whatever you want by yourself. Whether it’s backend, webdev, mobile apps, data science... maybe even games/graphics, but for those you need deep math knowledge.

Sure there will be gaps here and there but CS50 really does a great job at teaching you where to look. I myself took it 6+ years ago and it was the perfect gateway into this career.

On another note, get this book - Code. It takes you from morse/braile code through logic gateways all the way up to understanding everything (from hardware/logic point) about basic computer.

u/bonesingyre · 5 pointsr/webdev

Sure! There is a lot of math involved in the WHY component of Computer Science, for the basics, its Discrete Mathematics, so any introduction to that will help as well.
http://www.amazon.com/Discrete-Mathematics-Applications-Susanna-Epp/dp/0495391328/ref=sr_sp-atf_title_1_1?s=books&ie=UTF8&qid=1368125024&sr=1-1&keywords=discrete+mathematics

This next book is a great theoretical overview of CS as well.
http://mitpress.mit.edu/sicp/full-text/book/book.html

That's a great book on computer programming, complexity, data types etc... If you want to get into more detail, check out: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/0534950973

I would also look at Coursera.org's Algorithm lectures by Robert Sedgewick, thats essential learning for any computer science student.
His textbook: http://www.amazon.com/Algorithms-4th-Robert-Sedgewick/dp/032157351X/ref=sr_sp-atf_title_1_1?s=books&ie=UTF8&qid=1368124871&sr=1-1&keywords=Algorithms

another Algorithms textbook bible: http://www.amazon.com/Introduction-Algorithms-Thomas-H-Cormen/dp/0262033844/ref=sr_sp-atf_title_1_2?s=books&ie=UTF8&qid=1368124871&sr=1-2&keywords=Algorithms




I'm just like you as well, I'm pivoting, I graduated law school specializing in technology law and patents in 2012, but I love comp sci too much, so i went back into school for Comp Sci + jumped into the tech field and got a job at a tech company.

These books are theoretical, and they help you understand why you should use x versus y, those kind of things are essential, especially on larger applications (like Google's PageRank algorithm). Once you know the theoretical info, applying it is just a matter of picking the right tool, like Ruby on Rails, or .NET, Java etc...

u/tryx · 3 pointsr/math

I would recommend (and I find that I recommend this book about every 3rd thread which should say something) a book on theoretical computer science. The book is all about the beautiful mathematics that underlie all of computing.

Computers keep getting faster and faster, but are there are any questions that we can never answer with a computer no matter how fast? Are there different types of computers? Can they answer different types of questions?

What about how long it takes to answer a question? Are some questions fundamentally harder than others? Can we classify different problems into how hard they are to solve? Is it always harder to find a solution to a problem than to check that a solution is correct? (this is the gist of the famous P=NP problem )

The book has essentially no prerequisites and will take you through (initially) basic set theory moving on to logic and then the fun stuff. Every proof is incredibly clearly written with a plain English introduction of what the author is about to do before the technical bits.

The only downsides are that is quite expensive and you might have trouble finding it unless you have access to a university library/bookshop.

Good luck with your love of mathematics!

Edit: lol the book... Introduction to the theory of computation - Sipser

u/forzrin · 2 pointsr/technology

It was a few years ago, but lots of entire series of lectures are online. I got through most of the "core" CS concepts in ~2 years of doing intense academic learning alongside actually writing code (and releasing some stuff to small groups or to app stores). Picking straightforward, small projects with a specific academic challenge like a game with a simple concept but needs pathfinding algorithms (and implementing them myself instead of using a framework)

e.g. Data Structures (YouTube)

You can also find series of lectures like the above on algorithms. Than do basic research on what the most common/industry standard textbooks for these topics are, like Introduction to Algorithms (Amazon link) and buy them or download PDFs or whatever.

The important thing is to actually do the work, suggested tasks/projects, etc. Personal accountability is the driver, here.

Then there are one off books like Code: Hidden Language (Amazon link) that explore specific topics or walk you through certain ideas and concepts at a kind of introductory level. If you find the topic interesting or it is important for your work, it's a good starting point to learn about the lowest level stuff.

u/extra_specticles · 1 pointr/AskMenOver30

Before you commit to it, read Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. If it fires your imagination then computer programming may be is for you.

Another one to read is Soul of a New Machine by Tracy Kidder which is much older, but easily readable by non coders. Again if it fires your imagination then coding might be for you.

CS can lead to many many careers - many more than when I did my degree (80s), but you need to understand where the world of computers is moving to and where you want to be in that space.

If you're just looking for more money, then perhaps you shouldn't be looking at coding as a panacea. Don't get me wrong, coding is fantastic thing to do - if it floats your boat. However it's main problem is that you constantly have to keep yourself up to date with new technologies and techniques. This requires you to have the passion and self motivation to do that training.

I'm been coding since I was 11 (1978) and have seen many many aspects of the industry and the trade. I will concur with some of the comments here that indicate that the degree itself isn't the answer, but could be part of it.

Either whatever you decide - good luck!


u/tech-ninja · 6 pointsr/ProgrammerHumor

Depends what you want to learn. Some of my favorites are

  • Code by Charles Petzold if you want to know how your computer works under the hood.

  • Peopleware if you want to learn how to manage knowledge workers.

  • Clean Code by Uncle Bob if you want to learn about good practices and program structure. Impressive content, covers much more than I expected.

  • Don't Make Me Think if you want to learn about usability.

  • Algorithms by Robert Sedgewick if you want to learn about DS & algorithms.

  • The Art of UNIX Programming by Eric S. Raymond if you want to learn about the unix philosophy. Lots of hidden gems in there. Have you ever heard: write programs that do one thing and do it well; don't tune for speed until you've measured; imagine all this knowledge distilled to you in one book.

    This a good list to get you started :) most of my favorite books are not language specific.
u/binary_search_tree · 2 pointsr/movies

Actually, I was just reading a great book called Code: The Hidden Language of Computer Hardware and Software, in which the reader builds an imaginary computer composed of nothing but telegraph relays and light bulbs - all tech from the 1800s.

And Morse code is actually very interesting. It's a very clever way to encode data, and quite an efficient means of communication, especially if you happen to live in the 1800s. It's touched upon in the book.

So yeah, I suppose I am fascinated by those things.

u/MelAlton · 1 pointr/cscareerquestions

Ah those are both excellent suggestions!

Code is a great book - If a programmer newcomer reads Code and doesn't find it interesting, that's a no a good sign, as all the concepts in the book are vital, and presented very well Link: Code by Charles Petzold

And medical science background is a great lead-in to bioinformatics - in that area OP would have a lead on other straight-CS programmers. And bioinformatics uses a lot of Python, which is pretty easy to learn.

u/nbksndf · 6 pointsr/haskell

Category theory is not easy to get into, and you have to learn quite a bit and use it for stuff in order to retain a decent understanding.

The best book for an introduction I have read is:

Algebra (http://www.amazon.com/Algebra-Chelsea-Publishing-Saunders-Lane/dp/0821816462/ref=sr_1_1?ie=UTF8&qid=1453926037&sr=8-1&keywords=algebra+maclane)

For more advanced stuff, and to secure the understanding better I recommend this book:

Topoi - The Categorical Analysis of Logic (http://www.amazon.com/Topoi-Categorial-Analysis-Logic-Mathematics/dp/0486450260/ref=sr_1_1?ie=UTF8&qid=1453926180&sr=8-1&keywords=topoi)

Both of these books build up from the basics, but a basic understanding of set theory, category theory, and logic is recommended for the second book.

For type theory and lambda calculus I have found the following book to be the best:

Type Theory and Formal Proof - An Introduction (http://www.amazon.com/Type-Theory-Formal-Proof-Introduction/dp/110703650X/ref=sr_1_2?ie=UTF8&qid=1453926270&sr=8-2&keywords=type+theory)

The first half of the book goes over lambda calculus, the fundamentals of type theory and the lambda cube. This is a great introduction because it doesn't go deep into proofs or implementation details.

u/CSHunter33 · 3 pointsr/cscareerquestions

Congrats! I'm on such a programme at the University of Bath in the UK right now. Bath's programme teaches C and Java, with a little Python in some elective modules. There are also theory of computation modules with a bunch of discrete maths in.


If you let me know which specific course you're attending, what your undergrad was, and how technical a career you might want I can give more tailored advice.


I did the following for prep, after researching what modules I'd be taking:

  • MITx's Intro to CS MOOC (amazing)
  • Read and did all exercises from several relevant chapters from a "Discrete Maths for Computing" textbook I got second hand for a fiver (helped a lot in a maths-heavy module)
  • read the oft-recommended book Code (useful for awareness, but not essential, especially since we do no Comp Arch at Bath)
  • did some algorithm challenges at places like leetcode.com and firecode.io once I had done the Intro to CS MOOC


    Conversion masters are generally very intense so doing prep now and over summer is a great idea. The stuff I listed helped immensely and I would do the same again - perhaps I would switch the MITx MOOC to Harvard's CS50 instead since CS50 has a bigger spread of languages. If I had had more time on my hands, proceeding from here to do a Java MOOC would have been really useful also.

    Working on the Leetcode/Firecode challenges helped a lot for general programming practice, and will also be helpful prep for job hunting later.
u/autophage · 4 pointsr/IWantToLearn

Lots of people are recommending ways to learn a language, and I just want to pop in to say: most popular languages are much more alike than they are different. Learn any one of them for a few months (until you're no longer looking up references for how to write a for loop or getting confused by the language's comparison operators), then try your hand at a different language.

If you find something hard to grasp in one language, it's probably about equally hard to grasp in another language - so don't just think "Hmm, well, maybe this is easier in other-language" and switch over to that one instead. (There are a few exceptions - for example, you don't have to worry about memory management in Java the same way that you do in C. You can still get memory leaks in Java, but the fact that you've got garbage collection makes memory management on the whole far simpler.)

In terms of getting into hacking - the first step, hands down, is to read this book. It will teach you the really really basic stuff, on a far deeper level than most laymen ever think about, in a very gentle and even fun way. After that, start getting your hands on networking texts, security texts, and just plain writing a lot of code. Get the source to some popular open source projects (Apache, for example) and run it in a debugger, watching how the values change and looking for unexpected things.

u/awj · 4 pointsr/programming

It may be a big of a tough slog, but Sipser's Introduction to the Theory of Computation is great. The stuff on computability theory might be right up your alley, and even if you only make it through the chapter on deterministic finite automata you likely will be better at crafting a regular expression than many of my CS student peers.

Surprisingly enough, the book should be able to help you make sense out of that last sentence within 100 pages requiring only a bit of understanding of proofs. I think if you've got predicate logic under your belt you pretty much have all you need.

u/Mydrax · 2 pointsr/learnprogramming

Inside the Machine, visually illustrates concepts within a computer system so beautifully that it will make you cry.
Also, TeachYourselfCS, not really a book but a list of links of videos and books that will help you grasp various sections of CS.

Grokking Algorithms, not a book based on CS, but it's a really good book on algorithms with funny illustrations that will help you through.

The other books that have been mentioned like Clean Code for example are a must read!

u/deiphiz · 7 pointsr/learnprogramming

Okay, I'm gonna plug this in here. I hope this doesn't get buried because when I saw someone asking about low level stuff here, I couldn't help but make this recommendation.

For anyone that wants to learn more about low level computer stuff such as assembly code should read the book Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. I've been reading it recently and I'm really glad I picked it up. It really delves into how a computer really works, like how languages relate to the circuits on a board.

So yeah, /u/DEVi4TION, I recommend picking this up if you wanna know about more stuff like this. Then maybe try your hand at actual 6502 programming later :P

u/theinternetftw · 2 pointsr/askscience

The Turing machine answer is a fantastic theoretical one, but if you want to see a practical answer for "how do you build a computer (like most people would think of a computer) from scratch", which seems to be what you were looking for when you wrote this:

> What is going on at the lowest level? How are top-level instructions translated into zeroes and ones, and how does that make the computer perform an action?

...then this book is a fantastical down-to-earth, extremely approachable first read for such things (and designed such that you don't need *any* prior knowledge to start reading it).

Seriously, if you want to dive a little bit deeper, I highly recommend it.


edit: seems someone already recommended Code. Still, can't give it enough praise. Or The Elements of Computing Systems (TECS) which a (only *slightly*) more technical read designed around building everything that a computer "is", piece by piece...

Edit2: And as for "what's going on with the Minecraft ALU", TECS is a good read there as well, since the machine described in that book is what I based the ALU on. Also, the fact that Minecraft can simulate logic gates is what links the "real world" and the "minecraft world" together, because logic gates are all you need to build any computer (that's how Minecraft can let you build Turing Complete devices)

u/chalcidfly · 1 pointr/programmer

As always, read the docs first: (Code by Charles Petzold)[https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319] At least give it a skim. Great book, worth the 20$ for sure. That'll give you a really, really good intellectual basis for how computer hardware works.

Then, you could try getting a microcontroller like the Raspberry Pi or an Arduino or one of the millions of other options. Don't just get the microcontroller though, if you can afford it, get as many of the add-ons and books you can find and try to build something simple, like a computer. (Not too difficult with a Raspberry Pi.)

Or, if you want to just manipulate some LED stuff, you can buy some little LED lights and plug them straight into your microcontroller.

Whatever you do, it helps to document it through a blog or Facebook or Twitter or whatever, it'll keep you accountable.

u/FetusFeast · 1 pointr/books

lets see...

u/Shadowsoal · 11 pointsr/compsci

In the theoretical field of complexity...

The 1979 version of Introduction to Automata Theory, Languages, and Computation by Hopcroft & Ullman is fantastic and used to be the canonical book on theoretical computer science. Unfortunately the newer versions are too dumbed down, but the old version is still worth it! These days Introduction to the Theory of Computation by Sipser is considered to be the canonical theoretical computer science text. It's also good, and a better "introduction" than H&U. That said, I prefer H&U and recommend it to anyone who's interested in more than getting through their complexity class and forgetting everything.

In the theoretical field of algorithms...

Introcution to Algorithms by Cormen, Leiserson, Rivest and Stein is dynamite, pretty much everything you need to know. Unfortunately it's a bit long winded and is not very instructive. For a more instructive take on algorithms take a look at Algorithms by Dasgupta, Papadimitriou and Vazirani.

u/HeterosexualMail · 187 pointsr/programming

We did something similar as well. The labs were tons of fun. I remember having to run a couple dozen lines of code through the CPU cache on a test once, including some sneakery of using code as data at one point. I do appreciate having done it, but I'm not sure how much practical lasting value that really contributed.

That said, for those who are interested in this there is The Elements of Computing Systems: Building a Modern Computer from First Principles, more commonly known as "NAND to Tetris".

Petzold's Code is excellent as well.

Edit: Actually, while I've suggested those two let me throw Computer Systems: A Programmer's Perspective into the mix. It's a book we used across two courses and I really enjoyed it. We used the 2nd edition (and I have no issue recommending people get a cheaper, used copy of that), but there is a 3rd edition now. Being a proper text book it's stupidly priced (you can get Knuth's 4 book box set for $30 more), but it's a good book.

Anyone have suggestions similar to that Computer Systems's text? I've always wanted to revisit/re-read it, but could always used a different perspective.

u/icelandica · 1 pointr/math

Work hard and you'll get there. I preferred the applied side of things, but if I just stuck with pure math I think I would have eventually gotten a tenure track position in the mathematics side of things.

My favorite book to this day, for a beginners course in Computational complexity is still, Michael Sipser's Introduction to theory of computation, I highly recommend it. It might be a little too easy for you if you already have a base, let me know and I'll recommend books more advanced.

Here is a link to the book on amazon, although any big college library should have it, if not just have them order it for you. I've gotten my college's library to buy so many books that I wanted to read, but not spend money on, you'd be surprised at how responsive they are to purchasing requests from PhD candidates.

u/ogrisel · 3 pointsr/MachineLearning

Off-course R is used for machine learning. It's probably the most popular language for interactive exploratory and predictive analytics right. For instance most winners of kaggle.com machine learning competitions use R at one point or another (e.g. packages such as randomForest, gbm, glmnet and off-course ggplot2). There is also a recent book specifically teaching how to use R for Machine Learning: Machine Learning for Hackers.

My-self I am more a Python fan so I would recommend python + numpy + scipy + scikit-learn + pandas (for data massaging and plotting).

Java is not bad either (e.g. using mahout or weka or more specialized libraries like libsvm / liblinear for SVMs and OpenNLP / Standford NLP for NLP).

I find working in C directly a bit tedious (esp. for data preparation and interactive analysis) hence better use it in combination with a scripting language that has good support for writing C bindings.

u/ceol_ · 2 pointsr/TheoryOfReddit

Fantastic! I've gotta be honest, though: you're not going to learn a lot of "programming"; you're going to learn a lot of computer science. That's not a bad thing. You'll learn things like sorting algorithms, complexity, and discrete math.

A great language to start out with for this kind of thing is Python. Read Dive Into Python and Think Python to get you started. If you're having trouble wrapping your head around some concepts, I'd suggest Code: The Hidden Language. It's a great introduction to how computers work which should give you a bit of a kick into development.

Here's a quick example of using reddit's API to grab the last comment someone posted using Python:

import urllib2
import json

url = 'http://www.reddit.com/user/ceol_/comments.json'

request = urllib2.Request(url)
resource = urllib2.urlopen(request)
content = resource.read()
decoded = json.loads(content)

print decoded['data']['children'][0]['data']['body']

You can fool around with the reddit API here and see what it returns in a nice hierarchy.

Hope this helps!

u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/IamABot_v01 · 1 pointr/AMAAggregator


Autogenerated.

Science AMA Series: I’m Tony Hey, chief data scientist at the UK STFC. I worked with Richard Feynman and edited a book about Feynman and computing. Let’s talk about Feynman on what would have been his 100th birthday. AMA!

Hi! I’m Tony Hey, the chief data scientist at the Science and Technology Facilities Council in the UK and a former vice president at Microsoft. I received a doctorate in particle physics from the University of Oxford before moving into computer science, where I studied parallel computing and Big Data for science. The folks at Physics Today magazine asked me to come chat about Richard Feynman, who would have turned 100 years old today. Feynman earned a share of the 1965 Nobel Prize in Physics for his work in quantum electrodynamics and was famous for his accessible lectures and insatiable curiosity. I first met Feynman in 1970 when I began a postdoctoral research job in theoretical particle physics at Caltech. Years later I edited a book about Feynman’s lectures on computation; check out my TEDx talk on Feynman’s contributions to computing.



I’m excited to talk about Feynman’s many accomplishments in particle physics and computing and to share stories about Feynman and the exciting atmosphere at Caltech in the early 1970s. Also feel free to ask me about my career path and computer science work! I’ll be online today at 1pm EDT to answer your questions.


-----------------------------------------------------------

IamAbot_v01. Alpha version. Under care of /u/oppon.
Comment 1 of 1
Updated at 2018-05-11 17:56:32.133134

Next update in approximately 20 mins at 2018-05-11 18:16:32.133173

u/Wittgenstienwasright · 1 pointr/computerscience

I am sorry but that is simply not true. The Admins on this site are quite careful about such things. Without the age of the child it is impossible to provide a suitable answer. At eleven possibly, as I recommended in my DM, perhaps Rpi and surrounding projects as shown on the site, at thirteen, then Python and later on as experience dictates. Please do not berate a fellow father trying to educate people especially for free. Good luck and before your teenage tries anything new, perhaps you can find this physical tome.

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319



The first chapter is about communication.

u/intertroll · 5 pointsr/compsci

If you don’t want to real an actual textbook, this one will do the job (without skimping on details) and is more laypeople friendly:
https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

Just as an aside, I had a non techy friend who had a similar sense of mystification as OP’s but really wanted to understand them better, so I pointed him at this book. The next time I saw him we had a great conversation about logic gates and data representation. So it works! It was actually almost a cathartic experience, taking a person who doesn’t get to someone who does, since as a developer you often have to deal with users who don’t know and don’t care.

u/n06 · 1 pointr/ECE

This isn't a textbook, but this is one of my favorite down to earth books about computing and electronics I have ever read. My physical computing teacher gave it to me as a gift. It is called Code and it's very cool, and pretty cheap. It's well written too.

u/fossuser · 2 pointsr/compsci

If you're looking for a detailed, but approachable overview I'd highly recommend picking up this book: https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

I did CS and this book does a great job explaining how it all works from the ground up as well as how it was figured out. Putting everything in a historical context makes things a lot easier to understand and remember. It doesn't suffer from the common high level explanations that are useless to understand how something really works.

It's also just a fun read.

u/bladekill97 · 1 pointr/learnprogramming

I put C++ in the similar boat as C, both are good languages to start.

Also HTML is not programming language rather it is markup language used for structuring the web sites.

If you are going with C++ I would suggest:

u/Amablue · 1 pointr/askscience

If you want a very good overview of how computers work, you should try reading the book Code. It starts off talking about codes, like Morse Code and Binary. Then it moves on to light switches and batteries, and other neat constructions you can make with switches and relays, then it shows you how to build a simple adder. By the end of the book the author has basically given you an overview of how computers work from the logic gates all the way up to the processors and operating systems. It's a really good book, and each chapter flows pretty well to the next and it explains things in ways that are easy to understand.

u/Lhopital_rules · 2 pointsr/IWantToLearn

Possibly one of these? They were the only books about coding/computers with the name Charles that I could find. I'm guessing you're talking about the first one. It looks like a more popular version of things, but probably all still new stuff for me, so I'll check it out!

EDIT: The second one looks really promising too. Thanks for the suggestion!

Code: The Hidden Language of Computer Hardware and Software by Charles Petzold

Fundamentals of Logic Design by Jr. Charles H. Roth & Larry L. Kinney

u/twopoint718 · 1 pointr/compsci

A really interesting book that would complement (or be) a course in computer architecture is "The Feynman Lectures on Computation" http://www.amazon.com/Feynman-Lectures-Computation-Richard-P/dp/0738202967 This is a really fascinating book that explains computers from basic physics up to a useful machine that does work. It also has the virtue of being written by Feynman, someone with an amazing ability to explain things!

u/myrrlyn · 8 pointsr/learnprogramming

https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

This book is an excellent primer for a bottom-up look into how computers as machines function.

https://www.amazon.com/gp/aw/d/0123944244/ref=ya_aw_od_pi

This is my textbook from the class where we built a CPU. I greatly enjoy it, and it also starts at the bottom and works up excellently.

For OS development, I am following Philipp Opperman's excellent blog series on writing a simple OS in Rust, at http://os.phil-opp.com/

And as always Wikipedia walks and Reddit meanders fill in the gaps lol.

u/DrAmbulanceDriver · 5 pointsr/learnprogramming

I'm assuming you just want to learn the basic information about how computers work and the principles behind programming them, right?

In that case, I'd recommend Code by Charles Petzold

Are you looking to actually learn how to program and write code in a specific language? If so, then I'd recommend Automate the Boring Stuff with Python by Al Sweigart. It covers the basic principles of writing functions and how computer logic works, and you'll actually be able to apply it to some practical uses. And since its Python, it'll run on a lot of different platforms. If you like it, you may want to get into working with the Raspberry Pi. Javascript is another good language to start with, but as a book, I really like this one.

If you already know a bit about programming, and just want a general reference book, then Computer Science Illuminated by Dale and Lewis is pretty good.

u/Tibio · 2 pointsr/learnprogramming

I found this book to be a great overview from ground up. I have the physical copy but if you want to get it for free you know how..

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

It's a fairly easy read except the middle 40% about logic gates. Prepare to read through that part multiple times if you want to really understand it. But as you said that would be unnecessary, just for fun. I read it for the same purpose you did wanting some foundation.

u/bjzaba · 7 pointsr/ProgrammingLanguages

I thought the extreme use of comic sans on this site was pretty amusing. :D

The Lambda Cube is pretty cool - http://www.rbjones.com/rbjpub/logic/cl/tlc001.htm - I'm not sure the page does the best job at explaining it though. The first time I saw a good explanation was in Type Theory and Formal Proof: An Introduction.

u/Kal5 · 1 pointr/ireland

If you're unemployed for over a year, there are two state provided education schemes called Springboard and Momentum whose courses you can apply for. By my understanding, Springboard courses are put on by colleges/universities while Momentum courses are put on by private training companies (Not sure about that though as I think some colleges are putting out courses under Momentum). Most last about 9 months while some are 6 months.

If you are interested in computer science, I recommend you take course which teaches you a programming language solely before you take any courses that promise the moon, sun and stars. These usually try to cram too much in so you have a shite knowledge of everything.

Get learning a programming language out of the way and then everything becomes easier. Your choices are usually either c# or java, two object oriented languages. c# is microsoft's copy of java and is used to make stuff that runs on their architecture. Java is used in lots of places as it is fairly free to use. Android apps are made with it. If you know one, you know the other. There are minor differences. Basically if you know the concepts of any object oriented language, you can learn any of them very easily.

CCT used to put out a really excellent java only course but I don't see it there now. I only see this one which seems like an all encompassing one: http://www.springboardcourses.ie/details/3478

I really would suggest you write to them and ask them if they might put one on next September as there are two start times form programs throughout the year so you'll see different programs on offer depending on when you search.

Currently I see two java courses on the Momentum scheme. Both outside Dublin. I can't say how good they might be.

searching for java on Springboard only brings up these two courses, looks like you would need to know java before attempting either of those. But just have a look around those sites and search for computer related terms, "programming" etc.

If you learned java, you would be able to take this Android course on momentum:
www.devstream.io/android-developer
(can't see that listed on the momentum site for some reason but it is a momentum course)

You'd still have to pick up some database knowledge somewhere but a book on mysql would be easy enough to understand and do yourself. If you learned java, and then databases in your own time, and then did the android course, you could become an Android dev in about 2 years. Course you'd be missing a lot of knowledge about how computers work which degree course don't teach well anyway so to that end, just read this book:

http://www.amazon.co.uk/Code-Language-Computer-Hardware-Software/dp/0735611319/

u/Kerbobotat · 2 pointsr/videos

I highly, highly recommend [CODE - The hidden language of Hardware and Software](https://www.amazon.com/dp/0735611319/) (amazon link, non-referral)

This book explains how computers work from the fundementals, communicating with flashlights, morse code, tin cans on string, all the way up to modern computer hardware. Its really a great read.

I hope this helps a little bit on the way to understanding stuff. If you have any others you'd like to share as well, let me know!

u/dm0x48 · 1 pointr/Unity3D

Unity has already been well covered here by other redditors.

Nevertheless, if you are also in the process of learning how to program, I would like to contribute with two pointers.

Code: The Hidden Language of Computer Hardware and Software, by Charles Petzold

For the very basic of computer science

https://www.amazon.com/dp/0735611319

Inside C#, by Tom Archer and Andrew Whitechapel

This is quite old now but very easy to read and good to understand the language

https://www.amazon.com/dp/0735616485

If you are not in the US, just google around for other sources.

Happy hacking

u/guiraldelli · 1 pointr/compsci

Excellent! I'm glad to know the concept is clear to you.

I would recommend you to using the Lewis & Papadimitriou book as well as the Sipser one: for me, the former is more formal than the latter, that is more didatic (specially for undergraduate students); however, both use a simple language and are very didatic.

My advice is: take both books and keep studying by them. I've learned Theoretical Computer Science by the Lewis & Papadimitriou book, but always I couldn't get a concept, I went to Sipser. And vice-versa.

At last, the 2012 (3rd) edition of the Sipser is so beautiful, with good automata drawings to understand Pushdown Automata. :)

u/mcandre · 1 pointr/cscareerquestions

I wouldn't try to read programming tutorials during commutes, as programming is best learned by trying out exercises in a real text editor / terminal as you follow along in the book. Only something like Microsoft's Code would make for light, commuter reading.

As for podcasts, The Verge is a fun tech podcast.

I've started listening to Welcome to Night Vale, a fun, nontech podcast.

u/jimschubert · 3 pointsr/csharp

I recommend starting by teaching some version control basics. Focus on git and quickly cover others like TFS and subversion. You can read Pro Git for free.

If you teach a hardware/software course, CODE is an excellent book.

I also recommend C# in Depth. I would also think it'd be cool to offer points for contributing to StackOverflow or the new open source .NET projects on GitHub.

If you teach design/analysis or other classes focused on architecture, Adaptive Code via C# is pretty good. I'm only a few chapters in, but it discusses Scrum methodology, layering and tiers, as well as how to follow practices for writing great code.

I would also suggest a course on JavaScript. I have had to train too many Junior and Senior developers on how to write JavaScript. It's scary that many web developers don't even understand fundamentals.

u/MicturitionSyncope · 1 pointr/MachineLearning

There have already been a few books listed focusing on theory, so I'll add Machine Learning for Hackers to the list.

It doesn't cover much of the theory, but it's a nice start to getting the programming skills you need for machine learning. When you start using these techniques on real data, you'll quickly see that it's almost never a simple task to go from messy data to results. You need to learn how to program to clean your data and get it into a usable form to do machine learning. A lot of people use Matlab, but since they're free I do all of my programming in R and Python. There are a lot of good libraries/packages for these languages that will enable you to do a lot of cool stuff.

u/ienjoybuckyballs · 1 pointr/lostgeneration

Programming is not for everyone. People like to talk about future saturation of programmers but I don't believe it will happen simply because the majority of people will never cut it as programmers and of those that do most will be terrible at it. It's not a sleight on your brain power or intelligence to say you won't be a good programmer, it is simply an observation of the way your brain and mind work and the way you think. Most people just flat out do not think the way a programmer needs to think. It doesn't mean you can't give it a go and it doesn't mean you'll fail but it might mean you'll struggle to understand advanced concepts and may find yourself over your head and slow to adapt.

It's also important to make a distinction between a 'programmer' and a 'designer.' While there is plenty of work in web design, web design is not programming. Neither HTML nor CSS are programming languages. There is such a thing called web application development that involves HTML, CSS, and back end programming but this requires the knowledge of one or many programming languages such as PHP, Ruby, or C#.

For anyone interested in programming I would recommend you read this book before you start trying to learn programming. http://www.amazon.com/gp/product/0735611319/ref=oh_details_o00_s00_i00?ie=UTF8&psc=1 If you find yourself lost or confused and cannot finish, programming is not for you. Again, that doesn't mean you're stupid or less intelligent than anyone else it is simply an observation that you think differently. It is not an observation of inferiority, I can't stress that enough. If you can finish this book you will likely find success learning advanced programming concepts. Pick a language, any language, and start learning.

u/Haatveit88 · 1 pointr/learnprogramming

I understand how you feel, honestly - as someone who did poorly in school, and I am somewhat dyscalculic, I really feel like I can relate!

The important thing for you, in my opinion, based on your explanation there, is to look for learning materials that suit you. Some people learn easily from really academic materials, some (like me) don't - and fortunately, there are lots of materials out there trying different approaches to teaching this kind of stuff. It gets easier as you go, as well - once the ball starts rolling you find it much easier to grasp future concepts. I got a massive 1300 page book called "An introduction to Algorithms" many years ago... Introduction my arse. It might as well have been alien language to me. But now, years later, I can actually understand its contents. It definitely was not an introduction (but it is a great book, both physically and literally).

A few recommendations for actual introductory books on these subjects:

"A Common-Sense Guide to Data Structures and Algorithms" by Jay Wengrow (2nd Edition coming 2020)

This book says the following in the opening chapter:

>"The problem with most resources on these subjects is that they're...well...obtuse. Most texts go heavy on the math jargon, and if you're not a mathematician, it's really difficult to grasp" . . . "Because of this, many people shy away from these concepts, feeling like they are simply not 'smart' enough to understand them."

It's not a perfect book, but it goes into a lot of basic data structures and explains them in a not-insane way. It helped me a lot! Understanding not just how they work, but why they are useful, is so helpful.

"Grokking Algorithms: An illustrated guide for Programmers and other curious people" by Aditya Y. Bhargava

A similar book, however, more algorithm and less data structure focused, and it goes into somewhat more depth, although usually the extra material is considered optional. The author here expresses a similar concern that books and learning materials on these concepts are often very hard to understand, and it need not be that way!

You can learn these things, you just need to find the right book/method that works for you! It can take some searching to find it. I know from experience!

Read the books, try to implement some of their concepts, and then try applying those things to real problems (i.e. from HackerRank or similar sites, try more than just HR). Read the book again. Repeat. You will understand a bit more each time. That was what worked for me, at least.

u/yaymountainbiking · 1 pointr/learnprogramming

This, like all things, contains a varying amount of truthiness. I agree that beginners shouldn't use an IDE simply because they need to learn how to build an application from the ground up. However, professionally my work definitely benefits from an IDE, especially with larger code bases or when I get stuck something stupid like Java or Groovy. And sooner or later you WILL get stuck with something stupid like Java or Groovy.

Perhaps the best advice is to learn how a computer actually works. C, for example, is more representative of how a computer works than something like Javascript. Assembly, x86 or otherwise, is even better. Code by Charles Petzold is a good primer.

u/neop · 4 pointsr/compsci

I'm also a math major who turned into CS. There are already a lot of good recommendations here so I won't add much, but I suggest reading Code: The Hidden Language of Computer Hardware and Software by Charles Petzold.

It's not very technical and it's not in-depth, but I think it's an amazing book. You probably won't learn anything you're actually going to use by reading it, but I think this book has a unique ability for expressing the underlying facts that make us all find computer science so fascinating. It's a very fun read and it will give you a very broad overview of how computers work and how software gets compiled and ultimately ends up moving electrons around to make the magic happen.

u/Foryourconsideration · 1 pointr/programming

I fowned a great book at Chapters in Canada called Code. It's certainly very interesting as it deals with all sorts of "code" from Morse to C# and it looks at how they are related. A surprising break from tutorial books, which I also love.

u/cdubose · 1 pointr/IWantToLearn

There is actually a system to Braille. I read the first part of the book Code, and it does a great job explaining how someone might have first conceived of a system like Braille. For instance, notice the letters A though J. Then notice the letters K through T. The Braille patterns for K through T are the exact same as those for A through J but with the lower left dot filled in. Then notice the letters U through Z. With the exception of W, the last few letters of the Braille alphabet are like the first few letters, but with the bottom two dots filled in. (W doesn't match the pattern because W isn't part of nineteenth century French, the native language of Louis Braille).

Knowing some contextual information like this will help you memorize and understand the Braille alphabet better. I would start by learning the numbers associated with the different dot positions and go from there. This page is a good introduction I think.

u/kirang89 · 2 pointsr/AskComputerScience
u/Spasnof · 17 pointsr/learnprogramming

Awesome book Code , really helps you understand from a bottom up perspective. Super approachable without a CS background and does not need a computer in front of you to appreciate. Highly recommended.

u/Bozo_The_Brown · 1 pointr/computerscience

Let's convert 65 to A:

01000001 // 65

00110000 // "A" bitmap 8x8 pixels
01111000
11001100
11001100
11111100
11001100
11001100
00000000

This can be accomplished in hardware with an EEPROM, similar to the way you implement the 7-segment display in Ben's series. But instead of 4 bits representing a number decoded to 7 bits for the display (4 to 7), this is 7/8 bits representing the char code decoded to 8x8 bits for the display (7/8 to 64).

You could have 8x EEPROMS that each represent a row of output. Feed the same 7/8-bit address (char code) into all of them, and connect the outputs to a grid of lights. This is just for one character.

For a more practical idea, maybe connect the char code output to an arduino that does the decoding and renders on your computer monitor.

Conversion in software follows the same principle, just mapping the char codes to bitmaps. Not something that would be fun at the assembly level XD

Code by Petzold
and N2T for more fun.

u/KobayashiDragonSlave · 28 pointsr/learnprogramming

Not OP but I discovered this book 'Grokking Algorithms' from a fantastic youtube channel 'The Coding Train'. The book explains a lot of the algorithms and data structures that I am learning in my first sem of CS at school. This thing even has the stuff that I am going to learn in the next semster. I found this book much more fun than my monotonous textbooks.
If anyone wants to get a good grasp of the fundamentals of A&DS this is a great starting point and then move on to MOOCs by famous Universities. MIT's A&DS was the one that I used. Dunno if it's still available on YouTube because I remember that OCW courses were removed or something?

Link

u/obscure_robot · 3 pointsr/occult

HTML represents a particular line of thinking about how best to present information for machine processing and ultimately rendering so that a human can read it. The weight that HTML carries is entirely due to its success in the marketplace, it isn't a particularly good or bad exemplar as to how things should be done.

Some languages use paired symbols to indicate the beginning and end of a block of code, such as { and } or even begin and end. Others don't have an explicit beginning, but whatever valid command is first encountered is the beginning, and the end-of-file marker (a thing that no one but programmers and other very curious people ever see) signifies the end.

Hopcroft & Ullman's book on Automata Theory is a great place for someone with a magickal background to dive into computer science.

If you just want to see the many different ways of composing a very simply program (one that prints "hello, world!"), check out this list at Wikipedia.

I'm happy to answer more specific questions here or via private messages.

u/white_nerdy · 1 pointr/learnprogramming

> I want to be able to create functional programs with a bucket transistors, a cup of magnets, pen, and paper

I've heard good things about nand2tetris which goes from logic gates to a complete system with simple assembler, compiler and OS.

One good exercise might be to create an emulator for a simple system, like CHIP-8 or DCPU-16.

If you want to go deeper:

  • If you want to build compilers, the dragon book is the go-to resource.

  • If you want to start learning about theory, I recommend Sipser.
u/el3r9 · 5 pointsr/explainlikeimfive

I would this in a top level comment but it’s against the rules of the sub to do so, but OP can check out this book, called “Code” is a great, truly ELI5 intro to computers. If someone is interested they can check it out.

u/audionautics · 3 pointsr/videos

For the latter half, "the CPU executes instructions", there's a fantastic book called Code: the hidden language of computers, that, through a series of scenarios, takes you all the way from talking with your friend through a string and two tin cans, to flash lights, to morse code, to logic gates, transistors, and finally encoding information in bits and executing it on a CPU.

It's a super fun read.

u/cbarrick · 2 pointsr/computing

Sipser's Introduction to the Theory of Computation is the standard textbook. The book is fairly small and quite well written, though it can be pretty dense at times. (Sipser is Dean of Science at MIT.)

You may need an introduction to discrete math before you get started. In my udergrad, I used Rosen's Discrete Mathematics and Its Applications. That book is very comprehensive, but that also means it's quite big.

Rosen is a great reference, while Sipser is more focused.

u/DevilsWeed · 3 pointsr/darknetplan

As someone with zero programming experience, thank you for the reading list. I was just planning on trying to learn python but I don't know if that's the best language to start with. Would you recommend just reading those books and starting with C?

Also, since I have no experience a technical answer would probably go right over my head but could you briefly explain how someone would go about messing around with an OS? I've always wondered what people meant by this. I have Linux installed on a VM but I have no idea what I could do to start experimenting and learning about programming with it.

Edit: Are these the books you're talking about? Physical Computing, C programming, and Writing Great Code?

u/ljcoleslaw · 2 pointsr/learnprogramming

There is a great book called Code: The Hidden Language of Computer Hardware and Software which I give to people who are just starting out and are struggling with the bigger picture on computers.

It works from the bottom up without assuming you know anything. It's not a textbook on computer architecture, but it should satisfy your need for a high-level picture of what's going on and will be much easier to read and enjoy than looking up definitions people give you and piecing it together yourself.

u/Pandasmical · 11 pointsr/computerscience

I enjoyed this one!
Code: The Hidden Language of Computer Hardware and Software

Here is someone else's detailed review on it

"Charles Petzold a does an outstanding job of explaining the basic workings of a computer. His story begins with a description of various ways of coding information including Braille, Morse code, and binary code. He then describes the development of hardware beginning with a description of the development of telegraph and relays. This leads into the development of transistors and logic gates and switches. Boolean logic is described and numerous electrical circuits are diagramed showing the electrical implementation of Boolean logic. The book describes circuits to add and subtract binary numbers. The development of hexadecimal code is described. Memory circuits are assembled by stringing logic gates together. Two basic microprocessors are described - the Intel 8080 and the Motorola 6800. Machine language, assembly language, and some higher level software languages are covered. There is a chapter on operating systems. This book provides a very nice historical perspective on the development of computers. It is entertaining and only rarely bogs down in technical detail."

u/galahadredgrave · 9 pointsr/ArtificialInteligence

I'm just beginning this journey myself, so judge what I say accordingly.

Artificial Intelligence: A Modern Approach seems to be the most popular textbook.

This article has some seemingly good advice, though it seems to be geared more toward Machine Learning (ML) than AI in general.

I think you'll want to learn a programming language. The above article recommends Python as it is well suited to ML.

There is (was?) a free online course on ML from Stanford by Andrew Ng. I started to take it a couple years ago but never finished. It is very accessible. The lectures appear to be on YouTube.

Grokking Algorithms is a highly regarded book on algorithms.

Make a free Amazon Web Services account and start playing with Sagemaker.

There really is no well defined path to learning AI, in my opinion. It is a highly interdisciplinary endeavor that will require you to be a self-starting autodidact. It's very exciting though. There is still plenty of new ground to be broken. Some might argue it is difficult for the little guy to compete with big labs at the big tech companies with their ungodly amounts of data to feed their AI, but I am optimistic.

u/dionyziz · 3 pointsr/cryptography

Hi,

You should already know most of the math you need to know from your math major. It helps to know number theory, group theory, and algebraic curves, depending on what you do. Important knowledge is also discrete mathematics: Discrete probability theory, Markov chains, graph theory, logic, proof methods, solving recurrences, etc. are all helpful tools.

In terms of computer science, it's imperative to know how to program. You can learn a programming language such as Python. Project Euler is a good place to start for a mathematician. Knowledge of algorithms is also important, and you must understand computability and complexity theory.

Stanford's Cryptography I is a good place to start learning cryptography. I think you can go ahead and start the course without attempting prerequisites and see where you get stuck to go and learn what is required of you.

u/UncleMeat · 5 pointsr/compsci

I cannot recommend the book Code by Charles Petzold highly enough. This is the book that solidified my love of computer science and hits most of the major topics in CS in an easy to understand and thoroughly entertaining way. By the end of the book you have walked through the fundamentals of how to build and program a rudimentary computer and had fun why doing it!

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

u/scrabbles · 1 pointr/explainlikeimfive

Further to the excellent comments already left, if you want to investigate things later on in your own time, you might enjoy this book Code: The Hidden Language of Computer Hardware and Software. It explains programming and computer hardware fundamentals using excellent (and real - based on history) examples. I think even a 10 yr old would get a lot out of it, I would go so far as to recommend it to tech inclined parents for themselves and their children.

u/IrishTheHobbit · 2 pointsr/explainlikeimfive

If you are truly interested in how the computer performs these functions, this is a GREAT book. I found it easy to understand, and I think it will answer the question you have.

u/tl121 · 1 pointr/btc

Sorry. The sentence is not circular. It only appears to be circular. The ideas are clearly explained in text books on computability theory. Or if you are smart and patient you can just read Turing's original paper, or if you are really smart and really patient Goedel's work as well. Take your time and use your own mind to form your own opinion.

https://www.cs.virginia.edu/~robins/Turing_Paper_1936.pdf

https://en.wikipedia.org/wiki/G%C3%B6del%27s_incompleteness_theorems

My introduction was in this book:
https://www.amazon.com/Computability-Unsolvability-Prof-Martin-Davis/dp/0486614719

For more original sources:
https://www.amazon.com/Undecidable-Propositions-Unsolvable-Computable-Mathematics/dp/0486432289#reader_0486432289








u/factorysettings · 2 pointsr/pics

I'm a self taught programmer, so I don't know what CS degrees entail, but I highly recommend the book Code and also another one called The Elements of Computing Systems.

The former pretty much teaches you how a computer physically works and the latter teaches you how to build a processor and then write an OS for it. After reading those two books you pretty much know how computers work at every level of abstraction. I think that's the way programming should be taught.

u/RadioRoscoe · 1 pointr/funny

I have played on that learning circuit board thing, and it is truely awesome. The books that come with it are top notch. A nice compliment to it would be Petzoid's book, Code.

u/srnull · 54 pointsr/programming

Sorry to see this getting downvoted. Read the about page to get an idea of why /u/r00nk made the page.

I have to agree with one of the other comments that it is way too terse at the moment. I remember when we learnt about e.g. d-latches in school and it was a lot more magical and hard to wrap your head around at first then the page gives credit for. That and, or, and xor gates can be built up from just nand gates (the only logic gate properly explained) is also glossed over. Either go over it, or don't show the interiors of the other logic gates.

The interactive stuff is really neat. Good work on that.

Edit: If anyone reading wants to learn this stuff in more detail, two good books are

u/katyne · 3 pointsr/learnprogramming

First of all you will never understand everything on the level that you think you want to understand it. That just doesn't happen. Even with an advanced degree, even working for the industry for 10+ years you'll end up specializing in this or that and sort of having some idea about the other things. There's just too much stuff to learn. Those black boxes people talk about - they're called "abstractions", a way to simplify complex details in order to understand a concept. And in the beginning all you'll be doing is trying to understand concepts. When you were learning to drive a car you didn't need to know the very last mechanical detail of its engine, right? You just had an abstract idea of it, you knew you had to put in fuel and turn on ignition and why you had to shift gears and stuff, and that was enough. Same here. First learn to hold the wheel and steer, then choose the field you'd like to specialize in. But it will take you a lifetime to learn everything about everything - and that's if all you'll be doing is learning, not making anything of your own.

If you're like me you still need some introduction to "why" before you start learning "how" I would recommend this book - it's very approachable and sort of encompasses the general topics without going into much detail. Other than that it's hard to say anything because we don't know what your goal is, do you want to work as a web dev, a system programmer, or make games, write for mobile maybe? front end, backend, databases? It's like saying "I want to learn how to sport", well what kind of sport. They're all very different directions of virtually unlimited depth.

u/tolos · 5 pointsr/IWantToLearn

First, there are two requests: one from your title, and one from your description. The request from your title is a bit easier, though I'm afraid I won't be able to answer it satisfactorily. As far a real world example, that may be a bit harder because modern CPUs are pretty complicated. (When I was learning computer architecture at university, we never really discussed how an intel or AMD cpu worked -- just learned about the MSP430 and a couple of hypothetical CPUs. If you really want to see how a real CPU works, I'd suggest looking into a microprocessor to get started.)

A quick and dirty summary for MIPS CPU datapath Computer Organization and Design 4th ed page 315:

  1. The Program Counter (PC) loads the next instruction
  2. PC is incremented
  3. Instruction is parsed and the correct registers are loaded
  4. Registers are fed into ALU if necessary or
  5. Registers are passed into data memory (for read/write)
  6. Results from ALU/data memory are loaded back into registers
  7. next instruction

    Note that the MIPS example doesn't use pipe-lining, which generally makes things a bit faster. And there's only one code path executing. And there's no look ahead. Which isn't the case for modern CPUs.

    For further reading I highly recommend Code by Charles Petzold. I think it helped prepare me before going to college (for computer engineering).

    For video learning, a quick google search shows some videos that would probably be helpful (I haven't watched any of these).

    Sorry for the rushed response, I can expand on this more later if there's interest.
u/ThinqueTank · 3 pointsr/programming

I've actually been getting some interest lately because I just started to attend meetups last week and let some engineers see what I've done so far.

Really appreciate the algorithms/data structures advice. I picked up this book to get a light overview of it first before I really dive into something more formal:
Grokking Algorithms: An illustrated guide for programmers and other curious people

I also have enough college credits to take a Data Structures course called Discrete Structures for Computer Science and my math up to Linear Algebra completed. Here's the description for the community college course:

> This course is an introduction to the discrete structures used in
Computer Science with an emphasis on their applications. Topics
covered include: Functions; Relations and Sets; Basic Logic; Proof
Techniques; Basics of Counting; Graphs and Trees; and Discrete
Probability. This course is compliant with the standards of the Association
for Computing Machinery (ACM).

Is the above what you're referring to more or less?

Are there any books and/or online courses you'd personally recommend for data structures and algorithms?

u/Notlambda · 2 pointsr/dataisbeautiful

Sure. Without anything to go on, I'll just recommend some of my favorites. :)

  • Godel Escher Bach - Mindbending book that delves into connections between art, music, math, linguistics and even spirituality.
  • Code - The Hidden Language of Computer Hardware and Software - Ever wondered how the black box you're using to read this comment works? "Code" goes from transistor to a fully functioning computer in a sequential way that even a child could grasp. It's not at the "How to build your own computer from Newegg.com parts". It's more at the "How to build your own computer if you were trapped on a desert island" level, which is more theoretical and interesting. You get strong intuition for what's actually happening.
  • The Origin of Consciousness in the Breakdown of the Bicameral Mind - An intriguing looking into the theory that men of past ages actually hallucinated the voices of their gods, and how that led to the development of modern civilization and consciousness.
u/gnuvince · 1 pointr/programming

The classic Introduction to the Theory of Computation is quite excellent, though very pricey. Also, I had the pleasure of having a university class that used this book, so having a professor that can clear things up from the book was a big help.

u/Rikkety · 6 pointsr/AskComputerScience

Check out The Annotated Turing by Charles Petzold. It's Turing's paper on the Entscheidungsproblem which introduces Turing Machines, annotated with a lot of background information and some stuff about Turing's career. Very interesting stuff.

I can also recommend Code, by the same author which describes how a computer works from basic principles. It's doesn't have a lot of material on Turing, but it's certainly an interesting read for anyone interested in Comp Sci.

u/perrylaj · 1 pointr/learnpython

This isn't a programming book, or even a strictly computer science book. It's about Code, the history, the mindset, the means of communicating in binary/encoded processes, the evolution of signals. It touches on electronics, computers, coding, communication and so many other things. It's great for getting some history in an enjoyable format and lays some groundwork in understanding the roots of modern computing. I read it after having some years of experience in CS and still learned a lot of neat ways to look at the field from different perspectives.

http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1418450151&sr=8-1&keywords=code+the+language+of&pebp=1418450154033

Highly recommend it.

u/yggdrasilly · 3 pointsr/learnmath

Two great introductions are:

u/Zeroe · 1 pointr/learnprogramming

I found the Build a Modern Computer: From NAND to Tetris course to be very informative regarding the basics of computer architecture. They also have a book in print that covers all the same material, and the first half of the book and course are available on their website for free.

If you'd like something that is less detailed and just covers the ideas, I recommend the book Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. It starts from the basics of encoding information using systems like Morse code and binary numerals and shows how combining this with Boolean operations represented in hardware, we can construct machines that carry out general, programmable computations. He shows simple diagrams of relays and how those are assembled into gates, memory, and more complicated structures.

I highly recommend both resources.

u/shaggorama · 1 pointr/Python

Well, "machine learning" is a pretty big topic. For a lightweight crash course, I'd recommend checking out the coursera machine learning course provided through stanford taught by Andrew Ng, or check out the Will it Python? series of blog posts which ported the exercises in the book Machine Learning for Hackers from R to Python.

I'd say if you dedicated a full semester to the topic you'd still only barely be scratching the surface, so don't expect to come away with too much expertise in just a week. I definitely support pursuing machine learning though.

u/gineton2 · 6 pointsr/ComputerEngineering

For a gentle introduction, CODE: The Hidden Language of Computer Hardware and Software is a really pleasant read. It works its way up gradually, so maybe not the best fit for a physics student or someone who already understands the fundamentals. For someone new to the subject it's a great fit, however. Otherwise I see Patterson and Hennessy recommended.

u/paddingtonbear92 · 2 pointsr/cmu

Took it last spring.

The first third of the course covered basic logic stuff in a very pedantic way. The homework and notes are on OLI and are pretty much the same as Logic and Proofs (80210 i think).

Second third was about set theory stuff and incompleteness.

Last third was about Turing and undecidability, this was the text: https://www.amazon.com/gp/product/0486614719/

As someone who struggled with 251, I didn't find the material here particularly challenging. It's definitely the least rigorous of the CS logic electives. The first midterm was online, second midterm was half online and half in class, final was in class. I believe the 2nd midterm and final were curved pretty generously

Side note: the lectures are in the potato chip lecture hall in Scaife which gets so unbearably freaking hot in the spring semester I almost dropped the class because of it.

u/bdol · 1 pointr/gaming

This was a really great book that went through the basics of computer hardware and the software that controls it, from basic switches all the way up to displays. The author gives a lot of awesome examples to explain himself. I'm an EE, so the physics of computers interest me.

u/barlister · 2 pointsr/webdev

I really like the Write Great Code series. The first one contains pretty crucial information about low level stuff that webdev folks frequently don't learn, which kind of leaves us out to dry when it comes to number theory and other stuff that comes up occasionally.

http://www.amazon.com/gp/product/B0096FEJGQ?btkr=1

All of the volumes are very good though.

u/emcoffey3 · 3 pointsr/learnprogramming

Definitely take as many web-related classes as possible and at least one database class. The other two focus areas you mentioned are important, but probably not as important as the first two. Maximize your time on learning the important stuff; the other stuff can be learned later.

If you want to learn about UML diagrams, check out Martin Fowler's UML Distilled; it's an easy read and handy as a reference. Likewise, Charles Petzold's Code is one of my favorite hardware-related books.

u/anossov · 2 pointsr/AskProgramming

What subject exactly? Unicode is kinda complex, but try starting here.

Fundamentals like bits/bytes/numeral systems are annoyingly absent from sites like codecadamy.

Out of modern books, Charles Petzold's «Code» is regarded to be very enlightening.

u/DashAnimal · 4 pointsr/compsci

I know this is a pretty common recommendation and you've probably already heard of it
or even read it, but can I recommend the book Code: The Hidden Language of Computer Hardware and Software? I think having a history of how we got to where we are today (written in an entertaining way) is a good starting point, even though it barely scratches the surface of computer science.

u/sachal10 · 2 pointsr/learnmath

since you are a computer science student, you can start with proofs in Discrete Mathematics fo this you can look at Kenneth Rosen's book, it can help you with a lot of basic concepts, constructing proofs. Its a good book for those who want to go in algorithms or theoretical cs or a even want to work on pure maths problems. I had this same confusion I wanted to do maths but also cs with it. After this you can try "The art of computer programming"(this has 4 volumes) by Donald Knuth but CLRS is a must along with Rosen's if you want to take cs and maths side by side. If you want to explore further you can look at Design of Approximation Algorithms and Randomised Algorithms. These book can help you with concepts of probability, number theory, geometry, linear algebra etc. But then if you want pure math problems then search for them, go though different journals, SIAM and Combinatorica are really good ones, search them pick a problem you like, then find text relevant to problem and try to give better solutions.

u/falafel_eater · 3 pointsr/AskComputerScience

Computer Science is a pretty big field, so "strong foundation" can mean different things to different people.
You will definitely want the following:

  1. Introduction to Algorithms and Data Structures
  2. Introduction to Computability
  3. Introduction to Operating Systems

    For algorithms and data structures, a very commonly used textbook is Cormen.
    For computability, Sipser.

    Operating Systems I don't remember off the top of my head.
    That said, you are probably much better off finding a high-quality university course that is based on these textbooks instead of trying to read them cover-to-cover yourself. Check out lecture series from places like MIT on youtube or whatever.

    After that, you can take an Intro to Artificial Intelligence, or Intro to Communication Networks, or any other intro-level course to a more specific sub-area. But if you lack basis in computability to the point where you don't know what an NP-Complete problem is, or have no idea what a Binary Search Tree is, or do not know what an Approximation Algorithm is, then it would be hard to say you have a strong foundation in CS.
u/NothingWasDelivered · 1 pointr/computerscience

If you want a good, understandable explanation of this, read [Code]( Code (Developer Best Practices) by Charles Petzold http://www.amazon.com/dp/B00JDMPOK2/ref=cm_sw_r_udp_awd_7uYQub1VZ80TX) by Charles Petzold. He basically walks you through building a CPU from the ground up.

It's an excellent laypersons explanation of how computers work at a very fundamental level. How you can use relays (and transistors, their analog) to read 1's and 0's and make decisions based on that. From there you get to machine language (physically encoded into the chip), and everything above that is basically abstraction.

u/SlinkyBlue · 1 pointr/ITCareerQuestions

Thank you for the reply, that's very helpful. Obviously I have been able to benefit from the book I already have, it makes sense to continue along that avenue. [This]
(Https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_awdb_sq.azbES12DCH) is the book that has helped me out.

I live in Sacramento. There is a massive health care and education based economy here that has an expanding infrastructure that I want to get in on ASAP.

u/batmannigan · 1 pointr/ECE

Have you read the good book?. Joking aside, Code is an amazing book, which really tied alot of things together conceptually for me.

edit: My god I need to make a reddit bot, those are cool.

u/ShenmeNamaeSollich · 1 pointr/cscareerquestions

Why stay at a school where you're not studying what you want, and which doesn't even offer what you want nor one of the most popular in-demand majors??

Anyway, there are any number of online courses/tutorials about Data Structures, and how to build/use them in various languages. You can use C++ for them, or try to learn something else too. For speed & simplicity in interviews, a lot of people seem to prefer Python for discussing DS&A, but by their nature the concepts are fairly language-agnostic.

Try visualalgo for one ... there are plenty of others.

Since a lot of algorithms require/suggest the use of specific data structures to make them work, it's probably better to learn what those are first, and then try to tackle the algorithms that rely on them.

Grokking Algorithms - illustrated & pretty basic intro to concepts

Common Sense Guide to Data Structures and Algorithms - slightly less so, but still pretty basic intro to concepts

CTCI - problems covering arrays, linked lists, stacks & queues, trees, graphs ... Actually kind of useless if you don't already know what those are though.

Introduction to Algorithms (CLRS) - 1 of 2 standard U.S. college-level algorithms textbooks

Algorithms, 4th Ed. - the other standard U.S. college-level textbook, w/free online "book site", code, and a free Coursera course to go along with it.

u/Helix_van_Boron · 3 pointsr/compsci

Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. This is the book that made me decide to switch my college major to CS. It gave me great insight to what I was manipulating inside of a computer. It might not be very helpful to an experienced computer scientist, but I recommend it to anybody that's interested in getting into CS. And even if you understand all of the concepts in it, it's still an interesting read.

u/Ogi010 · 1 pointr/Python

can confirm, using this book in my MSCS program (my Uni doesn't have undergrad programs); I'm implementing a lot of the algorithms using custom objects in Python, so it's doing wonders for my object oriented game...

The book is also relatively straight forward to read (again, keep in mind I'm a graduate student), but other books such as the Algorithm Design Manual have been known to have easier to understand explanations and not be as theoretically deep.

Also lastly, for when I was testing my toes in the water, I flipped through Grokking Algorithms which for a short summary, and explanations on how things work is a great place to start (but of course you won't see this book in academic environments).

u/enteleform · 11 pointsr/compsci

Check out:
Grokking Algorithms: An illustrated guide for programmers and other curious people
 
I'm also pretty rusty at math right now, and have been getting by with a try-different-things-until-it-works approach.  Even for the types of problems I've become efficient at solving, in many cases I don't know the actual terminology, so it makes it difficult to expand upon concepts or communicate them with others.  I'd like to get to a point where I can mentally reason about processes & formulas without having to execute them in order to see the results, and I feel like the first step to get there is to get reacquainted with terminology & foundational concepts.  Here are Some Resources I've queued up to work through for that purpose.

u/donald-pinckney · 2 pointsr/types

Type Theory and Formal Proof covers metatheory all the way up to calculus of constructions, but I think it is at a fairly introductory level and tbh not that insightful from a mathematical perspective. However, it does give a pretty clear exposition of the type checking rules, and is easily converted to an implementation.

u/stateful · 1 pointr/programming

Some great responses here everyone, thank you. The book Write Great Code: Volume 1: Understanding the Machine helped me understand.

u/_--__ · 1 pointr/compsci

Wow. That's quite a range - normally those would be covered in two, three, or even four courses. Any wonder the pass rate is so low.

It's difficult to recommend anything sensible for the whole course - I'd have suggested Rosen - I have found it somewhat useful as a reference text later on - but it seems to be unpopular with amazon reviewers. For the latter part (finite-state machines etc), Sipser is a common course textbook - but it may be too advanced for a starting point (though might be worth getting so you can get ahead before you get to that part of the course, which will almost certainly be the trickiest).

u/wannabeproprogrammer · 3 pointsr/computerscience

Computer processors have historically been in powers of 2 in terms of instruction sets because of boolean logic and how it relates to binary numbers. If you were building a very rudimentary computer, i.e a circuit which is just on and off then you can say that the circuit only represents 2 states. This can be encoded as just 0 or 1. In fact this is what transistors do, they hold either an on or off state. Now let's say you want to represent more than two states? How would you go about that with what you already have? Well you introduce another transistor. Now you can represent 4 states. This is done following the same logic as before, so these 4 states can be 00, 01, 10, 11, where each digit corresponds to the on or off state of one of the transistors. In fact you can repeat this pattern ad-infinitum and keep adding more and more on-off holding transistors. What you'll find is when you do this, is that the number of states that your rudimentary CPU can hold will always correspond to the number of memory units that can be accessed at any time. This will be 2\^N hence the 32-bit and 64 bit numbers which are powers of two. The 32 and 64 correspond to the number of unique states that can represent a 0 or a 1 or an off or on, or rather the memory that can be accessed at any time.

​

Now in modern CPU's this can get a lot more complicated in terms of architecture as in reality CPU's may have million of transistors and have multi-cores but are still considered 32-bit or 64-bit. In reality the 32-bit and 64-bit in this context relates to the instruction set architecture of the CPU. This relates to the format of instructions that the CPU handles to perform operations i.e. in a 32 bit architecture, the CPU will handle instructions of 32 bits long which determine whether you're accessing memory, writing a value, triggering an interrupt signal and so on. If you really want to understand how CPU's work I recommend reading this book. It explains how CPU's work from a very rudimentary base all the way up to how machine code translates to actual CPU instructions. Hope this helped.

u/clavalle · 2 pointsr/zeroxtenc

This is a great resource.

Ninja Edit: This book is also good, and free! (PDF warning)

u/InvalidGuest · 7 pointsr/computerscience

I'd recommend this one: https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319

It's very enjoyable to read and definitely increases one's understanding of how computer conceptually function. Petzold also makes it very easy to understand what he is saying in his explanations.

u/t3h2mas · 1 pointr/learnprogramming

I think your plan is just fine. You may be distracted by the large amounts of choices you find along your way -- this thread is clear proof of that. Stay focused.

If you intend on starting with C I suggest a reading list kinda like this.

  • Code (good to read before you start to code)

  • K & R - The C Programming Language

  • LCTHW - Learn C the Hard Way (free to read from the site.)

    And then begin Java study. The first two may be a bit heavy at times... Work your way through and do your best, but don't except to get everything.

    You could start with Java. There's a fine community and great resources. I say go for the C -> Java route and get a taste of some low level fundamentals.