(Part 2) Reddit mentions: The best computer science books

We found 9,284 Reddit comments discussing the best computer science books. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 1,900 products and ranked them based on the amount of positive reactions they received. Here are the products ranked 21-40. You can also go back to the previous section.

21. Programming Game AI by Example (Wordware Game Developers Library)

    Features:
  • This beautiful chime candle holder is intended to give your Yule or winter holiday season a little more cheer
Programming Game AI by Example (Wordware Game Developers Library)
Specs:
Height9.21 Inches
Length6.09 Inches
Number of items1
Release dateOctober 2004
Weight1.62480687094 Pounds
Width1 Inches
▼ Read Reddit mentions

22. Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems

    Features:
  • O Reilly Media
Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems
Specs:
Height9.19 Inches
Length7 Inches
Number of items1
Release dateApril 2017
Weight2.17375790332 Pounds
Width1.29 Inches
▼ Read Reddit mentions

23. Introduction to the Theory of Computation

    Features:
  • Used Book in Good Condition
Introduction to the Theory of Computation
Specs:
Height9.5 Inches
Length6.5 Inches
Number of items1
Weight1.64905771976 Pounds
Width1 Inches
▼ Read Reddit mentions

25. Mathematics for 3D Game Programming and Computer Graphics, Third Edition

Used Book in Good Condition
Mathematics for 3D Game Programming and Computer Graphics, Third Edition
Specs:
Height9 Inches
Length7 Inches
Number of items1
Weight4.0565056208 pounds
Width1.5 Inches
▼ Read Reddit mentions

28. Designing Games: A Guide to Engineering Experiences

    Features:
  • O Reilly Media
Designing Games: A Guide to Engineering Experiences
Specs:
Height9 Inches
Length6 Inches
Number of items1
Release dateFebruary 2013
Weight1.23 Pounds
Width0.88 Inches
▼ Read Reddit mentions

30. Theory of Fun for Game Design

    Features:
  • O Reilly Media
Theory of Fun for Game Design
Specs:
Height9.25 Inches
Length7.5 Inches
Number of items1
Weight1.2 Pounds
Width0.55 Inches
▼ Read Reddit mentions

31. Fluent Python: Clear, Concise, and Effective Programming

    Features:
  • O Reilly Media
Fluent Python: Clear, Concise, and Effective Programming
Specs:
Height9.19 Inches
Length7 Inches
Number of items1
Weight3.03356072512 Pounds
Width1.57 Inches
▼ Read Reddit mentions

32. Violent Python: A Cookbook for Hackers, Forensic Analysts, Penetration Testers and Security Engineers

    Features:
  • Syngress
Violent Python: A Cookbook for Hackers, Forensic Analysts, Penetration Testers and Security Engineers
Specs:
Height9.25195 Inches
Length7.51967 Inches
Number of items1
Release dateNovember 2012
Weight1.029999688064 Pounds
Width0.6082665 Inches
▼ Read Reddit mentions

33. Quantum Computation and Quantum Information: 10th Anniversary Edition

    Features:
  • Cambridge University Press
Quantum Computation and Quantum Information: 10th Anniversary Edition
Specs:
Height9.8 Inches
Length7 Inches
Number of items1
Weight3.4392112872 Pounds
Width1.6 Inches
▼ Read Reddit mentions

34. Where Wizards Stay Up Late: The Origins Of The Internet

    Features:
  • Simon Schuster
Where Wizards Stay Up Late: The Origins Of The Internet
Specs:
Height8.4375 Inches
Length5.5 Inches
Number of items1
Release dateJanuary 1998
Weight0.65918216338 Pounds
Width0.8 Inches
▼ Read Reddit mentions

35. Superintelligence

    Features:
  • Great product!
Superintelligence
Specs:
Height0.5 Inches
Length6.75 Inches
Number of items1
Release dateMay 2015
Weight0.21875 Pounds
Width5.5 Inches
▼ Read Reddit mentions

36. Superintelligence: Paths, Dangers, Strategies

    Features:
  • a history of the study of human intelligence with some new ideas
Superintelligence: Paths, Dangers, Strategies
Specs:
Height6.2 inches
Length9.3 inches
Number of items1
Weight1.49693875898 Pounds
Width1 inches
▼ Read Reddit mentions

37. Computer Organization and Design: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design)

Computer Organization and Design: The Hardware/Software Interface (The Morgan Kaufmann Series in Computer Architecture and Design)
Specs:
Height9 Inches
Length7.5 Inches
Number of items1
Weight3.45905289078 Pounds
Width1.5 Inches
▼ Read Reddit mentions

38. On Intelligence: How a New Understanding of the Brain Will Lead to the Creation of Truly Intelligent Machines

    Features:
  • St Martin s Griffin
On Intelligence: How a New Understanding of the Brain Will Lead to the Creation of Truly Intelligent Machines
Specs:
Height8.25 Inches
Length5.3999892 Inches
Number of items1
Release dateJuly 2005
Weight0.54 Pounds
Width1.2 Inches
▼ Read Reddit mentions

39. Python Programming: An Introduction to Computer Science

    Features:
  • Used Book in Good Condition
Python Programming: An Introduction to Computer Science
Specs:
Height10 Inches
Length7 Inches
Number of items1
Release dateMay 2010
Weight2 Pounds
Width1.2 Inches
▼ Read Reddit mentions

40. Computer Systems: A Programmer's Perspective (2nd Edition)

Computer Systems: A Programmer's Perspective (2nd Edition)
Specs:
Height9.3 Inches
Length7.5 Inches
Number of items1
Weight2.8439631798 Pounds
Width1.6 Inches
▼ Read Reddit mentions

🎓 Reddit experts on computer science books

The comments and opinions expressed on this page are written exclusively by redditors. To provide you with the most relevant data, we sourced opinions from the most knowledgeable Reddit users based the total number of upvotes and downvotes received across comments on subreddits where computer science books are discussed. For your reference and for the sake of transparency, here are the specialists whose opinions mattered the most in our ranking.
Total score: 3,136
Number of comments: 26
Relevant subreddits: 5
Total score: 158
Number of comments: 94
Relevant subreddits: 3
Total score: 153
Number of comments: 18
Relevant subreddits: 1
Total score: 144
Number of comments: 22
Relevant subreddits: 3
Total score: 92
Number of comments: 19
Relevant subreddits: 6
Total score: 74
Number of comments: 30
Relevant subreddits: 5
Total score: 52
Number of comments: 16
Relevant subreddits: 3
Total score: 49
Number of comments: 21
Relevant subreddits: 2
Total score: 28
Number of comments: 21
Relevant subreddits: 9
Total score: -230
Number of comments: 68
Relevant subreddits: 9

idea-bulb Interested in what Redditors like? Check out our Shuffle feature

Shuffle: random products popular on Reddit

Top Reddit comments about Computer Science:

u/empleadoEstatalBot · 1 pointr/argentina
	


	


	


> # Teach Yourself Computer Science
>
>
>
> If you’re a self-taught engineer or bootcamp grad, you owe it to yourself to learn computer science. Thankfully, you can give yourself a world-class CS education without investing years and a small fortune in a degree program 💸.
>
> There are plenty of resources out there, but some are better than others. You don’t need yet another “200+ Free Online Courses” listicle. You need answers to these questions:
>
> - Which subjects should you learn, and why?
> - What is the best book or video lecture series for each subject?
>
> This guide is our attempt to definitively answer these questions.
>
> ## TL;DR:
>
> Study all nine subjects below, in roughly the presented order, using either the suggested textbook or video lecture series, but ideally both. Aim for 100-200 hours of study of each topic, then revist favorites throughout your career 🚀.
>
>
>
>
>
> Subject Why study? Best book Best videos Programming Don’t be the person who “never quite understood” something like recursion. Structure and Interpretation of Computer Programs Brian Harvey’s Berkeley CS 61A Computer Architecture If you don’t have a solid mental model of how a computer actually works, all of your higher-level abstractions will be brittle. Computer Organization and Design Berkeley CS 61C Algorithms and Data Structures If you don’t know how to use ubiquitous data structures like stacks, queues, trees, and graphs, you won’t be able to solve hard problems. The Algorithm Design Manual Steven Skiena’s lectures Math for CS CS is basically a runaway branch of applied math, so learning math will give you a competitive advantage. Mathematics for Computer Science Tom Leighton’s MIT 6.042J Operating Systems Most of the code you write is run by an operating system, so you should know how those interact. Operating Systems: Three Easy Pieces Berkeley CS 162 Computer Networking The Internet turned out to be a big deal: understand how it works to unlock its full potential. Computer Networking: A Top-Down Approach Stanford CS 144 Databases Data is at the heart of most significant programs, but few understand how database systems actually work. Readings in Database Systems Joe Hellerstein’s Berkeley CS 186 Languages and Compilers If you understand how languages and compilers actually work, you’ll write better code and learn new languages more easily. Compilers: Principles, Techniques and Tools Alex Aiken’s course on Lagunita Distributed Systems These days, most systems are distributed systems. Distributed Systems, 3rd Edition by Maarten van Steen 🤷‍
>
> ## Why learn computer science?
>
> There are 2 types of software engineer: those who understand computer science well enough to do challenging, innovative work, and those who just get by because they’re familiar with a few high level tools.
>
> Both call themselves software engineers, and both tend to earn similar salaries in their early careers. But Type 1 engineers grow in to more fullfilling and well-remunerated work over time, whether that’s valuable commercial work or breakthrough open-source projects, technical leadership or high-quality individual contributions.
>
>
>
> Type 1 engineers find ways to learn computer science in depth, whether through conventional means or by relentlessly learning throughout their careers. Type 2 engineers typically stay at the surface, learning specific tools and technologies rather than their underlying foundations, only picking up new skills when the winds of technical fashion change.
>
> Currently, the number of people entering the industry is rapidly increasing, while the number of CS grads is essentially static. This oversupply of Type 2 engineers is starting to reduce their employment opportunities and keep them out of the industry’s more fulfilling work. Whether you’re striving to become a Type 1 engineer or simply looking for more job security, learning computer science is the only reliable path.
>
>
>
>
>
> ## Subject guides
>
> ### Programming
>
> Most undergraduate CS programs start with an “introduction” to computer programming. The best versions of these courses cater not just to novices, but also to those who missed beneficial concepts and programming models while first learning to code.
>
> Our standard recommendation for this content is the classic Structure and Interpretation of Computer Programs, which is available online for free both as a book, and as a set of MIT video lectures. While those lectures are great, our video suggestion is actually Brian Harvey’s SICP lectures (for the 61A course at Berkeley) instead. These are more refined and better targeted at new students than are the MIT lectures.
>
> We recommend working through at least the first three chapters of SICP and doing the exercises. For additional practice, work through a set of small programming problems like those on exercism.
>
> For those who find SICP too challenging, we recommend How to Design Programs. For those who find it too easy, we recommend Concepts, Techniques, and Models of Computer Programming.
>
>
>
> [Structure and Interpretation of Computer Programs](https://teachyourselfcs.com//sicp.jpg)
>
>
>
> ### Computer Architecture
>
> Computer Architecture—sometimes called “computer systems” or “computer organization”—is an important first look at computing below the surface of software. In our experience, it’s the most neglected area among self-taught software engineers.
>
> The Elements of Computing Systems, also known as “Nand2Tetris” is an ambitious book attempting to give you a cohesive understanding of how everything in a computer works. Each chapter involves building a small piece of the overall system, from writing elementary logic gates in HDL, through a CPU and assembler, all the way to an application the size of a Tetris game.
>
> We recommend reading through the first six chapters of the book and completing the associated projects. This will develop your understanding of the relationship between the architecture of the machine and the software that runs on it.
>
> The first half of the book (and all of its projects), are available for free from the Nand2Tetris website. It’s also available as a Coursera course with accompanying videos.
>
> In seeking simplicity and cohesiveness, Nand2Tetris trades off depth. In particular, two very important concepts in modern computer architectures are pipelining and memory hierarchy, but both are mostly absent from the text.
>
> Once you feel comfortable with the content of Nand2Tetris, our next suggestion is Patterson and Hennesy’s Computer Organization and Design, an excellent and now classic text. Not every section in the book is essential; we suggest following Berkeley’s CS61C course “Great Ideas in Computer Architecture” for specific readings. The lecture notes and labs are available online, and past lectures are on the Internet Archive.
>
>
>
>
>
> ### Algorithms and Data Structures
>
> We agree with decades of common wisdom that familiarity with common algorithms and data structures is one of the most empowering aspects of a computer science education. This is also a great place to train one’s general problem-solving abilities, which will pay off in every other area of study.
>
> There are hundreds of books available, but our favorite is The Algorithm Design Manual by Steven Skiena. He clearly loves this stuff and can’t wait to help you understand it. This is a refreshing change, in our opinion, from the more commonly recommended Cormen, Leiserson, Rivest & Stein, or Sedgewick books. These last two texts tend to be too proof-heavy for those learning the material primarily to help them solve problems.
>

> (continues in next comment)

u/CodyDuncan1260 · 2 pointsr/gamedev

Game Engine:

Game Engine Architecture by Jason Gregory, best you can get.

Game Coding Complete by Mike McShaffry. The book goes over the whole of making a game from start to finish, so it's a great way to learn the interaction the engine has with the gameplay code. Though, I admit I also am not a particular fan of his coding style, but have found ways around it. The boost library adds some complexity that makes the code more terse. The 4th edition made a point of not using it after many met with some difficulty with it in the 3rd edition. The book also uses DXUT to abstract the DirectX functionality necessary to render things on screen. Although that is one approach, I found that getting DXUT set up properly can be somewhat of a pain, and the abstraction hides really interesting details about the whole task of 3D rendering. You have a strong background in graphics, so you will probably be better served by more direct access to the DirectX API calls. This leads into my suggestion for Introduction to 3D Game Programming with DirectX10 (or DirectX11).



C++:

C++ Pocket Reference by Kyle Loudon
I remember reading that it takes years if not decades to become a master at C++. You have a lot of C++ experience, so you might be better served by a small reference book than a large textbook. I like having this around to reference the features that I use less often. Example:

namespace
{
//code here
}

is an unnamed namespace, which is a preferred method for declaring functions or variables with file scope. You don't see this too often in sample textbook code, but it will crop up from time to time in samples from other programmers on the web. It's $10 or so, and I find it faster and handier than standard online documentation.



Math:

You have a solid graphics background, but just in case you need good references for math:
3D Math Primer
Mathematics for 3D Game Programming

Also, really advanced lighting techniques stretch into the field of Multivariate Calculus. Calculus: Early Transcendentals Chapters >= 11 fall in that field.



Rendering:

Introduction to 3D Game Programming with DirectX10 by Frank. D. Luna.
You should probably get the DirectX11 version when it is available, not because it's newer, not because DirectX10 is obsolete (it's not yet), but because the new DirectX11 book has a chapter on animation. The directX 10 book sorely lacks it. But your solid graphics background may make this obsolete for you.

3D Game Engine Architecture (with Wild Magic) by David H. Eberly is a good book with a lot of parallels to Game Engine Architecture, but focuses much more on the 3D rendering portion of the engine, so you get a better depth of knowledge for rendering in the context of a game engine. I haven't had a chance to read much of this one, so I can't be sure of how useful it is just yet. I also haven't had the pleasure of obtaining its sister book 3D Game Engine Design.

Given your strong graphics background, you will probably want to go past the basics and get to the really nifty stuff. Real-Time Rendering, Third Edition by Tomas Akenine-Moller, Eric Haines, Naty Hoffman is a good book of the more advanced techniques, so you might look there for material to push your graphics knowledge boundaries.



Software Engineering:

I don't have a good book to suggest for this topic, so hopefully another redditor will follow up on this.

If you haven't already, be sure to read about software engineering. It teaches you how to design a process for development, the stages involved, effective methodologies for making and tracking progress, and all sorts of information on things that make programming and software development easier. Not all of it will be useful if you are a one man team, because software engineering is a discipline created around teams, but much of it still applies and will help you stay on track, know when you've been derailed, and help you make decisions that get you back on. Also, patterns. Patterns are great.

Note: I would not suggest Software Engineering for Game Developers. It's an ok book, but I've seen better, the structure doesn't seem to flow well (for me at least), and it seems to be missing some important topics, like user stories, Rational Unified Process, or Feature-Driven Development (I think Mojang does this, but I don't know for sure). Maybe those topics aren't very important for game development directly, but I've always found user stories to be useful.

Software Engineering in general will prove to be a useful field when you are developing your engine, and even more so if you have a team. Take a look at This article to get small taste of what Software Engineering is about.


Why so many books?
Game Engines are a collection of different systems and subsystems used in making games. Each system has its own background, perspective, concepts, and can be referred to from multiple angles. I like Game Engine Architecture's structure for showing an engine as a whole. Luna's DirectX10 book has a better Timer class. The DirectX book also has better explanations of the low-level rendering processes than Coding Complete or Engine Architecture. Engine Architecture and Game Coding Complete touch on Software Engineering, but not in great depth, which is important for team development. So I find that Game Coding Complete and Game Engine Architecture are your go to books, but in some cases only provide a surface layer understanding of some system, which isn't enough to implement your own engine on. The other books are listed here because I feel they provide a valuable supplement and more in depth explanations that will be useful when developing your engine.

tldr: What Valken and SpooderW said.

On the topic of XNA, anyone know a good XNA book? I have XNA Unleashed 3.0, but it's somewhat out of date to the new XNA 4.0. The best looking up-to-date one seems to be Learning XNA 4.0: Game Development for the PC, Xbox 360, and Windows Phone 7 . I have the 3.0 version of this book, and it's well done.

*****
Source: Doing an Independent Study in Game Engine Development. I asked this same question months ago, did my research, got most of the books listed here, and omitted ones that didn't have much usefulness. Thought I would share my research, hope you find it useful.

u/Nicholas-DM · 1 pointr/worldnews

I watched this interview earlier today, so after reading this article, I'm a tad disappointed. Artificial intelligence and a Brain Machine Interface are two things I'm super interested in, and this particular technology editor wrote one of the crappiest articles I've read over it.

So here is the article, points, counterpoints, the whole shebang.

---

Article


> Elon Musk smoked pot and drank whiskey on the Joe Rogan podcast..."

He did indeed smoke pot and drink whiskey on the podcast. He had one puff of the pot, and drank one glass of the whiskey. And the pot was near the end. Nothing really serious about this, insofar as I am aware.


> "... and said he's going to soon announce a new "Neuralink" product that can make anyone superhuman."

Outright fabrication. Elon did not remotely say that he's going to soon announce a new Neuralink product that can make anyone superhuman, or suggest that anyone will have anything like that soon.


> "'I think we'll have something interesting to announce in a few months ... that's better than anyone thinks is possible,' the Tesla CEO said on 'Joe Rogan Experience.' 'Best case scenario, we effectively merge with AI.'"

Alright. Those are two actual quotes!

The first quote-- yes, Elon said that he'll have something interesting, possibly, in a few months. Specifically, he says that it is about an order of magnitude better than anyone thinks is possible.

The second sentence is a mostly unrelated part of the conversation about different ways to counter Artificial General Intelligence, which may be an existential threat to humanity and is a possibility. More on this at the end.


> Musk, whose enterprises include a company called Neuralink, says his new technology will be able to seamlessly combine humans with computers, giving us a shot at becoming "symbiotic" with artificial intelligence.

He does not say this at all in the interview. He suggests that becoming symbiotic with an interface that is like an AI is likely the best way forward for mankind, out of the different options. He goes on to explain, though he doesn't use the term, of how an emergent consciousness would work.


> Musk argued that since we're already practically attached to our phones, we're already cyborgs. We're just not as smart as we could be because the data link between the information we can get from our phones to our brains isn't as fast as it could be.

Accurate reporting here, and in the spirit of the actual interview. It doesn't really explain what he means by this, but that'd be a bit much for an article, wouldn't it?


ARTICLE BREAK FOR A QUICK PICTURE IN THE ARTICLE!

> Picture of Elon hitting a blunt

I think it's a blunt, not a spliff. Perfectly alright explaining my thought process if asked.


> "It will enable anyone who wants to have superhuman cognition," Musk said. "Anyone who wants."

I'll have to rewatch the interview to get the exact wording, but I watched it earlier today. I'm pretty confident Elon said 'would', not 'will'. Which doesn't seem like much, but makes a world of difference.

At this point, he is describing what it would be like to have an interface that you could control by thought.


> "Rogan asked how much different these cyborg humans would be than regular humans, and how radically improved they might be."

> "'How much smarter are you with a phone or computer or without? You're vastly smarter, actually,' Musk said. 'You can answer any question pretty much instantly. You can remember flawlessly. Your phone can remember videos [and] pictures perfectly. Your phone is already an extension of you. You're already a cyborg. Most people don't realize you're already a cyborg. It's just that the data rate ... it's slow, very slow. It's like a tiny straw of information flow between your biological self and your digital self. We need to make that tiny straw like a giant river, a huge, high-bandwidth interface.'"

At this point, the cyborg thing is explained a little bit better. The article times it and changes the order of the interview a bit to make him look like a crackpot idiot, but this part is pretty true to form. It doesn't really give much context around the rest of the conversation in the interview, that led up to that, explained ideas before, that sort of thing. But a good paragraph for the article.


> "Musk, who spoke about Neuralink before he smoked pot on the podcast..."

We know he smoked pot.


> "...said this sort of technology could eventually allow humans to create a snapshot of themselves that can live on if our bodies die."

> "'If your biological self dies, you can upload into a new unit. Literally,' Musk said."

This was definitely mentioned as an aside, and as a possibility, by Elon. It did actually explain how it would work. Also, it wasn't a snapshot-- people who study this know there is a big difference between a transition and a snapshot, and Elon did not at all imply it was a snapshot, it was spoken of as if it was a transition-- which is key. But not really something the average person studies, either-- so of course not explaining it.


> "Musk said he thinks this will give humans a better chance against artificial intelligence."

> "'The merge scenario with AI is the one that seems like probably the best. If you can't beat it, join it,' Musk said."

The article manages to make this, which is perhaps the most important section of the interview and a terribly important part of humanity, two short lines with no explanation in such a way that makes the person look like an idiot, ignoring everything he otherwise explained.


> "Tesla's stock took a hit after the bizarre appearance and revelations Friday that two Tesla executives are leaving."

Tesla's stock did indeed take a hit. It's an extremely volatile stock with good and bad news constantly. I personally fail to see how it relates to this article, though-- much like a hit of pot and a glass of whiskey.

---

An actual explanation


Elon Musk started a company called Neuralink somewhat recently. It brought together a board which consists of doctors, engineers, scientists, surgeons-- and in particular, people who were studied in multiples of those fields.

The end goal of Neuralink is to create a low-cost non-invasive brain machine interface (BMI), which would allow you to basically access the internet by thought. Notable is that you would both send and receive messages that your brain could then directly interpret.

With your phone, you can access most of the world's knowledge at your fingertips. The catch with that is that it is a tad slow. You have to pull your phone out, type out words with two thumbs, have pages load slowly, that sort of thing. In this way, you can think of your phone as an extension of yourself, and yourself as a sort of clumsy cyborg.

The company isn't far. I believe I read somewhere that its current goals range on medical uses. Elon mentioned in the interview that they might have something to announce (not even necessarily a product) in a few months. He also uses one of his favorite terms-- it will be an order of magnitude better than anything currently thought possible (by the general public). It will likely be medical in nature and impressive, but not revolutionary.

Actual success is a long, long way off, and nothing Elon said in the interview suggests otherwise.

So that's the gist of the article. As for the actual interview.

Joe Rogan interviewed Elon Musk on his podcast recently, where they discussed lots of things (The Boring Machine, AI, Neuralink, Tesla, SpaceX-- those sorts of things.)

They spent about three hours talking about things, Elon and Joe had a cup of whiskey, Elon had a hit from a blunt, Joe a few hits-- the entire interview was a pretty casual thing. Not a product announcement, nothing like that.

Not at all like this particular technology editor made it out to be.

And that's about it. I have some links on actually interesting reading for this down below.

---

Some resources!


http://podcastnotes.org/2018/09/07/elon/ - Some notes about the interview, and good summary.

https://www.youtube.com/watch?v=ycPr5-27vSI - The actual interview, tad long. AI stuff is the first topic and ends at roughly 33 minute mark.

https://waitbutwhy.com/2017/04/neuralink.html - Article over Neuralink, explaining the company and goal from pretty simple beginnings. Easy to read, wonderfully explanatory.

https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/1501227742 - Superintelligence: Paths, Stranger, and Strategies. Covers artificial general intelligence, why it is a threat, and ways to handle it. Pretty much the entire goal of Neuralink is based off of this book, and it's a very reasonable and quality book.

u/Horizivertigraph · 16 pointsr/QuantumComputing

Don't get discouraged, it's possible to get to a reasonable understanding with some sustained effort. However, you need to get the following into your head as quickly as possible:

Popular level explanations of anything quantum are a waste of your time.

Go back and read that again. You will never get close to understanding the field if you rely on someone else managing to "find the right metaphors" for you. Quantum computing is a mathematical field, and if you want to understand a mathematical field, you need to do mathematics. This sounds super scary, but it's actually no problem! Math is not what you think it is, and is actually a lot of fun to learn. You just need to put some work in. This just means maybe doing an hour or so of learning every day before you go to work, or afterwards.

Let's look at a little bit of a roadmap that you can follow to get to a reasonable understanding of quantum computing / quantum information. This is pretty much the path I followed, and now I am just about to submit my PhD thesis on quantum computational complexity. So I guess it worked out OK.

  1. You can get really far in quantum computing with some basic understanding of linear algebra. Go to Khan Academy and watch their fantastic introduction.

    If Sal asks you to do an exercise, do the exercise.

  2. Now you know what a vector is, can kind of grasp what a vector space is, and have some good intuition on how matrix-vector and matrix-matrix multiplication works, then you can probably make a reasonable start on this great intro book: https://www.amazon.co.uk/Quantum-Computing-Computer-Scientists-Yanofsky/dp/0521879965

    Start from the start, take it slowly, and do all of the exercises. Not some of the exercises, do all of the exercises. If you don't know a term, then look it up on wikipedia. If you can't do an exercise, look up similar ideas on Google and see if you can muddle your way through. You need to get good at not being scared of mathematics, and just pushing through and getting to an answer. If there is an explanation that you don't understand, look up that concept and see if you can find somebody else's explanation that does it better. Do the first few intro chapters, then dip in to some of the other chapters to see how far you get. You want to get a pretty good coverage of the topics in the book, so you know that the topics exist and can increase your exposure to the math involved.

  3. If you manage to get through a reasonable chunk of the book from point 2), then you can make a start on the bible: Quantum information and computation by Nielsen and Chuang (https://www.amazon.co.uk/Quantum-Computation-Information-10th-Anniversary/dp/1107002176/ref=pd_lpo_sbs_14_img_1?_encoding=UTF8&psc=1&refRID=S2F1RQKXKN2268JJF3M2). Start from the start, take it slowly, and do all of the exercises.

    Nielsen and Chuang is not easy, but it's doable if you utilise some of the techniques I mention in point 2): Google for alternative explanations of concepts that the book explains in a way that confuses you, do all of the exercises, and try to get good coverage throughout the whole book. Make sure you spend time on the early linear algebra and basic quantum chapters, because if you get good at that stuff then the world is your oyster.

    Edit:

    Just remembered two more excellent resources that really helped me along the way

    A) Quantum mechanics and quantum computation, a video lecture course by Umesh Vazirani (YouTube playlist here) is fantastic. Prof. Vazirani is one of the fathers of the field of quantum computing, with a bunch of great results. His lecture course is very clear, and definitely worth devoting serious attention to. Also, he has a wonderful speaking voice that is very pleasant to listen to...

    B) Another lecture course called "Quantum Computing for the determined", this time given by Michael Nielsen (YouTube playlist here). In my opinion Nielsen is one of the best scientific communicators alive today (see also his unrelated discourse on neural networks and machine learning, really great stuff), and this series of videos is really great. Communicating this sort of stuff well to non-practitioners is pretty much Nielsen's whole jam (he quit academia to go on and write about science communication ), so it's definitely worth looking at.
u/Orthak · 3 pointsr/mylittleandysonic1

Unity is the bee's knees.
I've been messing with it casually for several years, and got serious in the last 2-ish years. I like it because I get to use C#, and that's the language I know best. Only problem in it's using some weird limbo version of .NET 2, that's not actually 2.0 but is also 3.0 is some places? I think it's because it's using Mono 2.0, which is some subset of .NET. It's weird. They're moving to 4.5 soon anyways so I'm hype for that. I'ts been a lot of fun regardless, I get to apply a different knowledge and tool set from my day job. Not to mention it feels great when you actually get something to build and actually work.

So anyways here's a list of resources I've found over the years to be super helpful:

Things on Reddit

u/shred45 · 6 pointsr/gatech

So, when I was younger, I did attend one computer science related camp,

https://www.idtech.com

They have a location at Emory (which I believe I did one year) that was ok (not nearly as "nerdy"), and one at Boston which I really enjoyed (perhaps because I had to sleep on site). That being said, the stuff I learned there was more in the areas of graphic design and/or system administration, and not computer science. They are also quite expensive for only 1-2 weeks of exposure.

I felt it was a good opportunity to meet some very smart kids though, and it definitely lead me to push myself. Knowing and talking to people that are purely interested in CS, and are your age, is quite rare in high school. I think that kind of perspective can make your interests and hobbies seem more normal and set a much higher bar for what you expect for yourself.

On the other side of things, I believe that one of the biggest skills in any college program is an openness to just figure something out yourself if it interests you, without someone sitting there with you. This can be very helpful in life in general, and I think was one of the biggest skills I was missing in high school. I remember tackling some tricky stuff when I was younger, but I definitely passed over stuff I was interested in just because I figured "thats for someone with a college degree". The fact is that experience will make certain tasks easier but you CAN learn anything you want. You just may have to learn more of the fundamentals behind it than someone with more experience.

With that in mind, I would personally suggest a couple of things which I think would be really useful to someone his age, give him a massive leg up over the average freshman when he does get to college, and be a lot more productive than a summer camp.

One would be to pick a code-golf site (I like http://www.codewars.com) and simply try to work through the challenges. Another, much more math heavy, option is https://projecteuler.net. This, IMO is one of the best ways to learn a language, and I will often go there to get familiar with the syntax of a new language. I think he should pick Python and Clojure (or Haskell) and do challenges in both. Python is Object Oriented, whilst Clojure (or Haskell) is Functional. These are two very fundamental and interesting "schools of thought" and if he can wrap his head around both at this age, that would be very valuable.

A second option, and how I really got into programming, is to do some sort of web application development. This is pretty light on the CS side of things, but it allows you to be creative and manage more complex projects. He could pick a web framework in Python (flask), Ruby (rails), or NodeJS. There are numerous tutorials on getting started with this stuff. For Flask: http://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world. For Rails: https://www.railstutorial.org. This type of project could take a while, there are a lot of technologies which interact to make a web application, but the ability to be creative when designing the web pages can be a lot of fun.

A third, more systems level, option (which is probably a bit more opinionated on my part) is that he learn to use Linux. I would suggest that he install VirtualBox on his computer, https://www.virtualbox.org/wiki/Downloads. He can then install Linux in a virtual machine without messing up the existing OS (also works with Mac). He COULD install Ubuntu, but this is extremely easy and doesn't really teach much about the inner workings. I think he could install Arch. https://wiki.archlinux.org. This is a much more involved distribution to install, but their documentation is notoriously good, and it exposes you to a lot of command line (Ubuntu attempts to be almost exclusively graphical). From here, he should just try to use it as much as possible for his daily computing. He can learn general system management and Bash scripting. There should be tutorials for how to do just about anything he may want. Some more advanced stuff would be to configure a desktop environment, he could install Gnome by default, it is pretty easy, but a lot of people really get into this with more configurable ones ( https://www.reddit.com/r/unixporn ). He could also learn to code and compile in C.

Fourth, if he likes C, he may like seeing some of the ways in which programs which are poorly written can be broken. A really fun "game" is https://io.smashthestack.org. He can log into a server and basically "hack" his way to different levels. This can also really expose you to how Linux maintains security (user permissions, etc. ). I think this would be much more involved approach, but if he is really curious about this stuff, I think this could be the way to go. In this similar vein, he could watch talks from Defcon and Chaos Computer Club. They both have a lot of interesting stuff on youtube (it can get a little racy though).

Finally, there are textbooks. These can be really long, and kinda boring. But I think they are much more approachable than one might think. These will expose you much more to the "Science" part of computer science. A large portions of the classes he will take in college look into this sort of stuff. Additionally, if he covers some of this stuff, he could look into messing around with AI (Neural Networks, etc.) and Machine Learning (I would check out Scikit-learn for Python). Here I will list different broad topics, and some of the really good books in each. (Almost all can be found for free.......)

General CS:
Algorithms and Data Structures: https://mitpress.mit.edu/books/introduction-algorithms
Theory of Computation: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X
Operating Systems: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/0470128720

Some Math:
Linear Algebra: http://math.mit.edu/~gs/linearalgebra/
Probability and Stats: http://ocw.mit.edu/courses/mathematics/18-05-introduction-to-probability-and-statistics-spring-2014/readings/

I hope that stuff helps, I know you were asking about camps, and I think the one I suggested would be good, but this is stuff that he can do year round. Also, he should keep his GPA up and destroy the ACT.

u/dotslashzero · 2 pointsr/Cplusplus

> Thanks for the comprehensive reply.

No problem.

> I'm already familiar with the basics of computer architecture, although I could certainly stand to know more. If you know of any good medium-level textbooks on that, I'd appreciate it.

I learned computer architectures using this book (but with an earlier edition): http://www.amazon.com/Computer-Architecture-Fifth-Edition-Quantitative/dp/012383872X. I think this is the same book they use in MIT.

> I realize that this is the case notionally, but I've come across posts in various places on the internet that claim compiler optimizers will sometimes inline recursive functions up to some arbitrary compiler-dependent depth so that a single stack creation can handle a recursion depth up to that arbitrary count, making the performance difference between recursion and iteration much less significant for relatively small scales. Then there's also tail recursion optimization where the recursion can go arbitrarily deep without needing to increase the stack.

You said it yourself twice:

  • up to that arbitrary count
  • can go arbitrarily deep

    That is the thing, because it is arbitrary, the only way you will be able to tell is to look at the compiled code and check if the recursion/iterative loop was optimized as you expect it to. Compilers have many levels of optimization. There is compile time optimization, which we are all familiar with. There is also link time optimization where optimization happens during link stage. There is also a technique performed by LLVM where the compiler creates an intermediate language bit code then perform optimization during link time based on the generated bit code. I am sure there are other optimization stages/techniques that exist on other compilers. These different levels of optimization stages use varying criteria to judge how a code will be optimized. You will need to check the resulting binary, using the optimization switches of your choice, whether the compiler's technique is used or not.

    > The problem is that I don't have any authoritative sources on any of these optimizer techniques, so there might be lots of useful tricks that I'm not exploiting.

    To be honest, many of these things are learned more through experience. You are on the right track on being curious about what the compiler does (or specifically, how it optimizes code), but in most cases, you will probably learn them through your own experiments (e.g. getting curious about whether solution y works better than solution x, then doing it, and then profiling and verifying through the internet forums).

    > So I guess compiler documentation is where I need to look to find these things out?

    Cannot say that it isn't, but I doubt compiler manuals/documentation will talk about these things in detail. It is your best bet, but most likely, you will have to go through the source code of the compiler (if it is open source), or find some white papers talking about how such optimizations are achieved by the said compiler.
u/hell_onn_wheel · 13 pointsr/Python

Good on you for looking to grow yourself as a professional! The best folks I've worked with are still working on professional development, even 10-20 years in to their profession.

Programming languages can be thought of as tools. Python, say, is a screwdriver. You can learn everything there is about screwdrivers, but this only gets you so far.

To build something you need a good blueprint. For this you can study objected oriented design (OOD) and programming (OOP). Once you have the basics, take a look at design patterns like the Gang of Four. This book is a good resource to learn about much of the above

What parts do you specify for your blueprint? How do they go together? Study up on abstract data types (ADTs) and algorithms that manipulate those data types. This is the definitive book on algorithms, it does take some work to get through it, but it is worth the work. (Side note, this is the book Google expects you to master before interviewing)

How do you run your code? You may want to study general operating system concepts if you want to know how your code interacts with the system on which it is running. Want to go even deeper with code performance? Take a look at computer architecture Another topic that should be covered is computer networking, as many applications these days don't work without a network.

What are some good practices to follow while writing your code? Two books that are widely recommended are Code Complete and Pragmatic Programmer. Though they cover a very wide range (everything from organizational hacks to unit testing to user design) of topics, it wouldn't hurt to check out Code Complete at the least, as it gives great tips on organizing functions and classes, modules and programs.

All these techniques and technologies are just bits and pieces you put together with your programming language. You'll likely need to learn about other tools, other languages, debuggers and linters and optimizers, the list is endless. What helps light the path ahead is finding a mentor, someone that is well steeped in the craft, and is willing to show you how they work. This is best done in person, watching someone design and code. Also spend some time reading the code of others (GitHub is a great place for this) and interacting with them on public mailing lists and IRC channels. I hang out on Hacker News to hear about the latest tools and technologies (many posts to /r/programming come from Hacker News). See if there are any local programming clubs or talks that you can join, it'd be a great forum to find yourself a mentor.

Lots of stuff here, happy to answer questions, but hope it's enough to get you started. Oh, yeah, the books, they're expensive but hopefully you can get your boss to buy them for you. It's in his/her best interest, as well as yours!

u/DiggyDog · 9 pointsr/gamedev

Hey there, I'm a game designer working in AAA and I agree with /u/SuaveZombie that you'll probably be better off with a degree in CS. BUT... don't give up on wanting to be a designer!

 

You should realize that it's not giving up on your dream at all, in fact, it's great advice for how to reach that dream. A designer with an engineering background is going to have a lot more tools at their disposal than one who doesn't.

 

Design is way more than just coming up with a bunch of cool, big ideas. You need to be able to figure out all the details, communicate them clearly to your teammates, and evaluate how well they're working so you can figure out how to make something people will enjoy. In fact, working on a big game often feels like working on a bunch of small games that all connect.

Take your big game idea and start breaking it down into all the pieces that it will need to be complete. For example, GTA has systems for driving and shooting (among many other things). Look at each of those things as its own, smaller game. Even these "small" parts of GTA are actually pretty huge, so try to come up with something as small as possible. Like, super small. Smaller than you think it needs to be. Seriously! You'll eventually be able to make big stuff, but it's not the place to start. Oh, and don't worry if your first game(s) suck. They probably will, and that's fine! The good stuff you make later will be built on the corpses of the small, crappy games you made while you were learning.

 

If you're truly interested in design, you can learn a lot about usability, player psychology, and communication methods without having to shell out $17k for a degree. Same goes for coding (there are tons of free online resources), though a degree will help you get in the door at companies you might be interested in and help provide the structure to keep you going.

 

Here's some books I recommend. Some are specific to games and some aren't, but are relevant for anything where you're designing for someone besides yourself.

 

Universal Principles of Design

The Design of Everyday Things

Rules of Play

The Art of Game Design This and the one below are great books to start with.

A Theory of Fun This is a great one to start with.

Game Feel

• Depending on the type of game you're making, some info on level design would be useful too, but I don't have a specific book to recommend (I've found pieces of many books and articles to be useful). Go play through the developer commentary on Half-Life 2 or Portal for a fun way to get started.

 

Sounds like you're having a tough time, so do your best to keep a positive attitude and keep pushing yourself toward your goals. There's nothing to stop you from learning to make games and starting to make them on your own if that's what you really want to do.

Good luck, work hard!

u/raydenuni · 2 pointsr/boardgames

> I also like these discussions! This is actually a subject of some interest to me, because people have been complaining about a lack of board game content as true critique rather than just more "consumer oriented" reviews.

If you're interested in the theory of games (not to be confused with game theory, which is an interesting type of math), then look into "ludology".

> Have you played My First Orchard?

I've never heard of it. But it sounds like you make choices and can get better at the game, much like tic-tac-toe, so I would call it a game. Interestingly enough, from a mathematical, complexity tree point of view, tic-tac-toe, checkers, and chess are also essentially equivalent. Some are more complex than others, but at the end of the day there's a branching tree of moves, you take turns moving through this tree, and at the end of some branches, a player wins. We consider tic-tac-toe to be trivial and not worth our time because our brains are able to solve it. But checkers and chess are just as theoretically solvable, we're just not smart enough.

> What about a game where you roll 100d6 and your opponent rolls 100d6.

I guess. I'm a fan of extreme examples proving stuff. It's an easy way to see if your theory holds up or not. I'd argue that's a pretty poor game, but it's not technically any different than any number of games. Take MTG for example, given a specific shuffle of each player's deck, you could say one person has a 100% chance to win. For most shuffles though, it's probably a lot closer to 50%. In those cases, your choices matter.

> Also, as for a game needing "goals," doesn't this eliminate many "sandbox" style games (whether video games or sandboxy narrative games like RPGs)?

It does. Sandbox style games are often considered toys and not games. Sim City has been famously described by Will Wright as a toy and not a game. If you're looking at the dressing instead of the content, a lot of not-games become games. Toys are fine. Toys are good. Nothing wrong with toys. There are a lot of cool toys where you learn a lot of really useful stuff and given self-imposed goals, you can learn stuff about them and reality. But they're not games.

> Also, I'm not sure if I agree that activities that aren't about learning are not fun? Can't something be fun because it's physical (e.g., thrill rides)? Because it's nostalgic?

This is potentially a weakness of the argument and might be enough to prove my radical stance false. But the idea is that all of those things involve learning of some sort. It starts to blur the lines between learning and experiencing new things though for sure.

> Finally, is there nothing to say about the fact that S&L is routinely referred to as a game?

There is. And words are used differently in different contexts, with different people, to mean different things. If we're just speaking colloquially, then yeah, anything on the same shelf at Target can be considered a game. But if we're using these terms to actually mean something so we can have an intellectual, academic discussion about games, it's useful to differentiate between toys and competitions and games. If our terminology can't distinguish between Chess and Lego in some definitive manner, we're going to have trouble coming up with any interesting conclusions. I don't mean to assign any quality to the term game, there are a lot of really great non-games out there. We already differentiate between different types of board games. People will often refer to something as multi-player solitaire. Could we not refer to these as competitions? And of course when speaking colloquially, it doesn't really do to categorize Dominion as a card based competition instead of a card-game.

Raph Koster's blog has a bunch of good content: https://www.raphkoster.com/ But I would start with his book A Theory of Fun. Apparently there are some PDFs here and here, 10 years later. It's a super easy to ready book, but really insightful. I highly recommend it. It looks like maybe the PDFs are a subset of the book. Let me know what you think.

u/RoguelikeDevDude · 4 pointsr/gamedev

Book suggestions? Now that's my jam.

Out of all the books i've read, here are my recommendations regarding game programming:

Eric Lengyel's Books (only one out so far). This is aimed at game engine development, but if the 2nd onward are as indepth as the first, they will be amazing fundamental knowledge. Also, they're not thick, and jam packed with information.

Game Programming Patterns. The only book that comes more recommended than this is the one right below it by Jesse Schell. This book is fantastic, but you should write one or two small games to really get the most out of this book. You can also read it online on his website free, but then you don't get a pic of him and his dog on the back cover.

Book of Lenses. This is your intro/intermediate dive into game design. There are a lot of game design books, if you only read one, it should be this one.

Gane AI By Example. This book is a hodgepodge of fantastic techniques and patterns by those in AAA. There are other books on the series (like Game AI Pro) which are similar, but in my opinion (at least when I read AI PRO 3), they're not as good. But more knowledge is never bad.

Truthfully, as I sit here looking over all my books, those are the only ones i'd consider mandatory for any seasoned developer. Of course plenty of developers get by without reading these books, but they likely pick up all the principles listed herein elsewhere, in bits and pieces, and would likely have benefited having read them early on.

Here are a few others that I do recommend but do NOT consider mandatory. Sorry, no links.

Unity in Action. Personally, I recommend this or a more interactive online course version (udemy.com/unitycourse) if you want to learn unity while having a resource hold your hand. Having read the book, taken the course, AND taken Unity's own tutorials on the matter, i'd order them in order from Course being best, book second, videos from unity third. But none of them are bad.

Game Engine Architecture. This is the king for those who want a very broad introduction to making a game engine. It comes highly recommended from nearly anyone who reads it, just so long as you understand it's from a AAA point of view. Game Code Complete is out of print and unlikely to be revisited, but it is similar. These are behemoths of books.

Realtime rendering. This is one I haven't read, but it comes very highly recommended. It is not an intro book, and is also over 1000 pages, so you want this along side a more introductory book like Fundamentals of computer graphics. Truth be told, both books are used in courses in university at the third and fourth year levels, so keep that in mind before diving in.

Clean code. Yeah yeah it has a java expectation, but I love it. It's small. Read it if you understand Java, and want to listen to one of the biggest preachers on how not to write spaghetti code.

Rimworld guy, Tynaan sylvester I believe, wrote a book called Designing Games. I enjoyed it, but IMO it doesn't hold a candle to Jesse Schell's book. Either way, the guy did write that book after working in AAA for many years, then went on to create one of the most successful sim games in years. But yeah, I enjoyed it.

Last but not least, here are some almost ENTIRELY USELESS but interesting diagrams of what some people think you should read or learn in our field:

https://github.com/miloyip/game-programmer

https://github.com/utilForever/game-developer-roadmap

https://github.com/P1xt/p1xt-guides/blob/master/game-programming.md

u/coned88 · 1 pointr/linux

While being a self taught sys admin is great, learning the internals of how things work can really extend your knowledge beyond what you may have considered possible. This starts to get more into the CS portion of things, but who cares. It's still great stuff to know, and if you know this you will really be set apart. Im not sure if it will help you directly as a sys admin, but may quench your thirst. Im both a programmer and unix admin, so I tend to like both. I own or have owned most of these and enjoy them greatly. You may also consider renting them or just downloading them. I can say that knowing how thing operate internally is great, it fills in a lot of holes.

OS Internals

While you obviously are successful at the running and maintaining of unix like systems. How much do you know about their internal functions? While reading source code is the best method, some great books will save you many hours of time and will be a bit more enjoyable. These books are Amazing
The Design and Implementation of the FreeBSD Operating System

Linux Kernel Development
Advanced Programming in the UNIX Environment

Networking

Learning the actual function of networking at the code level is really interesting. Theres a whole other world below implementation. You likely know a lot of this.
Computer Networks

TCP/IP Illustrated, Vol. 1: The Protocols

Unix Network Programming, Volume 1: The Sockets Networking API

Compilers/Low Level computer Function

Knowing how a computer actually works, from electricity, to EE principles , through assembly to compilers may also interest you.
Code: The Hidden Language of Computer Hardware and Software

Computer Systems: A Programmer's Perspective

Compilers: Principles, Techniques, and Tools

u/fajitaman · 4 pointsr/learnprogramming

The usual advice is "get out and program!" and that works, but it can be very tricky coming up with something to write that's also satisfying. The idea is that you learn best by doing, and that many topics in programming can't really be learned without doing. All that stuff is true and I'm not denying that at all, but some of us need more. We need something juicier than spending hours configuring a UI for a project we couldn't care less about. It shouldn't be an exercise in masochism.

I guess what I'm saying is that there are a lot of ways to learn to write code and books are great if you can really sink your teeth into them (a lot of people can't). Code Complete is a great book on the practice of programming. You also say that you "get" OO pretty well, but it might open your eyes to read up on design patterns (e.g., Head First Design Patterns). You have a long way to go before you really get it

In addition to those, you could delve deeper into your languages of choice. There's no way around JavaScript if you're a web programmer, and a book like JavaScript: The Good Parts is pretty enlightening if you've got some experience in JavaScript already. It's a pretty interesting and unusual language.

But sometimes programming is about building gumption, so instead of just being practical, try to figure out what you like about computers and keep going deeper into it. If you have an interest in computer science and not in just building apps, then something like Structure and Interpretation of Computer Programs could instill in you an enthusiasm for computers that trickles down to everything else you do. If you're more interested in web design, there are probably similarly interesting books on artistic design principles.

I think what I'm ultimately saying is that you should find what you enjoy doing and just go deeper down the rabbit hole, getting your hands dirty when it's appropriate and interesting.

u/sarpunk · 3 pointsr/learnprogramming
  • I second the other comments about practice & sticking with projects. Perfectionism can be a great thing, but if it keeps you from finishing a project, let it go. The first iterations of your projects don't have to be perfect - just getting through them will help you grow.

  • Procrastinating on homework assignments will also tank your grade (been there, done that), even if the material seems easy - some programming assignments just take loads of time.

  • It sounds like you're still in school, so you'll probably be exposed to lots of different languages and paradigms, and that's a good thing. If you're going to insist on perfection in personal projects, though, it might be easiest to focus on one area, like halfercode suggested.

  • Finally, for reading material: It sounds like you don't need any basic intros, so look for advanced tutorials to new languages you want to learn, or just read the language documentation. This is a pretty good competency matrix to rate yourself against - if something looks unfamiliar, browse through the wiki page. Other great books: Computer Systems: A Programmer's Perspective - doesn't assume a ton of prior knowledge, but gets to a fair amount of depth pretty quickly. There are also really cool systems programming labs. Matt Might's list of everything a CS major should know is really comprehensive, with lots of reading material referenced. If I were you, I would focus specifically on the Data Structures & Algorithms and Theory sections, supplementing with practical projects.

  • As for projects: Start small, no matter the final size of the project. Focus on getting out a minimal example of what you want to do before you worry about what the UI looks like or perfect functioning.

    tl:dr Practice & perserverence are the main points. No one is really any good at programming until they've got a few years of churning out code, so don't get discouraged. Finally: don't let the breadth of the computer science/software world overwhelm you. Focus on small pieces, and in a few years you'll have learned more than you would have expected.
u/xzieus · 1 pointr/uvic

At UVic, I think there are security specializations for degrees such as the MTIS or the Computer Science Options (such as Network Security -- although I did the Software Engineering option for C.Sc. in my undergrad)

I focused on taking classes, but I did a LOT of my own (legal) research/projects. That "legal" caveat is IMPORTANT. Don't get arrested for a hobby, it doesn't achieve your goal, and it's not worth it. Do things the right way, don't trespass or break the law.

Most of the government cyber defense jobs are in Ontario -- so expect to have to move there if you want to work with them. I hear there are ... "sites" ... elsewhere, but realistically you would have to "do your time" there before anything like that became available.

Business and Finance classes are always a good idea -- not just for business but personal benefit. My wife is an accountant and those skills are really helpful to have for our daily/monthly/etc finances.

Advice

  • You have to "shoot straight" when it comes to security. Gone are the days where someone hacks the FBI and they offer him a job. Now they just arrest you and you stay there. It makes sense, why incentivise it. Don't do something that might even be construed as illegal. (With that being said, there is an argument to be made for making security education too "academic" and forgetting that people actually have to work on practical aspects -- this is outside the scope of this conversation though)
  • There are plenty of projects such as OWASP Broken Web App, classes like Elec 567 at UVic, or just learn how to make your own VMs and attack them locally (the best route -- then you can control what's installed, with a fine-tooth comb) -- this also helps test new patches, etc to see if the software is vulnerable.
  • Read. Lots. Subscribe to blogs, order books (I am partial to books such as Hacking: The Art of Exploitation (Pretty low level, but helps you understand what is going on under the hood), and Violent Python (more of a cookbook / handbook)), and read up on security news. Rule of thumb: Read at least 2 new security books every year (at a minimum) -- It gets easy when you have a dedicated app for security podcasts, RSS feeds, and you keep a book or two with you all the time.
  • When interviewing for government security jobs, don't lie to them. If they asked you if you have smoked pot, tell them if you did. They are looking for truthfulness.
  • Look at open source projects where you can contribute (general coding advice, but it helps). It doesn't have to be the Linux kernel, just work on something that isn't an assignment/project from school.
  • Learn who the big players are in security -- Like everything on the internet, there is lots of talk. Find the people who actually know what they are talking about and listen to them. Take EVERYTHING (including this post) with a grain of salt! The classic motto is "Trust but verify". This applies to everything. The security industry is ... interesting ... Think of it as a cross between the mafia (Pay us for protection ... or else), "tinfoil hattiness" (Comes with the territory -- you see a lot more than the average person, so it skews your view on certain subjects... not all of which you can even talk about), and the classic balance between privacy and security (ranges from surveillance state and anarchy) ... Politics play a HUGE part.
  • Always be learning. Show this to prospective employers. Don't just talk, do.


    Sorry, this turned into a bit of an essay. I'm just one opinion out there, but hopefully you get something out of this. As always, "trust but verify".

    [edit: a word]
u/Goliathvv · 3 pointsr/DestinyTheGame

From Theory of Fun for Game Design by Ralph Koster:

> Human beings are all about progress. We like life to be easier. We’re lazy that way. We like to find ways to avoid work. We like to find ways to keep from doing something over and over. We dislike tedium, sure, but the fact is that we crave predictability. Our whole life is built on it. Unpredictable things are stuff like drive-by shootings, lightning bolts that fry us, smallpox, food poisoning—unpredictable things can kill us! We tend to avoid them. We instead prefer sensible shoes, pasteurized milk, vaccines, lightning rods, and laws. These things aren’t perfect, but they do significantly reduce the odds of unpredictable things happening to us.
>
> And since we dislike tedium, we’ll allow unpredictability, but only inside the confines of predictable boxes, like games or TV shows. Unpredictability means new patterns to learn, therefore unpredictability is fun. So we like it, for enjoyment (and therefore, for learning). But the stakes are too high for us to want that sort of unpredictability under normal circumstances. That’s what games are for in the first place—to package up the unpredictable and the learning experience into a space and time where there is no risk.
>
> The natural instinct of a game player is to make the game more predictable because then they are more likely to win.
>
> This leads to behaviors like “bottom-feeding,” where a player will intentionally take on weaker opponents under the sensible logic that a bunch of sure wins is a better strategy than gambling it all on an iffy winner-take-all battle. Players running an easy level two hundred times to build up enough lives so that they can cruise through the rest of the game with little risk is the equivalent of stockpiling food for winter: it’s just the smart thing to do.
>
> This is what games are for. They teach us things so that we can minimize risk and know what choices to make. Phrased another way, the destiny of games is to become boring, not to be fun. Those of us who want games to be fun are fighting a losing battle against the human brain because fun is a process and routine is its destination.
>
> So players often intentionally suck the fun out of a game in hopes they can learn something new (in other words, find something fun) once they complete the task. They’ll do it because they perceive it (correctly) as the optimal strategy for getting ahead. They’ll do it because they see others doing it, and it’s outright unnatural for a human being to see another human being succeeding at something and not want to compete.
>
> All of this happens because the human mind is goal driven. We make pious statements like “it’s the journey, not the destination,” but that’s mostly wishful thinking. The rainbow is pretty and all, and we may well enjoy gazing at it, but while you were gazing, lost in a reverie, someone else went and dug up the pot of gold at the end of it.
>
> Rewards are one of the key components of a successful game activity; if there isn’t a quantifiable advantage to doing something, the brain will often discard it out of hand.(...)

u/invictus08 · 2 pointsr/flask

First of all, applause for the great start.

Here are some criticisms/suggestions I would like to offer. Keep in mind, I am not assuming your level/experience as a software developer:

  1. Functions with smaller size. You see, most of the functions that you have written is lengthy because of the sql statements. Here comes my second point.

  2. Separate business logic, application code, data storage related stuff etc. Keep things modular. That separation is important because you want things to be maintainable and reusable. Your code should be open for extension, but close for modification. If that does not make sense to you, that's perfectly fine, just start from this

  3. On that note, since you are using flask, might I suggest using flask-sqlalchemy instead of sqlalchemy? You may like it better. I know you have mentioned

    > I force myself to write raw SQL Request to get better with SQL

    while that is commendable, it is not really a good idea to write raw sqls in production code if there are ORM library alternatives available. Remember, it's not always you that is going to read/modify the code. While ORM syntax will be fairly universal, your style of writing SQL may vary starkly from other people - which is what creates confusion and lets errors sneak in. Even if you want to do that, maybe keep the raw sql in separate modules (point 2).

  4. Instead of computing everything and then sending the result along with the page, maybe create api endpoints for specific sections; render page with bare minimum info and from the webpage make multiple calls to update the page sections when required. This way, it will be far more responsive, user will not be waiting for you to finish all the computation and if you detect any change in any section of the page, you can just update that particular section with an appropriate api call, thereby avoiding a whole page reload. Design choices.

  5. PEP8. You don't have to blindly follow every rule - just make sure you understand why those rules are there, and that if you are breaking any, you know that it is absolutely necessary for accomplishing what you want. Again, what you want may not always be what you actually need - so be really careful.

  6. This is something I wish I knew earlier - Design Patterns. Without going into much details, I would recommend reading these books to start with and really understand instead of memorizing:
  7. Documentation is also important. Follow the good practices there. A remarkable reference would be Ken Reitz's Requests library.

    Finally, remember that all these are just suggestions, and you may already know them. You will decide which ones to take and which ones to leave behind based on your situation.

    Again, great job (I also learnt something from this). Just make sure you keep running.
u/CrimsonCuntCloth · 4 pointsr/learnpython

Depending on what you want to learn:

PYTHON SPECIFIC

You mentioned building websites, so check out the flask mega tutorial. It might be a bit early to take on a project like this after only a month, but you've got time and learning-by-doing is good. This'll teach you to build a twitter clone using python, so you'll see databases, project structure, user logons etc. Plus he's got a book version, which contains much of the same info, but is good for when you can't be at a computer.

The python cookbook is fantastic for getting things done; gives short solutions to common problems / tasks. (How do I read lines from a csv file? How do I parse a file that's too big to fit in memory? How do I create a simple TCP server?). Solutions are concise and readable so you don't have to wade through loads of irrelevant stuff.

A little while down the road if you feel like going deep, fluent python will give you a deeper understanding of python than many people you'll encounter at Uni when you're out.

WEB DEV

If you want to go more into web dev, you'll also need to know some HTML, CSS and Javascript. Duckett's books don't go too in depth, but they're beautiful, a nice introduction, and a handy reference. Once you've got some JS, Secrets of the javascript ninja will give you a real appreciation of the deeper aspects of JS.

MACHINE LEARNING
In one of your comments you mentioned machine learning.

These aren't language specific programming books, and this isn't my specialty, but:

Fundamentals of Machine Learning for Predictive data analytics is a great introduction to the entire process, based upon CRISP-DM. Not much of a maths background required. This was the textbook used for my uni's first data analytics module. Highly recommended.

If you like you some maths, Flach will give you a stronger theoretical understanding, but personally I'd leave that until later.

Good luck and keep busy; you've got plenty to learn!

u/autisticpig · 1 pointr/Python

If you were serious about wanting some deep as-you-go knowledge of software development but from a Pythonic point of view, you cannot go wrong with following a setup such as this:

  • learning python by mark lutz
  • programming python by mark lutz
  • fluent python by luciano ramalho

    Mark Lutz writes books about how and why Python does what it does. He goes into amazing detail about the nuts and bolts all while teaching you how to leverage all of this. It is not light reading and most of the complaints you will find about his books are valid if what you are after is not an intimate understanding of the language.

    Fluent Python is just a great read and will teach you some wonderful things. It is also a great follow-up once you have finally made it through Lutz's attempt at out-doing Ayn Rand :P

    My recommendation is to find some mini projecting sites that focus on what you are reading about in the books above.

  • coding bat is a great place to work out the basics and play with small problems that increase in difficulty
  • code eval is setup in challenges starting with the classic fizzbuzz.
  • codewars single problems to solve that start basic and increase in difficulty. there is a fun community here and you have to pass a simple series of questions to sign up (knowledge baseline)
  • new coder walkthroughs on building some fun stuff that has a very gentle and friendly learning curve. some real-world projects are tackled.

    Of course this does not answer your question about generic books. But you are in /r/Python and I figured I would offer up a very rough but very rewarding learning approach if Python is something you enjoy working with.

    Here are three more worth adding to your ever-increasing library :)

  • the pragmatic programmer
  • design patterns
  • clean code

u/Shadow-Master · 1 pointr/gamedev

Don't be suckered by a "Game Design" program. There are VERY few good ones. Most of them....as in, 99% of them...are rip-offs.

Learn programming, 3D-modeling, or animation. Pick one that you're more interested in and then full-speed ahead. These will make you useful in more than just game development roles, thus helping you in the future when you have trouble landing a game dev job. At least you'll still be doing something you like in the meantime, and still building your skill in that area. Many really popular game designers have specialties outside of just "Design". Some are excellent programmers, some are artists, some have excellent business skills (really good at project management), and some are brilliant story-writers. Most game design positions are not entry-level, because you REALLY have to know what you are doing, before someone will trust you enough to let you touch the design. The only real way to prove that you are actually a good game designer is by having games to show off. That proves that you have some idea of the design process and know how to maintain a game from start to finish. This is HARD.

Some like to say that these degree programs for game design help them by giving them the incentive to push through and finish their stuff, otherwise, they might not have the motivation. Well, that's very problematic, because that means that you will not be the type of person who can finish a game. Game development requires you to be highly self-driven.

Most of what "Game Design" programs teach you can be learned by picking up a few game design books and making your own games (alot of them, too). Game design is learned by making games, not by having a professor tell you about it. You have enough mentors in the game development community already. They will always be there to critique what you do and give you tips on how to improve your work. Pick up a couple of books like The Art of Game Design and Designing Games. You can look at other books in whatever other area you want to master and just get started on making games. Turn off your console and just get started. Start small. Make very simple, basic games to start off with (B.A.S.I.C.). It's about learning the process first. Do that while reading a ton of highly-detailed game postmortems online. Just learn the process. THAT will be your real education.

And finally, start working your way up to putting together a portfolio. Portfolios speak much louder than a resume (although, a resume is still important). And that doesn't mean having a bunch of "Game Design docs". Games. Not docs. Games. Then build up your confidence and hook up with a team, so you can fight your way together to the end of making a complete game. (this may be one of the only valuable things that a game design program can provide you out of the box, i.e., a team that you are forced to work on a game with)

The single most important tool you will ever have is discipline. No degree will be able to top that. Give up the idea of being a hardcore gamer, because you are now going to need to become a VERY disciplined person. You're going to need it.

Finally: Don't forget to have fun. Good luck! :)

u/RhoTheory · 33 pointsr/MachineLearning

Grad school for machine learning is pretty vague, so here's some general resources I think would be good for an incoming CS grad student or undergraduate CS researcher with a focus on deep learning. In my opinion, the courses you mentioned you've done should be a sufficient foundation to dive into deep learning, but these resources cover some foundational stuff as well.

  • Kaggle is for machine learning in general. It provides datasets and hardware. It has some nice tutorials and you can look at what other people did.
  • Google has an online crash course on Machine Learning.
  • Hands-On Machine Learning with Scikit-learn and Tensorflow is a great book for diving into machine learning with little background. The O'Reilly books tend to be pretty good.
  • MIT Intro to Deep Learning provides a good theoretical basis for deep learning specifically.
  • MIT Intro to AI. This is my favorite online lecture series of all time. It provides a solid foundation in all the common methods for AI, from neural nets to support vector machines and the like.
  • Tensorflow is a common framework for deep learning and provides good tutorials.
  • Scikit-learn is a framework for machine learning in python. It'd be a good idea to familiarize yourself with it and the algorithms it provides. The link is to a bunch of examples.
  • Stanford's deep learning tutorial provides a more mathematical approach to deep learning than the others I've mentioned--which basic vector calc, linear algebra, and stats should be able to handle.
  • 3Blue1Brown is a math youtuber that animates visual intuitions behind many rather high-level concepts. He has a short series on the math of neural networks.
  • If you are going to be dealing with hardware for machine learning at all, this paper is the gold standard for everything you'd need to know. Actually, even if you aren't dealing with the hardware, I'd recommend you look at the seconds on software. It is fairly high level, however, so don't be discouraged if you don't get some of it.
  • Chris Olah's Blog is amazing. His posts vary from explanations of complex topics very intuitively to actual research papers. I recommend "Neural Networks, Manifolds, and Topology".
u/jchiu003 · 1 pointr/OkCupid

Depends on how old you are.

  • Middle school: I really enjoyed this, this, and this, but I don't think I can read those books now (29) without cringing a little bit. Especially, Getting Things Done because I already know how to make to do list, but I still flip through all 3 books occastionally.

  • High school: I really enjoyed this, this, and this, but if you're a well adjusted human and responsible adult, then I don't think you'll find a lot of helpful advice from these 6 books so far because it'll be pretty basic information.

  • College: I really enjoyed this, this, and started doing Malcolm Gladwell books. The checklist book helped me get more organized and So Good They Can't Ignore You was helpful starting my career path.
  • Graduate School: I really enjoyed this, this, and this. I already stopped with most "self help" books and reading more about how to manage my money or books that looked interesting like Stiff.

  • Currently: I'm working on this, this, and this. Now I'm reading mostly for fun, but all three of these books are way out of my league and I have no idea what their talking about, but they're areas of my interest. History and AI.
u/reddilada · 1 pointr/learnprogramming

Being fond of problem solving is a good indicator. Problem solving and executing a solution is essentially what programming is all about in the end. Pretty much any engineering degree for that matter. The good news is most STEM courseware is pretty much the same the first couple of years of college so you won't really have to commit straight away. Your classes will apply to multiple degree paths and having a few intro compsci courses under your belt will help in literally any major.

A computer science degree is (should be) geared to problem solving more than learning to write code. Writing code is the easy bit and the tech changes so quickly it is something best learned on the fly. You will be taking tons of math, studying algorithms, data structures, learning to play well with others -- that sort of thing.

Being fond of computers alone can lead one astray. The classic example is that liking listening to music doesn't necessarily lead to liking making music.

The Harvard cs50x extension course will give you a straight up taste of what an intro to CS class will be like in university. The pace is fast so fair warning.

A good armchair book is CODE. Nice overview of how computers compute.

It's a great career choice IMO. I've been at it for a long long (long) time with zero regrets. Along side getting to play with all the shiny bits, you can get a constant supply of feel good moments when you see your work actually doing something in the wild and seeing your work impact peoples lives in a positive way.

u/KenFlorentino · 3 pointsr/gamedev

Fellow enterprise developer turned manager here. Me and my cohort are about to release our first title. It was developed using .NET/C#.

AMA. :)

I'll start with the questions you have above.

Assuming you already have a solid foundation in OOP, Design Patterns, some basic RDBMS, etc, you actually already have 60% of what you need. Code is code.

The other 40% depends on the type of game you are making. 2D? Basic algebra. 3D? Now it gets tougher on the math (though thankfully today's engines do most of the heavy lifting for you, but you still need to understand what is used for what).

Doing multi-player? Now networking is the tricky part because you are likely to use some sort of UDP communication layer and all the REST/SOAP you learned at work, while still useful for managing latency-agnostic stuff like player lists, matchmaking requests and such, won't cut it for real-time multi-player games. Writing solid "netcode" that delivers a great experience at 60+ FPS requires some creativity in managing perception (extrapolation and interpolation when latency is present) and fault-tolerant algorithms. It is no fun when you get a headshot in an FPS, see it happen, but your opponent runs away, apparently unscathed.

As far as graphics, I solved that one easily... I had a friend join my project who was the graphics guy. I provided the framework for doing the graphics and turned that area over to him. He went above and beyond though and learned shaders and added all sorts of special effects.

Meanwhile, I focused my energy on the game engine, networking layers, AWS cloud stuff, matchmaking and lots of behind the scenes stuff.

The other thing I did was read as much as possible about Game Design. I ordered a dozen books from Amazon, including my absolute favorite Designing Games by Tynan Sylvester, the developer of RimWorld (link: https://www.amazon.com/Designing-Games-Guide-Engineering-Experiences/dp/1449337937).

Hope that helps!



u/cybrbeast · 19 pointsr/Futurology

This was originally posted as an image but got deleted for IMO in this case, the irrelevant reason that picture posts are not allowed, though this was all about the text. We had an interesting discussion going: http://www.reddit.com/r/Futurology/comments/2mh0y1/elon_musks_deleted_edge_comment_from_yesterday_on/

I'll just post my relevant contributions to the original to maybe get things started.



---------------------------

And it's not like he's saying this based on his opinion after a thorough study online like you or I could do. No, he has access to the real state of the art:

> Musk was an early investor in AI firm DeepMind, which was later acquired by Google, and in March made an investment San Francisco-based Vicarious, another company working to improve machine intelligence.

> Speaking to US news channel CNBC, Musk explained that his investments were, "not from the standpoint of actually trying to make any investment return… I like to just keep an eye on what's going on with artificial intelligence. I think there is potentially a dangerous outcome there."

*Also I love it that Elon isn't afraid to speak his mind like this. I think it might well be PR or the boards of his companies that reigned him in here. Also in television interviews he is so open and honest, too bad he didn't speak those words there.

----------------------------

I'm currently reading Superintelligence which is mentioned in the article and by Musk. One of the ways he describes an unstoppable scenario is that the AI seems to function perfectly and is super friendly and helpful.

However on the side it's developing micro-factories which can assemble from a specifically coded string of DNA (this is already possible to a limited extent). These factories then use their coded instructions to multiply and spread and then start building enormous amount of nanobots.

Once critical mass and spread is reached they could instantly wipe out humanity through some kind of poison/infection. The AI isn't physical, but the only thing it needs in this case is to place an order to a DNA printing service (they exist) and then mail it to someone it has manipulated into adding water, nutrients, and releasing the DNA nanofactory.

If the AI explodes in intelligence as predicted in some scenarios this could be set up within weeks/months of it becoming aware. We would have nearly no chance of catching this in time. Bostrom gives the caveat that this was only a viable scenario he could dream up, the super intelligence should by definition be able to make much more ingenious methods.

u/FearMonstro · 3 pointsr/compsci

Nand to Tetris (coursera)

the first half of the book is free. You read a chapter then you write programs that simulate hardware modules (like memory, ALU, registers, etc). It's pretty insightful for giving you a more rich understanding of how computers work. You could benefit from just the first half the book. The second half focuses more on building assemblers, compilers, and then a java-like programming language. From there, it has you build a small operating system that can run programs like Tetris.

Code: The Hidden Language of Hardware and Software

This book is incredibly well written. It's intended for a casual audience and will guide the reader to understanding how a microcontroller works, from the ground up. It's not a text book, which makes it even more more impressive.

Computer Networking Top Down Approach

one of the best written textbook I've read. Very clear and concise language. This will give you a pretty good understanding of modern-day networking. I appreciated that book is filled to the brim of references to other books and academic papers for a more detailed look at subtopics.

Operating System Design

A great OS book. It actually shows you the C code used to design and code the Xinu operating system. It's written by a Purdue professor. It offers both a top-down look, but backs everything up with C code, which really solidifies understanding. The Xinu source code can be run on emulators or real hardware for you to tweak (and the book encourages that!)

Digital Design Computer Architecture

another good "build a computer from the ground up" book. The strength of this book is that it gives you more background into how real-life circuits are built (it uses VHDL and Verilog), and provides a nice chapter on transistor design overview. A lot less casual than the Code book, but easily digestible for someone who appreciates this stuff. It culminates into designing and describing a microarchitecture to implement a MIPS microcontroller. The diagrams used in this book are really nice.

u/adventuringraw · 9 pointsr/MachineLearning

dude, ten hours of intro that can help you intuitively navigate relevant research questions when jumping into the actual research is completely fine and appropriate. You're welcome to your opinion, but a roadmap is all the more helpful when the challenge of Arxiv for a beginner is the double wammy of finding 'worthwhile papers' to read in the first place (citation count? Topic? Survey papers? Which papers are most important to start with?) along with the timesink of parsing even a single individual paper. Concept learning in deep RL is also an incredibly active area of research (one I'm just wading into), but if I could have a really engaging, intuitive, hands on 5 hour whirlwind tour through different established results, theories, contrasting approaches and so on, then sign me up, that sounds great to me. You'll still need to roll up your sleeves and get into some gnarly concepts and really intense math if you want to actually implement one of the cutting edge approaches, but starting with this kind of high level eli5 overview can be immensely helpful when deciding how to use your precious time. Even in a 100 lifetimes I don't know I could do all the things I want to do, so any time savings are more than welcome.

Granted, this particular course might not function well as a road map, but that would be a specific critique on this course in particular. I call bullshit that a course of this kind is useless in general in an emergent field. Perhaps it is for you, but not everyone learns like you, let others have their road if it suits them. We're all adults here, and I hope we can judge for ourselves where our time is most wisely spent.

Shitty courses being slapped together to take advantage of novices and pop science hype is a potential related problem, but if that's the chip on your shoulder, I'd challenge that potentially perverse incentive structure giving rise to a high number of worthless courses doesn't mean the 'ideal' intro course couldn't exist and be valuable.

also for what it's worth... I'm dabbling in this book, and it's doing a great job of laying framework. There might be divergent ideas and theories, but they'll all share a unified framework... why not start by exploring there? even bleeding edge doesn't have NOTHING but disconnected ideas.

u/apocalypsemachine · 5 pointsr/Futurology

Most of my stuff is going to focus around consciousness and AI.

BOOKS

Ray Kurzweil - How to Create a Mind - Ray gives an intro to neuroscience and suggests ways we might build intelligent machines. This is a fun and easy book to read.

Ray Kurzweil - TRANSCEND - Ray and Dr. Terry Grossman tell you how to live long enough to live forever. This is a very inspirational book.

*I'd skip Kurzweil's older books. The newer ones largely cover the stuff in the older ones anyhow.

Jeff Hawkins - On Intelligence - Engineer and Neuroscientist, Jeff Hawkins, presents a comprehensive theory of intelligence in the neocortex. He goes on to explain how we can build intelligent machines and how they might change the world. He takes a more grounded, but equally interesting, approach to AI than Kurzweil.

Stanislas Dehaene - Consciousness and the Brain - Someone just recommended this book to me so I have not had a chance to read the whole thing. It explains new methods researchers are using to understand what consciousness is.

ONLINE ARTICLES

George Dvorsky - Animal Uplift - We can do more than improve our own minds and create intelligent machines. We can improve the minds of animals! But should we?

David Shultz - Least Conscious Unit - A short story that explores several philosophical ideas about consciousness. The ending may make you question what is real.

Stanford Encyclopedia of Philosophy - Consciousness - The most well known philosophical ideas about consciousness.

VIDEOS

Socrates - Singularity Weblog - This guy interviews the people who are making the technology of tomorrow, today. He's interviewed the CEO of D-Wave, Ray Kurzweil, Michio Kaku, and tons of less well known but equally interesting people.

David Chalmers - Simulation and the Singularity at The Singularity Summit 2009 - Respected Philosopher, David Chalmers, talks about different approaches to AI and a little about what might be on the other side of the singularity.

Ben Goertzel - Singularity or Bust - Mathematician and computer Scientist, Ben Goertzel, goes to China to create Artificial General Intelligence funded by the Chinese Government. Unfortunately they cut the program.



PROGRAMMING

Daniel Shiffman - The Nature of Code - After reading How to Create a Mind you will probably want to get started with a neural network (or Hidden Markov model) of your own. This is your hello world. If you get past this and the math is too hard use this

Encog - A neural network API written in your favorite language

OpenCV - Face and object recognition made easy(ish).

u/maholeycow · 1 pointr/SoftwareEngineering

Alright man, let's do this. Sorry, had a bit of a distraction last night so didn't get around to this. By the way, if you look hard enough, you can find PDF versions of a lot of these books for free.

Classic computer science principle books that are actually fun and a great read (This is the kind of fundamental teachings you would learn in school, but I think these books teach it better):

  1. https://www.amazon.com/Code-Language-Computer-Developer-Practices-ebook/dp/B00JDMPOK2 - this one will teach you at a low level about 1's and 0's and logic and all sorts of good stuff. The interoperation of hardware and software. This was a fun book to read.
  2. https://www.nand2tetris.org/book - This book is a must in my opinion. It touches on so many things such as boolean logic, Machine language, architecture, compiling code, etc. And it is f*cking fun to work through.

    Then, if you want to get into frontend web development for example, I would suggest the following two books for the fundamentals of HTML, CSS, and JavaScript. What I like about these books is they have little challenges in them:

  3. https://www.amazon.com/Murachs-HTML5-CSS3-Boehm-Ruvalcaba/dp/1943872260/ref=sr_1_2_sspa?keywords=murach%27s+html5+and+css3&qid=1557323871&s=books&sr=1-2-spons&psc=1
  4. https://www.amazon.com/Murachs-JavaScript-jQuery-3rd-Ruvalcaba/dp/1943872058/ref=sr_1_1_sspa?keywords=murach%27s+javascript&qid=1557323886&s=books&sr=1-1-spons&psc=1

    Another great book that will teach you just fundamentals of coding using an extremely flexible programming language in Python, how to think like a programmer is this book (disclaimer: I haven't read this one, but have read other Head First books, and they rock. My roommate read this one and loved it though):

  5. https://www.amazon.com/Head-First-Learn-Code-Computational/dp/1491958863

    Let me know if you want any other recommendations when it comes to books on certain areas of software development. I do full stack web app development using .NET technology on the backend (C# and T-SQL) and React in the frontend. For my personal blog, I use vanilla HTML, CSS, and Javascript in the frontend and power backend content management with Piranha CMS (.NET Core based). I often times do things like pick up a shorter course or book on mobile development, IoT, etc. (Basically other areas from what I get paid to do at work that interest me).

    If I recommended the very first book to read on this list, it would be the Head First book. Then I would move over to the first book listed in the classic computer science book if you wanted to go towards understanding low level details, but if that's not the case, move towards implementing something with Python, or taking a Python web dev course on Udemy..

    Other really cool languages IMO: Go, C#, Ruby, Javascript, amongst many more

    P.S. Another book from someone that was in a similar situation to you: https://www.amazon.com/Self-Taught-Programmer-Definitive-Programming-Professionally-ebook/dp/B01M01YDQA/ref=sr_1_2?keywords=self+taught+programmer&qid=1557324500&s=books&sr=1-2
u/aherpiesderpies · 2 pointsr/compsci

I'm more vocational than academic - with the experience you have you can probably jump straight into work if that's what you want to do. I planned to work for a while and then go onto a masters but a few years later it became clear that employers do not look past your vocational experience once you have a couple of years. Part of the reason I wanted to go back and do a masters after working was that I found during my undergrad that we were taught a lot of concepts but that there was nowhere to tie them without the real world exeprience - this leads me to downplay the value of academic qualifications beyond somebody demonstrating they can get shit done.

That said, you almost certainly can, looks like GU would take you . If you just want to get a better understanding of software development you'd be better joining in some open source projects, if you want to get a better fundamental understanding of computers then get this book. My copy is > 5 years old and computers still work p.much the same way so don't bother splashing out :)

I do apologise for answering a different question from the one you asked but from your question it looks like you are self motivated and do a lot of learning on your own, if this is true it's likely you can achieve more outwith an academic context than in it, and save a pile of cash along the way.

All the best :)

u/xPolydeuces · 2 pointsr/learnpython

My friend who was getting into Python recently asked me about the same thing, I've made some research and this was what I came with. Just for the record, I'm personally a book dude - can't really focus much when learning from Udemy courses and stuff like that, so I will only cover books:


First book:


Python Crash Course by Eric Matthes
Very solid position for beginners. The first part of the book covers Python's basics - data types, lists, functions, classes, pretty much everything you need to get a good grasp of Python itself. The second part of the book includes three practical projects, mini-game, data visualization and an introduction to making web apps with Django. From what I so, it's a pretty unusual approach to beginner friendly books, since most of them avoid using additional libraries. On the other hand, it's harder to get bored with this book, it really makes you want to learn more and more once you can actually see the effects of all your work.


Automate the Boring Stuff with Python by Al Sweigart
Best alternative if you want to spend 0 bucks or want to dive all into projects. Even though it covers basics as well, I still recommend to read it, even if you have done Python Crash Course before, even just for the sake of all those projects you can make to practice your Python. He also has a Youtube channel where he has a loooot of Python content and sometimes does cool things like streaming and helping people make their code better, really cool guy, be sure to check his channel!


Second book:


Writing Idiomatic Python by Jeff Knupp

Very solid book, filled with examples what you, as a Python developer should do, and what you shouldn't (and why not). Sounds like not much, but it is actually a lot of useful knowledge that will make your code shorter, cleaner and better.


Effective Python by Brett Slatkin

A bit easier to understand and easier to approach than a previous book, but still has a load of knowledge to share.


Third book:


Fluent Python by Luciano Ramalho

One of the best Python books overall, covers all of the things that previous books could have missed or didn't have time to introduce. My personal favorite when it comes to books for advanced Python developers.


All of those recommendations are my personal opinion, so if anyone has anything to add, I will gladly listen to any comments!

u/joatmon-snoo · 6 pointsr/explainlikeimfive

Disclaimer: I don't know the EE stuff very well, but I do know enough to explain everything that comes after.

Here are two explanations of how you build logic gates from transistors: a simple one and courtesy of the EE StackExchange, a more technical one. (The value of an input and output is taken relative to V-/GND.)

Before you can build a CPU with logic gates, there are two concepts you need: (1) Boolean algebra and (2) memory cells.

----

If you look up Boolean algebra, you're going to get a lot of results that only math majors really understand (e.g. the Wikipedia page). To simplify it all, Boolean algebra is essentially the field of study that asks "if I only have two values to work with, TRUE and FALSE, what kind of math can I do?" Notice that TRUE and FALSE map neatly to 1 and 0 (hello, binary math!) as well as HIGH and LOW (V+ and 0V).

This means that you can make all sorts of circuits, like binary adders, multipliers, dividers, and so on. (Subtraction involves some extra logical tricks.)

At this point, what you essentially have is the ability to create any function.

----

Now what we need is some way to remember data: that's where memory cells come into play. (This is basically your RAM.)

The primitive form that gets taught in introductory EE courses is the flip-flop circuit: a circuit with two stable states. The stable part here is important: it means that if such a circuit enters this state, it will not leave this state until an input changes. (Similarly, if such a circuit enters an unstable state, generally, it will eventually transition into a stable state.) There are a lot more ways to construct memory cells, of course, but flip-flops are a simple way to see how you can store and manipulate data in a circuit.

With memory cells and Boolean algebra, you can now build state machines. Again, if you google this you're going to end up finding a lot of academic technical nitty-gritty, but at its most basic, a state machine has a finite number of states (A, B, C, ...) and each state corresponds to some function of its inputs.

----

The canonical example is a vending machine (keep in mind, all the electromechanical stuff is abstracted away here - we're only thinking about the control logic).

Let's start with a really simple vending machine. It only accepts $1 coins, it only dispenses one type of soda, and all sodas are $1 each. It's not our job to worry about restocking or counterfeit money or whatnot: our job is just the dispensing logic circuit. We know we're going to have one input and one output: an input for "is there a dollar coin being inserted" and an output for "dispense one can of soda". And if we think about it, the circuit should only have two states: dispensing a soda and not dispensing a soda.

That's pretty simple, then: we use one memory cell, to distinguish between the dispensing and not-dispensing state. The output will always reflect our internal state (i.e. output goes HIGH when dispensing, LOW when not dispensing); and if our input goes HIGH when we're not dispensing, we transition to dispensing, and no matter what our input is when we're dispensing, we transition to not dispensing.

Now we can start adding some complexity to our vending machine: let's accept pennies, nickels, dimes, and quarters too. How about dollar bills? To deal with this, clearly our state machine is going to need some kind of internal counter for how much money has been inserted. We're also going to need logic to compare how much money has been inserted to how much soda costs right now ($1), and also logic to dispense change.

But not everyone's a fan of Generic Soda™ so we're going to need some variety. Now we need a way for people to choose a soda. And since some people are snobs and want pricey stuff - they're willing to pay $2 for their canned beverage of choice (gasp! shock! horror!) - we need to add logic to handle different prices.

----

CPUs are built up much in the same way like the hypothetical vending machine above. A program is supplied as input, in the form of a list of instructions, and the CPU is basically a really big state machine that goes through the program line-by-line.

Explaining the details of how a basic CPU is designed is a full undergraduate course (Computer Organization/Architecture, usually), and seeing as how I've already outlined its prerequisite (Digital Logic) above, I'm going to stop here. The text I learned from was Patterson and Hennessy's Computer Organization and Design (you can find free PDFs of older versions floating around if you just google it).

----

Aside: if you have Steam and are interested in assembly-level programming, I've heard great things about Shenzhen I/O.

u/Bdee · 1 pointr/Unity3D

I had the same problem. I ride the subway every day and have a ton of time to read, so I've been trying to collect similar resources.

Here are some resources I found really helpful:

  1. Beginners book on Unity - http://www.amazon.com/Development-Essentials-Community-Experience-Distilled/dp/1849691444

    This is a VERY basic (think: learn how to code!) introduction to Unity. I personally found it too elementary, having coded in a few different languages, but it might be a good place to start as it explains basic Unity design concepts like Components, Materials, Colliders, etc.

  2. Your first Unity project (helps to have Unity accessible to follow alone) - Building a 2D RPG in Unity: http://pixelnest.io/tutorials/2d-game-unity/

    This is by fast the best 'getting started' tutorial I've found. It walks you through creating a really basic project from scratch using Unity basics and scripts. This is what I based most of my code off of when I first started my project.

  3. REALLY great book on game design/physics and AI - http://www.amazon.com/Programming-Example-Wordware-Developers-Library/dp/1556220782

    This has been the most helpful resource for me. It's not Unity specific but will teach you A TON of great fundamentals for things like how to move a character, common patterns like StateMachines, how to handle AI, etc.... All of these concepts will be relevant and many are already in place in Unity so you'll recognize them right away.

    Advanced: Game Programming Patterns - http://gameprogrammingpatterns.com/

    This is a book (online/pdf/epub) that teaches the more advanced patterns you'll be applying in your code. I'd suggest this once you finish with the above resources and have been working through your game for a bit.
u/phao · 8 pointsr/cscareerquestions

The best way I know how is by solving problems yourself and looking at good solutions of others.

You could consider going back to "fundamentals".

Most programming courses, IMO, don't have nearly as many exercises I think they should have. Some books are particularly good on their exercises list, for example K&R2, SICP, and TC++PL. Deitel's has long exercises lists, but I don't think they're particularly challenging.

There are some algorithms/DS books which focus on the sort of problem solving which is about finding solutions to problems in context (not always a "realistic" one). Like the "Programming Challenges" book. In a book like that, a problem won't be presented in a simple abstract form, like "write an algorithm to sort numbers". It'll be inside some context, like a word problem. And to solve that "word problem", you'll have to find out which traditional CS problems you could solve/combine to get the solution. Sometimes, you'll just have to roll something on your own. Like a new algorithm for the problem at hand. In general, this helps you work out your reduction skills, for once. It also helps you spotting applications to those classical CS problems, like graph traversal, finding shortest plath, and so forth.

Most algorithms/DS books though will present problems in a pretty abstract context. Like Cormen's.

I think, however, people don't give enough credit to the potential of doing the exercises on the books I've mentioned in the beginning.

Some books I think are worth reading which also have good exercises:

u/DucBlangis · 20 pointsr/netsecstudents

Here is a "curriculum" of sorts I would suggest, as it's fairly close to how I learned:

  1. Programming. Definitely learn "C" first as all of the Exploitation and Assembly courses below assume you know C: The bible is pretty much Dennis Richie and Kernighan's "The C Programming Language", and here is the .pdf (this book is from 1988, I don't think anyone would mind). I actually prefer Kochan's book "Programming in C" which is very beginner freindly and was written in 2004 rather than 1988 making the language a little more "up to date" and accessible. There are plenty of "C Programming" tutorials on YouTube that you can use in conjunction with either of the aforementioned books as well. After learning C than you can try out some other languages. I personally suggest Python as it is very beginner friendly and is well documented. Ruby isn't a bad choice either.

  2. Architecture and Computer basics:
    Generally you'll probably want to look into IA-32 and the best starting point is the Intel Architecture manual itself, the .pdf can be found here (pdf link).
    Because of the depth of that .pdf I would suggest using it mainly as a reference guide while studying "Computer Systems: A Programmers Perspective" and "Secrets of Reverse Engineering".

  3. Operating Systems: Choose which you want to dig into: Linux or Windows, and put the effort into one of them, you can come back to the other later. I would probably suggest Linux unless you are planning on specializing in Malware Analysis, in which case I would suggest Windows. Linux: No Starch's "How Linux Works" is a great beginner resource as is their "Linux Command Line" book. I would also check out "Understanding the Linux Kernel" (that's a .pdf link). For Windows you can follow the Windows Programming wiki here or you can buy the book "Windows System Programming". The Windows Internals books are generally highly regarded, I didn't learn from them I use them more as a reference so I an't really speak to how well they would teach a "beginner".

  4. Assembly: You can't do much better than OpenSecurityTraining's "Introductory Intel x86: Architecture, Assembly, Applications, & Alliteration" class lectures from Xeno Kovah, found here. The book "Secrets of Reverse Engineering" has a very beginner friendly introduction to Assembly as does "Hacking: The Art of Exploitation".

  5. Exploitation: OpenSecurityTraining also has a great video series for Introduction to Exploits. "Hacking: The Art of Exploitation" is a really, really good book that is completely self-contained and will walk you through the basics of assembly. The author does introduce you to C and some basic principles of Linux but I would definitely suggest learning the basics of C and Linux command line first as his teaching style is pretty "hard and fast".

  6. Specialized fields such as Cryptology and Malware Analysis.


    Of course if you just want to do "pentesting/vuln assessment" in which you rely more on toolsets (for example, Nmap>Nessus>Metasploit) structured around a methodology/framework than you may want to look into one of the PACKT books on Kali or backtrack, get familiar with the tools you will use such as Nmap and Wireshark, and learn basic Networking (a simple CompTIA Networking+ book will be a good enough start). I personally did not go this route nor would I recommend it as it generally shys away from the foundations and seems to me to be settling for becoming comfortable with tools that abstract you from the real "meat" of exploitation and all the things that make NetSec great, fun and challenging in the first place. But everyone is different and it's really more of a personal choice. (By the way, I'm not suggesting this is "lame" or anything, it was just not for me.)

    *edited a name out





u/ItsAConspiracy · 2 pointsr/Futurology

My suggestion is to opensource it under the GPL. That would mean people can use your GPL code in commercial enterprises, but they can't resell it as commercial software without paying for a license.

By opensourcing it, people can verify your claims and help you improve the software. You don't have to worry about languishing as an unknown, or taking venture capital and perhaps ultimately losing control of your invention in a sale or IPO. Scientists can use it to help advance knowledge, without paying the large license fees that a commercial owner might charge. People will find all sorts of uses for it that you never imagined. Some of them will pay you substantial money to let them turn it into specialized commercial products, others will pay you large consulting fees to help them apply the GPL version to their own problems.

You could also write a book on how it all works, how you figured it out, the history of your company, etc. If you're not a writer you could team up with one. Kurzweil and Jeff Hawkins have both published some pretty popular books like this, and there are others about non-AGI software projects (eg. Linux, Doom). If the system is successful enough to really make an impact, I bet you could get a bestseller.

Regarding friendliness, it's a hard problem that you're probably not going to solve on your own. Nor is any large commercial firm likely to solve it own their own; in fact they'll probably ignore the whole problem and just pursue quarterly profits. So it's best to get it out in the open, so people can work on making it friendly while the hardware is still weak enough to limit the AGI's capabilities.

This would probably be the ideal situation from a human survival point of view. If someone were to figure out AGI after the hardware is more powerful than the human brain, we'd face a hard takeoff scenario with one unstoppable AGI that's not necessarily friendly. Having the software in a lot of hands while we're still waiting for Moore's Law to catch up to the brain, we have a much more gradual approach, we can work together on getting there safely, and when AGI does get smarter than us there will be lots of them with lots of different motivations. None of them will be able to turn us all into paperclips, because doing that would interfere with the others and they won't allow it.

u/infinitelyExplosive · 1 pointr/pcmasterrace

Here are some different sources for different aspects of computers.

The book Code: The Hidden Language of Computer Hardware and Software is an excellent introduction into the low-level concepts which modern CPUs are built on.

Link hopping on Wikipedia is a totally viable method to learn many aspects of computers. Start at some page you know about, like Graphics Cards or Internet, and just keep reading and clicking links.

Hacking challenges are a great way to learn about how computers work since they require you to have enough knowledge to be able to deliberately break programs. https://picoctf.com/ is an excellent choice for beginner- to intermediate-level challenges. https://overthewire.org/wargames/ also has some good challenges, but they start off harder and progress quickly. Note that these challenges will often require some programming, so learning a powerful language like Python will be very helpful.

This site is not very active anymore, but the old posts are excellent. It's very complex and advanced though, so it's not a good place to start. https://www.realworldtech.com/

In general, google will be your best friend. If you run into a word or program or concept you don't know, google it. If the explanations have more words you don't know, google them. It takes time, but it's the best way to learn on your own.

u/[deleted] · 1 pointr/statistics
  1. Data Analyst may or may not use sophisticated statistical methods. If they do a lot they are probably underemployed. Most Data Analysts use SQL, Python or R, and various business intelligence software (Tableau) to do data manipulation and summarization. Statisticians are typically designing studies, analyzing the data, summarizing and presenting results. They will have a much deeper understanding of theory.

  2. SQL is important but R or Python is much more. Your data may be in files, relational databases, or even NoSQL databases so being able to pick those things up as needed is more important than one specific language.

  3. Starting out in a business intelligence position or a data analyst position will be easiest and would give you good experience but could get boring quickly. With a masters you would technically qualify for a statistician or even data scientist position which is what you’re going to eventually want. As long as you feel confident about the job qualifications go for those instead.

    4 and 5. This really depends on the industry or domain, but good free books are: ISLR , Forecasting: Principles and Practice . The first is more general. I would focus there first. An inexpensive excellent python book: Hands on machine learning with sci-kit learn and tensorflow

  4. Depends on job, but I would say stats is more important as long as you can pick up programming as needed. I was given some very good advice that knowing how to analyze the data is more important than what technology you know.

  5. Classification or Regression. Kaggle has lots of competitions and data.

  6. There are jobs but lots of unqualified applicants due to hype. I would say there is lots of opportunities for qualified individuals.
u/zakraye · 1 pointr/buildapc
  • What exact processor do you have?

  • Have you enabled virtualization functions in your UEFI/BIOS?

    I'm not too great at this stuff but I have dabbled in programming/sysadmin.

    It may be best to hackintosh before you do a VM. I could get a hackintosh up and running relatively easy, while I never successfully got a VM to function seamlessly without a noticeable performance hit.

    >Hackintosh is essentially a PC made into a mac, right?

    Well actually the hardware is (for the most part) identical. Some of the higher end Mac Pros use server parts and ECC RAM, but AFAIK MacBook Pros and iMacs use regular parts without ECC.

    That response might be overly complex. Short answer: for the sake of a general overall understanding, the physical components that make up Macs and PCs are functionally identical. The software that runs on them is the only difference. So really, a hackintosh is an "unauthorized" Mac! If you buy the correct hardware it's (AFAIK) functionally identical to a Macintosh. Apple writes drivers for specific hardware and if you get compatible stuff it just works. With a bit of tweaking. OS X is operating system (software) that runs on the hardware. Hypothetically you could run Windows XP on an iPhone if you wrote the code for it. Who knows? Maybe someone already has! I'm fairly certain someone's run Linux on an iOS device.

    There's some kind of DRM scheme (key and lock software or something like it) that prevents you from doing this with a vanilla installation procedure with non Apple mobos. From what I understand, if Apple didn't enforce/create this barrier you wouldn't even need to use UniBeast/MultiBeast.

    May I ask why you're thinking of using Xcode?

    Either way if you're curious about hackintoshing I would check out www.tonymacx86.com. They know a hell of a lot more about customac (hackintosh) than I do. They prefer to call it customac probably for legal reasons. And honestly it's not too bad of an idea.

    Alternative idea: if you're interested in programming check out Python Programming: An Introduction to Computer Science 2nd Edition. I thought it was a great book as a beginner myself, and you can use any platform that supports python, which is pretty much most modern operating systems.
u/rhiever · 1 pointr/artificial

Programming Game AI by Example has a great, easy-to-understand explanation and walkthrough for learning ANNs: http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782

Once you've learned at least ANNs, you can delve into the popular approaches to GAI:

u/di0spyr0s · 1 pointr/resumes

Thanks so much!

Where do hobbies and interests go? Below Education somewhere? Sample stuff I could add:

  • I started sewing this year and have achieved my goal to knit and sew all my own clothes for 2015.
  • I play guitar, drums, and piano, and I'm learning to play bass. A friend and I started a band called OCDC, because we're n00bs and play the same thing over and over a lot.
  • I read insatiably. Most recently Code: The Hidden Language of Computer Hardware And Software and A Guide to the Good Life: The Ancient Art of Stoic Joy, but also the backs of cereal packages and the "In case of fire" escape instructions on doors if there's nothing else.
  • I'm from New Zealand and can, if necessary, butcher a sheep/pig/deer/rabbit, build a fence, milk a cow by hand (or milk several hundred, given a decent sized milking shed), TB test deer, fell trees, and use the word "munted" in a sentence.
  • I've ridden horses all my life and still volunteer occasionally as an equine masseuse for some of the carriage horses in Central Park.
  • I love automating stuff and am working on fully automating my home aquaponics set up: a combination of an aquarium and a grow bed which currently produces great quantities of grass for my cats to puke up.

    I had sort of planned to put all this stuff in my personal website - write ups of personal projects, a good reads feed, an "About me" section, and maybe a page of my sewing/knitting creations.

    I'll certainly look into adding some more personality into the resume design, it is currently the result of a google template, which is pretty blah.

    Again, Thanks so much for your feedback! It's been really helpful!
u/linuxlass · 2 pointsr/learnprogramming

I started by showing my son Scratch when he was 9.5yo and helping him make a couple of arcade games with it. He was never all that interested in Logo, but got really turned on by Scratch. After a couple of months he was frustrated by Scratch's limitations, and so I installed Ubuntu on an old computer, showed him pygame/python and worked through a couple of online tutorials with him, and let him loose.

He learned to use Audacity to edit files from Newgrounds, and Gimp to edit downloaded graphics as well as create his own. He made a walk around, rpg-like adventure game, a 2D platformer, and then decided he wanted to learn pyggel and has been working on a 3D fps since last summer.

Soon, I'm going to get him started on C++ so we can work through a book on game AI (which uses C++ for all its examples). He's 13.5 now, and thinks programming is great and wants to grow up to be a programmer like his mom :)

I highly recommend a simple language like python for a beginner, but Scratch is wonderful for learning all the basic concepts like flow control, variables, objects, events, etc, in a very fun and easy way. The Scratch web site also makes it simple to share or show off because when you upload your program it gets turned into a Java applet, so anyone with a browser can see what you've done.

u/pjsdev · 1 pointr/gamedesign

Okay, here are 4 suggestions about theory. There are plenty more, but these are a few of my favourites.

Rules of Play: Game Design Fundamentals

  • Chunky theory book and one of my favourites. Also has a companion book of essays

    Characteristics of Games

  • Really nice combination of chapters from various designers (including Richard Garfield of MtG) looking into different aspects of design.

    Game Mechanics: Advanced Game Design

  • All about systems and how resources move through them in games and the affect that has.

    Theory of Fun for Game Design

  • Easy to read, nicely illustrated and conveys a powerful fundamental idea for game design.

    Good luck and happy reading.
u/enteleform · 1 pointr/Python

Part of thinking logically is knowing what tools are available to you, so I'd recommend reading some books to gain an overhead view (data structures, standard libraries, design patterns, etc) of Python.  I put together a list that I'm working through @ this Gist.  Based on your question, I'd recommend reading Fluent Python to start, and then check out Problem Solving with Algorithms and Data Structures (along with any others that interest you).
 
In addition to 3burk's math-based suggestions, Project Euler is another good option.
 
Some interactive coding-challenge options:

  • CoderByte
    This is the more beginner-friendly of the two. The questions are well-balanced for all skill levels, and they have well-explained official solutions.
  • LeetCode
    This one is a bit more advanced, but definitely worth checking out once you're ready since the solutions are ranked by efficiency (in milliseconds & Big-O complexity) instead of arbitrary points.  You can learn a lot by figuring out your own solution, and then checking out more efficient solutions to learn what you could have done better. There's also a forum where you can discuss how/why certain methods are more efficient.

     
    Also, check out this post I wrote in response to a similar question for a bunch more coding resources (challenges, books, projects, etc).
u/csp256 · 1 pointr/cscareerquestions

my bachelors is in physics from a no name university in my home state of alabama. im mostly self taught when it comes to cs but i made a serious effort to learn the formalism school covers too. i was a nontraditional student (graduated age 29) and serial drop out. had kind of a hard start.

i did a bit of graduate study before i dropped out again for my current job. i did this mostly abroad at the university of oslo in the computational physics program there. they're good people, but it was just an exchange. my physics and cs interests didnt overlap with theirs sadly so the program wasnt a good fit but i felt compelled to take it because money. always been poor until recently.

i learned a lot by keeping my ear to the ground. i read a lot. things like slashdot back in the day, or hackernews, and when i found something interesting i would pull on that string. i spend several hours a day reading about peoples reactions to things i find interesting. i get bored easily and throwing a wide net is the best way to fix that. i tend to change disciplines every 6 months. recently ive just been bouncing around inside of computer vision, sometimes straying into computer graphics but not really any farther.

the thing that academia (for graduate studies) gives you is freedom to do (kinda) whatever the fuck you want with your research: for better or worse. in industry most every other aspect is better, except you probably do what your boss wants... there are some places where this is less true, but getting in to these places is not easy. (im still working on it!)

some fun links i thought of

https://gist.github.com/jboner/2841832

https://www.akkadia.org/drepper/cpumemory.pdf

there was a coursera graduate class on computer architecture from princeton. something like that. this is one of two books my university uses for undergrads:

https://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X

of course programming is not computer architecture. but it is still good to know.

i mean, do you think Euler or the other greats would have never used computers if they had access?

if you want to get a grad degree do it. i recommend computational physics, computer vision, or computer graphics. im biased, but they are all math heavy, cs heavy, and frequently deal with tricky things, like shape and probability and approximately tackling else-wise impossible tasks. these are pretty core skills. just be aware that academics tend to be pretty bad programmers and are out of touch with industry.

you dont need it but a graduate degree is probably the easiest path to the types of things i was talking about.

u/YuvalRishu · 4 pointsr/QuantumComputing

Hi, I work on programming quantum computers. I studied in Canada (PhD from the Institute for Quantum Computing at the University of Waterloo) and I now live and work in Sydney, Australia. Your TL;DR is actually a bit different from the rest of your post, so I'll answer the questions in the TL;DR first.

I started getting interested in quantum computing when I was an undergraduate in Physics. I began with an interest in quantum entanglement and did a couple of summer research projects in the subject. I did my Master's degree with my supervisor for the last of those projects, and even wrote my first paper based on that work.

Quantum entanglement is of course very important in quantum computing but the study of the subject is more under the heading of quantum information theory. I switched over to quantum computing when I was deciding where to go for my PhD, and decided that I wanted to do the PhD to answer one simple question to myself: how far away are we, really, from a quantum computer? While I was finishing my PhD, the opportunity in Sydney came up and I decided that I liked the work happening here. I was (and am) interested in simulating quantum fields on a quantum computer, and have gotten interested in simulating physics in general (doesn't have to be quantum) as well as solving problems on a quantum computer in general (doesn't have to be physics).

We're talking about close to half my life at this point, so it's hard to summarise that story in any reasonable way. But if I had to try, I'd say that I followed my nose. I was interested in stuff, so I found ways to learn as much as I could about that stuff from the best people I could find who would give me the time of day or, better yet, a pay check. One of the nice things about doing science as a student is that there are plenty of people willing to pay you to study science if you know how to ask nicely.

Training a scientist is a long and arduous process, from the perspective of the student, the teacher, and the society as a whole. Take your time to learn properly. Don't let the bumps in the road stop you!

With the motivational stuff out of the way, my best advice is to learn everything. I mean everything. Physics, maths, computer science, engineering, chemistry, philosophy, sociology, history, everything. I know you can't possibly become an expert in all of that, but get at least a passing knowledge in whatever strikes your interest. When you hit on the thing that you simply can't stop thinking about, the thing that you literally lose sleep over, then you've found the topic for your PhD thesis. Find a supervisor and work on that as hard as you can for as long as you can until they tell you to get out and get a real job.

If that's not the advice you're looking for, then I'll try another piece. Go study functional analysis. You can't possibly understand quantum physics without knowing some functional analysis. If you're serious about quantum physics, this is now your bible. And when you give up on that book (and you will give up on that book), read this. When you're done, read this.

u/YuleTideCamel · 3 pointsr/learnprogramming

Sure I really enjoy these podcasts.

u/joeswindell · 5 pointsr/gamedev

I'll start off with some titles that might not be so apparent:

Unexpected Fundamentals

These 2 books provide much needed information about making reusable patterns and objects. These are life saving things! They are not language dependent. You need to know how to do these patterns, and it shouldn't be too hard to figure out how to implement them in your chosen language.

u/doddyk96 · 1 pointr/datascience

Thank you so much for your reply. I actually do plan on taking Andrew Ng's course just cause the book I am talking about is very limited to Python but I've heard great things about it. However, the Stanford course I was referring to was the Statistical Learning course based on the ISL book.

Yes I plan on doing some kaggle challenges once I feel comfortable with my skills to build up my portfolio or see if I can find some other novel projects to work on.


Ideally I'd like to be in a data science consultancy type role where I get to work on different kinds of projects and don't necessarily need very specialized domain knowledge. But at this point I think more direction as to what kind of roles exits would also be helpful. I just don't know what the field is actually like and I've never really met anyone doing data science for a living.

Thank you again for your reply. It was very helpful.

u/naranjas · 2 pointsr/funny

> Can you give me any more info on what types of things you simulate

There are so many different things. One example that involves physical simulation is rendering. Rendering, turning a 3d description of a scene into a 2d image, is all about simulating the pysics of light transport. Given a set of lights and surfaces you simulate how light bounces around and what a virtual observer placed somewhere in the scene would see. Another example is explosions. Cool/realistic looking explosions for movies involve simulating burning materials, fluid/gas movement, sound propagation, fracture, plastic/non-plastic deformation, the list goes on and on.

Here are some books that might get you started in the right direction

  • Fundamentals of Computer Graphics: This is an entry level book that surveys a number of different areas of computer graphics. It covers a lot of different topics but it doesn't really treat anything in depth. It's good to look through to get a hold of the basics.

  • Mathematics for 3D Game Programming and Computer Graphics: Pretty decent book that surveys a lot of the different math topics you'll need.

  • Fluid Simulation for Computer Graphics: Really, really awesome book on fluid simulation.

  • Do a google/youtube search for Siggraph. You'll find a lot of really awesome demonstration videos, technical papers, and introductory courses.

    As for programming languages, you're definitely going to need to learn C/C++. Graphics applications are very resource initensive, so it's important to use a fast language. You'll probably also want to learn a couple of scripting languages like python or perl. You'll also need to learn some graphics API's like OpenGL or DirectX if you're on Windows.

    I hope this helped!
u/latetodata · 15 pointsr/learnmachinelearning

I personally really benefitted from Jose Portilla's udemy class on python for Data Science: https://www.udemy.com/python-for-data-science-and-machine-learning-bootcamp. It deals with the machine learning algorithms at a pretty basic level but he does a good job overviewing things and this course personally gave me more confidence. He also wrote a helpful overview for how to become a data scientist: https://medium.com/@josemarcialportilla/how-to-become-a-data-scientist-2d829fa33aba

Additionally, I found this podcast episode from Chris Albon helpful: http://partiallyderivative.com/podcast/2017/03/28/learning-machine-learning

Finally, I have just started going through Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems and I love it. It's very easy to read and applicable: https://www.amazon.com/dp/1491962291/_encoding=UTF8?coliid=I1VIM81L3W5JUY&colid=2MMQRCAEOFBAX

Hope this helps.

u/fishoutofshui · 4 pointsr/QuantumComputing

I feel like I gained traction coming from statistics by ping-ponging between these three books. Nielsen and Chuang is a great place to start, especially the first two chapters. There’s a lot that will go over your head but you will pick up enough. Then Aaronson like you have been doing for a different perspective. Then McMahon holds your hand a bit on the computations, which will help if you aren’t familiar with quantum mechanics, as I was not. When you get stuck, switch books. I feel like once I bought all three books and started going back and forth and reading previous chapters again that is when things started to click and I gained some maturity. I have a long way to go but this has been the greatest self-learning journey I’ve been on in the past year. I hope you get as much as I have. Good luck.

https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176/ref=nodl_

https://www.amazon.com/Quantum-Computing-Explained-David-Mcmahon/dp/8126564377/ref=mp_s_a_1_fkmrnull_1?crid=382OF32JOGTRH&keywords=quantum+computing+explained+mcmahon&qid=1551223235&s=gateway&sprefix=quantum+computing+explained&sr=8-1-fkmrnull

u/Earhacker · 4 pointsr/learnprogramming

Websites are built with HTML, CSS, JavaScript, and that's it. You can have a back end (the bit that handles and serves up the data) written in any language, but every website you see is just HTML, CSS and JavaScript.

If you're underwhelmed by Python, I don't think you'll like JavaScript. It's like Python, only weird. But if you really want to get into web development, I'd recommend Eloquent JavaScript which is free online, but the paper book is a slightly older version until later this year hopefully. It's an excellent book on JavaScript, but a little too terse for total newbies. It's great for someone learning JavaScript as a second language. If it's too much for you, then the best choice for newer readers is The Modern JavaScript Tutorial.

But I suspect that you haven't taken Python as far as you think you have. It's a great language for beginners because it offers quick wins, and you can build cool little apps very quickly. But that belies its depth. You can build website back ends with the Flask framework for Python. Miguel Grinberg has written the gold standard of Flask learning in the Mega Tutorial, and has expanded it into a paper book in Flask Web Development on O'Reilly. Or, if you want to explore the nerdy depths of Python and know it inside out, get Fluent Python.

If you really weren't impressed with GCSE Python, the next level up is probably learning Java or C#, maybe even Go or Rust if you fancy something a little more cutting edge. I'm not an expert in these languages and I haven't read much on them, so I'll defer you to other answers.

u/gavinb · 1 pointr/opengl

Well if you want to be the next Carmack, get cracking! :) You have a lot of ground to cover, such as: mathematics (matrices, linear algebra, etc), physics, artificial intelligence, real-time processing, multithreading, architecture, networking and protocols, rendering, sound, and much more!

It is certainly possible with enough time and dedication to develop your own engine. It's just that there are so many excellent engines already out there, that you would be competing with projects that have already invested many thousands of hours and have loads of titles already developed for them. Why not get involved with an existing project to start?

BTW I really like your idea of creating a FPS with one room and focusing on making that environment the richest possible, exploiting a wide variety of techniques. Do it!!

Is your ultimate goal to create an engine? Or to create a game? Remember, the engine is in many ways a means to an end - it's not much use without a game that uses it!

Either way, I think you would be well advised to get involved with one of the open source game engine projects, and start contributing. Once you've learned how they work, you will be in a much better position to design your own. And realistically, you can't really just design an engine without a game - you need to know how games work in the first place, and what features and architectural decisions and designs make for a good engine.

Consider joining:

u/TotalPerspective · 5 pointsr/bioinformatics

Here are some books that I feel have made me better professionally. They tend toward the comp sci side, some are more useful than others.

  • Bioinformatics: An Active Learning Approach: Excellent exercises and references. I think most chapters evolved out of blog posts if you don't want to buy the book.
  • Higher Order Perl: I like perl to start with, so your mileage may vary. But learning how to implement an iterator in a language that doesn't have that concept was enlightening. There is a similar book for Python but I don't remember what it's called. Also, you are likely to run into some Perl at some point.
  • SICP: Power through it, it's worth it. I did not do all the exercises, but do at least some of the first ones to get the ideas behind Scheme. Free PDFs exist, also free youtube vids.
  • The C Programming Language: Everyone should know at least a little C. Plus so much has evolved from it that it helps to understand your foundations. Free PDFs exist
  • The Rust Programming Language: Read this after the C book and after SICP. It explains a lot of complex topics very well, even if you don't use Rust. And by the end, you will want to use Rust! :) It's free!

    Lastly, find some open source projects and read their papers, then read their code (and then the paper again, then the code...etc)! Then find their blogs and read those too. Then find them on Twitter and follow them. As others have said, the field is evolving very quickly, so half the battle is information sourcing.
u/zrbecker · 5 pointsr/learnprogramming

Depends on what you are interested in.

If you are interested in games, pick a game and do it. Most board games are not that hard to do a command line version. A game with graphics, input, and sound isn't too bad either if you use something like Allegro or SDL. Also XNA if you are on windows. A lot of neat tutorials have been posted about that recently.

If you are more interested in little utilities that do things, you'll want to look at a GUI library, like wxWidgets, Qt and the sort. Both Windows and Mac have their own GUI libraries not sure what Windows' is called, but I think you have to write it with C++/CLI or C#, and Mac is Cocoa which uses Objective-C. So if you want to stick to basic C++ you'll want to stick to the first two.

Sometimes I just pick up a book and start reading to get ideas.

This is a really simple Game AI book that is pretty geared towards beginners. http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782/

I enjoyed this book on AI, but it is much more advanced and might be kind of hard for a beginner. Although, when I was first starting, I liked getting in over my head once in a while. http://www.amazon.com/Artificial-Intelligence-Modern-Approach-2nd/dp/0137903952/

Interesting topics to look up.

Data Structures

Algorithms

Artificial Intelligence

Computer Vision

Computer Graphics

If you look at even simple books in these subjects, you will usually find tons of small manageable programs that are fun to write.

EDIT: Almost forgot, I think a lot of these are Java based, but you can usually find a way to do it in C++. http://nifty.stanford.edu/ I think I write Breakout whenever I am playing with a new language. heh

u/EricTboneJackson · 1 pointr/oculus

> Why?

First, let's define what we're talking about. I haven't seen this Anime, but it shows someone jacking into VR via a plug in the back of their neck. So for the purposes of discussion, let's assume this is Matrix-level VR. That means a virtual reality that is literally indistinguishable from actual reality, via a plug into the back of the head.

In this exact form, this is impossible (see below). More extreme mechanisms (brain in a jar) might be possible, but that's currently a total unknown. And we won't have the tech for hundreds of years.

Why?

For starters, our command of biology is currently profoundly limited and progress is slow. We're justifiable proud of how far we've come, but we're still crude butchers. In the last few hundred years we discovered anesthesia and antibiotics, but we still fix people but cutting them with knives and literally sewing them back together. We don't understand how the brain works, much less have the ability to send it accurate information or read information back out.

So what do we need for Matrix-level VR? We need to:

Intercept and replace all information going into and out of the brain non-destructively.


This in itself is probably impossible. It's likely that Matrix-level VR will require removing the brain from a body, or severing the spinal column and doing all manner of damage to the face. Note that in the Matrix (and the OP's video), we bypass the senses via a plug in the back of the neck, with the idea being that you intercept all data going to and from the body via the spinal chord. But most of the data going into and out of your brain doesn't happen via the spinal chord. The eyes, ears, and nose have direct links to your brain. For instance, a jack on the back of your neck can't intercept and replace signals coming from your optic nerve. So that's just a bit of science-fiction fantasy that makes for convenient story telling, like faster than light travel.

Perfectly replicate the entire nervous system and musculature of the human body in a computer and flawlessly simulate bidirectional nerve impulses to the brain.


In the Matrix, you can feel every muscle in your body. You can feel that hot sauce you just ate, or the need to take a shit or a piss. Your entire body is simulated and the nerve impulses going to the brain are indistinguishable from those of a real body. Moreover, the entire network of nerve firings required to say, walk, is flawlessly interpreted by the virtual body -- contracting all the correct muscle fibers and resulting in you having the grace to dance, or do Kung Fu.

We're probably 50 years to even having the compute power to model that, much less the technology to perfectly interface it with a human brain, assuming we had the brain in a vat and didn't have to figure out how to intercept all replace those nerve signals without harming the person.

Perfectly replicate the entire world in a machine.


We've been working on computer graphics for over 50 years, and we still haven't achieved real time photorealism, especially in stereo, at retinal resolution, at human max FOV. Let's say we get there in the next 20 years or so (highly optimistic), now we have the surface of things. We can render an apple that looks 100% convincing. The next 100 years or so will be doing everything else.

What's inside the apple? In order to full simulate what can happen to an apple -- how it responds to a knife, or a tooth, the skin of it, the juice inside, the way it will bruise or rot, what a slice of it looks like under a microscope, so on and so forth, you have to simulate it from first principles. Now imagine that you also have to perfectly replicate the way it feels in a virtual mouth, and the way it tastes, the way it smells. Again, you have to simulate it from first principles. You have basically build a model of the entire chemistry of an apple (not to mention perfectly simulate bacteria) to cover all possible cases.

And that's just an apple. What about everything else? We basically need to be able to simulate a universe from first principles. We don't even know if that's possible. Clock speeds for our current technology stagnated a decade ago. We're about to run into a quantum mechanical limitation for transistor size. We assume we'll find a way around it, but that's currently unknown.

We know that computing power has been rising exponentially, and we expect it to continue to do so for a while, but there's no guarantee that it will do so forever. Bacteria in a petri dish multiply exponentially, too. If some early generation noticed this trend, they might be tempted to imagine that the bacteria will eventually take over the entire universe. But their exponential growth hits a hard limit (when they run out of space/food). It could be there's a similar limit on computing power. We don't know. In any case, the kind of power we need for the Matrix is at best centuries away, if it will ever exist at all. That's not even counting the biological engineering involved.

There's only one way I could see it coming any sooner (again, assuming it's even possible): we develop a superhuman AI which can do our research for us at vastly accelerated subjective timeframes. But then we have much bigger problems.

u/miriberkeley · 1 pointr/writing

The Machine Intelligence Research Institute is putting out a call for intelligent stories illustrating concepts related to (artificial or natural) intelligence. Guidelines are quite specific; read below.


  • -Pay Rate: 8c/word, up to 5000 words.

  • -Multiple Submissions ok

  • -Simultaneous Submissions ok

  • -Submissions window: Until July 15

     

    This call is intended to reward people who write thoughtful and compelling stories about artificial general intelligence, intelligence amplification, or the AI alignment problem. We're looking to appreciate and publicize authors who help readers understand intelligence in the sense of general problem-solving ability, as opposed to thinking of intelligence as a parlor trick for memorizing digits of pi, and who help readers intuit that non-human minds can have all sorts of different non-human preferences while still possessing instrumental intelligence.

    The winning stories are intended to show (rather than tell) these ideas to an intellectually curious audience. Conscious attempts to signal that the ideas are weird, wonky, exotic, or of merely academic interest are minuses. We're looking for stories that just take these ideas as reality in the setting of the story and run with them. In all cases, the most important evaluation criterion will just be submissions’ quality as works of fiction; accurately conveying important ideas is no excuse for bad art!

    -

    To get a good sense of what we're looking for—and how not to waste your time!—we strongly recommend you read some or all of the following.


  • Superintelligence

  • Smarter Than Us

  • Waitbutwhy post 1, Waitbutwhy post 2 (with caveats)

     

    Withdrawal policy:

    After you submit a story, we prefer you don't withdraw it. If you withdraw a story, we won't consider any version of that story in the future. However, if you do need to withdraw a story (because, for example, you have sold exclusive rights elsewhere), please send an e-mail telling us that you need to withdraw ASAP.

     

    Important Notes:

    MIRI is neither a publishing house nor a science fiction magazine and cannot directly publish you. However, MIRI will help link a large number of readers to your story.

    We frankly do not know whether being selected by MIRI will qualify as a Professional Sale for purposes of membership in the SFWA. We suspect, through readership numbers and payscale, that it will, but we have not spoken to the SFWA to clarify this.

    If you have a work of hypertext fiction you think might be a good fit for this call, please query us to discuss how to submit it.

     

    How to Contact Us:

    To contact us for any reason, write to intelligenceprize@gmail.com with the word QUERY: at the beginning of your subject line. Add a few words to the subject line to indicate what you're querying about.

     

    (We've discontinued the previous, smaller monthly prize in favor of this more standard 'Publishing House Call' model.)
u/FertileLionfish · 2 pointsr/learnprogramming

I personally love Python and try to get a lot of my college classmates to try it. Python is very simple, but powerful and in my opinion intuitive. While it is type-less, some few this as a plus or a negative, I could really care less. The biggest reason why I'll recommend Python to somebody new and interested in programming is how it enforces styles, so later on down the road when coding in other languages it just feels natural and your code will generally make more sense. If you're also interested in security/pentesting look into Violent Python. I wish you the best of luck getting into programming, its frustrating at times, but very rewarding in the long run.

u/d0cc0m · 1 pointr/cybersecurity

It's never too late. I didn't get into the field until my mid 20s. It really just takes an interest and a desire to learn. Cyber security is a pretty large field so play around in the different sub-fields and find the one(s) that interest you.

Here are some resources to get you started:

Books:

u/TurkishSquirrel · 3 pointsr/learnprogramming

It depends a bit on what areas you're interested in. For interactive graphics you'll likely do OpenGL or DirectX or such.
Non real-time graphics usually means ray tracing or some variant like photon mapping where you want to produce physically correct images, with flexibility depending on your art direction e.g. Big Hero 6. With ray tracing you're essentially simulating how light interacts in the scene.

Here's some useful books/links for real time graphics:

  • Real-Time Rendering this is a great book covering a lot of theory/math topics behind real time graphics techniques, so it's agnostic to whatever rendering API you use. The book's website lists more graphics related resources and is quite good.
  • OpenGL Superbible good book focusing on OpenGL, written for beginners with the API.
  • open.gl very good introductory tutorials for OpenGL, I just wish it covered some more content. Should give you a solid start though.

    Here's some for ray tracing:

  • Physically Based Rendering this is basically the book for ray tracing, the 3rd edition should be coming out this spring though so if you want to save some money you could wait a bit. There's also a website for this book.

    For general math topics I also recently picked up Mathematics for 3D Game Programming and Computer Graphics which looks very good, though I haven't gone through it as thoroughly.

    As mentioned already /r/GraphicsProgramming is a good subreddit, there's also /r/opengl for OpenGL questions.
u/YoloSwag9000 · 2 pointsr/computerarchitecture

Typically companies do not publish full details about their IP, because then it would be easy to copy them and they would lose any competitive advantage they have. However, there is a remarkable amount of detail about how processors work, as many of the old techniques for branch prediction, caching and so forth are still around. There is a good (and free!) Udacity course called "High-Performance Computer Architecture" where some of these things can be learned. I can also recommend the books "Advanced Computer Architecture: A Design Space Approach" (Sima) and "Computer Architecture: A Quantitative Approach" (Hennessey & Patterson). The website Real World Tech post some very informative articles where they dive deep into the microarchitecture of Intel processors (such as this Haswell writeup) and others. Another port of call is the ecosystem of RISC-V, an open-source instruction set. They have a partial list of core and SoC implementations that you could pick through. If you fancy looking into GPUs, the book "Real-Time Rendering" (Akenine-Moller et al.) will start you off with the basics of the graphics pipeline. Both AMD and NVIDIA publish varying amounts of information about how their GPUs. The Broadcom VideoCore-IV has had full microarchitecture specs published, which you can find easily with Google.

​

If you really want to learn this stuff in detail, I would highly recommend designing a CPU/GPU and writing a simulator of it. Start by designing an instruction set, then building a very simple scalar in-order processor. Then add features such as branch prediction, register renaming, out-of-order execution and so forth. At University I wrote a CPU simulator for my Advanced Architecture class, then a cutdown GPU simulator for my Master's Thesis project. From these I managed to land an awesome job writing GPU simulators, so if computer architecture is something you want to pursue as a career I can strongly recommend completing a project like this. You will learn plenty and have something to talk about with potential employers.

Good luck!

u/ryanbuck_ · 2 pointsr/learnmachinelearning

It sounds like you have identified your weakness. Presently, that is programming in python, and using the sklearn library.

I would recommend taking a MOOC on python first. Lynda.com has a free trial and python videos. datacamp is another good start. It has a free trial and mayybe some python basics, but definately something on sklearn. and you can get some pandas training or R training there. (the kaggle libraries, most likely).

At that point, if you are going the tensorflow route, Aurelion has a great hands-on book called Learning Tensorflow with sci-kit learn

If you’re going with pyTorch I dunno.

Your mileage is going to vary, you could always use a book to learn python, or whatever.

Just make sure you learn to program first, you’d be surprised how much 2 weeks of very hard work will earn you. Don’t expect it to be ‘easy’ ever tho.

Also, if you’re not formally educated in statisics, keep an eye out for statistics advice until you have the time to work on it. (like in a MOOC, course, or blog). Learning some real analysis will make understanding the papers a real possibility (once again it will probably never be easy)

It is truly stunning how many years of preparation it takes to become competent in this. It’s a lovely science, but the competent ones have generally been on a mathematical/science track since 5th grade. Doesn’t mean we can’t become competent but it takes time. Imagine the equivalent of an undergraduate degree just devoted to ML and you’re about there.

u/Statici · 4 pointsr/Physics

I got the most understanding out of reading Nielson and Chuang's Quantum Computation and Quantum Information.

It delves into what happens and what can be done with quantum information - that is, how qubits are different from bits. Philosophically, I don't think there is anything more important than that; it's nice to see what particles make reality up, but you don't get much idea as to what those particles are actually doing. As a forewarning though: This book will probably push you towards a many-worlds interpretation. Not because they push it; it's just (kind of) necessary to think that way, when considering large sets of quantum information interacting.

In terms of physics, it has only a single chapter dedicated to the direct exploration of Schrodinger's equation. After that, it starts to dig into "what's it like when we have more than one quanta?" which is...well, I can't summarize it in a post. If you would like a PDF copy, I found one online a long time ago, I could PM it to you :)

In any sense: I've had this book for three years now and it is by far the best buy I have made in ever. QI is growing in importance (mostly with regards to the AdS/CFT correspondence in quantum gravity theories) and it is also always nice to know (ahead of time) how quantum computers are going to be working!

u/christianitie · 18 pointsr/math

Without knowing much about you, I can't tell how much you know about actual math, so apologies if it sounds like I'm talking down to you:

When you get further into mathematics, you'll find it's less and less about doing calculations and more about proving things, and you'll find that the two are actually quite different. One may enjoy both, neither, or one, but not the other. I'd say if you want to find out what higher level math is like, try finding a very basic book that involves a lot of writing proofs.

This one is aimed at high schoolers and I've heard good things about it, but never used it myself.

This one I have read (well, an earlier edition anyway) and think is a phenomenal way to get acquainted with higher math. You may protest that this is a computer science book, but I assure you, it has much more to do with higher math than any calculus text. Pure computer science essentially is mathematics.

Of course, you are free to dive into whatever subject interests you most. I picked these two because they're intended as introductions to higher math. Keep in mind though, most of us struggle at first with proofwriting, even with so-called "gentle" introductions.

One last thing: Don't think of your ability in terms of your age, it's great to learn young, but there's nothing wrong with people learning later on. Thinking of it as a race could lead to arrogance or, on the other side of the spectrum, unwarranted disappointment in yourself when life gets in the way. We want to enjoy the journey, not worry about if we're going fast enough.

Best of luck!

u/the_omega99 · 3 pointsr/learnprogramming

It's somewhat old and MIPS architecture (very applicable, although you probably wouldn't directly work with any MIPS hardware), but I thought Computer Organization and Design was pretty good. MIPS is a very good ISA for learning purposes, due to its simplicity. And there's a number of simulators available for trying to program stuff.

The book will not only build up assembly language, but the very design of the processor and then some. And of course, that does include digital logic. For example, there's a mention on the design of an ALU, as well as optimizations such as carry look-ahead. So you'd see things like one bit adders as a set of basic logic gates (ANDs, ORs, etc), while an ALU would be built with multiple one bit adders, etc. At the CPU level, things will usually be a little higher level, since we end up with multiple ALUs and numerous multi-plexers (not to mention complicated subsystems like RAM and the registers). Overall, it's pretty good at managing the abstractions and specifics.

u/AlphaMotel · 1 pointr/compsci

Mathwise you could start with some basic number theory
I found Rosen's Discrete Mathematics textbook to be really helpful.


You could also start with boolean algebra (AND OR NOT XOR ) bit shifting and so on since it will be absolutely useful later on.


For computer hadware and assembly language, I used this book Art of Assembly Language by Randall Hyde and Computer Organization and Design by Patterson and Hennessy.

For cryptography you might learn all about prime numbers , algorithms to find really large prime numbers, random number generator algorithms and why some are more random (cryptographically strong) than others.

Then using that you can apply that towards public / private key encryption, one way hash functions, and the main hash algorithms used by the public.
(MD5, RSA, SHA512) and how they compare against each other.
And how one way hash function are used to verify data integrity.
I found Gary Kessler's site to be really helpful


For password security then you can understand how you can use a one way hash function with a salt and a nonce to make a reasonably secure password storage system. You could learn how one could safely store password hashes in a database like mySQL (www.mysql.com)


And once you understand one way hash functions and public and private keys, then you would already 90% on the way to understand how the bitcoin protocol works and how CPU's mine bitcoins and how the public ledger blockchains works.

For other languages, another language you could easily learn is Java using Processing. I really do enjoy using it and it was easy and fun to learn, and I use it a lot for rapid prototyping.

u/pianobutter · 2 pointsr/askscience

Oh, I have a bunch of recommendations.

First, I really think you should read Elkhonon Goldberg's The New Executive Brain. Goldberg was the student of neuropsychology legend Alexander Luria. He was also a good friend of Oliver Sacks, whose books are both informative and highly entertaining (try The Man who Mistook his Wife for a Hat).

I also think Jeff Hawkins' On Intelligence is a great read. This book focuses on the neocortex.

I think you'll also appreciate Sapolsky's Why Zebras Don't Get Ulcers. Sapolsky is a great storyteller. This book is a pretty good primer on stress physiology. Stress affects the brain in many ways and I'm sure this book will be very eye-opening to you!

More suggestions:

The Age of Insight and In Search of Memory by Eric Kandel are good. The Tell-Tale Brain and Phantoms of the Brain by Ramachandran are worth checking out. If you are interested in consciousness, you should check out Antonio Damasio and Michael Graziano. And Giulio Tononi and Gerald Edelman.

If you're up for a challenge I recommend Olaf Sporn's Networks of the Brain and Buzsáki's Rhythms of the Brain.

u/cunttard · 1 pointr/C_Programming

Start with getting the XCode developer tools and the command-line package.

C is an important language in Computer Science because it is pretty much the language for heavy duty Operating Systems, the type you see in Desktop OSes, Network OSes (the type that runs on a networking router/switch), Server OSes (Linux, BSD, Windows, etc.).

I think C is a hard language to learn, but it is a great first serious language while also simultaneously learning an easier language like shell or Python to make yourself more efficient/productive.

However fundamental to CS is about the theory of comptuation not really languages. Languages are just a way to express computation. Some languages are better than others for expressing computation to solve certain problems. I would highly encourage also looking into understanding computation from first principles, a great introduction is Theory of Computation (2nd edition is really really cheap used). The only background knowledge you need to know is highschool mathematics.

u/ocusoa · 3 pointsr/Physics

Do you know which fields of physics are you interested in?

If Quantum Information/Quantum Computation sounds interesting, I would look at this book. I used it when I first learned about the topic. It doesn't assume much advanced math, just basic matrix/vector multiplications will suffice.
There's a reason the book doesn't assume much prior knowledge. It has two parts, Quantum Information and Quantum Computation. Roughly speaking the former is physics and the latter is computer science. And usually physicists don't know much about computer science and computer scientists don't know much about physics.


There's also another book, "Q for Quantum", published very recently by Terry Rudolph. I haven't read the book myself (I plan to), but from what he described in an email it might be something you're looking for:


> I have finally finished a book, "Q is for Quantum", that teaches the fundamentals of quantum theory to people who start off knowing only basic arithmetic.

> I have successfully used this method in outreach with students as young as 12, but of course it is much easier when you can have a proper back-and-forth dialogue. In practice it is late-stage high school students I am most passionate about reaching with this book - I believe quantum theory can (and should) be taught quantitatively in high school, not 2 years into an undergraduate physics degree! In fact I would be delighted if the 3rd and 4th year students entering my undergraduate lecture courses already understood as much quantum theory as covered in the book.


Have fun!

u/EtherMan · 1 pointr/KotakuInAction

> For your reading pleasure: http://www.amazon.com/Where-Wizards-Stay-Up-Late/dp/0684832674 Knowledge is power.

Thanks. But I work in that field and are well versed in what the internet is, what it is not, and it's history. ARPANET is not the internet. It never was, and never will be. As I said, it's a completely different network for a completely different purpose. The TECH used for the development of ARPANET, was repurposed to create the internet. But saying that it therefor is the internet, is like saying a car is a horse cart. It's simply not true.

> The OP did, when contending that the rough-and-tumble days of Usenet justified, even necessitated, a similar treatment of the ideology of "safe spaces" today.

No he didn't. You trying to cram that interpretation into it, does not make it one. Nor was that what he said. Why are you lying?

> Our legal system disagrees with you, vehemently. Here is achildren's primer on the subject, which seems about your speed:

No it does not. There's plenty of rulings from multiple courts in multiple levels regarding this. Free speech has nothing to do with what a company can and cannot do with their platforms. Free speech is about free speech, nothing else. That's not to say they're not allowed to control their platform, it simply have nothing to do with free speech. That a company has no obligation to allow you to use their platform for your free speech, is COMPLETELY different, from the company having free speech rights. Is the education on the difference between free speech and the first amendment and what the difference between those things is, REALLY this bad in the US that even people in Europe knows it better? You're just being silly now... Seriously...

u/ziptofaf · 2 pointsr/learnprogramming

>is book could have been useful also for C++ real-time programmers of course because i would include also HW information used in that field.. probably I'm asking too much now..

It wouldn't be. You misunderstand how that field of programming works. Differences can be HUGE and you would end up using at least something like https://www.amazon.com/Computer-Organization-Design-MIPS-Architecture/dp/0124077269.

Why? Because hardware used there can be fundamentally different than your typical computer. How much? Well... some CPUs don't support recursion. No, really. Do more than 2-3 recursive calls and CPU rans out of memory. You also end up using FPGAs and ASICs. To explain all that is way more than a book.

You seem to want a hypotethical book on "current PC hardware and it's performance". Which frankly is not in a form of a book but comes from visiting places like Guru3d and anandtech. Actual low level differences that WILL matter for a programmer are hidden in CPU specs sheets and to read that you need resources that target computer architectures and your problem domain specifically. Well, that and practice really - someone working in game engine development is likely to know graphics pipeline like the back of their hand and can easily talk about performance on several GPUs and pinpoint what makes one better than the other. But that came from experimenting and plenty of articles, not a singular resource. Too many different requirements really to create a single resource stating that "X is good for Y branch of programming but bad for Z".

I mean, even within desktop CPUs themselves. Would you rather have a 14 Core i9 9980XE or a 32 core Threadripper 2990WX? Answer is - it depends. One has far superior single threaded performance due to higher clock, the latter will eat it alive in heavily multithreaded and independent processes (2990WX has 32 cores but only 16 are connected to the rest of your computer, this can cause very visible delays so there are multithreaded scenarios when it will underperform). And in some cases you will find out that an 8-core 9900k is #1 in the world. It ALL depends on a specific application and it's profile.

u/c3261d3b8d1565dda639 · 1 pointr/programming

> I know basically nothing about x86 internals to make an accurate statement

If you're interested in learning about the internals, check out some real world technologies articles. For instance, Intel’s Haswell CPU Microarchitecture. On page 3, Haswell Out-of-Order Scheduling, it talks about the register renaming that goes on to support out-of-order execution.

It's more detail than most people really need to know, but it's interesting to understand what modern microprocessors are doing under the hood during program execution.

For anyone else reading, an even easier introduction to the topic is in the awesome Computer Systems: A Programmer's Perspective. It'll get you comfortable with the machine language representations of programs first, and then move on to basic architecture for sequential execution, and finally pipelined architecture. It's a solid base to move forward from to modern architecture articles like on real world technologies. There are more detailed treatments if you're really interested, e.g. Computer Architecture, A Quantitative Approach, but I have never read it so can't say much about it.

u/Fruitcakey · 10 pointsr/learnprogramming

Well, I'm entering my final year of my degree. I've did it the hard way with lots of fanny-arsing and making life difficult for myself. I strongly recommend you don't do it my way.
In my experience, in your first year you won't get exposed to a great deal of code, nothing a clever university student can't cope with.

If I was to re-do my degree, I would get a a grasp on the more theoretical side early on. Over the next 4 years you'll be doing plenty of programming, what programming you can cram into the summer, ultimately won't count for much.

Early on you will be exposed to logic gates. AND, NOR, XOR, NAND etc. Think of these as the smallest possible, circuit level, building blocks that computers run on. You can construct all types of logic gates using the universal gates NOR and NAND. If you can teach yourself that (which I don't think would be too difficult) then you'll be ahead in at least one of your classes.

At some point you will have to learn a functional programming language, and if you're looking at it for the first time, it's a complete mind-fuck.
http://learnyouahaskell.com/
That's an excellent resource for learning haskell, arguably the most popular functional language. If you manage to work through some of that, you'll be miles ahead of your class mates still struggling with the concept.

You will likely have some sort of data structures and algorithms class, so if I were you, I'd familiarise myself with the main ones.
Learn the difference between an array and a linked list, how quick sort and merge sort work. Trees and Binary Trees, breadth first search vs. depth first search. Amongst others. Don't exhaust yourself over it, but at least get a flavour for it.

Maybe in first year, maybe in second. You'll start learning about Instruction set architectures, cache, operating systems and some assembly. If you're keen check out this:
http://www.amazon.co.uk/Computer-Organization-Design-Interface-Architecture/dp/0123747503
I genuinely can't recommend it enough.

Learning about the Internet Protocol Suite is probably a good idea. I found it really interesting and not too complex.


These are just my suggestions. In my opinion, they are manageable, can be self-taught, and will provide you with a good head start, cover a few bases. Sure, at some point you will need to delve into number theory and graph theory, and proof-by-inductions, but don't jump into the deep end too soon. You'll end up overwhelmed with big gaps in your knowledge. Hope it helps.

u/TexturelessIdea · 2 pointsr/gamedev

If you want to know how to deal with all the negative comments people give you, the only real answer is to ignore them. If you want to know how to convince people that being a indie dev is a worthwhile pursuit, then you have to release a game that sells really well.

This may sound like very useless advice, but the truth is that getting into indie game development is not a good idea. Most likely you will never finish a game, or you will release a game that nobody cares about and doesn't make you much(if any) money. Some people spend years making a game, and still end up releasing a bad game. The simple truth is that no amount of hard work, dedication, or love of game development is going to guarantee your success.

Most aspiring gamedevs like to talk about Minecraft, Stardew Valley, or Dwarf Fortress, as if the existence of those games guarantees their success. Most people don't realize that for every Notch, there are 1,000 people who make games nobody even knows about. Most likely you, me, and most of the other people here will fail to make a game that earns enough money to live off.

If you can't afford to release a game or two (or 5) without turning a profit, then game development just isn't for you. If my post upsets you, keep in mind that you will hear much worse from loads of people no matter how good your game is. I would never recommend developing indie game to anybody as a career choice, it is very hard work that will most likely earn you less money than working part time at minimum wage. You should think of game development as a fun hobby; because, until you make a big hit, game development isn't a career any more than buying lottery tickets.

If you've made it to the end of my post and you still want to be a game developer, well that's the kind of attitude you're going to need, so you have that going for you. I do also have some practical advice for improving your gamedev skills. When you're talking about your knowledge of programming, you seem hung up on the language itself. Knowing a programming language makes you about as much of a programmer as knowing a human language makes you a writer. I'm not saying this to be mean (you may find that hard to believe at this point); I'm just trying to point out that there are other aspects of programming for you to learn. Some good things to read up on are programming(or design) patterns, algorithm design, and general (language agnostic) programming topics. There are also game design topics that don't relate to the programming aspects. I'll leave a quick list of resources below.

Project Euler

Theory of Fun for Game Design

Game Programming Patterns

Coursera's Software Development Category

MIT Open CourseWare Computer Science Category

u/scarthearmada · 6 pointsr/explainlikeimfive

The internet isn't a specific 'thing'; there is no internet box that you can point to and say, "that's the internet!" The internet is an abstract term applied to a series of computer networks of an indeterminate number greater than one. This is important because prior to the networking of two distinct networks together, you only had two distinct, non-communicating networks.

There is a varying level of redundancy in the connections between the various networks, all with one specific thing in common these days: the TCP/IP internet protocol suite. It was the best way of allowing for common communication between distinct computer networks.

If you visualize a long line -- a wire -- and then envision computer networks connecting to it via servers and more wire, you're envision what the internet is at a basic, broad level. There is a great video on YouTube that explains the internet this way. I'm trying to locate it now. However, if you enjoy reading about such things, there are two fantastic books that I recommend on the subject:

  1. Where Wizards Stay Up Late: The Origins Of The Internet

  2. Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web

    The former explores the history of the internet, taken as a summation of its parts and their creation. The latter explores the origins (and potential futures) of the World Wide Web, a specific application of hosting and sharing documents (and other media) across the internet conveniently. It's written by Tim Berners-Lee, the number one scientist behind its creation. I include this link because it is a common misconception that "the internet" is "the world wide web."
u/MyJimmies · 2 pointsr/truegaming

It's been on my mind again, so I'm happy to see it here on Truegaming. But there's this video that might help out a bit or at least be a bit entertaininly-interesting.

It might be awhile until we are at the point where we can have entire schools based around this kind of discussion. But hopefully someday. There's plenty of interesting books outthere that have already been suggested here. There's some books based around game design like Raph Koster's "A Theory of Fun". There're YouTubers like aforementioned MrBTongue and Satchbag that fondly talks about games or themes in games and how it affects them and those around them. Then there's /r/truegaming that talks about these things as well, albeit a bit more fanatically.

But sadly I got nothing to fit exactly your category that you want to see, though I'd love to see it myself. Perhaps a start for finding some stories of interesting user interactions in MMOs can start with Eve Online. Check out The Mittani. Although I haven't read it in a long while I do remember its launch when I still flew with Goonswarm/Goonwaffe and the cool pilots and writers of the site. Some great stories and unintentionally interesting insight into the mindset of players interacting in an MMO space.

u/CastigatRidendoMores · 0 pointsr/singularity

Bostrom's Superintelligence covers gene editing very well, but let me summarize:

The singularity isn't likely going to come through gene editing. The reason is it's too difficult to improve on the brain. If you identify which genes are responsible for genius and activate them (which is difficult to say the least), you could get everyone as intelligent as the smartest person yet. But where you do you go from there? You'd have to understand the brain on a level far, far beyond what we do now.

Then if you did that, chances are you'd run up into diminishing gains. It would be a lot of work to increase everyone's IQ by 5 points once, but far more work to figure out how to do it the 10th time. Rather than getting exponentially increasing gains in intelligence, you get logarithmic increases.

Not to say I'm not a fan of gene editing. It's obviously fraught with controversy when used beyond curing disease, but compared to other forms of trans-humanistic techniques it would leave us with a lot more humanity intact.

u/Javbw · 2 pointsr/DoByFriday

Good ones!

I suggest trying to wear two “outer” shirts for one waking day - dress,polo, or any other type of collared shirt.

Find and buy one item solely for airplane miles arbitrage.

Watch an anime from John Siracusa and have Him as a guest. I want to hear Max and Merlin pick on John a bit, though he is almost always “good cop”.

For a serious one (if they ever want to do “serious”) I would love for all of them to expound on their thinking of how the mind handles memory/ consciousness - though this might be a Rec/Diffs topic just for John and Merlin:

I read a fascinating book (On Intelligence) that not only explained in lay terms how your brain (logically) processes inputs, but had a good theory of how a single method of working explained learning, practice, memory, and actually moving your muscles to do something - most theories can’t explain them all in a single method.


<br />
Speaking of miles arbitrage, my brother in law is a frequent traveler using arbitraged miles. He routinely buys money orders that offer some kind of very large miles bonus, then deposits it into the bank to pay the bill; the small fee for the money order is offset by the mileage gain. He has travelled more in a couple years than I have in my entire life - some of it paid, some on miles. considering he is the one who handles money responsibly, and I am most certainly not, he must be onto something. <br />
<br />
That might also be a good topic: revisit a lesson they learned about handling money. 
u/kimchi_station · 2 pointsr/netsecstudents

So this is aimed at people in a cyber security degree? What kind of knowledge do they have?

&gt; using all the tools of kali

Pleaseeee no. There are hundreds of programs and scripts in Kali, it would not be feasible to learn and remember them all. Off the top of my head what I would do is:

  • Have people do some of the starter wargames at overthewire so they are familiar with the linux command line. Maybe even make this a requirement to participate so you know that people are committed and have a base level of knowledge.

  • Read write-ups on attacks and attackers, here is a good one by Mandiant&lt;--(PDF link)

  • Culture. I feel like this is one of the most neglected fields in cyber security. Read some phrack.

  • Split people into teams to work on projects so that they have experience working together.

  • Find some old CTFs or images on Vulnhub. See if you can register for some CTFs, looks great on a resume.

  • Learn about sql and sql injection.

  • Learn python, take a look at violent python or Grey Hat Python and Black Hat Python for more advanced stuff. There is also Hacking Secret Ciphers with Python for more of a crypto angle.

  • linux, linux, linux. feel at home in the terminal and be able to script bash.

  • Going over basic tools like nmap, aircrack-ng (airmon-ng, etc.), sqlmap, hydra, hashcat, metasploit, etc. Make whole day labs that use just one tool, You could maybe find an easy Vulnhub image or use Metasploitable to practice these.

  • Make sure everyone has a github and populates it with stuff they create in this class. Incorporate it into your class so you got people forking and contributing to other members/teams projects.

  • Look over books like The Hacker Playbook, Hacking, the Art of Exploitation, and so on for more ideas.

  • Maybe most importantly, have the students teach. I'm sure there are people in there who specialize in one tool or subject. Have them design and lead a lesson/lab/activity. The best way to solidify and expand on what you know is to teach it.
u/samort7 · 257 pointsr/learnprogramming

Here's my list of the classics:

General Computing

u/compSecurity · 24 pointsr/netsecstudents

I'd recommend learning to use Linux well first, since that is what you will need to use a lot of the tools for Pen Testing, after that you can choose an area to start with, most go with web app sec or net sec, since those are most in use right now - after that you can move into areas like cloud security, forensics or some other specialty.

As far as resources go there are a lot out there, i'll link some good ones that I use:

https://github.com/wtsxDev/Penetration-Testing

https://github.com/jivoi/offsec_pdfs

Those two should keep you going for a while at least.

As for coding, i'd recommend learning to use Bash first, then python. Bash is the Born Again SHell, a scripting language used in linux and is something that you will use a lot, and python is a language that is used a lot in offsec.

Here is a place where you can learn some Bash:
https://www.tldp.org/LDP/Bash-Beginners-Guide/html/Bash-Beginners-Guide.html

There are two books i'd recommend for python, ill link them here:
https://www.amazon.com/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579

https://www.amazon.com/Black-Hat-Python-Programming-Pentesters/dp/1593275900

the book in the second link is a bit easier to approach in my opinion, but both require some basic knowledge of python - so youtube or google some tutorials and im sure you'll do fine.

If you want to get into pen testing web apps, then you will want to learn some PHP and JavaScript, a lot of websites are written in PHP, and a lot of exploits are executed with JS: Cross site scripting in particular. You should also learn some SQL since that is another common one for manipulating databases, and can be attacked in a method known as SQL injection.

If you want a place to practice things you are learning then go here: http://overthewire.org/wargames/
They offer some pretty basic war games for things like linux commands and what not so you can really test your knowledge and learn a lot of the things you will have to do to progress through the games.

That's all I can think of atm, but i'm sure of the other people in here will be happy to give you some more suggestions

good luck!

u/TynanSylvester · 6 pointsr/gamedev

I wasn't very confident, I actually pushed out the Kickstarter kind of fast because I didn't want to come out just as or after Spacebase DF-9 was announced.

There was no initial following. First PR move I made was actually a test - I put the first look video (https://www.youtube.com/watch?v=vV6wyeZ7458) up on Reddit and elsewhere and tried to gauge the response. When the response was very strong, I moved forward with Kickstarter a few weeks later. For KS, I was confident enough to put the goal at 20k; that's about it. I was pretty sure I could hit that number. Of course the game did 13x that amount, but this was very much not anticipated :P

Basically, I think PR works the same as game design. Develop the pitch, test it on people, see what they respond to, emphasize that, replace the rest, repeat until every part of the pitch hits really well. I'd been refining the RimWorld pitch for about 5 months on friends, family, and acquaintances just by explaining it and watching their facial expressions as I mentioned different points. By the time of the KS I knew (not thought) that every point worked - DF-like, sci fi colony, western theme, Firefly/dune inspiration. I'd cut the parts that didn't work, like the good/evil colony idea.

RW also had a nearly unheard-of advantage on Kickstarter, which was that it worked and was already fun (which I was confident of, again, because I had run lots of controlled playtests and refined the game for about 5 months by then). Got lots of traffic from people who came from YouTube vids of people playing the journalist-only pre-pre-alpha.

&lt;ShamelessPlug&gt;I've written much more at length on game design in general in my book: http://www.amazon.com/gp/product/1449337937 &lt;/ShamelessPlug&gt;

u/Flofinator · 0 pointsr/learnprogramming

Yikes! Well it's going to be pretty hard for you to really understand how to do Python without actually coding in it.

The one thing you could do though is get a book with examples and write them down and try to modify the examples to do something a little extra while at work.

I find the http://www.headfirstlabs.com/books/hfpython/ books the absolute best books for almost anything if you are just starting out. The Java book is especially fun!

I know this isn't exactly what you are asking but it might be a good resource for you to start using.

Another great book that will teach you parts of the theory, and has really good examples on how computers work is http://www.amazon.com/Code-Language-Computer-Developer-Practices-ebook/dp/B00JDMPOK2/ref=sr_1_1?s=digital-text&amp;amp;ie=UTF8&amp;amp;qid=1457746705&amp;amp;sr=1-1&amp;amp;keywords=code+charles+petzold .

That really helped me think about computers in a more intuitive way when I was first starting. It goes through the history and to what an adder is and more. I highly recommend that book if you want to understand how computers work.

u/zxcdw · 4 pointsr/hardware
  • Learn how to program, in any language just to get the hang of the way to think algorithmically
  • Learn how your OS executes your programs on your CPU
  • Learn how to program in assembly language

    That's just the beginning, but even from that you can infer so much when it comes to matters related to hardware. The low-level details of how AMD or Intel implement their FPU or ALU, or the communication protocol of their memory controllers etc. are utterly irrelevant for anything unless it's really something super specific. Studying such matters leads you nowhere, but actually having some understanding how how a computer operates in general, and actual experience of making the computer operate by programming it is a huge deal.

    Obviously this doesn't help you reason about individual products based on some specific microarchitecture compared to another, but it creates the foundation for you to actually dig deeper into the subject.

    Really, there's so much to it. There are many subjects in computer science, electrical engineering and even software development which come at play. It's not about individual facts and rote learning, but about being able to generalize and synthetize ideas from your knowledge and enough critical thinking skills to recognize bullshit.

    But if you just need one piece of advice, here it is: read Computer Organization and Design, Fourth Edition by Patterson &amp; Hennessy. Or really, any other book regarding the subject. ...and learn to program, for real.

    Also Wikipedia has some good articles on many subjects, and it is a great source for sources of information. Also some clever use of Google-fu can help to get some good course materials in your hands, or research papers, or just about anything. For example using site:*.edu is your friend.
u/noeda · 9 pointsr/roguelikedev

The Baconist

(there's no bacon in this game despite the name: I originally was going to make a roguelike about stretching a really long bacon across the dungeon but it's going in a different direction now)

It can be played here: https://submarination.space/baconist132/index.html

Screenshot of one of the unfinished puzzles: https://submarination.space/baconist/puzzle.png

(move with vikeys or numpad or WASD or mouse)

This is a puzzle game where you need to solve puzzles. Right now that means you push boulders like in a sokoban because I haven't got around to implementing actual interesting mechanics yet.

Since past two weeks I've managed to lay down the most important technical foundations:

  • Performance has been slightly improved (still crappy on browser but it's playable; the game can be compiled to native version that runs in a terminal and it's way better in there).
  • Field of view now works more sensibly when you look through portals.
  • Your character can still remember parts of a level that have been seen before (they are shaded darker)
  • I now have a system that makes it fairly easy to design the entire dungeon (I essentially just have a giant text file that's interpreted and turned into world).

    My roguelike can also display animated things. I made my water look like all fancy and animated, a lot like in Brogue but I soon realized this is probably going to be a problem if I use water in puzzles and it has to stand out well from surroundings and look consistent. Sometimes boring-looking things are better. Overall my game has lots of flat colors.

    At this point my concerns are about designing the game mechanics themselves (as opposed to technical challenges).

    Pushing boulders gets boring quickly. I have some unfinished code that would add chain chomp -like enemies to the game and the puzzles would be about how to go around them or neutralize their threat. And I have a sketchbook where I threw in bunch of more ideas. My thinking is to implement wacky ideas and seeing what works. I also have a book on game design I'm going through, trying to educate myself what kind of (puzzle) game would be fun to play.

    I guess this is not really a roguelike. It's a puzzle game and the entire world right now is hand-crafted. There are no random elements to this game whatsoever. But I think that's fine.
u/CSMastermind · 2 pointsr/AskComputerScience

Senior Level Software Engineer Reading List


Read This First


  1. Mastery: The Keys to Success and Long-Term Fulfillment

    Fundamentals


  2. Patterns of Enterprise Application Architecture
  3. Enterprise Integration Patterns: Designing, Building, and Deploying Messaging Solutions
  4. Enterprise Patterns and MDA: Building Better Software with Archetype Patterns and UML
  5. Systemantics: How Systems Work and Especially How They Fail
  6. Rework
  7. Writing Secure Code
  8. Framework Design Guidelines: Conventions, Idioms, and Patterns for Reusable .NET Libraries

    Development Theory


  9. Growing Object-Oriented Software, Guided by Tests
  10. Object-Oriented Analysis and Design with Applications
  11. Introduction to Functional Programming
  12. Design Concepts in Programming Languages
  13. Code Reading: The Open Source Perspective
  14. Modern Operating Systems
  15. Extreme Programming Explained: Embrace Change
  16. The Elements of Computing Systems: Building a Modern Computer from First Principles
  17. Code: The Hidden Language of Computer Hardware and Software

    Philosophy of Programming


  18. Making Software: What Really Works, and Why We Believe It
  19. Beautiful Code: Leading Programmers Explain How They Think
  20. The Elements of Programming Style
  21. A Discipline of Programming
  22. The Practice of Programming
  23. Computer Systems: A Programmer's Perspective
  24. Object Thinking
  25. How to Solve It by Computer
  26. 97 Things Every Programmer Should Know: Collective Wisdom from the Experts

    Mentality


  27. Hackers and Painters: Big Ideas from the Computer Age
  28. The Intentional Stance
  29. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine
  30. The Back of the Napkin: Solving Problems and Selling Ideas with Pictures
  31. The Timeless Way of Building
  32. The Soul Of A New Machine
  33. WIZARDRY COMPILED
  34. YOUTH
  35. Understanding Comics: The Invisible Art

    Software Engineering Skill Sets


  36. Software Tools
  37. UML Distilled: A Brief Guide to the Standard Object Modeling Language
  38. Applying UML and Patterns: An Introduction to Object-Oriented Analysis and Design and Iterative Development
  39. Practical Parallel Programming
  40. Past, Present, Parallel: A Survey of Available Parallel Computer Systems
  41. Mastering Regular Expressions
  42. Compilers: Principles, Techniques, and Tools
  43. Computer Graphics: Principles and Practice in C
  44. Michael Abrash's Graphics Programming Black Book
  45. The Art of Deception: Controlling the Human Element of Security
  46. SOA in Practice: The Art of Distributed System Design
  47. Data Mining: Practical Machine Learning Tools and Techniques
  48. Data Crunching: Solve Everyday Problems Using Java, Python, and more.

    Design


  49. The Psychology Of Everyday Things
  50. About Face 3: The Essentials of Interaction Design
  51. Design for Hackers: Reverse Engineering Beauty
  52. The Non-Designer's Design Book

    History


  53. Micro-ISV: From Vision to Reality
  54. Death March
  55. Showstopper! the Breakneck Race to Create Windows NT and the Next Generation at Microsoft
  56. The PayPal Wars: Battles with eBay, the Media, the Mafia, and the Rest of Planet Earth
  57. The Business of Software: What Every Manager, Programmer, and Entrepreneur Must Know to Thrive and Survive in Good Times and Bad
  58. In the Beginning...was the Command Line

    Specialist Skills


  59. The Art of UNIX Programming
  60. Advanced Programming in the UNIX Environment
  61. Programming Windows
  62. Cocoa Programming for Mac OS X
  63. Starting Forth: An Introduction to the Forth Language and Operating System for Beginners and Professionals
  64. lex &amp; yacc
  65. The TCP/IP Guide: A Comprehensive, Illustrated Internet Protocols Reference
  66. C Programming Language
  67. No Bugs!: Delivering Error Free Code in C and C++
  68. Modern C++ Design: Generic Programming and Design Patterns Applied
  69. Agile Principles, Patterns, and Practices in C#
  70. Pragmatic Unit Testing in C# with NUnit

    DevOps Reading List


  71. Time Management for System Administrators: Stop Working Late and Start Working Smart
  72. The Practice of Cloud System Administration: DevOps and SRE Practices for Web Services
  73. The Practice of System and Network Administration: DevOps and other Best Practices for Enterprise IT
  74. Effective DevOps: Building a Culture of Collaboration, Affinity, and Tooling at Scale
  75. DevOps: A Software Architect's Perspective
  76. The DevOps Handbook: How to Create World-Class Agility, Reliability, and Security in Technology Organizations
  77. Site Reliability Engineering: How Google Runs Production Systems
  78. Cloud Native Java: Designing Resilient Systems with Spring Boot, Spring Cloud, and Cloud Foundry
  79. Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation
  80. Migrating Large-Scale Services to the Cloud
u/ghostmrchicken · 3 pointsr/HaltAndCatchFire

You may like, "When wizards stay up late" as well:

https://www.amazon.com/Where-Wizards-Stay-Up-Late/dp/0684832674/ref=sr_1_1?ie=UTF8&amp;amp;qid=1506457291&amp;amp;sr=8-1&amp;amp;keywords=when+wizards+stay+up+late

Description from Amazon:

Twenty five years ago, it didn't exist. Today, twenty million people worldwide are surfing the Net. Where Wizards Stay Up Late is the exciting story of the pioneers responsible for creating the most talked about, most influential, and most far-reaching communications breakthrough since the invention of the telephone.

In the 1960's, when computers where regarded as mere giant calculators, J.C.R. Licklider at MIT saw them as the ultimate communications devices. With Defense Department funds, he and a band of visionary computer whizzes began work on a nationwide, interlocking network of computers. Taking readers behind the scenes, Where Wizards Stay Up Late captures the hard work, genius, and happy accidents of their daring, stunningly successful venture.

u/chakke_ooch · 4 pointsr/mbti

&gt; Would you say there's more opportunity working exclusively front end and design to exercise nfp creativity or novelty?

NFP creativity and novelty in the sense that Ne has free range, period? Sure, you get more of that in web design and even more of that as to step further and further away from the sciences. There is tons of creativity in real software engineering where you can be creative to solve actually challenging problems, not figuring out what color you'd like a button to be. To me, that's not creativity – or it's a lesser version. Creativity in problem solving is much more interesting. The way I see it is like when I was in music school and all the SFs were bitching about music theory and how they thought it limited their ability to "be creative". Such bullshit. It only exposes their lack of creativity. So you're saying that someone like Chopin who wrote amazing pieces and abided by the rules of music theory wasn't being creative? Hardly.

&gt; Are you a web dev?

No, I'm a software engineer at an astrodynamics company; I do a lot of orbital mechanics, back-end work with web services, high performance computing, etc.

&gt; By hardcore I meant requiring being meticulous, detail oriented.

I think that the lack of attention to detail is never permissible in either back-end software engineering or front-end web development, honestly.

&gt; One thing I've realized is how shit my high school was at explaining math conceptually. Which I think lead to misconceptions about its use in programming

Well, then read some books on computer science and/or mathematics like this.

u/tblaich · 3 pointsr/truegaming

Finally home and having a chance to reply. I pulled five books off of my shelf that I would recommend, but there are doubtless more that you should read.

Raph Koster's Theory of Fun for Game Design

Janet H. Murray's Hamlet on the Holodeck: The Future of Narrative in Cyberspace

Noah Wardrip-Fruin and Pat Harrigan's First Person: New Media as Story, Performance, and Game

Noah Wardrip-Fruin and Pat Harrigan's Second Person: Role-Playing and Story in Games and Playable Media

They wrote a Third Person as well, I just haven't gotten the chance to read it yet. You might be able to find PDF copies online somewhere, but if you have the money, you should try to support the writer's by buying. Show them that people are interested in critical discourse about games.

Next week I think I'm going to order a few new texts (after payday), and I'd be happy to let you know what I think once i have them in hand.

u/KennedyRichard · 1 pointr/learnpython

Python Cookbook 3rd ed., by David Beazley and Brian K. Jones, has a dedicated chapter about metaprogramming. The book is so good the other stuff may also give some insight on metaprogramming or alternatives. I already read it and it gave me insight on my code about metaprogramming and other topics, so it is pretty useful. You can also find a lecture from Beazley about the topic with a quick google search with his name and the "metaprogramming" word.

There's also Fluent Python by Luciano Ramalho, which has three dedicated chapters about metaprogramming. Didn't read the chapters myself but I'm half way into the book and it is awesome, so I'm having great expectations about those chapters.

Don't mind the metaprogramming "chapter count", it is just a piece of information. Quality is what you should be looking for. And I believe those books have it. Even though I admit an entire book about metaprogramming would be neat.

u/psiph · 3 pointsr/learnjavascript

Hello Teyk5,

I'm the author of the post you referenced. I'm looking for suggestions on my next post. I'm curious if you'd want to see another start to finish tutorial or if you'd rather see a sequel to the original post, where I explain how to add on to and refactor an application. Also, are there any particular technologies you'd want to explore? Knockout.js, Angular.js, or Ember.js for example -- or do you just want to stick with the basics for now?

I know it might be weird to recommend a Python book here, but I found that I learned a lot from just the first few chapters of this book: http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418/ I learned about how to organize my code and how to think through from the beginning to the end when creating a new program. I would highly recommend that book if you're at all interested in Python. All the knowledge you learn there you'll be able to transfer over to Javascript (besides some of the syntax of course).

For pure Javascript stuff, I'd strongly recommend just starting your own project. It can be incredibly difficult at first, but it's by far the best and fastest way to learn if you can force yourself to stick with it. I'd also recommend checking out these two tutorials:

u/slowfly1st · 1 pointr/learnprogramming

Here's a game developer road map:

https://github.com/utilForever/game-developer-roadmap

&amp;#x200B;

If you want to learn about Game Design, I recommend this book: Designing Games: A Guide to Engineering Experiences (it's by Tynan Sylvester, creator of Rimworld). I'm neither a game developer nor a game designer, but I really enjoyed the book, because it is somewhat the 'science' of game design, it's about mechanics, about emotion, about decisions and so on - things I knew there are, but didn't really understood the impact it has on a game, how those things make a game "good" or "bad".

&amp;#x200B;

What you can do now:

  • Try to ship a game for android. All the tools are free, I think publishing in the play store is a one time payment of a few dollars. You will learn a lot of programming skills, but also you will understand what it means, to ship something. The awesome thing about the play store is: A lot of potential users.
  • Contribute to open source games ( https://github.com/leereilly/games).
u/rjt_gakusei · 2 pointsr/programming

This book has a pretty strong breakdown of how computers and processors work, and goes into more advanced things that modern day hacks are based off of, like address translation and virtualization with the recent Intel bugs:
https://www.amazon.com/Computer-Systems-Programmers-Perspective-2nd/dp/0136108040
The book can be found online for free. The author's website has practice challenges that you can download, one of them being a reverse engineer of a "binary bomb". I did a challenge similar to it, and it felt pretty awesome when I was able to get around safeguards by working with the binaries and causing buffer overflows.

u/z4srh · 1 pointr/gamedev

You know, a fantastic book is Programming Game AI By Example. It's fantastic for learning about AI, but the author also put a lot of effort into the code, so you can learn a lot about general game design from it as well. Well worth the price. http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782 . You can download the code samples from the author's website to see what I mean. It is only 2D, so it won't help you with collision detection and some of the more 3D specific topics, but the core of the engine can be applied to anything.

One thing that is really important is to realize that there's no silver bullet. Every design decision has its benefits and its trade offs. It's easy to fall into the trap of overthinking your design, especially for personal projects - it is more beneficial for you to try to write it and have to rewrite it because of a bad design decision than to spend months trying to come up with the perfect architecture. That's not to say that you should ignore design, but rather that once you think you have a good idea, try it out, experiment, see what works and what doesn't. If you focus on having a modular design to your software, you'll find that rewrites get easier and easier.

u/lemontheme · 3 pointsr/datascience

Fellow NLP'er here! Some of my favorites so far:

u/timostrating · 3 pointsr/Unity3D

TL;DR

Take a look at spaced repetition. Study without any music and use the absence of music as a check to see if you are still motivated to do your studying.

&lt;br /&gt;

I fucked up my first part of my education too. Lucy i realized that and got motivated again before i finished school.

I am currently 19 years old and I also always loved math (and some physics). I am from the Netherlands so our education system does not really translate well to English but i was basically in highschool when i only did things that interested me. I got low grades on everything else.

1 moment in highschool really stayed with me where I now have finally realized what was happening. In highschool i had a course about the German language. I already had a low grade for that class so I sat to myself to learn extremely hard for the next small exam. The exam was pretty simple. The task was to learn 200 to 250 German words. So I took a peace of paper and wrote down all 250 words 21 times. 1 or 2 days later I had the exam. But when i got my grade back it sad that i scored a F (3/10) . I was totally confused and it only destroyed my motivation more and more.
What I now have come to realize is that learning something is not just about smashing a book inside your head as fast as possible.

&lt;br /&gt;

So these are some tips I wished I could have give myself in the first year of highschool:

Go and sit in an quit room or in the library. This room should be in total silence. Now start with you studying. As soon as you feel the tension to put on some music than you should stop and reflect and take a little break.

The default in nature is chaos. Learn to use this as your advantage. I sit in a bus for 2+ hours a day. 1 hour to school and 1 hour back. Nearly every student does nothing in this time. So I made the rule for myself to do something productive in that time like reading a book. Normally I am just at my desk at home and before I know it it is already midnight. So this is for me at least a really good way to force my self to start reading a book in dose otherwise wasted 2 hours.

Get to know your body and brain. I personally made a bucket list of 100 items that includes 10 items about doing something for a week like running at 6am for a week of being vegan for a week. Fasting is also really great. Just do it for 1 day. So only drink water for one day and look how you feel. And try the same with coffee, sex, fapping and alcohol. Quit 1 day and look at how you feel. And have the goal to quit 1 time for a hole week strait.

Watch this video to get a new few about the difference of low and high energy. I never understood this but I think that everybody should know about the difference https://youtu.be/G_Fy6ZJMsXs &lt;-- sorry it is 1 hour long but you really should watch it.

Learn about about how your brain stores information and how you can improve apon this. Spaced repetition is one of those things that really changed the way I now look at learning something. https://www.youtube.com/watch?v=cVf38y07cfk

&lt;br /&gt;

I am currently doing my highschool math again for fun. After I am done with that again i hope to start with these 3 books.

u/Rexutu · 27 pointsr/gamedesign

I recommend Tynan Sylvester's book "Designing Games" (you can get a 7-day free trial of the e-book on Amazon). A lot of people will recommend A Book of Lenses by Jesse Schell but I personally felt it lacked substance. For the more philosophical aspects of the craft, here are some talks that I think are valuable 1 2 3
4 5 (hopefully ordered in a somewhat logical progression).

Another thing -- find out what kind of games you want to make, find out who makes that kind of game (a few examples: Jonathan Blow for puzzle games, Raph Koster and Project Horseshoe for MMOs, Tom Francis for whatever the fuck he makes, etc. -- and "kind" does not necessarily mean genres), and study what those people have to say, figuring out what you agree with and disagree with. Standing on others' shoulders is the easiest way to get good and the best path toward making games of true quality.

u/Waitwhatwtf · 2 pointsr/learnprogramming

It's going to sound really far outside of the topic, but it really helped with my logical mathematical reasoning: Mathematics for 3d Game Programming and Computer Graphics.

I'll also preface this by saying you're probably going to need a primer to get into this book if you're not sure how to reason a greatest common factor. But, being able to tackle that book is a great goal, and can help you step through mathematical logic.

Also, graphics is awesome.

u/Rhemm · 1 pointr/learnpython

Start with idiomatic python. That's a very good, short book of important pythonic features. Also make sure to learn standart library. You can refer to docs for exact arguments, but you have to know all of functions and what they do. After you can read fluent python. It will show you how python works, python data model and I think overall it's the best python book. Then practice, read others code. When you use some library or whatever don't be afraid to dive into source code. Good luck and happy programming

u/EricHerboso · 23 pointsr/westworld

Asimov's books went even farther than that. Don't read if you don't want to be spoiled on his most famous scifi series.

[Spoiler](#s "Because Law 1 had the robots take care of humans, the first AIs decided to go out and commit genocide on every alien species in the universe, just so they couldn't compete with humans in the far future.")

AI safety is hard. Thankfully, if you care about actually doing good in real life, there are organizations out there working on this kind of thing. Machine Intelligence Research Institute does research on friendly AI problems; the Center for Applied Rationality promotes increasing the sanity waterline in order to increase awareness of the unfriendly AI problem; the Future for Humanity Institute works on several existential risks, including AI safety.

If you want to learn more about this topic in real life, not just in fiction, then I highly recommend Nick Bostrom's Superintelligence, a book that goes into detail on these issues while still remaining readable by laymen.

u/Ponzel · 3 pointsr/gamedev

Since you mentioned Rimworld: Tynan, the creator of Rimworld has a gamasutra post and a book about how he designs games. (Spoiler: It's all about the story experienced by the player).

I can tell you about the thought process for my colony simulator:

  1. I want to have a prototype as fast as possible, so the system should be as simple as possible.
  2. The focus of the game are the colonists, their personality and their emotions when something good or bad happens.

    Therefore I only have a couple (~10) resources that are not even items on the map, but are simply counted in the UI, like in a strategy game. If you're looking for inspiration, you can download it for free on the website.

    For your game, I think you could first think about what the focus is in your game. Do you want the player to spend more time managing resources, handling colonists, building stuff, or defending the colony? Then plan around your focus. Hope this helps you :)
u/raz_c · 30 pointsr/programming

First of all you don't need to write a billion lines of code to create an OS. The first version of Minix was 6000 lines long. The creator of Minix, Andrew Tanenbaum, has a great book called Modern Operating Systems, in which he explains the inner workings of several famous operating systems.

Considering the emphasis on "from scratch", you should also have a very good understanding of the underlying hardware. A pretty good starter book for that is Computer Organization and Design by David A. Patterson and John L. Hennessy. I suggest reading this one first.

&amp;#x200B;

I hope this can get you started.

u/caindela · 1 pointr/learnprogramming

I don't know much about Codecademy, but honestly I would start with something more general.

If you want to get a degree in CS, start with the popular books that they assign for first year CS students and go into it in as much detail as you can (and do all the problems). Python Programming: An Introduction to Computer Science is a very well-received book in a language that I would recommend.

It takes a lot of discipline for a high schooler to actually sit down and work through a book like this, but I think it's the way to go. You truly can't cut any corners if you really want to learn anything.

u/joshi18 · 3 pointsr/computerscience

You are welcome :).
This is one of the best book to learn programming http://www.amazon.com/Structure-Interpretation-Computer-Programs-Engineering/dp/0262510871. It's freely available and the class at MIT which uses this is here http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-001-structure-and-interpretation-of-computer-programs-spring-2005/
Peter Norvig's advice http://norvig.com/21-days.html
Programming Pearls : http://www.cs.bell-labs.com/cm/cs/pearls/
At https://www.codeeval.com you can solve questions and get interview calls.
You may also want to brush up your design skills as few interviewers might ask those kind of questions. http://www.amazon.com/Head-First-Design-Patterns-Freeman/dp/0596007124 might be a good place to start.
I think http://www.geeksforgeeks.org is a good place to look for nicely explained solutions and you can find almost all the questions ever asked in a software engineering interview at careercup.com

u/mcur · 14 pointsr/linux

You might have some better luck if you go top down. Start out with an abstracted view of reality as provided by the computer, and then peel off the layers of complexity like an onion.

I would recommend a "bare metal" approach to programming to start, so C is a logical choice. I would recommend Zed Shaw's intro to C: http://c.learncodethehardway.org/book/

I would proceed to learning about programming languages, to see how a compiler transforms code to machine instructions. For that, the classical text is the dragon book: http://www.amazon.com/Compilers-Principles-Techniques-Tools-Edition/dp/0321486811

After that, you can proceed to operating systems, to see how many programs and pieces of hardware are managed on a single computer. For that, the classical text is the dinosaur book: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/1118063333 Alternatively, Tannenbaum has a good one as well, which uses its own operating system (Minix) as a learning tool: http://www.amazon.com/Modern-Operating-Systems-Andrew-Tanenbaum/dp/0136006639/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1377402221&amp;amp;sr=1-1

Beyond this, you get to go straight to the implementation details of architecture. Hennessy has one of the best books in this area: http://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1377402371&amp;amp;sr=1-1

Edit: Got the wrong Hennessy/Patterson book...

u/adrixshadow · 2 pointsr/gamedesign

You basically have no game design education and have absolutely no idea on anything.

Luckily for you I have collected some resources that is the distilled essence of game design that makes a complete noob to pro game designer.

Read/watch this:
http://www.sirlin.net/article-archive/
http://www.gamasutra.com/view/news/231225/Video_Practical_Creativity__A_way_to_invent_new_kinds_of_video_games.php
http://www.gamasutra.com/view/news/233326/Video_Designing_your_game_to_offer_meaningful_choices.php
https://www.youtube.com/watch?v=AJdEqssNZ-U
http://www.shutupandsitdown.com/videos/
http://www.erasmatazz.com/library/game-design/paradigm-shift.html

Bookmark this: http://129.16.157.67:1337/mediawiki-1.22.0/index.php/Main_Page

Subscribe to this: https://www.youtube.com/channel/UCaTznQhurW5AaiYPbhEA-KA

and this:

https://www.youtube.com/channel/UCWqr2tH3dPshNhPjV5h1xRw
https://www.youtube.com/channel/UCI3GAJaOTL1BoipG41OmfyA
https://www.youtube.com/channel/UCm4JnxTxtvItQecKUc4zRhQ

Troll and read all the articles from this: http://www.gamasutra.com/category/design/

Good books on game design I recommend:
http://www.amazon.com/Game-Mechanics-Advanced-Design-Voices/dp/0321820274
http://www.amazon.com/The-Art-Game-Design-lenses/dp/0123694965
http://www.amazon.com/gp/product/1449363210/ref=as_li_ss_tl?ie=UTF8&amp;amp;camp=1789&amp;amp;creative=390957&amp;amp;creativeASIN=1449363210&amp;amp;linkCode=as2&amp;amp;tag=atheoroffunfo-20

Now my personal advice as a indie game designer.

Specialize!


You have to specialize on one domain where you devour any information available.
Her are some domains I know about that you can take on as an indie on a budget.

CRPGs: Info on: http://www.rpgcodex.net/

JRPG: Look also at japanese hentai rpgs because they are some game design jewels there, RPG Maker community

Action Games and platformers : Understand game feel and juice/impact, somewhat saturated indie scene

Puzzles and board games: Do not buy into f2p as it kills your creativity

Roguelikes and survival: Saturated indie scene

Narrative: Visual Novel style games and adventure games, great if you have a good writer and artist, some overlap with JRPGs,
VN community: http://lemmasoft.renai.us/forums/index.php?sid=59e38b38f69f9e5b5c271de0843d2569 ,
http://www.renpy.org/

Server Multiplayer: Games like space station 13

A forum where you can find information on games of the above types: http://forums.tigsource.com/

Do not do 3D, without a astronomical budget you won't get far.

Congratulations! Now you are better game designer then 90% of inde developers!



u/mflux · 8 pointsr/gamedesign

The citybound guy has been putting out daily blog posts of his city sim game programming. Wildly ambitious: http://blog.cityboundsim.com/

Not directly city game design but I highly recommend Rimworld creator's book Designing Games: Engineering Experiences for game design. I've emailed him a few times and he's very responsive and forthcoming with his wisdom.

I'm designing a city game myself right now. My theory on these games is that while they are experience engines in the sense that, for example, Sim City triggers your emotions with poverty, wealth, crime, health -- SC tends to be more like gardening: you plant seeds, water them, and see what comes out and much of the enjoyment of playing the game comes from that.

As far as programming goes, I went with a custom entity component system and am using an off the shelf engine (Unreal) to avoid the hard work of optimizing drawing tons of stuff (and lights) on screen.

u/EtherDynamics · 2 pointsr/skyrimmods

Thanks for the heads up -- I'll definitely look more into Hassabis, sounds like an incredibly interesting guy with his plunge into neuroscience.

Thx, I went to a few universities and picked up several graduate coursebooks on AI, and also went through some online and conventional book sources. On Intelligence really opened my eyes to the power of hierarchical learning, and the mechanics of cortical hierarchies. Absolutely fascinating stuff.

Hahaha and yeah, I agree that the point of games is not to just kill the player. Despite the "adversarial" nature of most AI enemies, they're actually teachers, gently guiding the player towards more nuanced strategies and better reactions.

u/yanalex981 · 4 pointsr/computerscience

I taught myself bits in high school with "C++ for Everyone". Despite its rating, I thought it was good 'cause it has exercises, and I did a lot of them. Works really well for laying foundations. I didn't go through the whole book though, and knowing the language is only part of the battle. You need to know about algorithms and data structures as well. For graphics, trees seem really useful (Binary space partitioning, quadtrees, octrees etc).

After university started, I read parts of "C++ Primer", which was when the language really started making sense to me. You'll get more than enough time to learn the required amount of C++ by next fall, but CG is heavy in math and algorithms. If your CS minor didn't go over them (much), my old algorithms prof wrote a free book specifically for that course.

For using OpenGL, I skimmed the first parts of "OpenGL SuperBible". For general graphics, I've heard good things about "Mathematics for 3D Game Programming and Computer Graphics", and "Real-Time Rendering".

Careful with C++. It may deceptively look like Java, but honestly, trying to write good idiomatic C++ after years of Java took a major paradigm shift

u/ActuarialAnalyst · 2 pointsr/actuary

Yeah. If you want to be good at like programming-programming I would read this book and do all of the projects: https://runestone.academy/runestone/books/published/fopp/index.html If you take like algorithms class you will probably get to use python.

If you want to be good at data analytics I would read "R for data science" if you want to use R. If you learn python people like this book for data science learning https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291/ref=pd_sbs_14_2/145-5658251-1609721?_encoding=UTF8&amp;pd_rd_i=1491962291&amp;pd_rd_r=4e33435c-cc98-4256-9c50-6050e79b7803&amp;pd_rd_w=ejSx8&amp;pd_rd_wg=Ter1m&amp;pf_rd_p=d66372fe-68a6-48a3-90ec-41d7f64212be&amp;pf_rd_r=3X23DYAJ2ZMCKP9AA1Z4&amp;psc=1&amp;refRID=3X23DYAJ2ZMCKP9AA1Z4 .

These books are kind of different though. The python book is much more focused on theory and will help you less in the workplace if you aren't actually building predictive models (at least I think based on table of contents).

u/Ignate · 2 pointsr/Futurology

Superintelligence

Good book.

I think of the human mind as a very specific intelligence designed to meet the demands of a natural life. A tailor made intelligence that is ultra specific seems like an incredibly difficult thing to recreate. I wouldn't be surprised if after AGI was created, it proved that our brains are both works of art, and only useful in specific areas.

They say a Philosopher is comparable to a dog standing on it's hind legs and trying to walk. Our brains are not setup to think about big problems and big solutions. Our brains are very specific. So, certainly, we shouldn't be using it as a model to build AGI.

As far as self awareness, I don't think we understand what that is. I think the seed AI's we have are already self-aware. They just have a very basic drive which is entirely reactionary. We input, it outputs.

It's not that if we connect enough dot's it'll suddenly come alive like Pinocchio. More, it will gradually wake up the more complex the overall program becomes.

u/mysticreddit · 5 pointsr/gamedev

Every game programmer should have at least one of these books:

  • Mathematics for 3D Game Programming &amp; Computer Graphics by Eric Lengyel
  • Game Physics by David Eberly
  • Real Time Collision Detection by Christer Ericson

    I own all 3. What I love about them is that they are some of the best ones around written by programmers to understand math in a clear and concise fashion; they are not written by some mathematician who loves theory and likes to hand-wave the worries about "implementation details."

    To help provide direction I would recommend these exercises to start; Work on (re) deriving the formulas (from easiest to hardest):

  • Work out how to reflect a vector
  • Derive the formula for how to calculate a 2D perpendicular vector
  • Work out the formula for how to project a vector A onto B.
  • Study how the dot product is used in lighting.
  • Derive the translation, scaling, and rotation 3x3 and 4x4 matrices.
u/Kenark · 3 pointsr/gamedesign

I highly recommend Designing Games: A Guide to Engineering Experiences.

It's a hard to describe book but it's worth a read. For one, he defines a video game as a series of mechanics to interact with one another to create an experience. Something unique to our medium. Storytelling through mechanics interacting with one another and creating a fiction within your own head.

The game he's creating right now, Rimworld, applies that concept and simulates a living breathing colony with pawns that have likes and dislikes, strengths and weaknesses. They have jobs they want to do, will do it if they have to or certain jobs they won't do at all. You set a list of priorities for your colony and let things play out with no (practical) way of controlling individual pawns directly.

They also simulate relationships within the game and the pawns will remember interactions with one another. They will dislike one another if they're insulted and they'll break if a loved one dies. They'll visit the graves of people who died years/seasons ago.

All these mechanics interact with each other to create a story in your head that's different with every colony you start. That kind of storytelling is unique to our medium, he says. So that's how I can best describe the first half of the book.

The second half of the book is more about the iterative process of creating the game itself. Creating iterative loops where you add in features, polish and then loop again until release. It's a more complex half to describe shortly but just as important as the design process itself.

u/michael0x2a · 3 pointsr/learnprogramming

I would argue that in the long run, you should probably have a good understanding of all of those languages, and more -- security is the sort of thing where having a comprehensive and deep understanding of the entire system, and how different pieces interact is valuable.

That said, as a beginner, your instinct is right: you should pick just a single language, and stick with it for a while (maybe a year or so?). Since you're going to end up learning all of those languages at one point or another, it doesn't matter so much which particular one you start with, since you'll need to continuously be learning throughout your career. If you decide not to learn something today, you'll probably end up learning it a few months from now.

I would personally recommend Python as a good starting point, mainly because I happen to know one or two security-oriented introductions to Python, and am less familiar with what tutorials are available in other languages. In particular, there's a book named Violent Python which introduces Python from a security context to beginners. It does skim over some of the intro material, so you may want to supplement this with a more in-depth introductory tutorial to Python.

I think C, then C++ would then make decent second and third languages to learn, respectively. You can probably fit in Ruby and Java anywhere in between or after, as necessary -- I suspect they'll be easier to pick up.

u/ramwolf · 1 pointr/learnprogramming

Python is the best place to start I think. Its syntax is super easy and it helps you think systematically and gives you a good introduction to how to code. This is the book that I read and it was fantastic.

http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418

After a little python intro then I'd move on to java

u/Idoiocracy · 2 pointsr/learnprogramming

I also recommend the C Programming: A Modern Approach by K.N. King. It's considerably longer than Kernighan &amp; Ritchie's book, but does provide more explanation which can be helpful for a beginner. A Modern Approach is considered one of the best starting C books among those who read it. Your local library might have a copy of both if you want to read them cheaply.

If you wish to start with Python instead, a good book is Python Programming: An Introduction to Computer Science by John Zelle.

u/saibog38 · 1 pointr/TrueReddit

Some reading I'd recommend.

Don't be scared off by his masters in theology - theology as an academic subject is a very relevant historical study into the psychology of man (and if it helps legitimize the author at all, the South Park guys are fans). The book is basically about psychoanalysis and the problem of identity. I'm a physics lover, engineer by trade, rationalist to the bone, and it gets my stamp of approval for making logical arguments. I've taken up an interest in neuroscience as well, to which I'd recommend this book. For me, those two books are approaching similar ideas from opposite directions.

Good luck broseph.

u/s-ro_mojosa · 4 pointsr/learnpython

Sure!

  1. Learning Python, 5th Edition Fifth Edition. This book is huge and it's a fairly exhaustive look at Python and its standard library. I don't normally recommend people start here, but given your background, go a head.
  2. Fluent Python: Clear, Concise, and Effective Programming 1st Edition. This is your next step up. This will introduce you to a lot of Python coding idioms and "soft expectations" that other coders will have of you when you contribute to projects with more than one person contributing code. Don't skip it.
  3. Data Structures and Algorithms in Python. I recommend everybody get familiar with common algorithms and their use in Python. If you ever wonder what guys with CS degrees (usually) have that self-taught programmers (often) don't: it's knowledge of algorithms and how to use them. You don't need a CS degree (or, frankly any degree) to prosper in your efforts, but this knowledge will help you be a better programmer.

    Also, two other recommendations: either drill or build pet projects out of personal curiosity. Try to write code as often as you can. I block out time three times a week to write code for small pet projects. Coding drills aren't really my thing, but they help a lot of other people.
u/abstractifier · 22 pointsr/learnprogramming

I'm sort of in the same boat as you, except with an aero and physics background rather than EE. My approach has been pretty similar to yours--I found the textbooks used by my alma mater, compared to texts recommended by MIT OCW and some other universities, looked at a few lists of recommended texts, and looked through similar questions on Reddit. I found most areas have multiple good texts, and also spent some time deciding which ones looked more applicable to me. That said, I'm admittedly someone who rather enjoys and learns well from textbooks compared to lectures, and that's not the case for everyone.

Here's what I gathered. If any more knowledgeable CS guys have suggestions/corrections, please let me know.

u/biochromatic · 1 pointr/CasualConversation

I highly recommend Theory of Fun by Raph Koster. It's worth getting in paperback rather than kindle because it is full of comics that shows his points.

Recently I've been working on a pong game for the Wii U. I've come to understand that I need to make games with a small scope in order to actually finish them. I used to spend hours working on RPG Maker games, only to find that all I had done was copy and paste a bunch of houses in a town and set up the events so that the hero could enter them. I would never finish those games because they were too big for me to finish.

Something small like Pong is easy enough to get a fun prototype up in a weekend, and since then I've spent months fine tuning it the way I want it to play. (Another thing I've learned is that making gameplay and making a game are two different things--making a game is much harder.)

What kinds of prototype games do you make?

u/Orphion · 3 pointsr/quantum

I would recommend The Feynman Lectures on Physics. They're expensive books, but the description of quantum mechanics is particularly good, albeit 50 years old. Moreover, the lectures cover all of the other things you'll need to know in physics as well.

The problem with the Feynman lectures being old is that in the 50 years since they were given, quantum information has emerged as a field entirely separate from quantum mechanics/physics. The Mike and Ike book is the best single introduction to the field, but it, too, is expensive.

Luckily, there is a huge number of articles published on the physics arxiv, some of which are quite approachable. This introduction to quantum information is written by many of the giants in the field.

u/caphector · 3 pointsr/sysadmin

I'm not aware of any books that just like this, but here are some recommendations:

  • The Soul of a New Machine - The company is gone. The machine forgotten. What remains, 30 years later, is the story of building and debugging a 32 bit computer. Spends time on hardware and software development and has some excellent descriptions of how the computer works.
  • Where the Wizards Stay Up Late - This is about the people who put the Internet together. Goes into the work that was needed to build the inital networks.
  • Hackers: Heroes of the Computer Revolution - A lovely history of hackers, in the inital sense of the term. People that were enthralled by computers and wanted to do interesting things with them. Starts off with the MIT Tech Model Railroad Club and moves foward from there.
u/madwilliamflint · 1 pointr/Random_Acts_Of_Amazon

Good lord. That's me 30 years ago.

He doesn't go to the gym. He put that on there so he didn't seem like such an ubergamer. Same thing with skydiving and hawaii. They're "supposed to say" answers.

Ask him what kind of programming he's done (if any) and does he play WoW.

If he's a warcraft player then you want this: http://www.amazon.com/Beginning-Lua-World-Warcraft-Add-ons/dp/1430223715/ or something like it. (programming add-ons for games is a good way to get gamers to cross over in to programming.)

If he's done neither, then a book on Python programming might be a good start. (http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418/ is supposed to be pretty good.)

u/PostmodernistWoof · 3 pointsr/MachineLearning

+1 for top-down learning approaches. There's so much work going on to democratize use of ML techniques in general software development, that, depending on where you want to go, there's little need to start with the classic theory.

IMHO, the classic ML literature suffers a bit from decades of theorists who never had the computing resources (or the data) to make big practical advances, and it tends to be overly dense and mathematical because that's what they spent their time on.

But really it depends on your goals. Which category do you fall into?

  1. Get a PhD in math, study computer science, get a job as a data scientist at Google (or equivalent) and spend your days reading papers and doing cutting edge Research in the field.

  2. Learn classic and modern ML techniques to apply in your day to day software development work where you have a job title other than "data scientist".

  3. You've heard about Deep Learning and AlphaGo etc. and want to play around with these things and learn more about them without necessarily having a professional goal in mind.

    For #1 the Super Harsh Guide is, well, super harsh, but has good links to the bottom up mathematical approach to the whole thing.

    For #2 you should probably start looking at the classic ML techniques as well as the trendy Deep Learning stuff. You might enjoy:

    https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291

    as a place to start and immediately start playing around with stuff.

    For #3 any of the TensorFlow getting started tutorials are good, along with all of Martin Görner's machine learning/deep learning/TensorFlow "without a PhD" videos on YouTube. Here's one of the more recent ones:

    https://www.youtube.com/watch?v=vaL1I2BD_xY
u/samsmith453 · 1 pointr/computerscience

What interests you about CS? What would you like to build / know / work on?

I would always recommend starting at the very bottom when it comes to learning computer science. Build a knowledge of computing based on what is really happening under the hood, in the hardware. This is the approach I took and it gave me a great foundation, and accelerated my career!

This book is great: https://www.amazon.co.uk/Computer-Organization-Design-Interface-Architecture/dp/0123747503

I have just started a youtube series on understanding hardware aimed at beginners which you might find helpful:

https://www.youtube.com/playlist?list=PLH4a1-PgdkBTKkSSNx63uVkQG1Qs6GmYv

u/lbiewald · 2 pointsr/learnmachinelearning

I agree this is a missing area. I've been working on some materials like recent videos on Transfer Learning https://studio.youtube.com/video/vbhEnEbj3JM/edit and One Shot learning https://www.youtube.com/watch?v=H4MPIWX6ftE which might be interesting to you. I'd be interested in your feedback. I also think books like https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291/ref=pd_lpo_sbs_14_t_1?_encoding=UTF8&amp;psc=1&amp;refRID=3829RHN356ZXBEBP0KF3 do a good job of bridging some of this gap. Reading conference papers is a skill that takes practice and a strong math background.

u/bonesingyre · 3 pointsr/coursera

Just my 2 cents: The Stanford Algorithms class is more about designing algorithms. The Princeton Algorithms class is more about implementation and real world testing.

The FAQ at the bottom:

How does Algorithms: Design and Analysis differ from the Princeton University algorithms course?

The two courses are complementary. That one emphasizes implementation and testing; this one focuses on algorithm design paradigms and relevant mathematical models for analysis. In a typical computer science curriculum, a course like this one is taken by juniors and seniors, and a course like that one is taken by first- and second-year students.


As a computer science student, I would encourage you to pick up a book on Discrete Mathematics, and pick up Robert Sedgwick's Algorithm's textbook. Sedgwick's Algorithms book is more about implementing algorithms, compared to CLRS, which is another algorithms textbook written by some very smart guys. CLRS is far more in depth.

I took a Data Structures and Algorithms class recently, we used Sedgwick's textbook. I will be taking another ALgorithms &amp; Design class later using CLRS.

Books:
http://www.amazon.com/Discrete-Mathematics-Applications-Susanna-Epp/dp/0495391328/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1372267786&amp;amp;sr=1-1&amp;amp;keywords=discrete+mathematics
http://www.amazon.com/Algorithms-4th-Robert-Sedgewick/dp/032157351X/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1372267775&amp;amp;sr=1-1&amp;amp;keywords=algorithms
http://www.amazon.com/Introduction-Algorithms-Thomas-H-Cormen/dp/0262033844/ref=sr_1_1?ie=UTF8&amp;amp;qid=1372267766&amp;amp;sr=8-1&amp;amp;keywords=clrs
http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X/ref=sr_1_1?s=books&amp;amp;ie=UTF8&amp;amp;qid=1372267798&amp;amp;sr=1-1&amp;amp;keywords=theory+of+computation

The last book is super important for CS students, I would read that front to back as well.

u/nmaxcom · 1 pointr/RimWorld

Yes, plenty others are charging for DLC's and upgrades (erjemHOI4jrem). Here it's not exactly the same but.. more or less (read down).

Still, something doesn't quite fit. Bear with me.

Back in 2013 they raised 200k$ on Kickstarter with something that worked already:
&gt; The game already exists, and the testers are already having good experiences with it. We've got a small crew of testers on the Ludeon forums sharing their experiences with the game. Take it from them, not from me.

Indeed, the game was already looking solid.. Even if graphically speaking we're looking at a Prison Architect copy&amp;paste, which I don't think anyone mind, not even the guys from Prison Architect which is pretty cool, but still worth mentioning that no much innovation went there. What I mean by that is that most of the hard work (genre, game mechanics, plot and so on) was long done.

So that cash, an already big and thriving community, a kickstarter success and Steam Greenlit... All of that before ending 2013. So in that situation you already know what you are facing, what you'll need to change and so on. And since then, it's been selling at 30$.

According to steamdb.info (I don't know the reliability but doesn't seem crazy numbers) they have sold about 700k copies.

They do have developed 3 DLC's. For 170$, 15$ and 370$ aprox. The most expensive one says:

&gt; This DLC gives you the right to enter a name and character backstory into the game, with skills, appearance, and special work requirements. In addition, your character will appear as the leader of another faction!
&gt;
&gt; Write yourself as an interplanetary detective, an entrepreneur, an ex-artist, or anything else you can think of. Players will recruit, command, and fight you for all time!
&gt;
&gt; [...]

The actual game dev has been very scarce in these years, for everything that I mentioned this game has going for it.

Anyhow, I do like this game. I like it a lot. I'm making it clear because after giving facts some people may get the wrong idea. It's not about thrashing it, quite the contrary.

I think the guy nailed it in terms of the game itself (BTW he actually has a pretty good book on game design) but with all that money and all that time, maybe (and of course here I can only talk out of my ass because can't know) he hasn't managed the growth well and/or he hasn't allied with someone to do it.

So now, instead of medium to big upgrades every month or two (Prison Architect style, another game from kickstar success; or even Minecraft) we have medium to big upgrades twice a year.

I hope this can be seen as the constructive criticism from someone that wants this game to crush it big time. And sooner rather than later.

u/jasonwatkinspdx · 2 pointsr/AskComputerScience

It varies in industry. I think it's a great idea to have a general understanding of how processors execute out of order and speculate, how caches and the cache consistency protocols between cores work, and how the language implementation transforms and executes the source you write.

The Hennesy and Patterson book covers almost everything hardware wise. Skim the areas that seem interesting to you. For language internals I like Programming Language Pragmatics. Compared to other "compiler course" textbooks like the famous dragon book it's got a lot more of the real world engineering details. It does cover quite a bit of theory as well though, and is written in a really straightforward way.

Skimming these two books will give people a pretty accurate mental model of what's going on when code executes.

u/tylerjames · 7 pointsr/movies

It's even more interesting if you don't just think him as the standard insane genius trope, but realize that he is probably genuinely disturbed and conflicted about what he's created and what to do with it.

Trying not to be spoiler-y here for people who haven't seen the movie but there are probably a lot of practical and metaphysical questions weighing on him. Is an AI truly a conscious creature? Does it have wants? If so, what would an AI want? Given that its social manipulation, long-game planning, and deception abilities are off the charts how could we ever be sure that what it told us was the truth? Does it have any moral considerations toward humans? How would we ever be able to contain it if we needed to? And if it is a conscious creature worthy of moral consideration then what are the moral ramifications of everything he's done with it so far?

Really interesting stuff. For those inclined I recommend checking out the book Superintelligence by Nick Bostrom as it explores these themes in depth.

u/CWRules · 2 pointsr/blackmirror

&gt; The truth is that the singularity could be reached but never realized as long as you don't connect that super-smart AI to anything.

A super-intelligent AI could probably convince a human to let it out of its confinement (Google The AI-Box Experiment for an exploration of this), but even failing that it might think of a way to break free that we can't even conceive of. Even if we literally didn't connect it to anything, that leaves us with no way to interact with it, so what was the point of developing it?

The reason I say human-based AI is less risky is because it would implicitly have human values. It wouldn't kill all humans so that we can't stop it from turning the planet into paperclips. Designing a friendly AI from scratch basically requires us to express human ethics in a way a computer can understand, which is not even close to a solved problem.

Nick Bostrom's Superintelligence is a pretty good exploration of the dangers of AI if you're interested in the subject, but it's a fairly difficult read. Tim Urban's articles on the subject are simpler, if much less in-depth.

u/flaz · 17 pointsr/philosophy

You might be interested in a book called On Intelligence, by Jeff Hawkins. He describes something similar to your simulations idea, but he calls it a predictive hierarchical memory system (or something like that). It is a fascinating idea, actually, and makes a lot of sense.

I too suspect that speech is a central unifying aspect to what we call consciousness. A lot of AI guys seem to agree. There is a theory by Noam Chomsky (I think), called Universal Grammar. As I recall, he suspects that may be key to modern intelligence, and he suspects the genetic mutation for it happened about 70,000 years ago, which gave us the ability to communicate, and allowed Homo Sapiens to successfully move out of Africa. I've also read that mutation 70k years ago referred to as the cognitive revolution. But it seems everyone agrees that's when the move out of Africa began, and communication started; it's not just a Chomsky thing.

u/zem · 1 pointr/learnprogramming

non programming, but for history of computers, my two favourite books are where wizards stay up late and dealers of lightning. as a programmer, you will definitely appreciate and be inspired by both of them.

for programming, sandi metz's practical object-oriented design in ruby is a good read, and possibly worth it even if you're not a ruby programmer.

u/old_TA · 6 pointsr/berkeley

Former 61C ugrad TA here. 61C is broken into 6 main ideas, which you can find on the last slide of the first lecture: http://www-inst.eecs.berkeley.edu/~cs61c/sp13/lec/01/2013Sp-CS61C-L01-dg-intro.pdf

From personal experience, 61C seems to be more difficult for most people than 61A or 61B. On the other hand, if you've been struggling with 61A or 61B, then 61C provides a much more level playing field - the material is new for pretty much everyone, so someone who's been programming since the beginning of high school doesn't have as much of an advantage as they do in the earlier classes.

Also I realize that the advice I'm about to give is devalued since I'm a former staff member, but if you want any type of A, READ THE BOOK CAREFULLY (the book I'm referencing is http://www.amazon.com/Computer-Organization-Design-Fourth-Edition/dp/0123747503/ref=dp_ob_title_bk). There are tons of subtleties in the material that we simply don't have enough time to cover in lecture/discussion/lab but are essential to doing well on projects/exams. The book is meaty, but probably the best book in the world for this material.

Feel free to respond to this if you have more questions.

u/Watabou90 · 1 pointr/learnprogramming

If you want x86 assembly, this book is very good: http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040/ref=dp_ob_title_bk/180-6741587-3105245

I'm talking an assembly class this semester that involves writing assembly from scratch and this book (which is required for this class) is a lifesaver because the professor isn't that great at summarizing the important points.

I think it's a good book. It starts easy and it has a lot of exercises that have answers on the back of the chapter so you can check your answers pretty easily.

u/quantifiableNonsense · 3 pointsr/AskEngineers

Self taught professional software engineer here.

Which language you learn is not as important as learning about data structures and complexity analysis. Code organization is also very important.

Pick one high level scripting language (like Python, Ruby, Perl, etc) and one low level systems language (C, C++, Rust, etc) and learn them both inside out.

A couple of books I recommend:

  • Code Complete
  • SICP

    As far as practical skills go, you need to learn how to use git (or whatever VC system the companies you are interested in use). You need to learn how to use Unix systems. A great introduction is The UNIX Programming Environment. You need to learn how to read other peoples' code, open source projects are great for that.

    When you are getting ready to interview, there is no better resource than Cracking the Coding Interview.
u/Aeiorg · 1 pointr/GraphicsProgramming

1 ) Real-time rendering is the go-to bible for every graphic programmer. It's starting to get pretty old tho but still relevant. Then you have a lot of different books for different topics, like Real-time shadows, VR, etc. In fact what you are looking for is a Computer Graphics course/book, not an OpenGL one (or you can try OpenGL SuperBible). OpenGL is a tool (you can learn this one, or DirectX, or Vulkan ...), Computer Graphics is the science ( you learn it one time, and you use an API to apply it). You have tons of books for computer graphics, some are more into mathematics , some are more into specifics techniques, etc

2 ) OpenGL is just an API, it doesn't do anything in itself. Therefore, "the rest" is just a superset of functions that passes data differently on GPU and give you more control of your app. If you want to begin to understand why/how OpenGL is evolving, you can have a look at this supa great talk

Have fun learning !

u/Gaff_Tape · 6 pointsr/ECE

Not sure about EE-related topics, but for CE you're almost guaranteed to use these textbooks:

u/Brianfellowes · 4 pointsr/askscience

The other answer in this thread is a bit simplistic, and not quite correct in some cases.

First, let's look at where differences in IPC (instructions per cycle) can arise. In essence, all Intel and AMD CPUs are Von Neumann machines, meaning, they perform computations by reading data from memory, performing an operation on that data, and then writing the result back to memory. Each of those actions take time, more specifically cycles, to perform. Computers can read from memory using load instructions. Computers operate on data through logical instructions (add, subtract, multiply, divide, bit shift, etc.) or control instructions (conditional/unconditional branches, jumps, calls... basically instructions to control what code gets executed). Computers write data to memory through store instructions. All useful programs will use all 4 types of instructions to some degree. So in order to improve IPC, you can implement features which will speed up the operation of those instructions, at least over the lifetime of the program (in other words, they improve average performance of an instruction)

So how can you improve the operation of these instructions? Here's a crash course in (some) major features of computer architectures:

  1. Pipelining: Instead of doing an instruction in 1 cycle, you can do 1/Nth of an instruction in 1 cycle, and the instruction will take N cycles to complete. Why do this? Let's say you split an instruction execution into 3 parts. Once the first 1/3 of the instruction 0 is completed on cycle 0, you can execute the 2/3 of instruction 0 in cycle 1 as well as the 1/3 of instruction 1. The overall benefit is that if you can execute 1 instruction in t time, you can execute 1/n of an instruction in t/n time. So our 3-stage pipeline can now on average do 1 instruction per cycle, but it can run 3 times faster. Practical impact: the processor frequency can be greatly increased. In this case, by 3x.
  2. Caching: Believe it or not, loads and stores to memory take far far far longer than logical or control instructions. Well, at least without the caching optimization. The idea of caching is to keep a small, fast memory close to the processor and the larger, slower memory farther away. For example, if you sat down at your desk and wanted a pencil, where would you want to have it? On the desk? Inside the desk drawer? In your storage closet? Or down the street at the office supply store? You have a small number of things you can fit on top of your desk, but keeping your pencil there is the best if you use it frequently. Practical impact: the average time it takes to access RAM is somewhere between 50 and 120 cycles. The average time to access the L1 cache (the fastest and smallest cache) is 3-5 cycles.
  3. Superscalar processing: Let's say that you have the following code:

    a = b + c
    d = e + f
    This code will form two add instructions. One key thing to note is that these two instructions are completely independent, meaning that the instructions can be performed in any order, but the result will be the same. In fact, the two instructions can be executed at the same time. Therefore, a superscalar processor will detect independent instructions, and try to execute them simultaneously when possible. Practical impact: allows the processor to reach an IPC higher than 1. Code varies a lot, but the theoretical IPC maximum for most single-thread programs is somewhere between 2-5.
  4. Branch prediction: When we introduce pipelining, we run into a problem where we might not be able to execute the first 1/3 of an instruction because we don't know what it is yet. Specifically, if we have a control instruction, we need to complete the control instruction before we can figure out what the next instruction to execute is. So instead of waiting to finish the control instruction, we can predict what the next instruction will be and start executing it immediately. The processor will check its prediction when the control instruction finishes. If the prediction is correct, then the processor didn't lose any time at all! If it guesses incorrectly, it can get rid of the work it did and restart from where it went guessed wrong. Practical impact: modern processors predict correctly 98+% of the time. This saves many, many cycles that would otherwise be spent waiting.
  5. Out of order / speculative processing: Building on superscalar processing, processors can try to guess on a lot of things in advance. Let's say there's a load instruction 10 instructions ahead of where the processor is currently executing. But, it doesn't look like it depends on any of the previous 9 instructions? Let's execute it now! Or, what if it depends on instruction 5? Let's guess at the result of instruction 5 and use it to execute instruction 10 anyways! If the processor guesses wrong, it can always dump the incorrect work and restart from where it guessed wrong. Practical impact: it can increase IPC significantly by allowing instructions to be executed early and simultaneously.
  6. Prefetching: A problem with caching is that the cache can't hold everything. So if the processor needs data that isn't in the cache, it has to go to the large, slow RAM to get it. Think of like when you look in your refrigerator for milk, but you don't have any, so you have to spend time going to the store. Well, processors can try to guess about the data it will need soon, and fetch that data from RAM to put it in the cache before it need it. Think of it like realizing your milk is low, so you stop by the store and pick some up on the way home from work. That way, when you actually need the milk it will already be there! Practical impacts: prefetching can significantly reduce the average time it takes to get data from RAM, this increasing IPC.

    The conclusion
    Knowing exactly why AMD's architecture doesn't have the same IPC as Intel's is a bit difficult to tell, because there are necessarily no people who have access to the internal design details for both Intel and AMD simultaneously. It would be like trying to tell how someone got sick - you can come up with a lot of educated guesses and theories, but there's not really a way to tell for sure.

    Another reason is that many of the inventions that go into CPU microarchitectures are patentable. So it could easily be that Intel has certain patents that they are unwilling to license or AMD doesn't want to license.

    To put things in perspective, both Intel and AMD perform all of the above items I listed. The main difference between the two is how they are implemented. Their architectures will differ in how many pipeline stages they use, how many instructions they can execute at the same time, how far ahead they can look for independent instructions, how they decide which data to prefetch, etc. These will cause minor differences in IPC.

    The bottom line
    One of the larger differences between the two recently, in my opinion, has been small differences and techniques on how they implement speculation related to load instructions. Intel pulls more tricks related to trying to guess the value of a load before it is known for sure, and doing so quickly and correctly. It's hard for AMD to replicate these because the techniques are either trade secrets or patented.

    Edit: many of these techniques can are described in the "bible" of computer architecture:
    J. Hennessy, and D. A. Patterson. Computer architecture: a quantitative approach. Elsevier, 2011.
u/nimblerabit · 3 pointsr/compsci

I learned mostly through reading textbooks in University, but not many of the books we were assigned stood out as being particularly great. Here's a few that I did enjoy:

u/charles__l · 11 pointsr/lisp

Lisp is like magic - it's the programmable programming language - if you learn it, everything else kind of pales in comparison :P

One fascinating aspect of lisp is that it's based on lambda calculus, which is basically a cleaner alternative to Turing machines (Turing machines are basically a mathematical way to describe computable problems). After learning about lambda calculus, Turing machines looked like a hack to me. A decent non-mathematical guide I found introducing them was this: http://palmstroem.blogspot.com/2012/05/lambda-calculus-for-absolute-dummies.html

Even though lisp allows for a lot of functional programming, it's not purely functional, and can be used to write object oriented code, or anything else really.

The books I'd recommend to learning it are:

  • The Little Schemer - a lovely, beginner friendly book that introduces Lisp and computation in a rather unique way.
  • Structure and Interpretation of Computer Programs - this is the book that was used to teach a bunch of programming classes at MIT, and is a classic text for computer science. Despite its advanced topics, it's still rather approachable, especially if you have a decent amount of programming background.
u/Kadoba · 2 pointsr/gamedev

I personally love Programming Game AI By Example. It gives lots of very usable examples in an entertaining and understandable way. It's pretty friendly for beginners and even offers a game math primer at the start of the book. However the examples still have a lot of meat to them and thoroughly explains some important AI concepts like state machines and pathfinding.

u/Denis_Vo · 3 pointsr/algotrading

I would highly recommend to read the following book

https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1491962291/ref=mp_s_a_1_2?keywords=machine+learning&amp;amp;qid=1566810016&amp;amp;s=gateway&amp;amp;sprefix=machi&amp;amp;sr=8-2

I think it is the best one about ml/dl. Not sure that they already updated the tensorflow examples to tf 2.0 and keras.

And as tensorflow includes keras now, and has perfect pipeline for deploying your model, i think it is the perfect choice. :)

u/InfinitysDice · 1 pointr/shittysuperpowers

Well, there are a lot of potential dangers to creating kittens with greater brainpower than we could imagine. It's essentially a superintelligent AI problem: it's tricky to create conditions that would allow us to create something more powerful than ourselves without running into a large host of problems where the AI wouldn't slip into a mode that isn't value-alligned with us. Maybe with the right types of check-boxes, it could be done, though this runs into a second problem:

&amp;#x200B;

I'm not at all sure that you can create superintelligent kittens and be at all sure that you can still call them kittens. Any noun is an idea with other ideas attached to them, and if you change any of those defining ideas enough, language, or human convention, tends to call that original noun by a different name.

&amp;#x200B;

If the superintelligent kittens would rightly be called something other than kittens, I suspect there would be no checkboxes that would point to them, or allow them to be designed or created.

&amp;#x200B;

Further, there are always ethical dilemmas that surround intelligent species, and the willy-nilly creation of them, especially with the intent of placing them into service, especially if doing so would cause them to suffer.

&amp;#x200B;

Anyhow, thanks for the submission, I enjoyed playing with it. :D

u/shinyhare · 1 pointr/ECE

After checking out some popular books besides the ones I learned from, for digital logic I found Schaum's Outline of Digital Principles is surprisingly good, and concise. You could definitely get by with that, googling anything that doesn't 'click' right away.

There are many books that go beyond basic digital logic to build things like microprocessors and embedded systems so it's hard to give a solid recommendation (and in retrospect all the ones I've read were way to verbose, imo). The one I'm most familiar with is this one. It's cool since it explains how programming languages are translated down the hardware level, and different processor architectures.

In any case, doing projects as you go along is probably going to be more important, and will teach you more than the reading itself.

u/guiraldelli · 1 pointr/compsci

Excellent! I'm glad to know the concept is clear to you.

I would recommend you to using the Lewis &amp; Papadimitriou book as well as the Sipser one: for me, the former is more formal than the latter, that is more didatic (specially for undergraduate students); however, both use a simple language and are very didatic.

My advice is: take both books and keep studying by them. I've learned Theoretical Computer Science by the Lewis &amp; Papadimitriou book, but always I couldn't get a concept, I went to Sipser. And vice-versa.

At last, the 2012 (3rd) edition of the Sipser is so beautiful, with good automata drawings to understand Pushdown Automata. :)

u/wizardU2032 · 4 pointsr/gamedesign

The best book by someone who's been commercially successful is Designing Games, by Tynan Sylvester of Rimworld: https://smile.amazon.com/Designing-Games-Guide-Engineering-Experiences/dp/1449337937/

It is the best at actually applying all of the navelgazing people tend to do when talking about game design and art and theory and so forth towards actually creating compelling structures and content for games.

u/shadowblade7536 · 15 pointsr/hacking

There are online forums that provide with tutorials on how to hack certain things, so read those and try them on your own devices or devices you have the permission to attack.

Examples of those forums : [NullByte] (https://null-byte.wonderhowto.com/) and [BlackMOREOps] (https://www.blackmoreops.com/)

Download Kali, load it onto a USB and look at the tools, especially [Metasploit] (https://www.metasploit.com/) and play with port scanners and such. I'd also recommend running vulnerable VM's such as Metasploitable and running vulnerable web apps such as [DVWA] (http://www.dvwa.co.uk/).

When it comes to writing code, Python excells for writing hacking tools. There are books about that such as [Violent Python] (https://www.amazon.com/Violent-Python-Cookbook-Penetration-Engineers/dp/1597499579) and [Black Hat Python] (https://www.nostarch.com/blackhatpython). Im sure there are some about writing payloads and exploits in C, but I cant really remember the names.

If you have any questions, feel free to ask! And remember one thing: Be as creative as you can when experimenting. You'll learn a great deal that way.

u/binary_is_better · 3 pointsr/AskEngineers

&gt; let's say I want to do 5*5. How does it take 5, and multiply the same thing in binary

How this is done varies from processor to processor. MIPS are usually the easiest processor to understand. When I learned this stuff in school we started with MIPS. I work on ARM processors mostly now (smartphones), but at a high enough level that I don't worry about the type of details that you're asking now.

000000 00100 00101 00010 00000 100000

In MIPS, that binary sequence means add two numbers together. So if the CPU saw that binary sequence it would first look at the first six digits. This is called the op code. My memory to what these do exactly is fuzzy so I'll leave it to someone else to answer. The next five digits tell the CPU to grab the binary digits that it is storing in register 4, the next five digits tell the CPU to grab the binary digits it is storing in register 5. The next 5 digits tell the CPU that when it is done working with the numbers it should store the results in register 2. The next 5 digits are ignored for this example. The last 6 digits tell the CPU that it should add these numbers together.

If you previously stored the numbers 3 and 17 in registers 4 and 5, register 2 should now hold 20. (It's a different MIPS instruction to store a number, and yet another instruction to retrieve the number.)

I should note that most computer scientist never work at this low level of detail. If we want to add two numbers together and store the result we just type "a = b + c;". That would take the number stored in location b, add it with the number stored in location c, and then store it in location a. We wouldn't care if a, b, or c are registers or in cache or in ram. Those details are handled by the computer (well, compiler) not us.

As it how the processor adds the numbers together, ask a hardware guy. I don't really remember, and to be honest I never really understood it well either.

If you want to delve deeper into this subject, this is a good book, but be warned it assumes you already have a decent grasp of computer science.

As for the second part of your questions it has to do with the number of cores and what they specialize in. CPU's generally have just a few cores. Maybe 1 to 8. They are also general purpose so they can do a lot of things and are very powerful. This monster video card from AMD has 2048 stream processing units on it. None of those processing units are very powerful, and they can really only do a few tasks (which just so happen to be the ones that graphics need). But it can do 2048 of them at a time verses 1 to 8 on a CPU. That's the difference between a CPU and a GPU.

Take the Mythbusters example. Their "GPU" can only paint the Mona Lisa, nothing else. But it can paint it very fast. The "CPU" could be programmed to paint anything. It just takes a lot longer to paint it. Actually, that's a bad example. A CPU will beat a GPU at almost everything. GPU's can only do a few tasks, but the tasks they can do they are much better at than the CPU.

u/kapelin · 1 pointr/Teachers

In college I took an intro course where we learned to code in Python. I liked the book a lot and felt it explained everything at a pretty basic level-http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418/ref=sr_1_2?s=books&amp;amp;ie=UTF8&amp;amp;qid=1373804276&amp;amp;sr=1-2&amp;amp;keywords=python

I realize college is not the same as high school, but maybe you could read the book and adapt it to your course. Even if you don't use it for your course, I recommend it if you want to try Python! Good luck!

u/bluebathysphere · 2 pointsr/compsci

The two starting books that gave me a great deal of understanding on systems (which I think is one of the toughest things to grasp and CLRS and the Art of Programming have already been mentioned):

[Computer Systems: A Programmer's Perspective] (http://www.amazon.com/Computer-Systems-Programmers-Perspective-Edition/dp/0136108040/ref=sr_1_2?ie=UTF8&amp;amp;qid=1407529949&amp;amp;sr=8-2&amp;amp;keywords=systems+computer)

This along with its labs served as a crash course in how the system works, particularly a lot about assembly and low-level networking.

The Elements of Computing Systems: Building a Modern Computer from First Principles

I've mostly only done the low-level stuff but it is the most fun way I have found to learn starting all the way at gate architecture. It pairs well if you have read Petzold's Code. A great introduction to the way computers work from the ground up.

u/ACoderGirl · 3 pointsr/IAmA

No idea about online resources, but this is the text my class on it used (for MIPS32 assembly -- should be pretty transferable to other RISC languages like ARM). The focus really is on the architecture of the CPU and how that relates to assembly. Which I feel is the important thing to learn, though. I feel like the text is enough. The lectures of that class didn't really do anything beyond what the text covered, anyway.

For exercises with assembly specifically, all the standard beginner programming problems can be used (like, basics of looping and conditionals). Really anything from some first year textbook should be an interesting challenge in assembly simply because it's sooo much simpler. It's not like there's very much to learn because assembly is pretty minimal with what it can do. Everything complex is just a shit ton of code (also why few would build anything large in just assembly). You could have your compiler compile, say, a C program to assembly (gcc's -S flag) to try and practice understanding reverse engineering. It'll let you see what assembly your compiler is generator to try and understand it (it'll be more complex and optimized than human written assembly). Or could even grab a disassembler and disassemble some existing binary and try and understand some small portion of its assembly to see what it's doing.

u/nabnob · 3 pointsr/AskReddit

Are you in high school or college?

C# is very similar to Java - it's object oriented, has garbage collection (meaning you can get away with not learning about memory), and strongly typed. I wouldn't really say it's that useful to learn if you already know Java unless you end up working for a software company that does work in C#.

C doesn't have any of those nice features of Java and C#(strongly typed, garbage collection), and all variables - pointers, integers, characters - are treated as bits stored somewhere in memory, either in the stack or the heap. Arrays and structs (similar to objects in Java, sort of) are longer blocks of memory. C++ is an object-oriented version of C, and if you already know C and Java you would be able to pick up on C++ fairly quickly.

Learning C forces you to learn a lot of memory and system concepts. It's not really used in the software industry as much because, since it's missing all those nice Java and C# features, it can be difficult to write huge, complicated systems that are maintainable. If you want to be a serious developer, you DO need to learn these things before you graduate from college. Most major software companies ask systems/memory type questions in their interviews.

However, if you're in high school, I wouldn't say it's really necessary to try to learn C on your own unless you really want to. A good computer science program in college would require at least one class on C programming. If you are really interested, I would look at this to learn C, and later this for more information on how computers work.

TLDR; Learn C in college if want to be a software engineer and, if you're in high school, learn whatever you find interesting.

u/praxis22 · 1 pointr/skyrim

Ah you mean TV &amp; Movie AI :) I'm not sure if we'll ever get there, but superintelligent AI is reckoned to be only a short hop away from General Purpose AI. There are a series of blog posts on waitbutwhy.com which are the most cogent I've ever seen or read on the subject. A long read, but a must read if you're at all interested in the state of the art.

However, in one of the posts you'll find the results of a survey of domain experts, about when AI will happen, probabilistically. From Nick Bostrom an autodidact that wrote Superintelligence Also a leading thinker about AI at Oxford university. The earliest estimate of true AI is 2025, (25%) then 2040, (50%) and 2060, (75%) now those percentages are from memory but the years should be right. Go check the post. But that's allegedly what AI experts thought when asked at an AI conference.

Google's deepmind are essentially running "an Apollo program for AI" Their words, and have about 600 academics on staff full time working on the issues. They already beat the best human player at Go, and until they did that it was an event thought to be 10 years away. This is coming, it's just a matter of when.

u/vvillium · 14 pointsr/compsci

https://www.amazon.com/Quantum-Computation-Information-10th-Anniversary/dp/1107002176

Best book hands down. This will bring you to the frontier of quantum computing. The book is also very approachable and meant for people trying to learn. It covers some linear algebra as well as physics in order to bring you up to speed.



Michael Nielson is an amazing educator and expert in the field. His you tube lecture course https://www.youtube.com/playlist?list=PL1826E60FD05B44E4 Quantum Computing for the Determined, is a short version of that book. He also has a free book online on Neural Networks that is probably the most referenced source on the matter. http://neuralnetworksanddeeplearning.com/index.html

u/lotusstp · 3 pointsr/technology

Tip of the hat to the pioneers... Lawrence Roberts, Vin Cerf, Bob Taylor, Ivan Sutherland, Douglas Engelbart and J.C.R. Licklider, among many others. Well worth studying up on these dudes. Some excellent reads (available at your public library, natch): "Dealers of Lightning" an excellent book about Xerox PARC; "Where Wizards Stay Up Late" fascinating book about MIT and DARPA; J.C.R. Licklider and the Revolution That Made Computing Personal a turgid yet compelling book about J.C.R. Licklider and his contemporaries.

u/grahamboree · 4 pointsr/gamedev

The Starcraft Broodwar API has source code for a bunch of bots from the annual competition at AIIDE. You can find them here. They use a variety of techniques that will help you set you in the right direction.

I'd recommend this book too if you're interested in AI. It's the most comprehensive survey of the most common techniques used in the industry today.

Good luck!

u/frizzil · 3 pointsr/VoxelGameDev

Agreed 100%. Additionally, if you're trying to learn basic OpenGL, Java combined with LWJGL is actually a great choice, since the language is generally quick to iterate with. And definitely go with the advanced pipeline, as learning immediate mode isn't going to help you much if advanced is your end goal.

Also, big piece of advice -- you're really going to want a solid understanding of 3D matrix math before diving in. Particularly, you're going to want to know the difference between row-major and column-major systems, and how to perform basic manipulations in both. To this end, I highly recommend the book Mathematics for 3D Game Programming and Computer Graphics.

u/slacker87 · 9 pointsr/networking

I LOVE following the history of networking, awesome find!

If you end up wanting more, where wizards stay up late and dealers of lightning are great reads about the people behind the early internet.

u/DMRv2 · 2 pointsr/emulation

I don't know of any resources on emulation in general, sorry. What aspect of emulation are you interested in? Dynamic recompilation? Binary translation? Cycle-accurate emulation?

If you're interested in cycle accurate emulation, it helps to have a background in computer architecture. H&amp;P is a great textbook:
https://www.amazon.com/Computer-Architecture-Fifth-Quantitative-Approach/dp/012383872X

u/arsenalbilbao · 9 pointsr/learnpython
  1. if you want to LEARN how to write programs - read "Structure and interpretation of computer programms" on python - SICP (project: you will write an interpreter of "scheme" programming language on python)

  2. if you want to TRAIN your OOP skills - Building Skills in Object-Oriented Design (you will code 3 games - roulette, craps and blackjack)

  3. Helper resources on your way:
    3.1. Dive into python 3 (excellent python book)
    3.2. The Hitchhiker’s Guide to Python! (best practice handbook to the installation, configuration, and usage of Python on a daily basis.)
    3.3 Python Language Reference ||| python standard library ||| python peps

  4. if you want to read some good python code - look at flask web framework (if you are interested in web programming also look at fullstackpython

  5. good but non-free books
    5.1. David Beazley "Python cookbook" (read code snippets on python)
    5.2. Dusty Phillips "Python 3 Object Oriented Programming" (learn OOP)
    5.3. Luciano Ramalho "Fluent python" (Really advanced python book. But I haven't read it YET)

  6. daily challenges:
    6.1. r/dailyprogrammer (easy, intermediate and advanced challenges) (an easy challenge example)
    6.2. mega project list

  7. BONUS
    From NAND to tetris ( build a general-purpose computer system from the ground up) (part1 and part2 on coursera)
u/white_nerdy · 1 pointr/learnprogramming

&gt; I want to be able to create functional programs with a bucket transistors, a cup of magnets, pen, and paper

I've heard good things about nand2tetris which goes from logic gates to a complete system with simple assembler, compiler and OS.

One good exercise might be to create an emulator for a simple system, like CHIP-8 or DCPU-16.

If you want to go deeper:

  • If you want to build compilers, the dragon book is the go-to resource.

  • If you want to start learning about theory, I recommend Sipser.
u/ImNot_NSA · 2 pointsr/worldnews

Good point. If you want to read about superior alien intelligence, check out http://www.amazon.com/gp/aw/d/0199678111/ref=mp_s_a_1_1?qid=1420566893&amp;amp;sr=8-1 Our civilization is currently giving birth to an intelligent life form beyond our imagination.

u/mitchell271 · 1 pointr/Python

Been using python for 5 years, professionally for 1. I learn new stuff every day that makes my python code more pythonic, more readable, faster, and better designed.

Your first mistake is thinking that you know the "things that every python programmer should know." Everyone has different opinions of what you should know. For example, I work with million+ entry datasets. I need to know about generators, the multiprocessing library, and the fastest way to read a file if it's not stored in a database. A web dev might think that it's more important to know SQLAlchemy, the best way to write UI tests with selenium, and some sysadmin security stuff on the side.

If you're stuck with what to learn, I recommend Effective Python and Fluent Python, as well as some Raymond Hettinger talks. You'll want to go back through old code and make it more pythonic and efficient.

u/SUOfficial · 21 pointsr/Futurology

This is SO important. We should be doing this faster than China.

A branch of artificial intelligence is that of breeding and gene editing. Selectively selecting for genetic intelligence could lead to rapid advances in human intelligence. In 'Superintelligence: Paths, Dangers, Strategies', the most recent book by Oxford professor Nick Bostrum, as well as his paper 'Embryo Selection for Cognitive Enhancement', the case is made for very simple advances in IQ by selecting certain embryos for genetic attributes or even, in this case, breeding for them, and the payoff in terms of raw intelligence could be staggering.

u/____candied_yams____ · 20 pointsr/learnpython

OOP in python is a bit different than other languages because of how python doesn't necessarily advocate for private data members as in many statically typed languages like C++ and Java. Additionally, the keyword decorator property eliminates the need for getters and setters up front in class design.

I have lots of python books but the two that helped me write pythonic classes were Effective Python (I actually only have the 1st edition) and Fluent python. The first book is probably a great crash course so if you get just one book get that one, but the latter book is good too and goes into much more detail without coming across like a dense reference text.

James Powell also has some great videos on pythonic programming, a lot of which involves OOP style.

u/Quinnjaminn · 3 pointsr/cscareerquestions

Copy pasting my response to a similar question:

Edited to have more resources and be easier to read.

It's hard to draw the line between "essential" and "recommended." That depends a lot on what you want to do. So, I will present a rough outline of core topics covered in the 4 year CS program at my university (UC Berkeley). This is not a strict order of topics, but prerequisites occur before topics that depend on them.

Intro CS

Topics include Environments/Scoping, abstraction, recursion, Object oriented vs functional programming models, strings, dictionaries, Interpreters. Taught in Python.

The class is based on the classic MIT text, "Structure and Interpretation of Computer Programs." Of course, that book is from 1984 and uses Scheme, which many people don't want to learn due to its rarity in industry. We shifted recently to reading materials based on SICP, but presented in python. I believe this is the reading used now. This course is almost entirely posted online. The course page is visible to public, and has the readings, discussion slides / questions and solutions, project specs, review slides, etc. You can find it here.

Data Structures and basic algorithms

DS: Arrays, Linked Lists, Trees (Binary search, B, Spaly, Red-Black), Hash Tables, Stacks/Queues, Heaps, Graphs. Algorithms: Search (Breadth first vs depth first), Sorting (Bubble, radix, bucket, merge, quick, selection, insert, etc), Dijkstra's and Kruskal's, Big-O analysis.

This class uses two books: "Head First Java" and "Data Structures and Algorithms in Java" (any edition except 2). The class doesn't presupposed knowledge in any language, so the first portion is covering Object Oriented principles and Java from a java book (doesn't really matter which), then moving to the core topics of data structures and algorithms. The course page has some absolutely fantastic notes -- I skim through these before every interview to review. You can also check out the projects and homeworks if you want to follow along. The course page is available here (note that it gets updated with new semesters, and links will be removed -- download them soon if you want to use them).

Machine Structures (Intro Architecture)

Warehouse scale computing (Hadoop Map-Reduce). C language, basics of assemblers/compilers/linkers, bit manipulation, number representation. Assembly Language (MIPS). CPU Structure, pipelining, threading, virtual memory paging systems. Caching / memory hierarchy. Optimization / Performance analysis, parallelism (Open MP), SIMD (SSE Intrinsics).

This class uses two books: "The C Programming Language" and "Computer Organization and Design". This class is taught primarily in C, so the first few weeks are spent as a crash course in C, along with a discussion/project using Map-Reduce. From there in jumps into Computer Organization and Design. I personally loved the projects I did in this class. As with above, the lecture slides, discussion notes, homeworks, labs, solutions, and projects are all available on an archived course page.

Discrete Math / Probability Theory

Logic, Proofs, Induction, Modular Arithmetic (RSA / Euclid's Algorithm). Polynomials over finite fields. Probability (expectation / variance) and it's applicability to hashing. Distributions, Probabilistic Inference. Graph Theory. Countability.

Time to step away from coding! This is a math class, plain and simple. As for book, well, we really didn't have one. The class is based on a series of "Notes" developed for the class. When taken as a whole, these notes serve as the official textbook. The notes, homeworks, etc are here.

Efficient Algorithms and Intractable Problems

Designing and analyzing algorithms. Lower bounds. Divide and Conquer problems. Search problems. Graph problems. Greedy algorithms. Linear and Dynamic programming. NP-Completeness. Parallel algorithms.

The Efficient Algorithms class stopped posting all of the resources online, but an archived version from 2009 has homeworks, reading lists, and solutions. This is the book used.

Operating Systems and System Programming

Concurrency and Synchronization. Memory and Caching. Scheduling and Queuing theory. Filesystems and databases. Security. Networking.

The Operating Systems class uses this book, and all of the lectures and materials are archived here (Spring 2013).

Math

Those are the core classes, not including about 4 (minimum) required technical upper division electives to graduate with a B.A. in CS. The math required is:

  • Calculus 1 and 2 (Calc AB/BC, most people test out, though I didn't)

  • Multivariable calculus (not strictly necessary, just recommended)

  • Linear Algebra and Differential Equations.

    Those are the core classes you can expect any graduate from my university to have taken, plus 4 CS electives related to their interests. If you could tell me more about your goals, I might be able to refine it more.
u/cbarrick · 2 pointsr/computing

Sipser's Introduction to the Theory of Computation is the standard textbook. The book is fairly small and quite well written, though it can be pretty dense at times. (Sipser is Dean of Science at MIT.)

You may need an introduction to discrete math before you get started. In my udergrad, I used Rosen's Discrete Mathematics and Its Applications. That book is very comprehensive, but that also means it's quite big.

Rosen is a great reference, while Sipser is more focused.

u/blowaway420 · 1 pointr/RationalPsychonaut

Very interesting. You might be interested in

https://en.m.wikipedia.org/wiki/On_Intelligence

https://www.amazon.de/Intelligence-Jeff-Hawkins/dp/0805078533

It was pretty popular and was read among AI researchers alot. It's easy to understand.

Consciousness prepare to be understood!

u/browwiw · 2 pointsr/HaloStory

I'm currently listening to the audio book of Nick Bostrom's Superintelligence: Paths, Dangers, Strategies, so I'm kind of hyped on AI and their possible existential threat, right now. The Halo writers are greatly downplaying what is possible for a powerful superintelligence can do. Once in control of the Domain, and properly bootstrapped to godhood, Cortana wouldn't have need for the Guardians or any of the Promethean's infrastructure. She could just start converting matter into computronium or something even more exotic. Of course, that's way too un-fun and not adventure sci-fi. If the Halo writers wanted to combine Halo-lore with contemporary conjecture on AI doomsdays, Cortana should have started mass producing Composer platforms to convert all sentient life in the known galaxy into info-life and importing them all into the Domain where they can live in a never ending Utopia...on her terms, of course. Using ancient warships to enforce martial law is just too crude. The Guardians are a decisive strategic advantage, but just not nearly what a superintelligence can get away with.

Also, I'd like to note that according to real world AI theory, the Smart AI of Halo are not "true" AI. They are Emulated Minds, ie, their core architecture is based on high resolution scanning of human brains that is emulated via powerful software. I know that this is common knowledge amongst us, but I find it interesting that RL researchers do make a distinction between artificial machine intelligence and theoretical Full Mind Emulation.

u/ActiveCarpet · 2 pointsr/ludology

This video examines the history of creativity in game design, the evolution of genres, and how game designers can be creative in the future. It combines Raph Koster's GDC talk about practical creativity, with insights from books as varied as Tynan Sylvesters designing games, to Micheal Sellers Advanced game design, to suggest that the key to the future of creativity in video games is understanding our past.

&amp;#x200B;

Raph Kosters GDC talk https://www.youtube.com/watch?v=zyVTxGpEO30

Tynan Sylvester's Designing games https://www.amazon.ca/Designing-Games-Guide-Engineering-Experiences/dp/1449337937

Erin Hofmman's Gdc Talk Precision of Emotion https://www.youtube.com/watch?v=FP-LNRtwpb8

Gdc talk Design in Detail https://www.youtube.com/watch?v=hJhpMmVLMZQ

&amp;#x200B;

There are About 25 other links in the description of the video as well, all pertaining to the history and future of game design