Reddit mentions: The best ai & machine learning books
We found 3,368 Reddit comments discussing the best ai & machine learning books. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 567 products and ranked them based on the amount of positive reactions they received. Here are the top 20.
1. Code: The Hidden Language of Computer Hardware and Software
- Microsoft Press
Features:
Specs:
Height | 8.9 Inches |
Length | 6 Inches |
Number of items | 1 |
Weight | 1.0361726314 Pounds |
Width | 1 Inches |
2. Design Patterns: Elements of Reusable Object-Oriented Software
- Great product!
Features:
Specs:
Color | White |
Height | 1.04 Inches |
Length | 9.31 Inches |
Number of items | 1 |
Weight | 1.9510910187 Pounds |
Width | 7.62 Inches |
3. Gödel, Escher, Bach: An Eternal Golden Braid
- Basic Books AZ
Features:
Specs:
Color | Black |
Height | 9.25 Inches |
Length | 6.5 Inches |
Number of items | 1 |
Release date | February 1999 |
Weight | 2.3368999772 Pounds |
Width | 1.9 Inches |
4. Artificial Intelligence: A Modern Approach (3rd Edition)
- Overnight shipping available
Features:
Specs:
Height | 11.1 Inches |
Length | 9.2 Inches |
Number of items | 1 |
Weight | 4.40042674952 Pounds |
Width | 2.05 Inches |
5. Concrete Mathematics: A Foundation for Computer Science (2nd Edition)
- Extremely fast! It’s the fastest manual Santoku and 15 Degree style knife sharpener available
- Creates an ultra-sharp 15 Degree edge
- 2-Stages, sharpening and honing/polishing, for an arch-shaped edge that is stronger and more durable
- Uses 100% diamond abrasives in stages 1 and 2
- CrissCross technology for an extremely sharp edge with lots of “bite”
Features:
Specs:
Height | 1.44 Inches |
Length | 9.38 Inches |
Number of items | 1 |
Weight | 2.645547144 Pounds |
Width | 7.82 Inches |
6. Pattern Recognition and Machine Learning (Information Science and Statistics)
- Springer
Features:
Specs:
Height | 10.2 Inches |
Length | 7.7 Inches |
Number of items | 1 |
Release date | April 2011 |
Weight | 4.73332476514 Pounds |
Width | 1.3 Inches |
7. Design Patterns: Elements of Reusable Object-Oriented Software
- The space mouse compact was developed to deliver an intuitive, effortless and precise 3D navigation in CAD applications that cannot be experienced by using a standard mouse and keyboard.
- Six-degrees-of-freedom (6Dof) sensor - intuitively and precisely navigate digital models or views. Operating system - Windows 10, Windows 8.1, Windows 7 SP1,Apple macOS 10.14, Apple macOS 10.13, Apple macOS 10.12, Apple OS X 10.11, Apple OS X 10.10. Linux Red Hat Enterprise Linux Workstation 4, 5, Linux Novell SUSE Linux 9.3, 10, 11
- Each of the space mouse compact's two buttons opens its own 3Dconnexion radial menu. They provide direct access to up to 8 of your favorite application commands.
- With its iconic, pure design, the space mouse compact is small enough to fit on every desk while the brushed steel base ensures the device stability for precise 3D navigation.
- 2-Year manufacturer's warranty
Features:
Specs:
Release date | October 1994 |
8. Grokking Algorithms: An illustrated guide for programmers and other curious people
Manning Publications
Specs:
Height | 9.25 Inches |
Length | 7.38 Inches |
Number of items | 1 |
Release date | May 2016 |
Weight | 0.881849048 Pounds |
Width | 0.4 Inches |
9. Programming Game AI by Example (Wordware Game Developers Library)
- This beautiful chime candle holder is intended to give your Yule or winter holiday season a little more cheer
Features:
Specs:
Height | 9.21 Inches |
Length | 6.09 Inches |
Number of items | 1 |
Release date | October 2004 |
Weight | 1.62480687094 Pounds |
Width | 1 Inches |
10. Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems
- O Reilly Media
Features:
Specs:
Height | 9.19 Inches |
Length | 7 Inches |
Number of items | 1 |
Release date | April 2017 |
Weight | 2.17375790332 Pounds |
Width | 1.29 Inches |
11. Introduction to the Theory of Computation
- Used Book in Good Condition
Features:
Specs:
Height | 9.5 Inches |
Length | 6.5 Inches |
Number of items | 1 |
Weight | 1.64905771976 Pounds |
Width | 1 Inches |
12. Mathematics for 3D Game Programming and Computer Graphics, Third Edition
Used Book in Good Condition
Specs:
Height | 9 Inches |
Length | 7 Inches |
Number of items | 1 |
Weight | 4.0565056208 pounds |
Width | 1.5 Inches |
13. Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices)
Specs:
Release date | October 2000 |
14. Superintelligence
- Great product!
Features:
Specs:
Height | 0.5 Inches |
Length | 6.75 Inches |
Number of items | 1 |
Release date | May 2015 |
Weight | 0.21875 Pounds |
Width | 5.5 Inches |
15. Superintelligence: Paths, Dangers, Strategies
- a history of the study of human intelligence with some new ideas
Features:
Specs:
Height | 6.2 inches |
Length | 9.3 inches |
Number of items | 1 |
Weight | 1.49693875898 Pounds |
Width | 1 inches |
16. On Intelligence: How a New Understanding of the Brain Will Lead to the Creation of Truly Intelligent Machines
- St Martin s Griffin
Features:
Specs:
Height | 8.25 Inches |
Length | 5.3999892 Inches |
Number of items | 1 |
Release date | July 2005 |
Weight | 0.54 Pounds |
Width | 1.2 Inches |
17. The Annotated Turing: A Guided Tour Through Alan Turing's Historic Paper on Computability and the Turing Machine
- John Wiley & Sons
Features:
Specs:
Height | 8.799195 Inches |
Length | 5.901563 Inches |
Number of items | 1 |
Release date | June 2008 |
Weight | 1.12876678144 Pounds |
Width | 0.901573 Inches |
18. Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series)
Mit Press
Specs:
Color | Multicolor |
Height | 9.27 Inches |
Length | 8.25 Inches |
Number of items | 1 |
Release date | August 2012 |
Weight | 4.1998060911 Pounds |
Width | 1.79 Inches |
19. Programming Collective Intelligence: Building Smart Web 2.0 Applications
- O Reilly Media
Features:
Specs:
Height | 9.19 Inches |
Length | 7 Inches |
Number of items | 1 |
Release date | August 2007 |
Weight | 1.27206725174 Pounds |
Width | 0.9 Inches |
20. Introduction to the Theory of Computation
- NIV, Bible
Features:
Specs:
Height | 9.21258 inches |
Length | 5.70865 inches |
Number of items | 1 |
Weight | 1.650380493332 pounds |
Width | 0.90551 inches |
🎓 Reddit experts on ai & machine learning books
The comments and opinions expressed on this page are written exclusively by redditors. To provide you with the most relevant data, we sourced opinions from the most knowledgeable Reddit users based the total number of upvotes and downvotes received across comments on subreddits where ai & machine learning books are discussed. For your reference and for the sake of transparency, here are the specialists whose opinions mattered the most in our ranking.
Game Engine:
Game Engine Architecture by Jason Gregory, best you can get.
Game Coding Complete by Mike McShaffry. The book goes over the whole of making a game from start to finish, so it's a great way to learn the interaction the engine has with the gameplay code. Though, I admit I also am not a particular fan of his coding style, but have found ways around it. The boost library adds some complexity that makes the code more terse. The 4th edition made a point of not using it after many met with some difficulty with it in the 3rd edition. The book also uses DXUT to abstract the DirectX functionality necessary to render things on screen. Although that is one approach, I found that getting DXUT set up properly can be somewhat of a pain, and the abstraction hides really interesting details about the whole task of 3D rendering. You have a strong background in graphics, so you will probably be better served by more direct access to the DirectX API calls. This leads into my suggestion for Introduction to 3D Game Programming with DirectX10 (or DirectX11).
C++:
C++ Pocket Reference by Kyle Loudon
I remember reading that it takes years if not decades to become a master at C++. You have a lot of C++ experience, so you might be better served by a small reference book than a large textbook. I like having this around to reference the features that I use less often. Example:
namespace
{
//code here
}
is an unnamed namespace, which is a preferred method for declaring functions or variables with file scope. You don't see this too often in sample textbook code, but it will crop up from time to time in samples from other programmers on the web. It's $10 or so, and I find it faster and handier than standard online documentation.
Math:
You have a solid graphics background, but just in case you need good references for math:
3D Math Primer
Mathematics for 3D Game Programming
Also, really advanced lighting techniques stretch into the field of Multivariate Calculus. Calculus: Early Transcendentals Chapters >= 11 fall in that field.
Rendering:
Introduction to 3D Game Programming with DirectX10 by Frank. D. Luna.
You should probably get the DirectX11 version when it is available, not because it's newer, not because DirectX10 is obsolete (it's not yet), but because the new DirectX11 book has a chapter on animation. The directX 10 book sorely lacks it. But your solid graphics background may make this obsolete for you.
3D Game Engine Architecture (with Wild Magic) by David H. Eberly is a good book with a lot of parallels to Game Engine Architecture, but focuses much more on the 3D rendering portion of the engine, so you get a better depth of knowledge for rendering in the context of a game engine. I haven't had a chance to read much of this one, so I can't be sure of how useful it is just yet. I also haven't had the pleasure of obtaining its sister book 3D Game Engine Design.
Given your strong graphics background, you will probably want to go past the basics and get to the really nifty stuff. Real-Time Rendering, Third Edition by Tomas Akenine-Moller, Eric Haines, Naty Hoffman is a good book of the more advanced techniques, so you might look there for material to push your graphics knowledge boundaries.
Software Engineering:
I don't have a good book to suggest for this topic, so hopefully another redditor will follow up on this.
If you haven't already, be sure to read about software engineering. It teaches you how to design a process for development, the stages involved, effective methodologies for making and tracking progress, and all sorts of information on things that make programming and software development easier. Not all of it will be useful if you are a one man team, because software engineering is a discipline created around teams, but much of it still applies and will help you stay on track, know when you've been derailed, and help you make decisions that get you back on. Also, patterns. Patterns are great.
Note: I would not suggest Software Engineering for Game Developers. It's an ok book, but I've seen better, the structure doesn't seem to flow well (for me at least), and it seems to be missing some important topics, like user stories, Rational Unified Process, or Feature-Driven Development (I think Mojang does this, but I don't know for sure). Maybe those topics aren't very important for game development directly, but I've always found user stories to be useful.
Software Engineering in general will prove to be a useful field when you are developing your engine, and even more so if you have a team. Take a look at This article to get small taste of what Software Engineering is about.
Why so many books?
Game Engines are a collection of different systems and subsystems used in making games. Each system has its own background, perspective, concepts, and can be referred to from multiple angles. I like Game Engine Architecture's structure for showing an engine as a whole. Luna's DirectX10 book has a better Timer class. The DirectX book also has better explanations of the low-level rendering processes than Coding Complete or Engine Architecture. Engine Architecture and Game Coding Complete touch on Software Engineering, but not in great depth, which is important for team development. So I find that Game Coding Complete and Game Engine Architecture are your go to books, but in some cases only provide a surface layer understanding of some system, which isn't enough to implement your own engine on. The other books are listed here because I feel they provide a valuable supplement and more in depth explanations that will be useful when developing your engine.
tldr: What Valken and SpooderW said.
On the topic of XNA, anyone know a good XNA book? I have XNA Unleashed 3.0, but it's somewhat out of date to the new XNA 4.0. The best looking up-to-date one seems to be Learning XNA 4.0: Game Development for the PC, Xbox 360, and Windows Phone 7 . I have the 3.0 version of this book, and it's well done.
*****
Source: Doing an Independent Study in Game Engine Development. I asked this same question months ago, did my research, got most of the books listed here, and omitted ones that didn't have much usefulness. Thought I would share my research, hope you find it useful.
[2 of 4]
I watched this interview earlier today, so after reading this article, I'm a tad disappointed. Artificial intelligence and a Brain Machine Interface are two things I'm super interested in, and this particular technology editor wrote one of the crappiest articles I've read over it.
So here is the article, points, counterpoints, the whole shebang.
---
Article
> Elon Musk smoked pot and drank whiskey on the Joe Rogan podcast..."
He did indeed smoke pot and drink whiskey on the podcast. He had one puff of the pot, and drank one glass of the whiskey. And the pot was near the end. Nothing really serious about this, insofar as I am aware.
> "... and said he's going to soon announce a new "Neuralink" product that can make anyone superhuman."
Outright fabrication. Elon did not remotely say that he's going to soon announce a new Neuralink product that can make anyone superhuman, or suggest that anyone will have anything like that soon.
> "'I think we'll have something interesting to announce in a few months ... that's better than anyone thinks is possible,' the Tesla CEO said on 'Joe Rogan Experience.' 'Best case scenario, we effectively merge with AI.'"
Alright. Those are two actual quotes!
The first quote-- yes, Elon said that he'll have something interesting, possibly, in a few months. Specifically, he says that it is about an order of magnitude better than anyone thinks is possible.
The second sentence is a mostly unrelated part of the conversation about different ways to counter Artificial General Intelligence, which may be an existential threat to humanity and is a possibility. More on this at the end.
> Musk, whose enterprises include a company called Neuralink, says his new technology will be able to seamlessly combine humans with computers, giving us a shot at becoming "symbiotic" with artificial intelligence.
He does not say this at all in the interview. He suggests that becoming symbiotic with an interface that is like an AI is likely the best way forward for mankind, out of the different options. He goes on to explain, though he doesn't use the term, of how an emergent consciousness would work.
> Musk argued that since we're already practically attached to our phones, we're already cyborgs. We're just not as smart as we could be because the data link between the information we can get from our phones to our brains isn't as fast as it could be.
Accurate reporting here, and in the spirit of the actual interview. It doesn't really explain what he means by this, but that'd be a bit much for an article, wouldn't it?
ARTICLE BREAK FOR A QUICK PICTURE IN THE ARTICLE!
> Picture of Elon hitting a blunt
I think it's a blunt, not a spliff. Perfectly alright explaining my thought process if asked.
> "It will enable anyone who wants to have superhuman cognition," Musk said. "Anyone who wants."
I'll have to rewatch the interview to get the exact wording, but I watched it earlier today. I'm pretty confident Elon said 'would', not 'will'. Which doesn't seem like much, but makes a world of difference.
At this point, he is describing what it would be like to have an interface that you could control by thought.
> "Rogan asked how much different these cyborg humans would be than regular humans, and how radically improved they might be."
> "'How much smarter are you with a phone or computer or without? You're vastly smarter, actually,' Musk said. 'You can answer any question pretty much instantly. You can remember flawlessly. Your phone can remember videos [and] pictures perfectly. Your phone is already an extension of you. You're already a cyborg. Most people don't realize you're already a cyborg. It's just that the data rate ... it's slow, very slow. It's like a tiny straw of information flow between your biological self and your digital self. We need to make that tiny straw like a giant river, a huge, high-bandwidth interface.'"
At this point, the cyborg thing is explained a little bit better. The article times it and changes the order of the interview a bit to make him look like a crackpot idiot, but this part is pretty true to form. It doesn't really give much context around the rest of the conversation in the interview, that led up to that, explained ideas before, that sort of thing. But a good paragraph for the article.
> "Musk, who spoke about Neuralink before he smoked pot on the podcast..."
We know he smoked pot.
> "...said this sort of technology could eventually allow humans to create a snapshot of themselves that can live on if our bodies die."
> "'If your biological self dies, you can upload into a new unit. Literally,' Musk said."
This was definitely mentioned as an aside, and as a possibility, by Elon. It did actually explain how it would work. Also, it wasn't a snapshot-- people who study this know there is a big difference between a transition and a snapshot, and Elon did not at all imply it was a snapshot, it was spoken of as if it was a transition-- which is key. But not really something the average person studies, either-- so of course not explaining it.
> "Musk said he thinks this will give humans a better chance against artificial intelligence."
> "'The merge scenario with AI is the one that seems like probably the best. If you can't beat it, join it,' Musk said."
The article manages to make this, which is perhaps the most important section of the interview and a terribly important part of humanity, two short lines with no explanation in such a way that makes the person look like an idiot, ignoring everything he otherwise explained.
> "Tesla's stock took a hit after the bizarre appearance and revelations Friday that two Tesla executives are leaving."
Tesla's stock did indeed take a hit. It's an extremely volatile stock with good and bad news constantly. I personally fail to see how it relates to this article, though-- much like a hit of pot and a glass of whiskey.
---
An actual explanation
Elon Musk started a company called Neuralink somewhat recently. It brought together a board which consists of doctors, engineers, scientists, surgeons-- and in particular, people who were studied in multiples of those fields.
The end goal of Neuralink is to create a low-cost non-invasive brain machine interface (BMI), which would allow you to basically access the internet by thought. Notable is that you would both send and receive messages that your brain could then directly interpret.
With your phone, you can access most of the world's knowledge at your fingertips. The catch with that is that it is a tad slow. You have to pull your phone out, type out words with two thumbs, have pages load slowly, that sort of thing. In this way, you can think of your phone as an extension of yourself, and yourself as a sort of clumsy cyborg.
The company isn't far. I believe I read somewhere that its current goals range on medical uses. Elon mentioned in the interview that they might have something to announce (not even necessarily a product) in a few months. He also uses one of his favorite terms-- it will be an order of magnitude better than anything currently thought possible (by the general public). It will likely be medical in nature and impressive, but not revolutionary.
Actual success is a long, long way off, and nothing Elon said in the interview suggests otherwise.
So that's the gist of the article. As for the actual interview.
Joe Rogan interviewed Elon Musk on his podcast recently, where they discussed lots of things (The Boring Machine, AI, Neuralink, Tesla, SpaceX-- those sorts of things.)
They spent about three hours talking about things, Elon and Joe had a cup of whiskey, Elon had a hit from a blunt, Joe a few hits-- the entire interview was a pretty casual thing. Not a product announcement, nothing like that.
Not at all like this particular technology editor made it out to be.
And that's about it. I have some links on actually interesting reading for this down below.
---
Some resources!
http://podcastnotes.org/2018/09/07/elon/ - Some notes about the interview, and good summary.
https://www.youtube.com/watch?v=ycPr5-27vSI - The actual interview, tad long. AI stuff is the first topic and ends at roughly 33 minute mark.
https://waitbutwhy.com/2017/04/neuralink.html - Article over Neuralink, explaining the company and goal from pretty simple beginnings. Easy to read, wonderfully explanatory.
https://www.amazon.com/Superintelligence-Dangers-Strategies-Nick-Bostrom/dp/1501227742 - Superintelligence: Paths, Stranger, and Strategies. Covers artificial general intelligence, why it is a threat, and ways to handle it. Pretty much the entire goal of Neuralink is based off of this book, and it's a very reasonable and quality book.
Unity is the bee's knees.
I've been messing with it casually for several years, and got serious in the last 2-ish years. I like it because I get to use C#, and that's the language I know best. Only problem in it's using some weird limbo version of .NET 2, that's not actually 2.0 but is also 3.0 is some places? I think it's because it's using Mono 2.0, which is some subset of .NET. It's weird. They're moving to 4.5 soon anyways so I'm hype for that. I'ts been a lot of fun regardless, I get to apply a different knowledge and tool set from my day job. Not to mention it feels great when you actually get something to build and actually work.
So anyways here's a list of resources I've found over the years to be super helpful:
Things on Reddit
Books
Videos
Other Things
That's all I can think of right now. I know it's a lot, but hopefully you can take away at least one of these resources.
Happy dev-ing!
You have a long journey ahead of you, but here goes :D
Beginner
C++ Primer: One of the better introductory books.
The C++ Standard Template Library: A Tutorial and Reference: Goes over the standard template library in fantastic detail, a must if you're going to be spending a lot of time writing C++.
The C++ Programming Language: Now that you have a good idea of how C++ is used, it's time to go over it again. TCPPL is written by the language's creator and is intended as an introductory book for experienced programmers. That said I think it's best read once you're already comfortable with the language so that you can full appreciate his nuggets of wisdom.
Intermediate
Modern C++ Design: Covers how to write reusable C++ code and common design patterns. You can definitely have started game programming by the time you read this book, however it's definitely something you should have on your reading list.
C++ Templates: Touches on some similar material as Modern C++ Design, but will help you get to grips with C++ Template programming and how to write reusable code.
Effective C++: Practical advise about C++ do's and dont's. Again, this isn't mandatory knowledge for gamedev, but it's advice is definitely invaluable.
Design Patterns: Teaches you commonly used design patterns. Especially useful if you're working as part of a team as it gives you a common set of names for design patterns.
Advanced
C++ Concurrency in Action: Don't be put off by the fact I've put this as an "advanced" topic, it's more that you will get more benefit out of knowing the other subjects first. Concurrency in C++11 is pretty easy and this book is a fantastic guide for learning how its done.
Graphics Programming
OpenGL: A surprisingly well written specification in that it's pretty easy to understand! While it's probably not the best resource for learning OpenGL, it's definitely worth looking at. [edit: Mix it in with Open.gl and arcsynthesis's tutorials for practical examples and you're off to a good start!]
OpenGL Superbible: The OpenGL superbible is one of the best ways to learn modern OpenGL. Sadly this isn't saying much, in fact the only other book appears to be the "Orange Book", however my sources indicate that is terrible. So you're just going to have suck it up and learn from the OGL Superbible![edit: in retrospect, just stick to free tutorials I've linked above. You'll learn more from them, and be less confused by what is 3rd party code supplied by the book. Substitute the "rendering" techniques you would learn from a 3d book with a good 3d math book and realtime rendering (links below)]Essential Mathematics for Game Programmers or 3D Math Primer for Graphics and Game Development: 3D programming involves a lot of math, these books cover topics that OpenGL/DirectX books tend to rush over.
Realtime Rendering: A graphics library independent explanation of a number of modern graphical techniques, very useful with teaching you inventive ways to use your newly found 3d graphical talents!
So, when I was younger, I did attend one computer science related camp,
https://www.idtech.com
They have a location at Emory (which I believe I did one year) that was ok (not nearly as "nerdy"), and one at Boston which I really enjoyed (perhaps because I had to sleep on site). That being said, the stuff I learned there was more in the areas of graphic design and/or system administration, and not computer science. They are also quite expensive for only 1-2 weeks of exposure.
I felt it was a good opportunity to meet some very smart kids though, and it definitely lead me to push myself. Knowing and talking to people that are purely interested in CS, and are your age, is quite rare in high school. I think that kind of perspective can make your interests and hobbies seem more normal and set a much higher bar for what you expect for yourself.
On the other side of things, I believe that one of the biggest skills in any college program is an openness to just figure something out yourself if it interests you, without someone sitting there with you. This can be very helpful in life in general, and I think was one of the biggest skills I was missing in high school. I remember tackling some tricky stuff when I was younger, but I definitely passed over stuff I was interested in just because I figured "thats for someone with a college degree". The fact is that experience will make certain tasks easier but you CAN learn anything you want. You just may have to learn more of the fundamentals behind it than someone with more experience.
With that in mind, I would personally suggest a couple of things which I think would be really useful to someone his age, give him a massive leg up over the average freshman when he does get to college, and be a lot more productive than a summer camp.
One would be to pick a code-golf site (I like http://www.codewars.com) and simply try to work through the challenges. Another, much more math heavy, option is https://projecteuler.net. This, IMO is one of the best ways to learn a language, and I will often go there to get familiar with the syntax of a new language. I think he should pick Python and Clojure (or Haskell) and do challenges in both. Python is Object Oriented, whilst Clojure (or Haskell) is Functional. These are two very fundamental and interesting "schools of thought" and if he can wrap his head around both at this age, that would be very valuable.
A second option, and how I really got into programming, is to do some sort of web application development. This is pretty light on the CS side of things, but it allows you to be creative and manage more complex projects. He could pick a web framework in Python (flask), Ruby (rails), or NodeJS. There are numerous tutorials on getting started with this stuff. For Flask: http://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world. For Rails: https://www.railstutorial.org. This type of project could take a while, there are a lot of technologies which interact to make a web application, but the ability to be creative when designing the web pages can be a lot of fun.
A third, more systems level, option (which is probably a bit more opinionated on my part) is that he learn to use Linux. I would suggest that he install VirtualBox on his computer, https://www.virtualbox.org/wiki/Downloads. He can then install Linux in a virtual machine without messing up the existing OS (also works with Mac). He COULD install Ubuntu, but this is extremely easy and doesn't really teach much about the inner workings. I think he could install Arch. https://wiki.archlinux.org. This is a much more involved distribution to install, but their documentation is notoriously good, and it exposes you to a lot of command line (Ubuntu attempts to be almost exclusively graphical). From here, he should just try to use it as much as possible for his daily computing. He can learn general system management and Bash scripting. There should be tutorials for how to do just about anything he may want. Some more advanced stuff would be to configure a desktop environment, he could install Gnome by default, it is pretty easy, but a lot of people really get into this with more configurable ones ( https://www.reddit.com/r/unixporn ). He could also learn to code and compile in C.
Fourth, if he likes C, he may like seeing some of the ways in which programs which are poorly written can be broken. A really fun "game" is https://io.smashthestack.org. He can log into a server and basically "hack" his way to different levels. This can also really expose you to how Linux maintains security (user permissions, etc. ). I think this would be much more involved approach, but if he is really curious about this stuff, I think this could be the way to go. In this similar vein, he could watch talks from Defcon and Chaos Computer Club. They both have a lot of interesting stuff on youtube (it can get a little racy though).
Finally, there are textbooks. These can be really long, and kinda boring. But I think they are much more approachable than one might think. These will expose you much more to the "Science" part of computer science. A large portions of the classes he will take in college look into this sort of stuff. Additionally, if he covers some of this stuff, he could look into messing around with AI (Neural Networks, etc.) and Machine Learning (I would check out Scikit-learn for Python). Here I will list different broad topics, and some of the really good books in each. (Almost all can be found for free.......)
General CS:
Algorithms and Data Structures: https://mitpress.mit.edu/books/introduction-algorithms
Theory of Computation: http://www.amazon.com/Introduction-Theory-Computation-Michael-Sipser/dp/113318779X
Operating Systems: http://www.amazon.com/Operating-System-Concepts-Abraham-Silberschatz/dp/0470128720
Some Math:
Linear Algebra: http://math.mit.edu/~gs/linearalgebra/
Probability and Stats: http://ocw.mit.edu/courses/mathematics/18-05-introduction-to-probability-and-statistics-spring-2014/readings/
I hope that stuff helps, I know you were asking about camps, and I think the one I suggested would be good, but this is stuff that he can do year round. Also, he should keep his GPA up and destroy the ACT.
The resource seems very extensive such that it should suffice you plenty to be a good software engineer. I hope you don't get exhausted from it. I understand that some people can "hack" the technical interview process by memorizing a plethora of computer science and software engineering knowledge, but I hope you pay great attention to the important theoretical topics.
If you want a list of books to read over the summer to build a strong computer science and software engineering foundation, then I recommend to read the following:
The general theme of this list of books is to teach a hierarchy of abstract solutions, techniques, patterns, heuristics, and advice which can be applied to all fields in software engineering to solve a wide variety of problems. I believe a great software engineer should never be blocked by the availability of tools. Tools come and go, so I hope software engineers have strong problem solving skills, trained in computer science theory, to be the person who can create the next big tools to solve their problems. Nonetheless, a software engineer should not reinvent the wheel by recreating solutions to well-solved problems, but I think a great software engineer can be the person to invent the wheel when problems are not well-solved by the industry.
P.S. It's also a lot of fun being able to create the tools everyone uses; I had a lot of fun by implementing Promises and Futures for a programming language or writing my own implementation of Cassandra, a distributed database.
I've posted this before but I'll repost it here:
Now in terms of the question that you ask in the title - this is what I recommend:
Job Interview Prep
Junior Software Engineer Reading List
Read This First
Fundementals
Understanding Professional Software Environments
Mentality
History
Mid Level Software Engineer Reading List
Read This First
Fundementals
Software Design
Software Engineering Skill Sets
Databases
User Experience
Mentality
History
Specialist Skills
In spite of the fact that many of these won't apply to your specific job I still recommend reading them for the insight, they'll give you into programming language and technology design.
Your mileage with certifications may vary depending on your geographical area and type of IT work you want to get into. No idea about Phoenix specifically.
For programming work, generally certifications aren't looked at highly, and so you should think about how much actual programming you want to do vs. something else, before investing in training that employers may not give a shit about at all.
The more your goals align with programming, the more you'll want to acquire practical skills and be able to demonstrate them.
I'd suggest reading the FAQ first, and then doing some digging to figure out what's out there that interests you. Then, consider trying to get in touch with professionals in the specific domain you're interested in, and/or ask more specific questions on here or elsewhere that pertain to what you're interested in. Then figure out a plan of attack and get to it.
A lot of programming work boils down to:
As a basic primer, you might want to look at Code for a big picture view of what's going with computers.
For basic logic skills, the first two chapters of How to Prove It are great. Being able to think about conditional expressions symbolically (and not get confused by your own code) is a useful skill. Sometimes business requirements change and require you to modify conditional statements. With an understanding of Boolean Algebra, you will make fewer mistakes and get past this common hurdle sooner. Lots of beginners struggle with logic early on while also learning a language, framework, and whatever else. Luckily, Boolean Algebra is a tiny topic. Those first two chapters pretty much cover the core concepts of logic that I saw over and over again in various courses in college (programming courses, algorithms, digital circuits, etc.)
Once you figure out a domain/industry you're interested in, I highly recommend focusing on one general purpose programming language that is popular in that domain. Learn about data structures and learn how to use the language to solve problems using data structures. Try not to spread yourself too thin with learning languages. It's more important to focus on learning how to get the computer to do your bidding via one set of tools - later on, once you have that context, you can experiment with other things. It's not a bad idea to learn multiple languages, since in some cases they push drastically different philosophies and practices, but give it time and stay focused early on.
As you gain confidence there, identify a simple project you can take on that uses that general purpose language, and perhaps a development framework that is popular in your target industry. Read up on best practices, and stick to a small set of features that helps you complete your mini project.
When learning, try to avoid haplessly jumping from tutorial to tutorial if it means that it's an opportunity to better understand something you really should understand from the ground up. Don't try to understand everything under the sun from the ground up, but don't shy away from 1st party sources of information when you need them. E.g. for iOS development, Apple has a lot of development guides that aren't too terrible. Sometimes these guides will clue you into patterns, best practices, pitfalls.
Imperfect solutions are fine while learning via small projects. Focus on completing tiny projects that are just barely outside your skill level. It can be hard to gauge this yourself, but if you ever went to college then you probably have an idea of what this means.
The feedback cycle in software development is long, so you want to be unafraid to make mistakes, and prioritize finishing stuff so that you can reflect on what to improve.
Good on you for looking to grow yourself as a professional! The best folks I've worked with are still working on professional development, even 10-20 years in to their profession.
Programming languages can be thought of as tools. Python, say, is a screwdriver. You can learn everything there is about screwdrivers, but this only gets you so far.
To build something you need a good blueprint. For this you can study objected oriented design (OOD) and programming (OOP). Once you have the basics, take a look at design patterns like the Gang of Four. This book is a good resource to learn about much of the above
What parts do you specify for your blueprint? How do they go together? Study up on abstract data types (ADTs) and algorithms that manipulate those data types. This is the definitive book on algorithms, it does take some work to get through it, but it is worth the work. (Side note, this is the book Google expects you to master before interviewing)
How do you run your code? You may want to study general operating system concepts if you want to know how your code interacts with the system on which it is running. Want to go even deeper with code performance? Take a look at computer architecture Another topic that should be covered is computer networking, as many applications these days don't work without a network.
What are some good practices to follow while writing your code? Two books that are widely recommended are Code Complete and Pragmatic Programmer. Though they cover a very wide range (everything from organizational hacks to unit testing to user design) of topics, it wouldn't hurt to check out Code Complete at the least, as it gives great tips on organizing functions and classes, modules and programs.
All these techniques and technologies are just bits and pieces you put together with your programming language. You'll likely need to learn about other tools, other languages, debuggers and linters and optimizers, the list is endless. What helps light the path ahead is finding a mentor, someone that is well steeped in the craft, and is willing to show you how they work. This is best done in person, watching someone design and code. Also spend some time reading the code of others (GitHub is a great place for this) and interacting with them on public mailing lists and IRC channels. I hang out on Hacker News to hear about the latest tools and technologies (many posts to /r/programming come from Hacker News). See if there are any local programming clubs or talks that you can join, it'd be a great forum to find yourself a mentor.
Lots of stuff here, happy to answer questions, but hope it's enough to get you started. Oh, yeah, the books, they're expensive but hopefully you can get your boss to buy them for you. It's in his/her best interest, as well as yours!
Something like Code: The Hidden Language of Computer Hardware and Software may be up your alley.
So may be From NAND 2 Tetris, a course where you build a computer (hardware architecture, assembler, OS, C-like compiler, and programs to run on the OS / written in the compiler) starting with just NAND.
At the end of the day though, the way things work is like this: Protocols and specifications.
Everything follows the same published IPO (input, processing, output) standards. Stuff is connected to and registers expected values on expected peripherals. The CPU, motherboard, graphics card, wireless modem, etc. all connect in the right, mostly pre-ordained places on the hardware.
In this vein, there's firmware level APIs for then communicating with all of these at the BIOS level. Although as far as I'm aware, "actual" "BIOS" is no longer used. UEFI is instead: https://en.wikipedia.org/wiki/Unified_Extensible_Firmware_Interface
This is what Firmware is / is built on-top of. Operating systems build on top of these. System calls. Operating systems communicate under the hood and expose some number of system calls that perform low-level actions like talking to devices to perform things like file access or network I/O. A lot of this stuff is asynchronous / non-blocking, so the OS or system will then have to respond to an interrupt or continuously check a registry or some other means of getting a response from the device to see if an operation completed and what its result was.
Loading the OS is one thing the BIOS is responsible for. This is through the bootstrapping process. The OSs are located at very specific locations on the partitions. In the past, the only command you had enough room for within BIOS / pre-operating system execution was to load your OS, and then the OS's startup scripts had to do everything else from there.
Once you have an operating system, you can ask the OS to make system calls and invoke low-level API requests to get information about your computer and computer system, such as the file system, networks, connected drives and partitions, etc. These calls are usually exposed via OS-specific APIs (think the win32 API) as well as through a command-line interface the OS provides.
New devices and I/O from/to those devices communicate through firmware, and interrupts, and low-level system calls that are able to communicate with these firmware APIs and respond to them.
Just about anything you can think of - graphics, audio, networking, file systems, other i/o - have published standards and specifications. Some are OS-specific (X windowing system for Linux, DirectX win32 API or GDI on Windows, Quartz on Mac, etc.). Others are vendor-specific but don't seem to be going anywhere (OpenGL, then nVidia vs AMD driver support which varies across operating systems, etc.).
The biggest hardware vendors and specification stakeholders will work with the biggest operating system vendors on their APIs and specifications. It's usually up to device manufacturers to provide OS-compatible drivers along with their devices.
Drivers are again just another specification. Linux has one driver specification. Windows has another. Drivers are a way that the OS allows devices and users to communicate, with the OS as a middle-manager of sorts. Drivers are also often proprietary, allowing device manufacturers to protect their intellectual property while providing free access to use their devices on the OS of your choice.
I'm not an expert in how it all works under the hood, but I found comfort in knowing it's all the same IPO and protocol specifications as the rest of computing. No real hidden surprises, although a lot of deep knowledge and learning sometimes required.
When we get to actually executing programs, the OS doesn't have too much to work with, just the hardware... So the responsibility of slicing up program execution into processes and threads is up to the OS. How that's done depends on the OS, but pretty much every OS supports the concept in some sense.
As far as how programs are multi-tasked, both operating systems and CPUs are pretty smart. Instructions get sent to the chips, batched and divided by them and the computational results placed into to registries and RAM. Again, something I'm not a huge expert in, and it honestly surprised me to find out that the OS is responsible for threading etc. I for some reason always thought this was at the chip level.
When you include libraries (especially system / OS / driver libraries) in your code, you're including copies of or references to OS native functions and definitions to help you reference these underlying OS or system calls to do all the cool things you want to do, like display graphics on the screen, or play audio. This is all possible because of the relationship between OS's and device manufacturers and the common standards between them, as well as the known and standard architectures of programs designed for OS's and programs themselves.
Inter-program compatibility is where many things start to become high level, such as serialization standards like JSON or XML, but not always. There are some low-level things to care about for some programs, such as big- vs little-endian. Or the structure of ASM-level function calls.
And then you have things like bitcode that programs like Java or JavaScript will compile to, which are a system-independent representation of code that most often uses a simple heap or stack to describe things that might instead be registry access or a low-level heap or stack if it had been written in ASM. Again, just more standards, and programs are written according to specifications and know how to interface with these.
The modularity of programming thanks to this IPO model and the fact that everything follows some standards / protocols was a real eye opener for me and made me feel like I understood a lot more about systems. What also helped was not only learning how to follow instructions when setting up things on my computer or in my programs, but learning how to verify that those instructions worked. This included a lot of 'ls' on the command-line and inspecting things in my debugger to ensure my program executed how I expected. These days, some might suggest instead using unit tests or integration tests to do the same.
Book suggestions? Now that's my jam.
Out of all the books i've read, here are my recommendations regarding game programming:
Eric Lengyel's Books (only one out so far). This is aimed at game engine development, but if the 2nd onward are as indepth as the first, they will be amazing fundamental knowledge. Also, they're not thick, and jam packed with information.
Game Programming Patterns. The only book that comes more recommended than this is the one right below it by Jesse Schell. This book is fantastic, but you should write one or two small games to really get the most out of this book. You can also read it online on his website free, but then you don't get a pic of him and his dog on the back cover.
Book of Lenses. This is your intro/intermediate dive into game design. There are a lot of game design books, if you only read one, it should be this one.
Gane AI By Example. This book is a hodgepodge of fantastic techniques and patterns by those in AAA. There are other books on the series (like Game AI Pro) which are similar, but in my opinion (at least when I read AI PRO 3), they're not as good. But more knowledge is never bad.
Truthfully, as I sit here looking over all my books, those are the only ones i'd consider mandatory for any seasoned developer. Of course plenty of developers get by without reading these books, but they likely pick up all the principles listed herein elsewhere, in bits and pieces, and would likely have benefited having read them early on.
Here are a few others that I do recommend but do NOT consider mandatory. Sorry, no links.
Unity in Action. Personally, I recommend this or a more interactive online course version (udemy.com/unitycourse) if you want to learn unity while having a resource hold your hand. Having read the book, taken the course, AND taken Unity's own tutorials on the matter, i'd order them in order from Course being best, book second, videos from unity third. But none of them are bad.
Game Engine Architecture. This is the king for those who want a very broad introduction to making a game engine. It comes highly recommended from nearly anyone who reads it, just so long as you understand it's from a AAA point of view. Game Code Complete is out of print and unlikely to be revisited, but it is similar. These are behemoths of books.
Realtime rendering. This is one I haven't read, but it comes very highly recommended. It is not an intro book, and is also over 1000 pages, so you want this along side a more introductory book like Fundamentals of computer graphics. Truth be told, both books are used in courses in university at the third and fourth year levels, so keep that in mind before diving in.
Clean code. Yeah yeah it has a java expectation, but I love it. It's small. Read it if you understand Java, and want to listen to one of the biggest preachers on how not to write spaghetti code.
Rimworld guy, Tynaan sylvester I believe, wrote a book called Designing Games. I enjoyed it, but IMO it doesn't hold a candle to Jesse Schell's book. Either way, the guy did write that book after working in AAA for many years, then went on to create one of the most successful sim games in years. But yeah, I enjoyed it.
Last but not least, here are some almost ENTIRELY USELESS but interesting diagrams of what some people think you should read or learn in our field:
https://github.com/miloyip/game-programmer
https://github.com/utilForever/game-developer-roadmap
https://github.com/P1xt/p1xt-guides/blob/master/game-programming.md
I have some advice for you. I'm speaking more from the position of someone who didn't do a lot of these things and regrets it than someone who can say for sure what you need to do, but I still think I have some helpful advice:
First and foremost, take a little time to enjoy your last summer before going off to university. I'm not saying you shouldn't also learn some CS, but honestly if you're the type of person who knows how to study and is willing to go to class and put in the work you already have a "head-start" on a good percentage of your classmates. And frankly, life gets a lot more stressful after high school. Enjoy yourself. You're right that AP courses aren't representative of the college experience, but your first year's coursework isn't going to be too scary. If you can handle AP, you can handle your freshman level coursework. Go to class, take good notes, and be an active participant in your courses, and then study afterwards. That's 90% of college success in the classroom. Doing well in the classroom isn't the only thing you need to worry about though (more below).
If you'd like a little summer reading, I personally found this book really enjoyable. Picked it up during my freshman year IIRC: https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_1?ie=UTF8&qid=1541924346&sr=8-1&keywords=code
It is both very informative and a fun read.
If you've taken AP Comp. Sci. then I'm assuming you can code a bit. If you want to do some programming I'd recommend sticking to what you know and deepening your understanding of it or trying out the language that your university starts you off with. They might even be the same language. Don't worry about learning multiple languages yet. You will likely be introduced to several languages in undergrad. Focus on the fundamentals.
Get connected at your university. Universities have a lot of resources for people interested in research. For example, my university had an undergraduate research program. They also had a student success center with people equipped to help you develop a plan (which is what you're trying to do now) to get to where you want to be, including graduate school and beyond. Be an active participant in your CS classes. You're going to have some shitty professors but you'll likely meet some really cool people too, who are passionate about what they do and who have the connections to help you if you show some initiative. If it's a larger school there will also be plenty of student organizations for you to check out, both explicitly CS-related and basically anything you can imagine. Don't be afraid to check out non-CS stuff. See what your university has to offer. If you don't know where to start, check out the website and once you get on campus seek out the equivalent of what I called the student success center. They will be familiar with all of the programs and services that your university offers at a broad level. Should be pretty similarly named. Also, ask your professors! At my university student organizations had faculty advisors and even if your professor isn't that person they will probably know who is.
Final note: while you should be active, have a plan, and get involved, don't try to do too much at once. You will burn out. You need to be honest about your limits. It's healthy to push yourself, but only if you are mindful of what you can handle. And if you find yourself struggling, do not be afraid to seek out academic coaching. If you have trouble adjusting to the university experience (very common and nothing to be ashamed of) your university might also offer access to therapists. Finally, if you have a diagnosed disability, make sure to register with disability services to receive accommodations!
I'm not sure what your background is, but if you haven't had any formal programming education, I believe we learned out of a book like this: https://www.amazon.com/dp/0201633612/?tag=stackoverfl08-20
One of the keys to good software is good design. Making a plan of action before you even start coding cuts back on the 'quick fix' solutions that make your code harder to work with later.
I don't think this is C# specific, but design is really something that is abstract from language specifics. If you're looking for something specific to the environment you're working in, I think material on best practice for the game engine you're using would be better. I have done application development with C# and C# scripts in Unity, and there are definitely differences in how I am able to make things interact, which impacts how I design my code.
I have two recommendations if you're wanting to expand your abilities in design:
As far as the book goes, I can't condone it, but there's probably PDFs out there. Sometimes it helps motivate to actually read through it if you've invested some money into it, though. I would recommend finding an old/used copy. An old version of the book should work just as well as an 'updated' version.
You may also find it useful to look into Agile/Scrum. It's all about documenting your development, and helps to organize what's been done and what needs to be done, as well as give you an idea of how long things take, which helps later with estimations. All these things are skills that will come in handy later, if you decide to pursue software as a career. Plus, it's all good habits that help facilitate good, clean code.
While being a self taught sys admin is great, learning the internals of how things work can really extend your knowledge beyond what you may have considered possible. This starts to get more into the CS portion of things, but who cares. It's still great stuff to know, and if you know this you will really be set apart. Im not sure if it will help you directly as a sys admin, but may quench your thirst. Im both a programmer and unix admin, so I tend to like both. I own or have owned most of these and enjoy them greatly. You may also consider renting them or just downloading them. I can say that knowing how thing operate internally is great, it fills in a lot of holes.
OS Internals
While you obviously are successful at the running and maintaining of unix like systems. How much do you know about their internal functions? While reading source code is the best method, some great books will save you many hours of time and will be a bit more enjoyable. These books are Amazing
The Design and Implementation of the FreeBSD Operating System
Linux Kernel Development
Advanced Programming in the UNIX Environment
Networking
Learning the actual function of networking at the code level is really interesting. Theres a whole other world below implementation. You likely know a lot of this.
Computer Networks
TCP/IP Illustrated, Vol. 1: The Protocols
Unix Network Programming, Volume 1: The Sockets Networking API
Compilers/Low Level computer Function
Knowing how a computer actually works, from electricity, to EE principles , through assembly to compilers may also interest you.
Code: The Hidden Language of Computer Hardware and Software
Computer Systems: A Programmer's Perspective
Compilers: Principles, Techniques, and Tools
> Fast forward into the future... our system speaks very well. Fluent English. There is no self. There is no awareness. But it's an incredibly effective chatbot. The best ever created. It learned from scratch. Naturally. No programming involved, other than the basic conditions for the neural network to start developing.
You got your order backwards. That's impossible without context / experience to go with it. You can say, "Oh, well it's taking in a bunch of data, that's it's context." but at that point it is identifying self and other already to build context.
Like a baby, it identifies self before it can speak well.
Have you read any of Douglas Hofstadter's work? You would really like his writing. It's all about this sort of stuff. His least popular book (that I wouldn't recommend as a starting read) is Le Ton beau de Marot which explains the difficulty of translating language without context, and is surprisingly relevant to the struggles of Google Translate and the like.
>Gradually it starts developing a self. How wouldn't this happen? Learning how to speak is a process tightly related to learning how to think. Can you imagine thinking without language? Can you imagine fully-fledged human-like communication without some basic underlying thinking? Can you imagine being in love without language? Can you imagine getting attached to your girlfriend, or worrying about death, without language? Take a moment to picture that. A language-less mind.
Yes, I do it all the time, though depending on what you call language. Most of my thoughts are not linguistic which imho is probably why I struggle with English so much.
I've also written AI that pattern matches visual information, like charts of data, mostly for the stock market. So actually, yes, I do know exactly what you mean.
>More complex hierarchies were built. That is how human thinking emerges. Slowly, over the years. Developing such a deep level of understanding of language that it can encode complex thoughts and emotions.
Yes abstractions and recursion. It's not that complicated.
>Back to our system. It's proficient in a language, not hard-coded; but self-taught. It developed not only great grammar and vocabulary, but great communication skills.
I have horrible communication skills. Can you teach me how to be a better communicator?
>You think you can encode emotions without language? Well, sure. But I bet you've never thought of the concept of "Doing your very best effort, to the point of challenging your very own mental and physical limits, going beyond what you thought you were capable of, to push yourself forward and improve" in one single word. You might have thought about this, but... in a single word? Well... you have, if you speak Finnish and know the word "Sisu". Non-Finnish-Speakers lack this. The same applies to tons of languages.
Ahh, but did you ever think about boiling that concept down into a single word itself? (The concept of boiling down concepts.)
There is this thing called Domain-Driven Design in the software engineer architecture world. One of the key aspects of it is called "Ubiquitous Language". It's kind of the heart of business terminology (though technically incorrect from a historical perspective). The idea is you make up a word that represents a concept and then casually use it in the work place in such a way that the sales people, managers, and engineers use it. This way casually everyone syncs up to the same terminology bridging communication between individuals of different backgrounds.
Sadly, ubiquitous language is often used as an ego boost by senior engineers to maintain seniority. Please, if you ever use this technology, explain how it works to the juniors clearly, so they are kept within the loop. Too often seniors will use it as a way to choose who is in the loop, so please "with power comes responsibility". We could use more kindness in the engineering discipline, or at least out here in silicon valley.
>This system I've described in this whole post is actually a human being.
Pretty awesome write up.
You're pretty smart. Are you on hacker news by any chance? Or and totally checkout GEB, you'll like it.
Heh, sure.
A lot of people are fans of Code Complete. I tried reading it after being in industry for a decade, and I found it to be very dry and boring. The general consensus from people that I've talked to is that it's more useful when you're just starting out. Maybe I just came to it too late.
A better book (in my opinion) in that same vein is Clean Code. Clean code is shorter, more focused, and has better real-world examples. It feels less "complete" (hue hue) than Code Complete, but to me, that's a strength. As a quick point of comparison: Code Complete devotes 32 pages to the chapter on identifier naming; Clean Code devotes just 14.
I got a lot out of Design Patterns. I seem to recall that the pattern fad was in full swing back when I read this in 2005-ish. I think I had independently discovered some of the patterns already at that point, but this book helped me to codify those ideas and also showed me some new ones. Some of these patterns are now seen as antipatterns (I'm looking at you, Singleton!), and all of the patterns have an object-oriented bias. But there's still something useful in the pattern language, and this book is a reasonably comprehensive start. The book is somewhat dry, and some people report that Head First Design Patterns is a gentler and friendlier introduction. Head First Design Patterns hits the essential patterns, but misses a lot of the less popular ones.
Eventually, you'll need to work in a codebase with some technical debt. Maybe it's debt that somebody else put there, or maybe it's debt that you introduced. Working Effectively with Legacy Code is still my go-to recommendation. It defines technical debt as code that is not under test, it introduces the idea of "seams" that you can use to pry apart code that's too tightly coupled, and it then provides a cookbook of specific scenarios and reasonable approaches.
If you're looking for thought-provoking videos, I recommend anything by Rich Hickey. I don't know if I've watched all of those, but I remember good things about Hammock Driven Development and especially Simple Made Easy.
Get comfortable with a source control system. I didn't use source control in college, since it wasn't needed for any classes, and that was a missed opportunity. The whole world loves Git, so you'll probably want to learn it if you haven't already. But I'll also toss out a recommendation for Mercurial. I haven't used it in years, but I remember finding it to be quite good.
Good luck!
Physicist here so don't pretend I don't know what science is. (Though like the ancient Pythagoreans I'm sure as soon as I discuss something that has been proven that goes against a purely scientific worldview out comes the pitchforks.) And though I love science, unlike some people here I am willing to admit to the limits of science. Science can lead to all truth in the same way that rational numbers define all numbers: it can't! and Godel proved it.
The real problem with science is that it has been mathematically proven by Godel that there are more things that are true then are provable and thus you can't ever have a scientific theory that can determine the truth or falsity of all things. As soon as you write down that theory, assuming it allows for arithmetic, Godel's incompleteness theorem immediately shows if the theory is true there will be true statements about reality that are beyond provability. Read Godel Esher Bach or Incompleteness or work through it yourself in this textbook as I have.
So like I said above, science is great in it's sphere (and in that sphere let me emphasize it is awesome!) but leads to all truth in the same way that rational numbers leads to all numbers. (And the analogy is precise since Godel used the famous diagonizational argument in his proof.) Russell and Whitehead set out to show in the early 1900s that if we could determine the axioms of reality then through logic work out everything that was true and Godel spoiled the party.
It it would be one thing if these truths were trivial things, but they are not. Some examples of true or false statements that may fall into this category of being unprovable are:
Now, at this point critics almost always tell me: but Joe, Godel's incompleteness theorem is only relative to your set of logic. (Ie... we can prove Goldbach by just adding axioms needed to do so.) Fine. But two things: (first) adding axioms to prove what you want willy nilly is not good science. (Two) You now have a new set of axioms and by Godel's theorem there is now a new uncountable set of things that are true (and non-trivial things like I listed) that are beyond proof.
Now usually comes the second critique: But Joe, this doesn't prove God exists. And this is true. But at least it has been proven God gives you a chance. It has been proven that an oracle machine is free from the problems that hold science and logic back from proving the truth of all things. At least something like God gives you a chance (whereas science falls short).
Or, like Elder Maxwell says so well: it may only be by the "lens of faith" that we can ever know the truth of all things. He maybe be right, and hence the importance to learn by study, and also by faith...
You have to be above the bar to begin with. If you can understand exactly what intelligence is then you can increase it.
Meditation can be used as a way to gain insight. This is not all types of meditation, but there are definitely types of meditation with the goal of enlightenment in mind. Using the Buddhist definition of enlightenment and overly simplified explanation is insight, specifically the type of lower level type of insight that not everyone can get to and for the most part needs to be unlocked. Once it is unlocked, how one utilizes it can be a large intelligence booster, but you have to be able to comprehend how your mind works. If you can't fully recognize a lot of advanced and abstract concepts then knowledge gain is possible but hardly any intelligence gain.
Using the example you mention, math is utilized on the other part of the brain in such a way that you can multitask while solving advanced math problems. A way this can be figured out is solving math problems in your sleep. It is like a piece of your brain is a math coprocessor and it can chug along while you are talking to someone, reading writing, sleeping, or generally not paying attention to it, much like cooking something in the oven.
It depends what you want to learn. The most direct path is raw insight. For advanced logic, paradoxes, and other mathy nerdy stuff you might want to checkout GEB. Meditation doesn't skip the learning step. You still have to learn things the same way everyone else does. Meditation just helps you realize you can utilize your brain to a more full potential.
If you are really interested and think you can can push forward, I highly recommend you try a 300µg+ dose of lsd. Tripping is the same thing as a deep meditation state, but it doesn't stay. It is like driving a car over the mountain instead of walking. In a deep state under the influence you can do all of the more insightful things one can do in a deep meditative headspace. However, figuring it out could take multiple trips as sometimes insight will take 6 hours to come full circle. When meditating in a deep headspace the answer can come much quicker.
The idea is if you can figure it out while tripping, then you can remember what you've learned and migrate it into meditative practices, as it can literally take a life time to get to the level of meditation skill as one night of dropping acid will bring you to.
It is definitely possible. If you don't ask very specific detailed questions about how your brain works, I will not be able to explain in detail, and without asking yourself you can't move towards figuring things out either.
An efficient way to get to a deep headspace from meditation is a map, so you have an idea of which direction to go in. This tends to be pretty good.
tl;dr version:
Quick background to validate the above/below: I was a 30y/o banquet manager when I decided to change careers. I had no prior experience [unless you want to count a single programming class I took in high school] but did get a job in tech support at a medium size startup while I was in school and wrote a couple apps for our department. Just before I graduated I started working at a primarily Google & Mozilla funded non-profit as their sole software engineer. I moved on after a little over two years and am now a software engineer at VMware.
Two books I'd suggest reading are The Pragmatic Programmer and Code: The Hidden Language of Computer Hardware and Software. Pragmatic Programmer is one of those classics that every good dev has read (and follows!). Code is great at giving you some insight into what's actually happening at a lower level - though it gets a bit repetitive/boring about halfway through so don't feel bad about putting it down once you reach that point.
The best thing you can do to help you land a job is have some open-source side-projects (ideally on GitHub). Doesn't have to be anything major or unique - but it will help a lot for potential employers to see what your code looks like.
A famous artefact of early computing is the boot-strapping process where the goal is a self-hosting compiler - which lets you write the compiler for a new language in the new langauge. However to get to that point a lot of earlier innovations were needed.
Take all of this with a pinch of salt - the order and the details may be wildly inaccurate, but the overall ideas viewed from afar give an idea of how we got to the point that we can choose our own language to write a compiler for another language..
To start with, raw binary values had to be set in order to define and run a program. Those raw binary values represent instructions that tell the hardwaer what to do and data that the program needed to operate. This is now usually referred to as machine code.
At first you would enter values into computer storage using switches.
Since that's so tedious and error prone, puched cards were developed along with the necessary hardware to read them so you could represent lots of values that could be read toagether. They had their own problems but it was a step forward from switches.
After some time symbolic instructions were defined as a shortcut for several machine code instructions - now usually called assembly language. For example put the value 8 and store it into a memory location 58 could be written as ST 8, [58]. This might take 3 machine code instructions, one represents the store instruction, one the value 8 and one the location 58. Since now assembly language could be written down it was easier to understand what the computer is being instructed to do. Naturally someone had the bright idea to make that automatic so that for example you could write down the instructions by hand, then create punched cards representing those instructions, convert them to machines code and then run the program. The conversion from the symbolic instructions to machines code was handled by a program called an assembler - people still write programs in assembly code and use assemblers today.
The next logical step is to make the symbolic instructions more useful and less aimed at the mundane, physical processes that tells the computer exactly how to operate and more friendly for people to represent ideas. This is really the birth of programming languages. Since programming languages allowed you to do more abstract things symbolically - like saving the current instructions location, branching off to another part of the same program to return later, the conversion to machine code became more complex.Those programs are called compilers.
Compilers allow you to write more useful programs - for example the first program that allowed you to connected a keyboard that lets you enter numbers and characters, one connected to a device to print numbers and characters, then later to display them on another device like a screen. From there you are quite free to write other programs. More languages and their compilers developed that were more suitable to represent more abstract ideas like variables, procedure and functions.
During the whole process both hardware - the physical elctronic machines and devices and software, the instructions to get the machines to do useful work - were both developed and that process still continues.
There's a wonderful book called Code by Charles Petzold that details all of these developments, but actually researched and accurate.
...continued...
> Test plans - When you apply for QA roles, you'll almost certainly be asked "how would you test ____?". The correct answer is to be methodical. Don't just spew out a stream of test cases as you brainstorm them. Understand the different scopes (unit, functional, integration, maybe end-to-end) and what the goals of each is, and how they differ. Understand that there are different areas of testing like boundary, happy path, special cases (null, " ", 0, -1), exceptions, localization, security, deployment/rollback, code coverage, user-acceptance, a/b, black box vs white box, load/performance/stress/scalability, resiliency, etc. Test various attributes at the intersection of a compenent and a capability (borrowed from the book How Google Tests Software), and I believe you can see a video that goes into this called The 10 Minute Test Plan. Understand how tests fit into your branching strategy - when to run bvts vs integration vs regression tests.
> Test methodologies - Understand the tools that make you an efficient tester. These include data driven tests, oracles, all-pairs / equivalency class, mocking & injection, profiling, debugging, logging, model-based, emulators, harnesses (like JUnit), fuzzing, dependency injection, etc.
> Test frameworks - Knowing all the tests you need to write is good, but then you have to write them. Don't do all of them from scratch. Think of it as a system that needs to be architected so that test cases are simple to write, and new functionality is easy to implement tests for. I can't recommend any books for this because it's something I learned from my peers.
> Test tools - Selenium / WebDriver for web ui, Fiddler for web services (or sites), JUnit/TestNG, JMeter (I have to admit, I don't know this one), integration tools like Jenkins, Github/Stash, git/svn.
> System design - As you're entry-level, this may not be a huge focus in an interview, but know how to sensibly design a system. Know which classes should be used and how they interact with each other. Keep in mind that the system may evolve in the future.
> Whiteboarding - Practice solving problems on a whiteboard. The process is more than just writing the solution, though. This is the process I follow (based loosely on the book Programming Interviews Exposed):
Resources:-
> Learning to test:
> Learning to interview:
> Learning to program:
> Miscellaneous
> What sort of skills should I really hone? I realize I gave you a ton of stuff in this post, so here's a shorter list:
> Examples of projects that make you look valuable
Books on project management, software development lifecycle, history of computing/programming, and other books on management/theory. It's hard to read about actual programming if you can't practice it.
Some of my favorites:
You can't exactly learn to program without doing, but hopefully these books will give you good ideas on the theories and management to give you the best understanding when you get out. They should give you an approach many here don't have to realize that programming is just a tool to get to the end, and you can really know before you even touch any code how to best organize things.
IF you have access to a computer and the internet, look into taking courses on Udacity, Coursera, and EDX. Don't go to or pay for any for-profit technical school no matter how enticing their marketing may tell you you'll be a CEO out of their program.
Computer scientist here... I'm not a "real" mathematician but I do have a good bit of education and practical experience with some specific fields of like probability, information theory, statistics, logic, combinatorics, and set theory. The vast majority of mathematics, though, I'm only interested in as a hobby. I've never gone much beyond calculus in the standard track of math education, so I to enjoy reading "layman's terms" material about math. Here's some stuff I've enjoyed.
Fermat's Enigma This book covers the history of a famous problem that looks very simple, yet it took several hundred years to resolve. In so doing it gives layman's terms overviews of many mathematical concepts in a manner very similar to jfredett here. It's very readable, and for me at least, it also made the study of mathematics feel even more like an exciting search for beautiful, profound truth.
Logicomix: An Epic Search for Truth I've been told this book contains some inaccuracies, but I'm including it because I think it's such a cool idea. It's a graphic novelization (seriously, a graphic novel about a logician) of the life of Bertrand Russell, who was deeply involved in some of the last great ideas before Godel's Incompleteness Theorem came along and changed everything. This isn't as much about the math as it is about the people, but I still found it enjoyable when I read it a few years ago, and it helped spark my own interest in mathematics.
Lots of people also love Godel Escher Bach. I haven't read it yet so I can't really comment on it, but it seems to be a common element of everybody's favorite books about math.
They seem a like reasonable starting point I think. Repetition is the mother of mastery, the more books the better (in addition to applying what is learned).
Since Mosh is calling out learning fundamentals as important to becoming a good C# developers, I would personally also recommend some general (non C# specific books) too for who are starting out in software development:
There's a ton more, but those are a few that stood out to me. Essentially the more the merrier in my opinion - books, courses, videos, tutorials, and so on. The books I'm recommending here focus on adopting the developer mindset and being successful at it. That's part of the puzzle.
The other part is understanding the technical details including the programming language and frameworks you intend to use.
And finally, for learning about C#, I do highly recommend Mosh's videos/courses (some are free on YouTube, others available on Udemy). He's got a unique ability to explain things clearly and simply in a way that beginners can pick up quickly.
What I'd do is check out his free content first, and if you agree his style is ideal for learning, an investment in one of his courses is well worth it since he'll cover a lot more breadth and depth on each of the topics and they're organized into a super consumable package rather than scouring the internet for various topics.
I am not totally sure what you are asking for actually exists in book form...which is odd, now that I think about it.
If it were me, I would think about magazines instead. And if you really want to push him, think about the following options:
If you insist on books...
I see you already mentioned A Brief History of the Universe, which is an excellent book. However, I am not sure if you are going to get something that is more "in depth." Much of the "in depth" stuff is going to be pretty pop, without the rigorous foundation that are usually found in textbooks.
If I had to recommend some books, here is what I would say:
Hope that helps! OH AND GO WITH THE SUBSCRIPTION TO NATURE
edit: added the linksssss
I can't talk about Schildt's competence on Java. I know about his C books, which are pretty much recommended to be avoided. Bruce Eckel's on the other hand, I've heard only good about his materials (although I didn't really like his design patterns book very much). I've never read the two books you've mentioned though.
Have you tried the official tutorials for learning Java? They're very good IMO. They're freely available too.
My first book on java was The Java Programming Language (it teaches Java 5 [current version of java is 8]). Except that you'd be learning Java 5, which is still fully applicable, this book is very good. One of the authors is the creator of java, another is Guy Steele. He's a programming languages expert whose books I believe are worth reading just because he wrote it. He's pretty above the average, and also one of the creators of Scheme. Look him up on wikipedia =D.
I've read Core Java too (it has pretty up to date editions). I found it good, which is a win on its own since most learning sources are terrible IMO, but I didn't find anything particularly interesting about it. It does cover a lot of ground, though. I surely recommend it.
A lot of writing good java code is about understanding the usual patterns of which people make use.
The author of Core Java has a book on this (http://www.amazon.com/Object-Oriented-Design-Patterns-Cay-Horstmann/dp/0471744875/). I've never read it, but I'd guess it's good. I don't know how advanced it is though.
You can, of course, always look up the Design Patterns book (http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/). I'd not recommend reading this before learning java. I think you should only do it after learning some java. Although I don't think it's a particularly challenging book, I think it'll make a lot more sense given you can see its code examples (mostly in C++, and some in smalltalk; but mostly in C++) and understand what they do. You don't need to really know C++ for that, honestly. The code doesn't make use of any (IMO) advanced features of C++. Knowing java and using common sense I think is enough to understand what is in there.
There are many books on better using java. If you google for good java books, you'll find plenty of reviews, recommendations, and so forth. You can search amazon too.
By the way, a lot of the programming techniques for writing good code in one language can be learned by studying materials in other languages. For example, I owe much of my programming basis knowledge to K&R2 (The ANSI C Programming Language - 2e), SICP (Structure and Interpretation of Computer Programs), The Little Schemer, and The Seasoned Schemer. These books teach in C and Scheme (two languages that I probably will never use professionally), but a lot of what they teach I've been able to apply while coding in C++, Java, JavaScript, PHP, SML, Python, and also other languages I've used in the past.
Good luck.
As one of the people who commented on that thread, I feel the need to respond to this as rationally as humanly possible.
For starters, let's clear up the difference between fractal mathematics, fractal woo, and what Douglas Hofstadter might call fractal analogy.
>“Everything big is just like everything small!” they exclaim, “the universe is self-similar!”
...and then using such logic to thereby justify whatever silly energy-Reiki-mystical-connectedness-telepathy-de-jour they want.
As well, Chaos Theory - the study of how immensely complex patterns emerge from seemingly simple preconditions - is full of fractal mathematics. Given that the universe is absolutely packed with iterated functions and self-similarity almost everywhere we look, I think you can absolutely take the point of view that the universe is fractal in nature, especially when you are in a self-induced state where your brain makes a lot of connections you might normally overlook or not even bother to think about.
My point is that discussing things in the universe as self-similar is useful to mathematicians and non-mathematicians alike; using the word "fractal" to describe natural systems that exhibit those familiar patterns might not be perfectly correct, but it's not itself offensive or an affront to reasonable discourse. I manage a business; so what's your problem if I visualize the structure of my company as a fern leaf with departments and employees as branches off the main stem? What would be the issues of discussing how incredible human cellular morphology really is with my biologist roommate, and citing some cool research someone decided to do about fractal geometry in the way our bodies build themselves?
EDIT: OP's edit makes it more clear his statements were more about irrational folk seeing the universe as a single continuous fractal (that would be the "fractal woo"), and that he is not denying the existence of fractal-like patterns in nature, or that using fractal models can be useful in understanding phenomena. Sorry for any confusion and thanks for the discussion!
EDIT2: /u/ombortron commented pretty well in regards to the utility of the concept of fractals in scientific discourse and otherwise:
>The universe itself doesn't have to be a fractal for fractals to be important.
>Fractals are quite common in our reality, and as a result, that means they are an important facet of reality, and as such they are a legitimate and common topic of discussion amongst people, and this is particularly true of people who do psychedelics.
>Does this mean the universe is 100% fractal in nature? No.
I did this, but I came to data science in the final year of my PhD when I got a job at a startup. I started with R, then SQL, then Python. I currently work in data science, moving internal ML products into production settings. I also do research - and knowing how to conduct proper trials is great if the company you work for gives you freedom in how something you've built is rolled out. I can also blend my degree with ML, e.g. designing batteries of questions to identify 'good fit' candidates for a given role - I combine the battery results with future performance data and continually refine the question set / improve the model. As well, I'm a good fit for UX and dabble in that. The combo skillset will give you the ability to produce value in many different ways.
The things that helped me most were:
It can be overwhelming, but don't worry. Do one course to completion, with that as your only goal. Then do the next. Then work on a kaggle thing. Then work through a book. One thing at a time - you might get anxious or be uncertain or want to do multiple things at once, but just start with one thing and focus on that and that alone. You'll get where you want to go.
I also brushed up on my linear algebra / probability using MITs open courses and khanacademy.
Beyond all this, I found that learning a lot about ML/AI really expanded my thinking about how human beings work and gave me a new and better lens through which to view behaviour and psych research/theories. Very much recommend to all psychologists.
Good luck!
some amazing books I would suggest to you are:
Charles Petzold.
All of these I would love to read again, if I had the time, but none more so than Godel, Escher, Bach, which is one of the most beautiful books I have ever come across.
Road to Reality is the most technical of these books, but gives a really clear outline of how mathematics is used to describe reality (in the sense of physics).
Code, basically, teaches you how you could build a computer (minus, you know, all the engineering. But that's trivial surely? :) ). The last chapter on operating systems is pretty dated now but the rest of it is great.
Pi in the Sky is more of a casual read about the philosophy of mathematics. But its very well written, good night time reading!
You have a really good opportunity to get an intuitive understanding of the heart of mathematics, which even at a college level is somewhat glossed over, in my experience. Use it!
You're asking a very complex question that the best current minds in the fields of sociology, politics, psychology, technology, futurology, neuroscience, education, and many others, cannot answer.
We just don't know what it is that makes a good programmer different from a bad one. We all have theories and ideas and notes and observations and anecdotes, but compiling them together doesn't generate an actual understanding of the subject.
So I'm sorry but there's no real way for anyone to answer your question.
I would look for local resources to double check wtf you're doing.
Befriend a trusted professor and visit them during office hours, or a trusted student or advisor or professional or something. Explain your concern and ask them to walk you through how they approach problems. Not the solution; just watch how they approach the problem and pay very close attention to it.
That's quite invaluable.
For instance, take SQL. If I get really stuck on a SQL problem, I go back to my root approach to it, which is to ask this question: if I had several tables of data printed out on paper, what would I tell a monkey to do to collate them and generate the output I want? That's all that SQL usually does; it explains to the computer how to collate and present diverse data tables.
And that's easy to forget when you're trying to think about join syntax or something and you're grasping at straws and trying to construct a pattern in your mind and it keeps unraveling because you don't have a good sense at the root level of what it is that you're trying to do.
Better programmers aren't defined by what they know, they're defined by how they think.
And I sense your problem is that you're trying to apply knowledge not without understanding it, but without understanding how to understand it.
Watch this, I'm about to do a cool programmer mental trick of segueing to what seems like a completely unrelated subject, but I'm actually following a mental thread that connects the two.
When linguists try to decipher ancient languages, they sometimes run into an interesting kind of trouble. Take Linear A as an example. They have the symbols. They have a pretty good idea of what some of them mean. But they have no idea how they go together or what most of them mean, because they have no context for deciphering the thing. It's completely inaccessible. For all scientists can tell, Linear A might as well be encoded symbols of sound waves which can only be translated intelligibly when played aloud. If they converted a Linear A script to MP3, it might come out a perfect rendition of All Along The Watchtower.
The problem isn't that they haven't unlocked the words, the problem is that they haven't figured out how the writers thought. And without knowing that, the words are probably unknowable. They could throw some together that have likely translations, but what sounds like "there is the sun" might actually mean "my dog ate my banana".
So the key to unlocking the language isn't to stare at the language and to try to wrestle words into place.
Instead, it's to research the culture that used the language, to try to learn more about the culture and how it functioned. Did it have a seat of government? Was it patriarchal or matriarchal? Was it militant or trade oriented or hunter/gatherer or what?
By understanding how the people lived, you get a sense of how they thought. And by understanding how they thought, you can start to figure out how they communicated. And more than one language over time has been deciphered in that way.
Or how about this; you don't speak french, but encountering a french person on the street, you're able to use hand gestures to ask them directions to something, and they're able to respond intelligently using hand gestures. How'd that happen? Because you both thought the same way.
This psychological question consumes exobiologists, because they're tasked with figuring out how to communicate with aliens who, by definition, don't think like us.
So what do they do? They return to the basic roots. And the simplest roots they can find are the hydrogen atom and the set of prime numbers. And maybe pi. Things like that. And people hear "math is the universal language" and sneer dismissively because math is boring, but it's actually true.
So I'm curious whether you're fluent in the universal language of computers, or whether you just think you are because you practiced writing linked lists 37 times.
Charles Petzold wrote a great book called "Code: The Hidden Language of Computer Hardware and Software". It doesn't teach you how to program, but it teaches you WHY you have to program. And by understanding the WHY, you get great inroads to the HOW.
I'd recommend taking a look at that if you can find it. "http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319". Notice that the second book in the "Customers Who Bought This Item Also Bought" section is titled "Think Like A Programmer". Code has better reviews so I'd start there.
But that might help.
I don't think you're stupid, I just think that you aren't thinking the right way about the problems.
Or maybe it's a problem solving thing altogether. How do you do with Sudoku or other games like that? Or chess puzzles?
I dunno. This is all food for thought, and maybe some of it will stick.
ETA: Damn I wrote a lot. I'll save it and blog it someday, maybe. Anyway, I did a search on "how programmers think" and came up with this interesting article: "http://www.techrepublic.com/blog/programming-and-development/how-to-think-like-a-programmer-part-1/43". It's interesting not because I entirely agree with it, and I couldn't find the other two parts of it anyway, but at the bottom of the first section, in bold, the author says that the big problem with some programmers is that they've been taught the HOW but not the WHY. So here again is someone supporting my point. Tricks and habits and stuff will only take you so far; at some point you have to learn the WHY in a way that lets you apply it and start coming up with your own HOW's as the need arises.
If you want to dig deep into the theoretical of programming, and help build a good foundation for OOP, patterns, and algorithm design, check out Concrete Mathematics: A Foundation for Computer Science. It is honestly the best textbook I have ever come across.
From there, if you're feeling really ambitious in studying algorithms, check out The Art of Computer Programming, but I should warn you, it is very dense and can be hard to understand even for accomplished developers.
Beyond that, I suggest checking out The Odin Project. It covers a variety of languages and frameworks including Ruby On Rails, which is pretty standard in app development these days. They have a lot of great references and side material. It's basically a "go at your own pace" open source coding boot-camp.
> Like I said, this is for me. I hate just being told "do this" and having no concept of why. I want to understand why I'm doing it, the implications for doing it "this way".
This... This is the mindset that will carry you and eventually make you stand out as an exceptional programmer. Learning how to do something might land you a job, but knowing how it works makes you an invaluable asset to any employer.
As long as you are passionate about learning the material, you will pick it up over time.
>This is where I realized that I was doing this wrong, at least for me. I'd be on codeabbey and know what I wanted to do, but not how. I realized that I needed to be building larger things to be working with oop concepts. I really felt I was missing a lot of "base" information.
Awesome observation. Doing studying and doing drills both have an important role in the learning process, but there are other forms of practice to include in order to reinforce the material in a meaningful way. Ruby Rogues Podcast has a great group discussion about how to learn that I highly suggest you give a listen.
Personally, I learn best by throwing myself into a project where I am in wayyy over my head. By struggling through problems, scrupulously tearing through documentation and examples, I learn a lot more of the why than the how at the end of the day.
I learned Javascript, jQuery, and AJAX by building a templating & ecommerce framework. I started out with little to no knowledge or understanding of how JS worked, and was forced to restart a number of times as I began to see what was good and what was not, but now I feel very comfortable working with it.
Find a problem, and solve it, because Computer Science is, really, just the art of problem solving.
Best of luck, and most importantly, have fun :D
If you were serious about wanting some deep as-you-go knowledge of software development but from a Pythonic point of view, you cannot go wrong with following a setup such as this:
Mark Lutz writes books about how and why Python does what it does. He goes into amazing detail about the nuts and bolts all while teaching you how to leverage all of this. It is not light reading and most of the complaints you will find about his books are valid if what you are after is not an intimate understanding of the language.
Fluent Python is just a great read and will teach you some wonderful things. It is also a great follow-up once you have finally made it through Lutz's attempt at out-doing Ayn Rand :P
My recommendation is to find some mini projecting sites that focus on what you are reading about in the books above.
Of course this does not answer your question about generic books. But you are in /r/Python and I figured I would offer up a very rough but very rewarding learning approach if Python is something you enjoy working with.
Here are three more worth adding to your ever-increasing library :)
Escher's work with tessellation and other mathematical ideas are fairly well-known and documented so I'll try to mention a few examples of things I learned in an art history course a while ago.
DaVinci's Vitruvian Man used Phi in the calculation of ratios. Example: the ratio of your arm to your height or your eyes to your face is nearly always Phi. I'm not sure if I'm correct in the body parts mentioned, my art history class was nearly 6 years ago so I'm a bit rusty. I'll try to think of some more examples and post.
EDIT: a few more examples have come back from memory. DaVinci was a master of perspective as well. As you can see DaVinci used linear lines to draw attention to the subject of his works. In the case of The Last Supper, the lines from the structure of the building, to the eyes and gestures of the disciples aim towards Jesus.
Botticelli's Birth of Venus uses a triangle to bring the subject into the viewer's mind. The two subjects on the left and right form the lines that meet at the middle of the top and close off a triangle with the bottom of the work. Venus herself is in the middle of the triangle which brings your attention to her immediately upon viewing the work.
Michelangelo's Pieta also uses a triangle to highlight its subjects. Mary's figure creates a triangle (which is considered to be quite intentional based upon her size, both in relation to Jesus, a full grown man, and from her upper and obviously enlarged lower body). Her triangle makes the outline for the subject, Jesus. He is nearly in the center of both the horizontal and vertical axises. The way he is laying, from near the top of the left and then draping to the bottom of the right, depicts a very lifeless form because of the unnatural laying. Moving the viewer's gaze from the top to the bottom of the triangle strengthens the emotion of the scene.
Moving on to architecture, vaulted ceilings also use triangles to draw your eyes down a line also make an awe-inspiring impression.
In contrast to the European's love of straight lines and geometric figures, the traditional Japanese architectural style was opposed to using straight lines. As you can see, nearly every line in a traditional Japanese building is curved. The traditional belief was that straight lines were evil because they thought evil spirits could only travel in straight lines. This design criteria made for very interesting formations and building methods which I would encourage you to check out because of the sheer dedication to the matter.
The Duomo in Florence is a great example of Renaissance architecture and has a really cool octagonal shaped dome. I could go on and on about how awesome Brunelleschi's design was, but I'll just let you read about it here.
I could talk all day about this sort of stuff, just let me know if you want anything else or have any questions. Good luck with your class!
EDIT2: I've found some more links about the subject of mathematics in art and architecture. It looks like University of Singapore actually has a class on the subject. There's also a good Wikipedia page on it as well. This article is pretty lengthy and knowledgeable, but doesn't include pictures to illustrate the topics. Finally, as almost anybody in r/math will testify, Godel, Escher, Bach by Douglas Hofstadter is a fantastic read for anybody interested in mathematics and cool shit in general.
EDIT3: LITERATURE: I know we've all heard what a badass Shakespeare was, but it really hits you like a bus when you find out that how well the man (or for you Shakespeare conspiracy theorists, men) could use words in rhyme and meter. Here's a Wikipedia article about his use of iambic pentameter and style. Nothing else really comes to mind at the moment as far as writers using math (other than using rhyme and meter like I mentioned Shakespeare doing); however, I can think of a few ways to incorporate math. If you would like to go into any sort of programming during the class, you could show how to make an array out of a word. Once that concept is understood, you could make them solve anagrams or palindromes with arrays... a favorite of mine has always been making [ L , I , N , U , X ] into [ U , N , I , X ] ( [ 3 , 2 , 1 , 4 ] for the non-array folks ).
Bishop's book Pattern Recognition and Machine Learning is pretty great IMHO, and is considered to be the Bible in ML - although, apparently, it is in competition with Murphy's book Machine Learning: A Probabilistic Approach. Murphy's book is also supposed to be a gentler intro. With an ECE background the math shouldn't be too difficult to get into in either of these books. Depending on your background (i.e. if you've done a bunch of information theory) you might also like MacKay's book Information Theory, Inference and Learning Algorithms. MacKay's book has a free digital version and MacKay's 16 part lecture series based on the books is also available online.
While those books are great, I wouldn't actually recommend just reading through them, but rather using them as references when trying to understand something in particular. I think you're better off watching some lectures to get your toes wet before jumping in the deep end with the books. MacKay's lectures (liked with the book) are great. As are Andrew Ng's that @CatZach mentioned. As @CatZach mentioned Deep Learning has had a big impact on CV so if you find that you need to go that route then you might also want to do Ng's DL course, though unlike the courses this one isn't free :(.
Finally, all of the above recommendations (with the exception of Ng's ML course) are pretty theory driven, so if you are more of a practical person, you might like Fast.AI's free deep learning courses which have very little theory but still manage to give a pretty good intuition for why and how things work! You probably don't need to bother with part 2 since it is more advanced stuff (and will be updated soon anyways so I would try wait for that if you do want to do it :))
Good luck! I am also happy to help with more specific questions!
> Felt pretty good about myself.. until I got to the algorithm section.
This is VERY normal. These are hard math concepts that take everyone a little bit to get used to. The only way you will learn these concepts is by implementing them, over and over and over and over and over.
> I would say I was getting stuck probably about half the time and would turn to read-search-ask method.
If this were not the case, then there would be no need to learn. I am a web developer and I look up the most inane shit on a daily basis because it is something that I either have never used/implemented or something I rarely use/implement (my big one here is PHP's array functions, I can never remember if the array comes before the callback or not with
array_map()
but I remember that it is the exact opposite ofarray_filter()
andarray_reduce()
). Embrace this, build your Google-fu because you will always need it.> A lot of times I was missing some small operator (code error) or somewhat minor step in the thought process, other times I would be lost entirely. Basically I wasn't thinking about how to implement my code well enough, imo.
This is 100% normal. Have you ever heard of a code review? This is where other developers review your code before it goes live. The point of this is that you cannot be 100% perfect with your code, maybe you forgot a semicolon or maybe your code is tough to read, that is what the code review process is like. I write code in iterations to make sure that I never 'get in too deep' and the fear of removing code sets in, each of these phases I go through a mini code review to see what is working at what isn't. I ALWAYS find some half-baked logic in my first few iterations before I really get into it and over the last couple years I find that I need fewer and fewer iterations and that I am able to get a better 'big picture.'
Don't be afraid to scrap some code and go back at it, this is your education and only you know when you understand the material. I have a bajillion abandoned side projects and so does every developer that I know.
Advice
Links
I did FCC up through the frontend section, I started my web dev career path in 2014 and picked up FCC in mid 2015 right before getting a job in web development. The most important part of FCC is that you are coding, getting practice and making mistakes, TONS of mistakes. Just keep it up, don't get burned out and remember that it is about your education, not how many challenges you complete. Code and read and read code.
Ok then, I'm going to assume that you're comfortable with linear algebra, basic probability/statistics and have some experience with optimization.
caret
package in R, but is also supposed to be a great textbook for modeling in general).I'd start with one of those three books. If you're feeling really ambitious, pick up a copy of either:
Or get both of those books. They're both amazing, but they're not particularly easy reads.
If these book recommendations are a bit intense for you:
Additionally:
He sounds like a younger version of myself! Technical and adventurous in equal measure. My girlfriend and I tend to organise surprise activities or adventures we can do together as gifts which I love - it doesn't have to be in any way extravegant but having someone put time and thought into something like that it amazing.
You could get something to do with nature and organise a trip or local walk that would suit his natural photography hobby. I love to learn about new things and how stuff works so if he's anything like me, something informative that fits his photography style like a guide to local wildflowers or bug guide. I don't know much about parkour but I do rock climb and a beginners bouldering or climbing session might also be fun and something you can do together.
For a more traditional gift Randall Munroe from the web comic XKCD has a couple of cool books that might be of interest - Thing Explainer and What If. Also the book CODE is a pretty good book for an inquisitive programmer and it isn't tied to any particular language, skillset or programming level.
> I have zero Linux experience. How should I correct this deficiency?
First, install a VM (Oracle OpenBox is free) and download a linux ISO and boot from it. Debian and Ubuntu are two of my favorites. Both are totally free (as are most linux distros). Once installed, start reading some beginner linux tutorials online (or get "Linux In A Nutshell" by O'Reilly).
Just fuck around with it... if you screw something up, blow it away and reinstall (or restore from a previous image)
> Is it necessary? Should I start trying to make Linux my primary OS instead of using windows, or should that come later?
It's not necessary, but will help you learn faster. A lot of security infrastructure runs on Linux and UNIX flavors. It's important to have at least a basic understanding of how a Linux POSIX system works.
> If you can, what are some good books to try to find used or on PDF to learn about cissp and cisa? Should I be going after both? Which should I seek first?
You don't need to worry about taking & passing them until you've been working in the field for at least 3-5 years, but if you can get some used review materials second-hand, it'll give you a rough idea what's out there in the security landscape and what a security professional is expected to know (generally)
CISSP - is more detailed and broader and is good if you're doing security work day-to-day (this is probably what you want)
CISA - is focused on auditing and IT governance and is good if you're an IT Auditor or working in compliance or something (probably not where you're headed)
> What are good books I can use to learn about networking? If you noticed I ask for books a lot its because the only internet I have is when I connect my android to my laptop by pdanet, and service is sketchy at my apartment.
O'Reilly is a reliable publisher of quality tech books. An amazon search for "O'Reilly networking" pull up a bunch. Also, their "in a nutshell" series of books are great reference books for Windows, Linux, Networking, etc... You can probably find older/used copies online for a decent price (check ebay and half.com too)
> How would you recommend learning about encryption? I just subscribed to /r/crypto so I can lurk there. Again, can you point me at some books?
Try "The Code Book" for a very accessible intro to crypto from ancient times thru today
http://www.amazon.com/The-Code-Book-Science-Cryptography/dp/0385495323
Also, for basics of computer architecture, read "CODE", which is absolutely excellent and shows how computers work from the ground up in VERY accessible writing.
http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319
Grad school for machine learning is pretty vague, so here's some general resources I think would be good for an incoming CS grad student or undergraduate CS researcher with a focus on deep learning. In my opinion, the courses you mentioned you've done should be a sufficient foundation to dive into deep learning, but these resources cover some foundational stuff as well.
Depends on how old you are.
I'll be honest with you, I don't think Head First Java would be a good choice, however DO READ Clean Code. I also suggest Design Patterns: Elements of Reusable Object-Oriented Software and Working Effectively with Legacy Code. The first is a classic MUST READ for anyone in software development. It present numerous challenges that most of us will face when developing solutions, and gives you the design patterns you will need to solve them. The second is great for learning how to fix your predecessors shitty code, you'll need this one. If you haven't already, look up Bob Buzzard and Andy Fawcett. These two guys are my favorite SFDC Dev Bloggers. I also suggest watching any Salesforce Webinar that has anything to do with code, especially security stuff.
Practice makes perfect, except for us there is no perfect, just better. Know your best practices and live by them. With everything you do ask how can I make it better? Faster? More efficient? Do I even need code, or will Workflow/Process Builder/Flow do? How can I write code, so that an Admin can customize it without any code?
> Based on code reviews--my code is pretty good, with good logic and pretty well laid out.
This is actually VERY important, having good logic is obviously crucial, but being well laid out is a kind of hidden requirement with code. You or somebody else will eventually need to maintain your code, if it's laid out well it should hopefully be easy to read and maintain.
When you write code do your best to incorporate declarative features so that further customization can be done without code (I know I said this earlier, but I think it's important). Need to write some code that uses an arbitrary set of fields, consider using Field Sets. An Admin can add/remove them without code. Maybe use a Custom Setting, or Custom Metadata to map fields to values.
Learn how to use Describe calls for everything. Need to write some code that catches dupes and merges them? Don't hard code the values, then nobody will be able to remove or add fields without updating code. Instead use Describe calls, now you get every field on the object forever. Need to remove a field from an object no problem. Need to add a field to an object no problem. Does your losing record have child records that need to be reparented? Don't hard code, use Describe calls to get all sObjects with a Child Relationship. Use Describe to find out if it can be directly reparented or if it needs to be clones (CampaignMembers can't reparent a LeadId to a new Lead. You MUST clone and add the new Lead Id).
How much do you know about HTML? CSS? JavaScript? JQuery? Visualforce? Learn 'em. Lightning is coming, and these are going to be more important than ever (except maybe Jquery).
Practice, practice, practice. One coding assignment per month isn't that bad, but if you get some work done early and you have an hour or two to spare, work on a side project. Can you think of something in your company that could be automated, spin up a Dev Org and give it a shot. Maybe your Sales people could use a new VF page for entering information just a little quicker.
Always seek to improve your code. Always seek new ideas and better ways of doing things.
Trailhead is good, do all the coding ones you can find, it's more practice!
Being fond of problem solving is a good indicator. Problem solving and executing a solution is essentially what programming is all about in the end. Pretty much any engineering degree for that matter. The good news is most STEM courseware is pretty much the same the first couple of years of college so you won't really have to commit straight away. Your classes will apply to multiple degree paths and having a few intro compsci courses under your belt will help in literally any major.
A computer science degree is (should be) geared to problem solving more than learning to write code. Writing code is the easy bit and the tech changes so quickly it is something best learned on the fly. You will be taking tons of math, studying algorithms, data structures, learning to play well with others -- that sort of thing.
Being fond of computers alone can lead one astray. The classic example is that liking listening to music doesn't necessarily lead to liking making music.
The Harvard cs50x extension course will give you a straight up taste of what an intro to CS class will be like in university. The pace is fast so fair warning.
A good armchair book is CODE. Nice overview of how computers compute.
It's a great career choice IMO. I've been at it for a long long (long) time with zero regrets. Along side getting to play with all the shiny bits, you can get a constant supply of feel good moments when you see your work actually doing something in the wild and seeing your work impact peoples lives in a positive way.
Probably start with Artificial Intelligence: a modern approach. This is the state of the art AI as of 2009, of course in AI years that's ancient history but it's background you must know if you're serious about AI.
Following on from that you have the very popular statistical techniques, you can read about these in Pattern Recognition and Machine Learning. These are a wide range of statistical models and algorithms that allow machines to infer, classify and predict. Another very important concept is Chapter 14 on combining models. IBM's Watson for example uses a complex network of "simple" models who combine their answers to form the final responses.
From all the techniques in the previous book, neural networks from Chapter 5 have become the most popular and powerful. These are covered in Deep Learning, and are currently the cutting edge of machine learning. They are extremely general models that seem to be highly successful at a range of tasks. In particular their popularity comes from their amazing accuracy in image recognition, which really challenged past algorithms.
Ultimately nothing you can learn from anyone is sure to bring you close to sci-fi AI. The techniques to produce such an AI eludes even the foremost experts. You may also become disillusioned with your dream as you realise just how mechanical and constrained AI is. I personally think we'd have better luck genetically engineering intelligence in a random animal/insect than creating true intelligence in silicon and circuits.
CompSci covers a wide range of subjects, many of which won't be that relevant to you. When I was at Uni my classes covered:
This list isn't comprehensive, but covers most of the main points. For your job, you can happily ignore most of that core.
I think you'll want to focus on with the highest priority:
Warning: Algorithms is a heavy and dry book. It might not be a good recommendation for a beginner to be honest.
As you're interested in Data Science, you're already off to a good start with R and Matlab. Python is fine but has some issues that don't make it ideal for large-scale data processing. The good news is that once you've got the hang of Python, you can learn another language much easier. I think it's worth noting that R is quite obtuse in my experience, so if you get your head around that you're doing quite well.
But I digress. You're also going to want to learn about data structures, networked systems and databases, especially if you want to work with "Big Data". I think thought that your best starting place here is learning how to code. With that under your belt along with your math degree, everything else should be easy enough to learn on a case-by-case basis.
Source: Masters & PhD in CompSci subjects. Good luck, and feel free to PM me if you're interested in a mentor for this. With a better understanding of your position, I could probably help pare down what you need to study to more specific subjects.
PS: Assuming you're at Uni, your library should have both books I have suggested.
> Recently released books? Udemy courses? Free stuff online like W3Schools or CodeAcademy?
Mozilla Developer Network has fantastic documentation for JS, HTML, CSS, and the browser APIs, and a section intended to guide you through them for the first time in a comfortable order.
If JS is your first language, I'd recommend checking out the book "Code" by Charles Petzold, a great and relatively short book that answers the fundamental beginner questions like "So what's a CPU actually doing in there?", "How does source code make stuff happen?" and "How can music be 1s and 0s?"
There's a great series of books, available for free online, called "You Don't Know JS", that teach you how the language operates. It might be too involved if it's the first thing you read, but definitely at least start it and bookmark the later volumes to come back to.
Marijn Haverbeke's book "Eloquent JavaScript" is available to read freely online. I'd really recommend that one, too. And when you want to stop reading theory and start working on actual projects, grab "JavaScript Cookbook" by Shelley Powers.
> Did I choose right languages for what I wish to create? Maybe I should also use something else, like Bootstrap, Node.js or Typescript?
Don't worry about those for now. Node will be useful if you decide you need a server component to your game. Bootstrap is nice, and it's helpful, but if you're still trying to learn HTML and CSS yourself, it will only get in the way and obscure things. TypeScript is something you'd look at only once you're comfortable and confident with regular old JS. Stick to plain old HTML/CSS/JS of your own to start.
Jaap Schroder wrote a book detailing his study of the Solo violin works, and he's recorded the concertos as well. That's a good place to begin. There are some really brilliant insights that most students would never consider.
Don't get caught up thinking you are handcuffed and can only imitate an anemic baroque style or a warbly, romantic style. This video is one sort of hybrid, where the soloist and conductor are very aware of performance practice, but modern instruments and techniques are relied upon heavily. Remember that no recordings exits before 1900ish. There's still a lot of personal judgment in a good historically informed performance.
There are many great Bach interpretations, and you should listen to many recordings (Grumiaux is often held in high esteem, and Schroder, as good models) to find out where your preferences lie. You should attempt to play with all sorts of expressive devices (Non vib, lots of decay, faster bow, different bow strokes, bowing patterns, holding the bow higher, gut strings?, baroque bow) and find out what you have to say about Bach. I think any successful interpretation will at least have two major things: a tremendous sense of line (form, rhythm, a large-scale view) and an expressive use of the tone color (bright, warm, deep, thick, feathery, etc.).
Leopold Mozart also wrote a treatise on violin playing. In terms of playing style, he was more familiar with the Baroque than with the music of W.A mozart. He wrote about a sense of "affect" in Baroque music. He wrote that overall, there is one overriding feeling that should come across in Barque works (especially dances and binary form movements.) In the E major Bach, I bet it would be helpful to decide what the "affect" is for each movement. Is there only one, is the narrative single-minded? More simply, come up with something other than "happy" or "sad."
Don't let anyone tell you Bach was a stodgy, strict person. He was ridiculously smart, as shown by his ability to improvise multi-voice fugues. Hofstader wrote eloquently about Bach's puzzles and intellectualism. He was a jokester - the crab canon and the Coffee Cantata or good examples. He was sometimes compensated for his work with large amounts of beer. Bach had somewhere around 20 children, about half of which survived childhood. Bach was a very complex person, with lots of life experience. Don't let a careless caricature influence how you think about his music.
You probably already have, but if not, definitely read Design Patterns, which is old but a classic. I'd also highly recommend the Pragmatic Programmer.
EDIT: I just want to say, that I also fully support alienangel2's answer. I wanted to recommend a couple good books to get you on "the path", but ultimately, the best thing by far is to find a job that grows you. For some people, the best way to do that is to work at a super small startup, where everything you're building is from scratch. For others (like me), the best way is to work at a company with tons of really smart people who have already built great software, and learning from them and the choices they've made (and why). And if you still feel like you're regressing since school, maybe that's the answer: go back to school (i.e. get a Master's or PhD)!
So I don't think you should get too hung up on "enterprise architecture" at the moment, partially because you're still very early in your career, but also because enterprise architecture means a lot of things to a lot of different people. At this stage in your career, I really think you should focus mainly on SOLID code, core Object Oriented design concepts, and then understanding patterns. Good architectural strategies are built around all of those concepts, but they're also much much more than that.
For SOLID code, one of my favorite references is actually Dependency Injection in .Net by Mark Seemann. Although he does spend a good amount of time on DI, the recommendations that Mark makes for how to properly structure your code in order to take advantage of DI are very useful in understanding SOLID oriented design principles in general. The examples and code really work through the concepts well, so you get a great explanation followed by some well thought out code.
Clean Code by Uncle Bob is a great reference on how to structure well thought out code that touches on some architectural principles, but doesn't have that as the main focus of the book. Many of the topics in the book you'll find need to be addressed throughout a codebase.
As far as design patterns (which are different then architectural patterns), I don't think you can go wrong with the original Gang of 4 book , although I personally have a C# specific version, C# Design Pattern Essentials. I don't want to put too much emphasis on design patterns, because sometimes they get overused and applied too literally, but they are still very useful. I think the key to design patterns is not just knowing what they are, but determining where they might be applicable to your use case, and whether you should make any small adjustments or tweaks to them.
After you really have a rock solid base of working with code, then you can shift your focus on more architectural concerns. For that, it really depends on what problem your looking to solve, but Domain Driven Design (DDD) is a good way about understanding those problems and to structure the solutions in a well thought out, loosely coupled, and evolvable manner. That "central framework" that you referenced in your post is the business logic, and its the key focus of DDD
This was originally posted as an image but got deleted for IMO in this case, the irrelevant reason that picture posts are not allowed, though this was all about the text. We had an interesting discussion going: http://www.reddit.com/r/Futurology/comments/2mh0y1/elon_musks_deleted_edge_comment_from_yesterday_on/
I'll just post my relevant contributions to the original to maybe get things started.
---------------------------
And it's not like he's saying this based on his opinion after a thorough study online like you or I could do. No, he has access to the real state of the art:
> Musk was an early investor in AI firm DeepMind, which was later acquired by Google, and in March made an investment San Francisco-based Vicarious, another company working to improve machine intelligence.
> Speaking to US news channel CNBC, Musk explained that his investments were, "not from the standpoint of actually trying to make any investment return… I like to just keep an eye on what's going on with artificial intelligence. I think there is potentially a dangerous outcome there."
*Also I love it that Elon isn't afraid to speak his mind like this. I think it might well be PR or the boards of his companies that reigned him in here. Also in television interviews he is so open and honest, too bad he didn't speak those words there.
----------------------------
I'm currently reading Superintelligence which is mentioned in the article and by Musk. One of the ways he describes an unstoppable scenario is that the AI seems to function perfectly and is super friendly and helpful.
However on the side it's developing micro-factories which can assemble from a specifically coded string of DNA (this is already possible to a limited extent). These factories then use their coded instructions to multiply and spread and then start building enormous amount of nanobots.
Once critical mass and spread is reached they could instantly wipe out humanity through some kind of poison/infection. The AI isn't physical, but the only thing it needs in this case is to place an order to a DNA printing service (they exist) and then mail it to someone it has manipulated into adding water, nutrients, and releasing the DNA nanofactory.
If the AI explodes in intelligence as predicted in some scenarios this could be set up within weeks/months of it becoming aware. We would have nearly no chance of catching this in time. Bostrom gives the caveat that this was only a viable scenario he could dream up, the super intelligence should by definition be able to make much more ingenious methods.
Beyond the obvious choices, Watts' The Book, Ram Dass' Be Here Now, Huxley's Doors of Perception, Leary’s The Psychedelic Experience, and of course Fear and Loathing (all of these should be on the list without question; they’re classics), here are a some others from a few different perspectives:
From a Secular Contemporary Perspective
Godel Escher Bach by Douglass Hofstadter -- This is a classic for anyone, but man is it food for psychedelic thought. It's a giant book, but even just reading the dialogues in between chapters is worth it.
The Mind’s Eye edited by Douglass Hofstadter and Daniel Dennett – This is an anthology with a bunch of great essays and short fictional works on the self.
From an Eastern Religious Perspective
The Tao is Silent by Raymond Smullyan -- This is a very fun and amusing exploration of Taoist thought from one of the best living logicians (he's 94 and still writing logic books!).
Religion and Nothingness by Keiji Nishitani – This one is a bit dense, but it is full of some of the most exciting philosophical and theological thought I’ve ever come across. Nishitani, an Eastern Buddhist brings together thought from Buddhist thinkers, Christian mystics, and the existentialists like Neitzsche and Heidegger to try to bridge some of the philosophical gaps between the east and the west.
The Fundamental Wisdom of the Middle Way by Nagarjuna (and Garfield's translation/commentary is very good as well) -- This is the classic work from Nagarjuna, who lived around the turn of the millennium and is arguably the most important Buddhist thinker after the Buddha himself.
From a Western Religious Perspective
I and Thou by Martin Buber – Buber wouldn’t approve of this book being on this list, but it’s a profound book, and there’s not much quite like it. Buber is a mystical Jewish Philosopher who argues, in beautiful and poetic prose, that we get glimpses of the Divine from interpersonal moments with others which transcend what he calls “I-it” experience.
The Interior Castle by St. Teresa of Avila – this is an old book (from the 1500s) and it is very steeped in Christian language, so it might not be everyone’s favorite, but it is perhaps the seminal work of medieval Christian mysticism.
From an Existentialist Perspective
Nausea by Jean Paul Sartre – Not for the light of heart, this existential novel talks about existential nausea a strange perception of the absurdity of existence.
The Myth of Sisyphus by Albert Camus – a classic essay that discusses the struggle one faces in a world inherently devoid of meaning.
----
I’ll add more if I think of anything else that needs to be thrown in there!
Nand to Tetris (coursera)
the first half of the book is free. You read a chapter then you write programs that simulate hardware modules (like memory, ALU, registers, etc). It's pretty insightful for giving you a more rich understanding of how computers work. You could benefit from just the first half the book. The second half focuses more on building assemblers, compilers, and then a java-like programming language. From there, it has you build a small operating system that can run programs like Tetris.
Code: The Hidden Language of Hardware and Software
This book is incredibly well written. It's intended for a casual audience and will guide the reader to understanding how a microcontroller works, from the ground up. It's not a text book, which makes it even more more impressive.
Computer Networking Top Down Approach
one of the best written textbook I've read. Very clear and concise language. This will give you a pretty good understanding of modern-day networking. I appreciated that book is filled to the brim of references to other books and academic papers for a more detailed look at subtopics.
Operating System Design
A great OS book. It actually shows you the C code used to design and code the Xinu operating system. It's written by a Purdue professor. It offers both a top-down look, but backs everything up with C code, which really solidifies understanding. The Xinu source code can be run on emulators or real hardware for you to tweak (and the book encourages that!)
Digital Design Computer Architecture
another good "build a computer from the ground up" book. The strength of this book is that it gives you more background into how real-life circuits are built (it uses VHDL and Verilog), and provides a nice chapter on transistor design overview. A lot less casual than the Code book, but easily digestible for someone who appreciates this stuff. It culminates into designing and describing a microarchitecture to implement a MIPS microcontroller. The diagrams used in this book are really nice.
I think it depends on your location. If you live in or a commutable distance to a city with a strong technology sector, they will be quite a bit of companies willing to hire an high school level intern.
For example, a friend of a friend got an internship after graduating high school for the summer in the city he lives. I live outside the city and I am taking a year off before university because I got sick in the summer. I'm better now but all the companies near me are kind of old fashioned and they don't accept interns after a lot of cold calling; my calls and emails get sent to their HR manager, and they don't feel like giving me a chance. Also the benefit of an internship in the city for him was that he used pretty modern web development stuff.
As a high school intern unless you find a research group, you will highly unlikely use R. You can probably do front-end web development, so learning HTML, CSS, JavaScript, Jquery, and the Bootstrap framework would be awesome. If you don't want to do front-end web development, you really have to market yourself and make sure you are competent in Python if you want to use Python.
As a person who graduated high school last year and is taking a year before university to recover from an illness, I have to compete with other university students of various years who even the freshman have some sort of qualification as a candidate for a bachelor of applied science or math degree which tells the potential employer the applicant is knowledgeable. If you want to be competitive with the freshman or maybe the sophomore students, you really need a good GitHub portfolio which shows you are knowledgeable as them.
For example, in my GitHub portfolio, I have an Android application (GitHub and Google Play). In this small to medium sized application (35, 000 lines of code), I show I can use a version control system and a bug tracker by using Git and GitHub, respectively. Furthermore, in the bug tracker, I show I can debug by showing results of me using an allocation tracker, a heap dump analyzer, a GPU rendering profiler, and the like. In the actual source code, I show my experience with Java. But more importantly, I show I can implement an architectural pattern like Model-View-Presenter (a deviation from Model-View-Controller), some design patterns like wrappers, singletons, mappers, adapter, presenters, contracts, providers, and factories, and design an API which performs network requests, database queries, and file input and output. In the source code, I try to apply as much as I can from reading, Effective Java (2nd Edition), Clean Code: A Handbook of Agile Software Craftsmanship, Design Patterns: Elements of Reusable Object-Oriented Software, and Introduction to Algorithms while I get acquainted with reading Software Engineering: A Practitioner's Approach. I still need to try to utilize TDD and Agile practices; I read about them, but I never tried them out.
I think if you have a GitHub portfolio with project(s) of a good size that shows a lot of computer science and software engineering concepts, you will be ahead of most freshman students whose only projects might be a small class project in their Intro to Java class which all their peers did.
Currently, the application has around 5,000 downloads in about a month with 4.4 rating to place it 5th in its specific category above 4 stars on Google Play: https://play.google.com/store/search?q=Manga+Reader&c=apps&rating=1; it took about a month and a half. However, every time I send my resume to a local company outside the city for an internship, I get no response. I'm going to a Career Fair at a friend's university in the city on Friday, so I'm a test my luck there; they have quite a few recruiters for mobile application interns, and one company develops a full stack product and service whose mobile applications kind of match mine. Overall, it's feasible if you are near a city, willing to commute, and can prove you know as much as a freshman student who they could hire instead.
Alright man, let's do this. Sorry, had a bit of a distraction last night so didn't get around to this. By the way, if you look hard enough, you can find PDF versions of a lot of these books for free.
Classic computer science principle books that are actually fun and a great read (This is the kind of fundamental teachings you would learn in school, but I think these books teach it better):
Then, if you want to get into frontend web development for example, I would suggest the following two books for the fundamentals of HTML, CSS, and JavaScript. What I like about these books is they have little challenges in them:
Another great book that will teach you just fundamentals of coding using an extremely flexible programming language in Python, how to think like a programmer is this book (disclaimer: I haven't read this one, but have read other Head First books, and they rock. My roommate read this one and loved it though):
Let me know if you want any other recommendations when it comes to books on certain areas of software development. I do full stack web app development using .NET technology on the backend (C# and T-SQL) and React in the frontend. For my personal blog, I use vanilla HTML, CSS, and Javascript in the frontend and power backend content management with Piranha CMS (.NET Core based). I often times do things like pick up a shorter course or book on mobile development, IoT, etc. (Basically other areas from what I get paid to do at work that interest me).
If I recommended the very first book to read on this list, it would be the Head First book. Then I would move over to the first book listed in the classic computer science book if you wanted to go towards understanding low level details, but if that's not the case, move towards implementing something with Python, or taking a Python web dev course on Udemy..
Other really cool languages IMO: Go, C#, Ruby, Javascript, amongst many more
P.S. Another book from someone that was in a similar situation to you: https://www.amazon.com/Self-Taught-Programmer-Definitive-Programming-Professionally-ebook/dp/B01M01YDQA/ref=sr_1_2?keywords=self+taught+programmer&qid=1557324500&s=books&sr=1-2
Most of my stuff is going to focus around consciousness and AI.
BOOKS
Ray Kurzweil - How to Create a Mind - Ray gives an intro to neuroscience and suggests ways we might build intelligent machines. This is a fun and easy book to read.
Ray Kurzweil - TRANSCEND - Ray and Dr. Terry Grossman tell you how to live long enough to live forever. This is a very inspirational book.
*I'd skip Kurzweil's older books. The newer ones largely cover the stuff in the older ones anyhow.
Jeff Hawkins - On Intelligence - Engineer and Neuroscientist, Jeff Hawkins, presents a comprehensive theory of intelligence in the neocortex. He goes on to explain how we can build intelligent machines and how they might change the world. He takes a more grounded, but equally interesting, approach to AI than Kurzweil.
Stanislas Dehaene - Consciousness and the Brain - Someone just recommended this book to me so I have not had a chance to read the whole thing. It explains new methods researchers are using to understand what consciousness is.
ONLINE ARTICLES
George Dvorsky - Animal Uplift - We can do more than improve our own minds and create intelligent machines. We can improve the minds of animals! But should we?
David Shultz - Least Conscious Unit - A short story that explores several philosophical ideas about consciousness. The ending may make you question what is real.
Stanford Encyclopedia of Philosophy - Consciousness - The most well known philosophical ideas about consciousness.
VIDEOS
Socrates - Singularity Weblog - This guy interviews the people who are making the technology of tomorrow, today. He's interviewed the CEO of D-Wave, Ray Kurzweil, Michio Kaku, and tons of less well known but equally interesting people.
David Chalmers - Simulation and the Singularity at The Singularity Summit 2009 - Respected Philosopher, David Chalmers, talks about different approaches to AI and a little about what might be on the other side of the singularity.
Ben Goertzel - Singularity or Bust - Mathematician and computer Scientist, Ben Goertzel, goes to China to create Artificial General Intelligence funded by the Chinese Government. Unfortunately they cut the program.
PROGRAMMING
Daniel Shiffman - The Nature of Code - After reading How to Create a Mind you will probably want to get started with a neural network (or Hidden Markov model) of your own. This is your hello world. If you get past this and the math is too hard use this
Encog - A neural network API written in your favorite language
OpenCV - Face and object recognition made easy(ish).
Congratulations! That's a big step. Be proud that you were able to make the switch. Not many people manage to transform ideas into results.
I think there are four areas on which you need to focus, in order to go from mediocre to great. Those areas are:
Now, these areas don't include things like marketing yourself or building valuable relationships with coworkers or your local programming community. I see those as being separate from being great at what you do. However, they're at least as influential in creating a successful and long-lasting career.
Let's take a look at what you can do to improve yourself in those four areas. I'll also suggest some resources.
​
1. Theoretical foundation
Foundational computer science. Most developers without a formal degree have some knowledge gaps here. I suggest taking a MOOC to remediate this. After that, you could potentially take a look at improving your data structures and algorithms knowledge.
​
2. Working knowledge.
I'd suggest doing a JavaScript deep-dive before focusing on your stack. I prefer screencasts and video courses for this, but there are also plenty of books available. After that, focus on the specific frameworks that you're using. While you're doing front-end work, I also suggest you to explore the back-end.
​
3. Software engineering practices.
Design patterns and development methodologies. Read up about testing, agile, XP, and other things about how good software is developed. You could do this by reading the 'Big Books' in software, like Code Complete 2 or the Pragmatic Programmer, in your downtime. Or, if you can't be bothered, just read different blog posts/Wikipedia articles.
​
4. Soft skills.
​
Some closing notes:
- For your 'how to get started with open source' question, see FirstTimersOnly.
- If you can't be bothered to read or do large online courses, or just want a structured path to follow, subscribe to FrontendMasters and go through their 'Learning Paths'.
- 4, combined with building relationships and marketing yourself, is what will truly differentiate you from a lot of other programmers.
​
Sorry for the long post, and good luck! :)
I would guess that career prospects are a little worse than CS for undergrad degrees, but since my main concern is where a phd in math will take me, you should get a second opinion on that.
Something to keep in mind is that "higher" math (the kind most students start to see around junior level) is in many ways very different from the stuff before. I hated calculus and doing calculations in general, and was pursuing a math minor because I thought it might help with job prospects, but when I got to the more abstract stuff, I loved it. It's easily possible that you'll enjoy both, I'm just pointing out that enjoying one doesn't necessarily imply enjoying the other. It's also worth noting that making the transition is not easy for most of us, and that if you struggle a lot when you first have to focus a lot of time on proving things, it shouldn't be taken as a signal to give up if you enjoy the material.
This wouldn't be necessary, but if you like, here are some books on abstract math topics that are aimed towards beginners you could look into to get a basic idea of what more abstract math is like:
Different mathematicians gravitate towards different subjects, so it's not easy to predict which you would enjoy more. I'm recommending these five because they were personally helpful to me a few years ago and I've read them in full, not because I don't think anyone can suggest better. And of course, you could just jump right into coursework like how most of us start. Best of luck!
(edit: can't count and thought five was four)
You probably don't need extensive knowledge of data structures for mobile apps, but I ALWAYS encourage learning data structures! Knowing what structures are available and when to use them is a bit like being a programmer super-hero and one of the things that really sets apart the self-taught hackers from the top tier engineers and scientists.
It's not a course, but I always love to plug my professor's book Open Data Structures. He's made it freely available in many different languages with code samples in multiple languages, and it's a really good read.
One thing I would highly recommend before getting into the mobile space, however, is looking into design patterns. The fundamental book on this is Elements of Reusable Object-Oriented Software by the Gang of Four, but there are some other ones I've found which are pretty handy. Game Programming Patterns is freely available, but it is a bit domain specific. It really nicely details a lot of patterns, however. JavaScript Design Patterns also really nicely details them and is also freely available.
Get an MS degree. I had a BA in Psych and went straight for MS in Comp Sci. Not every school will allow it and the ones that will will require you to take a couple of undergrad courses and pass them with B or higher.
That being said - if you still prefer to go the no degree route you're going to have somewhat of a tough time with interviews even though you may perform the job well itself. Most software engineering interviews revolve around things like how a hash map works and properties of binary trees - information you aren't usually going to get from the "build your own iphone app" type books. So I would recommend:
Good luck!
I've taught a lot of people how computers work, or more precisely how to program them. I am sure you can learn too.
First, let's make it fun.
There is a lot of material for people who like the Raspberry Pi out there that is fun and simple. You don't even need to own a Raspberry Pi to understand what they're talking about.
It's fun and simple because it's designed for youngsters who find long/complex books a bit too boring. I think you might enjoy it, because you've said you've found the books you've tried too boring.
Here is a load of magazines about the Pi - on each issue you can click on "Get Issue" and then under the cover "download the PDF" and read it and see if you enjoy that.
Next, have a play with Scratch. It's designed for kids but the exact same concepts are in professional programming languages.
The reason I recommend it is not because I think you are a child, but because it's a lot of fun and makes a lot of the dull and boring bits of programming go away so you can focus on the fun bits.
You have to remember all the things going on inside a computer are basically the things going on in there - just a lot more complex.
If you ever want to learn a programming language that professional developers use, I think you'll like Ruby.
It's very forgiving for new developers, but still lets you do what we would call "production grade" code. It's what I work in most days.
Also, why's poignant guide is quite funny, but you might find it a bit weird and confusing - I know I did the first time I read it. :-)
I also recommend this book to you: Code by Charles Petzoid. The first few chapters don't seem like they're about computers, because they talk about flags and electrical circuits - that's because you need to understand those things first.
If you can read and understand the whole thing you will know more about how computers work than half of the professional software engineers out there. And they're normally quite a clever bunch.
If you find it too difficult, slow down and think. Each paragraph has something in it worth thinking about and letting it mull over in your mind.
IQ is not a measure of how much you can learn, but perhaps how quickly it can see patterns and understand things.
You having a lower IQ than somebody else does not mean you can't see those patterns or understand things, it just means it might take you a little more thinking to get there. I'm sure you will.
If you ever have any questions about computers, I'd love to try and help answer them - feel free to ask me.
There was a book I read 40 years ago that covered basically everything from vacuum tubes and semiconductors up to basically chips. It was in the library, and it was like 800 pages long. I asked on reddit if anyone knew what it was, and someone pointed me at the newest edition. But I don't really have time to go through all my comment history looking for "electronics book" or to write a program to do same, but you should feel free to do so. :-) Then I got into assembly for the 8-bit CPUs, picked up the 16-bit and 32-bit CPUs of the day, and the mainframe stuff. Then I went back to school. :-)
However, all that said, this looks like what I read, and the intro sounds like he's describing the first edition I remember: https://smile.amazon.com/Electronic-Devices-Circuit-Theory-11e-ebook/dp/B01LY6238B/ref=mt_kindle
If you want more about assembler, just flipping through this seems like it starts with the very fundamentals and goes through a fair amount. https://smile.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=sr_1_5 If you already know how to program, and you understand the basics of how (for example) basic assembler language works and how the chip accesses memory and what an interrupt does and etc, then learning new assembler languages is pretty straightforward. Sort of like "I know Java, now I need to learn C#."
But honestly, at this point, I'd look online. When I learned all this stuff, textbooks were the way to go. Nowadays, everything moves so fast that you're probably better off finding a decent description online, or looking up an online class or something and seeing what texts they use.
If you don't want to learn assembler or hardware, but you still want to challenge yourself, the other thing to look into is unusual programming languages and operating systems. Things that are unlike what people now use for doing business programming. Languages like APL (or "J"), or Hermes, or Rust, or Erlang, or Smalltalk, or even Lisp or Forth if you've been steeped in OOP for too long. Operating systems like Eros or Amoeba or Singularity. Everything stretches your mind, everything gives you tools you can use in even the most mundane situations, and everything wonderful and wild helps you accept that what you're doing now is tedious and mundane but that's where you're at for the moment. :-) (Or, as I often exclaim at work, "My kingdom for a Java list comprehension!")
I'm assuming you're looking for things geared toward a layman audience, and not textbooks. Here's a few of my personal favorites:
Sagan
Cosmos: You probably know what this is. If not, it is at once a history of science, an overview of the major paradigms of scientific investigation (with some considerable detail), and a discussion of the role of science in the development of human society and the role of humanity in the larger cosmos.
Pale Blue Dot: Similar themes, but with a more specifically astronomical focus.
Dawkins
The Greatest Show on Earth: Dawkins steers (mostly) clear of religious talk here, and sticks to what he really does best: lays out the ideas behind evolution in a manner that is easily digestible, but also highly detailed with a plethora of real-world evidence, and convincing to anyone with even a modicum of willingness to listen.
Hofstadter
Godel, Escher, Bach: An Eternal Golden Braid: It seems like I find myself recommending this book at least once a month, but it really does deserve it. It not only lays out an excruciatingly complex argument (Godel's Incompleteness Theorem) in as accessible a way as can be imagined, and explores its consequences in mathematics, computer science, and neuroscience, but is also probably the most entertainingly and clearly written work of non-fiction I've ever encountered.
Feynman
The Feynman Lectures on Physics: It's everything. Probably the most detailed discussion of physics concepts that you'll find on this list.
Burke
Connections: Not exactly what you were asking for, but I love it, so you might too. James Burke traces the history of a dozen or so modern inventions, from ancient times all the way up to the present. Focuses on the unpredictability of technological advancement, and how new developments in one area often unlock advancements in a seemingly separate discipline. There is also a documentary series that goes along with it, which I'd probably recommend over the book. James Burke is a tremendously charismatic narrator and it's one of the best few documentary series I've ever watched. It's available semi-officially on Youtube.
I had the same problem. I ride the subway every day and have a ton of time to read, so I've been trying to collect similar resources.
Here are some resources I found really helpful:
This is a VERY basic (think: learn how to code!) introduction to Unity. I personally found it too elementary, having coded in a few different languages, but it might be a good place to start as it explains basic Unity design concepts like Components, Materials, Colliders, etc.
This is by fast the best 'getting started' tutorial I've found. It walks you through creating a really basic project from scratch using Unity basics and scripts. This is what I based most of my code off of when I first started my project.
This has been the most helpful resource for me. It's not Unity specific but will teach you A TON of great fundamentals for things like how to move a character, common patterns like StateMachines, how to handle AI, etc.... All of these concepts will be relevant and many are already in place in Unity so you'll recognize them right away.
Advanced: Game Programming Patterns - http://gameprogrammingpatterns.com/
This is a book (online/pdf/epub) that teaches the more advanced patterns you'll be applying in your code. I'd suggest this once you finish with the above resources and have been working through your game for a bit.
My suggestion is to opensource it under the GPL. That would mean people can use your GPL code in commercial enterprises, but they can't resell it as commercial software without paying for a license.
By opensourcing it, people can verify your claims and help you improve the software. You don't have to worry about languishing as an unknown, or taking venture capital and perhaps ultimately losing control of your invention in a sale or IPO. Scientists can use it to help advance knowledge, without paying the large license fees that a commercial owner might charge. People will find all sorts of uses for it that you never imagined. Some of them will pay you substantial money to let them turn it into specialized commercial products, others will pay you large consulting fees to help them apply the GPL version to their own problems.
You could also write a book on how it all works, how you figured it out, the history of your company, etc. If you're not a writer you could team up with one. Kurzweil and Jeff Hawkins have both published some pretty popular books like this, and there are others about non-AGI software projects (eg. Linux, Doom). If the system is successful enough to really make an impact, I bet you could get a bestseller.
Regarding friendliness, it's a hard problem that you're probably not going to solve on your own. Nor is any large commercial firm likely to solve it own their own; in fact they'll probably ignore the whole problem and just pursue quarterly profits. So it's best to get it out in the open, so people can work on making it friendly while the hardware is still weak enough to limit the AGI's capabilities.
This would probably be the ideal situation from a human survival point of view. If someone were to figure out AGI after the hardware is more powerful than the human brain, we'd face a hard takeoff scenario with one unstoppable AGI that's not necessarily friendly. Having the software in a lot of hands while we're still waiting for Moore's Law to catch up to the brain, we have a much more gradual approach, we can work together on getting there safely, and when AGI does get smarter than us there will be lots of them with lots of different motivations. None of them will be able to turn us all into paperclips, because doing that would interfere with the others and they won't allow it.
Here are some different sources for different aspects of computers.
The book Code: The Hidden Language of Computer Hardware and Software is an excellent introduction into the low-level concepts which modern CPUs are built on.
Link hopping on Wikipedia is a totally viable method to learn many aspects of computers. Start at some page you know about, like Graphics Cards or Internet, and just keep reading and clicking links.
Hacking challenges are a great way to learn about how computers work since they require you to have enough knowledge to be able to deliberately break programs. https://picoctf.com/ is an excellent choice for beginner- to intermediate-level challenges. https://overthewire.org/wargames/ also has some good challenges, but they start off harder and progress quickly. Note that these challenges will often require some programming, so learning a powerful language like Python will be very helpful.
This site is not very active anymore, but the old posts are excellent. It's very complex and advanced though, so it's not a good place to start. https://www.realworldtech.com/
In general, google will be your best friend. If you run into a word or program or concept you don't know, google it. If the explanations have more words you don't know, google them. It takes time, but it's the best way to learn on your own.
Hooray, a question I can answer!
One of the problems here is that the question is worded backwards. Binary doesn't combine to give us programming languages. So the answer to your question is somewhat to the contrary: programming languages were invented to ease the tedium of interfacing using binary codes. (Though it was still arguably tedious to work on e.g. punched cards.) Early interfaces to programming machines in binary took the form of "front panels" with switches, where a user would program one or several instructions at a time (depending on the complexity of the machine and the front panel interface), using the switches to signify the actual binary representation for the processor functions they desired to write.
Understanding how this works requires a deeper understanding of processors and computer design. I will only give a very high level overview of this (and others have discussed it briefly), but you can find a much more layperson accessible explanation in the wonderful book Code: The Hidden Language of Hardware and Software. This book explains Boolean logic, logic gates, arithmetic logic units (ALUs) and more, in a very accessible way.
Basically, logic gates can be combined in a number of ways to create different "components" of a computer, but in the field of programming languages, we're really talking about the CPU, which allows us to run code to interface with the other components in the system. Each implementation of a processor has a different set of instructions, known as its machine code. This code, at its most basic level, is a series of "on" or "off" electrical events (in reality, it is not "on" and "off" but high and low voltages). Thus, different combinations of voltages instruct a CPU to do different things, depending on its implementation. This is why some of the earliest computers had switch-interfaces on the front panel: you were directly controlling the flow of electricity into memory, and then telling the processor to start executing those codes by "reading" from the memory.
It's not hard to see how programming like this would be tedious. One could easily write a book to configure a machine to solve a simple problem, and someone reading that book could easily input the code improperly.
So eventually as interfacing with the machine became easier, we got other ways of programming them. What is commonly referred to as "assembly language" or "assembler" is a processor-specific language that contains mnemonics for every binary sequence the processor can execute. In an assembly language, there is a 1:1 correlation between what is coded, and what the processor actually executes. This was far easier than programming with flip-switches (or even by writing the binary code by hand), because it is much easier for a human to remember mnemonics and word-like constructs than it is to associate numbers with these concepts.
Still, programming in assembly languages can be difficult. You have to know a lot about the processor. You need to know what side-effects a particular instruction has. You don't have easy access to constructs like loops. You can't easily work with complex datatypes that are simply explained in other languages -- you are working directly with the processor and the attached memory. So other languages have been invented to make this easier. One of the most famous of these languages, a language called "C," presents a very small core language -- so it is relatively easy to learn -- but allows you to express concepts that are quite tedious to express in assembler. As time has gone on, computers have obviously become much faster, and we've created and embraced many languages that further and further abstract any knowledge about the hardware they are running on. Indeed, many modern languages are not compiled to machine code, but instead are interpreted by a compiled binary.
The trend here tends to be making it easier for people to come into the field and get things done fast. Early programming was hard, tedious. Programming today can be very simple, fun and rewarding. But these languages didn't spring out of binary code: they were developed specifically to avoid it.
TL;DR: People keep inventing programming languages because they think programming certain things in other ones is too hard.
Reading some books would be a good idea.
The following are textbooks:
General AI
Machine Learning
Statistics for Machine Learning
There are many other topics within AI which none of these books focus on, such as Natural Language Processing, Computer Vision, AI Alignment/Control/Ethics, and Philosophy of AI. libgen.io may be of great help to you.
Programming Game AI by Example has a great, easy-to-understand explanation and walkthrough for learning ANNs: http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782
Once you've learned at least ANNs, you can delve into the popular approaches to GAI:
My biggest recommendation: read about all of them! Try to understand why scientists pursue each approach. Having a solid understanding of the motivation behind each approach will make it much easier for you to decide which path to pursue. I recommend the following books:
It looks like the Deep Learning folks have already pitched in, so I'll trust their recommendations are good on that end. Read their stuff too, then decide for yourself. You'll find that there's fanatics from every branch, and they'll claim that their way is the only way. This is the only thing I can tell you for certain: no one has proven that any approach is better than the other just yet, and anyone who claims they have needs to remind themselves of the no free lunch theorem [http://en.wikipedia.org/wiki/No_free_lunch_in_search_and_optimization].
That's one of my favorite popular science books, so it's wonderful to hear you're getting so much out of it. It really is a fascinating topic, and it's sad that so many Christians close themselves off to it solely to protect their religious beliefs (though as you discovered, it's good for those religious beliefs that they do).
As a companion to the book you might enjoy the Stated Clearly series of videos, which break down evolution very simply (and they're made by an ex-Christian whose education about evolution was part of his reason for leaving the religion). You might also like Coyne's blog, though these days it's more about his personal views than it is about evolution (but some searching on the site will bring up interesting things he's written on a whole host of religious topics from Adam and Eve to "ground of being" theology). He does also have another book you might like (Faith Versus Fact: Why Science and Religion are Incompatible), though I only read part of it since I was familiar with much of it from his blog.
> If you guys have any other book recommendations along these lines, I'm all ears!
You should definitely read The Selfish Gene by Richard Dawkins, if only because it's a classic (and widely misrepresented/misunderstood). A little farther afield, one of my favorite popular science books of all time is The Language Instinct by Steven Pinker, which looks at human language as an evolved ability. Pinker's primary area of academic expertise is child language acquisition, so he's the most in his element in that book.
If you're interested in neuroscience and the brain you could read How the Mind Works (also by Pinker) or The Tell-Tale Brain by V. S. Ramachandran, both of which are wide-ranging and accessibly written. I'd also recommend Thinking, Fast and Slow by psychologist Daniel Kahneman. Evolution gets a lot of attention in ex-Christian circles, but books like these are highly underrated as antidotes to Christian indoctrination -- nothing cures magical thinking about the "soul", consciousness and so on as much as learning how the brain and the mind actually work.
If you're interested in more general/philosophical works that touch on similar themes, Douglas R. Hofstadter's Gödel, Escher, Bach made a huge impression on me (years ago). You might also like The Mind's I by Hofstadter and Daniel Dennett, which is a collection of philosophical essays along with commentaries. Books like these will get you thinking about the true mysteries of life, the universe and everything -- the kind of mysteries that have such sterile and unsatisfying "answers" within Christianity and other mythologies.
Don't worry about the past -- just be happy you're learning about all of this now. You've got plenty of life ahead of you to make up for any lost time. Have fun!
Thanks so much!
Where do hobbies and interests go? Below Education somewhere? Sample stuff I could add:
I had sort of planned to put all this stuff in my personal website - write ups of personal projects, a good reads feed, an "About me" section, and maybe a page of my sewing/knitting creations.
I'll certainly look into adding some more personality into the resume design, it is currently the result of a google template, which is pretty blah.
Again, Thanks so much for your feedback! It's been really helpful!
Oi. Disclaimer: I haven't bought a book in the field in a while, so there might be some new greats that I'm not familiar with. Also, I'm old and have no memory, so I may very well have forgotten some greats. But here is what I can recommend.
I got my start with Koblitz's Course in Number Theory and Cryptography and Schneier's Applied Cryptography. Schneier's is a bit basic, outdated, and erroneous in spots, and the guy is annoying as fuck, but it's still a pretty darned good intro to the field.
If you're strong at math (and computation and complexity theory) then Oded Goldreich's Foundations of Cryptography Volume 1 and Volume 2 are outstanding. If you're not so strong in those areas, you may want to come up to speed with the help of Sipser and Moret first.
Also, if you need to shore up your number theory and algebra, Victor Shoup is the man.
At this point, you ought to have a pretty good base for building on by reading research papers.
One other note, two books that I've not looked at but are written by people I really respect Introduction to Modern Cryptography by Katz and Lindell and Computational Complexity: A Modern Approach by Arora and Barak.
Hope that helps.
There are a ton of books, but i guess the main question is: what are you interested in? Concepts or examples? Because many strong conceptual books are using examples from java, c++ and other languages, very few of them use php as example. If you have the ability to comprehend other languages, then:
http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_1?ie=UTF8&qid=1322476598&sr=8-1 definetly a must read. Beware not to memorize it, it is more like a dictionary. It should be pretty easy to read, a little harder to comprehend and you need to work with the patterns presented in that book.
http://www.amazon.com/PHP-5-Objects-Patterns-Practice/dp/1590593804 - has already been mentioned, is related directly to the above mentioned one, so should be easier to grasp.
http://www.amazon.com/Patterns-Enterprise-Application-Architecture-Martin/dp/0321127420/ref=sr_1_1?ie=UTF8&qid=1322476712&sr=8-1 - one of the most amazing books i have read some time ago. Needs alot of time and good prior knowledge.
http://www.amazon.com/Refactoring-Improving-Design-Existing-Code/dp/0201485672/ref=sr_1_4?ie=UTF8&qid=1322476712&sr=8-4 - another interesting read, unfortunatelly i cannot give details because i haven't had the time to read it all.
Here's pretty much your most basic flow for problem 3:
To troubleshoot, use a debugger (Eclipse's builtin is nice). If you feel it's taking too long, break the program's execution and check its state. Is the loop counter about where it should be? Are the found divisors plausible? Is the loop end target plausible? Set a breakpoint on the first line inside the loop and keep stepping through (either one line at a time if you like, or just hit 'resume' and it will break again at the top of the next loop iteration).
I learned Java throughout college, as it was the primary teaching language. Honestly, the best way to learn is just to WRITE CODE. Solve problems that you don't know how to solve. Invent random things that are useful to you. Your code doesn't have to be perfect when you're learning (and it definitely won't be!), and what is important is that you constantly look for ways to improve. I want you to look back on code you've written a year ago, and think that it's absolute crap - that will show that you are learning and improving.
Somewhat counter-intuitively, the best resources are books! I'll list some recommendations below.
Keep these principles in mind:
Once you start getting a feel for Java, which I think you might be already, Effective Java is the book. You probably won't need any other resource for Java itself.
If you need help with something, Reddit is a great place, but Stack Overflow is the programmer's mecca. The best resource on the web for just about everything software, and their sister sites cover other topics, from IT/sysadmin to gaming.
I haven't read it, but it looks pretty good. I can personally vouch for this book however:
https://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319/ref=pd_sim_14_2?_encoding=UTF8&psc=1&refRID=C8KMCKES83EHGXS3VWSQ
It's truly amazing. I'm currently an EE PhD student but I had a pretty limited background in digital hardware and computer architecture, and I read most of this book just out of interest a little while ago and frankly learned quite a bit. It's written at a very readable level for anyone with almost no prior knowledge, yet gets technical when it needs to. It's very thorough, but approaches the topics at a wonderful and easy pace with very clear explanations. The author even says you can skip some of the more technical details if they're not of interest to you, and you'll still end up learning quite a lot. The book you posted looks pretty similar, so I'd say it's worth a shot.
The amount of planning you have to do scales with the complexity of the project.
Professors drill the importance of planning, documentation and unit testing into students because it is extremely important and once you start your career if you're a poor planner it's going to come back to haunt you.
However, when you're working on a simple project that's not intended for public release you don't have to go overboard with docs unless you just want to practice.
My own process usually starts with me jotting down an idea; I find that writing it out helps me to get a better grasp on the overall feasibility.
Once I'm satisfied that I actually have something I can implement I'll diagram the flow of the application, and maybe do some wire-frames.
I usually find that this is enough of a launching pad for a simple personal project.
Professional projects are a different ballgame, because as I said, the amount of planning you have to do scales with the complexity and size of the project. It's in the professional environment that all of the things your professors are teaching you will become really important.
So, to answer what I think was your question,
>So how does one end up with 20 classes connected with each other perfectly and a build file that set everything up working flawlessly with unit test methods that check every aspect of the application?
This comes about more in the implementation phase than the planning phase. I've heard it said that in war "no plan survives contact with the enemy" and you'll find this to be true in software development as well. Even when you plan really well you'll sometimes have to go back to the drawing board and come up with a new plan, but that's just part of the process.
Some books that I recommend on the topic are Hackers and Painters - Paul Grahm and I think every software dev should have a copy of Design Patterns
The former is a collection of essays that might give you some useful perspective on the process of writing software.
The latter is more of a reference book, but it's helpful to become familiar with the patterns covered in the book so that you don't find yourself re-inventing the wheel every time you begin a new project.
As for the other part of your question (apologies for addressing them out of order)
>My new "bottleneck" writing code is the structure. I end up having huge classes with way to many public methods. I might as well just write a script with everything in one file. Almost anyway.. I try to write OO, but I often get lazy and just end up with not very elegant systems I would say.
Don't be lazy, because as you're already seeing, it comes back to bite you in the ass.
As you're writing your code you have to be mindful of the complexity of the project as it grows around you, and you have to periodically take a step back and look at what you've created, and re-organize it. This kind of goes back to what I was saying earlier about no plan surviving enemy contact.
So when you find yourself creating a new class that you hadn't thought about, be mindful of where you put it.
Should you create a new file (yes, of course you should), new folder?
Do you have a bunch of similar classes doing the same thing? Should they inherit from one another?
Be especially mindful of copy and pasting from one are of your code to another, generally speaking if you're doing this you should probably be writing a function, or using inheritance.
It's up to you as the developer to make sure your project is organized, and now-a-days it's really easy to learn how to best organize code by looking through other peoples projects on github, so there's really no excuse for it.
Hope that helps, good luck.
Sometimes depression isn't about circumstances or perspective, it's entirely chemical.
I've thought of the answers to these questions hundreds of times. But the thing about depression is that sometimes it makes it impossible to find a happy answer.
Here are my thoughts:
I'll try to keep the philosophical bit short, because I could really lose myself in a rant otherwise :p
I think that an existentialist philosophy can have a lot to offer on a human level. In a way, everything is functionally meaningless, in the sense that so much meaning is beyond our understanding. However, I think that rather than a complete lack of meaning in the universe, meaning is an inherent part of the universe.
To quote Hofstadter (taken from P-6 of Godel, Escher, Bach:
>Shouldn't meanings that one choooses to read into strings of meaningless symbols be totally without consequence?
>Something very strange thus emerges from the Godelian loop: the revelation of the causal power of meaning in a rule-bound but meaning-free universe.
Basically, by the way that matter is related to other matter, meaning emerges from even the tiniest connection. And that meaning can push matter around in the same way that matter can cause meaning. Ideas are not only meaningful, they have causative power. I think that's pretty cool!
So basically, I agree that humans may have insignificance on some scale. But in the grand scheme of things, there is something so much more magnificent that we are a part of.
Anyway...while I appreciate your thoughts and respect your desire to help others, I think that you are a bit misinformed. That's okay! It's nearly impossible for someone who hasn't experienced depression to know what it's like. But there are ways to better understand and help. Here is a great resource from /r/SuicideWatch that shares some ways that you can connect with depressed or suicidal people. I think it may help a lot!
Oh, and sorry for what turned out to be a philosophical rant anyways; I just can't resist invoking Hofstadter and isomorphism in the face of existentialism :p
I started by showing my son Scratch when he was 9.5yo and helping him make a couple of arcade games with it. He was never all that interested in Logo, but got really turned on by Scratch. After a couple of months he was frustrated by Scratch's limitations, and so I installed Ubuntu on an old computer, showed him pygame/python and worked through a couple of online tutorials with him, and let him loose.
He learned to use Audacity to edit files from Newgrounds, and Gimp to edit downloaded graphics as well as create his own. He made a walk around, rpg-like adventure game, a 2D platformer, and then decided he wanted to learn pyggel and has been working on a 3D fps since last summer.
Soon, I'm going to get him started on C++ so we can work through a book on game AI (which uses C++ for all its examples). He's 13.5 now, and thinks programming is great and wants to grow up to be a programmer like his mom :)
I highly recommend a simple language like python for a beginner, but Scratch is wonderful for learning all the basic concepts like flow control, variables, objects, events, etc, in a very fun and easy way. The Scratch web site also makes it simple to share or show off because when you upload your program it gets turned into a Java applet, so anyone with a browser can see what you've done.
I recommend reading: The User Illusion by Tor Norretranders, Gödel, Escher, Bach by Douglas R. Hofstadter, and I Am a Strange Loop also by Douglas R. Hofstadter for some interesting reading on the subject (Warning: Gödel, Escher, Bach isn't for everyone- it's a bit strange, but I love it). I read a lot of books on science in general and, based on that, it seems like many believe consciousness and also free will is just an illusion. In fact, just a few days ago, physicist Brian Greene sorta-kinda said as much in his AMA - granted, he's talking specifically about free will and not consciousness per se, but I think the two must be very related.
I, too, believe in God and also have a very strong belief in and enthusiasm for science, so this is an especially fascinating question for me.
BTW: if you're interested in the way the brain works in general, I highly recommend How the Mind Works by Steven Pinker.
I think this might be beyond what you're looking for, but I really enjoyed Pattern Recognition and Machine Learning. It's very heavy on statistics, and if you're looking into machine learning methods, it has a wonderful amount of mathematical information given in a fairly clear manner. It might be severe overkill if this isn't your field, but I thought I'd mention it since you said AI.
For AI in general, I see Artificial Intelligence: A Modern Approach used a lot. It gives some solid basic concepts, and will be helpful in getting you started writing basic AI in your applications.
I can't really recommend discrete math because, despite enjoying it quite a bit, I haven't found a textbook that I like enough to endorse. My textbook for it in college was by Rosen, and I despised it.
edit:
Just double checked it, and I would stay far away from the first recommendation unless you have a very extensive knowledge of sophisticated statistics. I like it because it gives the math that other books gloss over, but it's not good for an introduction to the subject. It's almost like going through a bunch of published papers on some new cutting edge methods. The ever popular Machine Learning by Thomas Mitchell is a much better introduction to machine learning. If you want to obtain the mathematical depth necessary for your own research into the field, go with the other book after you've gotten acquainted with the material. I'll leave my suggestion up anyway in case anyone here might find it interesting.
One book that i didnt see mentioned in a casual skim of the posts is Off to be the Wizard
https://www.amazon.co.uk/Off-Be-Wizard-Magic-2-0/dp/1612184715
A very silly series where a modern day guy ends up in an alternate dimension where he can do magic/control the world via programming. Super light reads, fun and funny, and pulls in your computer interest. If you enjoy the first one, you can pick up the others.
If you want something a bit meatier, check out some Douglas Hofstadter.
Le Ton Beau de Marot (it's in English) is about the process and problems of translating languages, and makes surprisingly good bathroom reading because the chapters are short. He starts the scope small, talking about whether to focus on literal meaning or the spirit of the words, and then brings in more concepts like artificial constraints (poetry, or even writing without certain letters, for one example). It is philosophical, informative, and amusing. https://www.amazon.co.uk/dp/B012HVQ1R0/ref=cm_sw_r_cp_awdb_L2sgAbDYFK1XK
He also wrote Godel Escher Bach: an Eternal Golden Braid. https://www.amazon.co.uk/dp/0465026567/ref=cm_sw_r_cp_awdb_b3sgAbQ79TTGS better writers than I have written reviews (this one is from Amazon)
>Twenty years after it topped the bestseller charts, Douglas R Hofstadter's Gödel, Escher, Bach: An Eternal Golden Braid is still something of a marvel. Besides being a profound and entertaining meditation on human thought and creativity, this book looks at the surprising points of contact between the music of Bach, the artwork of Escher, and the mathematics of Gödel. It also looks at the prospects for computers and artificial intelligence (AI) for mimicking human thought. For the general reader and the computer techie alike, this book still sets a standard for thinking about the future of computers and their relation to the way we think.
Sure I really enjoy these podcasts.
As for books , here are some tech books I have read and enjoyed:
There's obviously a ton of other books, but those immediately come to mind.
These are books I actually own and would recommend. Of course there are other great/better books out there, but I'm going to stick with what I've actually bought and read or "read".
I say "read" because several books are NOT meant to be read cover-to-cover. These typically have about 1/3 that you should read like normal, and then skim the rest and know what's in the rest so that you can quickly reference it. These books are no less important, and often even more important. I've marked these kind of books as #ref for "read for reference". Normal books that should be read cover-to-cover are marked #read
For learning your first language: This is really the hardest part and unfortunately I don't have any books here I can vouch for. I started with "C++ for Dummies" and am not including a link because it's bad. Your best bet is probably "Learning <language>" by Oreily. I also love the Oreily pocket books because you can carry them and skim while on the bus or the john, but you can just do the same with your smartphone. Pocket Python, Pocket Java, Pocket C++
Top Recommendations:
Accelerated C++ #read Made for people who already know another language and want to pickup C++. Also great for people who need a refresher on C++. I really like how it doesn't start with OOP but gets you familiar with the imperative parts of C++ before diving into OOP.
The Algorithm Design Manual #ref This is my new favorite book and the first I would send back in time to myself if I could. Each algorithm & data structure is given a mathematical breakdown, pseudocode, implementation in very readable C, a picture (very helpful), and an interesting war story of how it Saved The Day.
Cracking the Coding Interview #read I originally avoided this book like the plague because it represented everything I hate about coding interviews, but many interviewers pull questions straight from this book so this book can equal getting a job. Put that way, it's ROI is insane.
The Pragmatic Programmer #read Must-have for any profressional software engineer that covers best-practices for code and your growth. You can also find the raw tips list here
Head First Design Patterns #read Many prefer the "GoF/Gang of Four" Design Patterns which is more iconic, but Head First is a modern-version using Java to cover actual design patterns used day-to-day by programmers.
For Intermediates:
Effective Java or Effective C++ and Effective Modern C++ #read When you're ready to go deep into one language, these books will give you a huge boost to writing good Java and C++.
Design Patterns #ref You'll want to get this at some point, but early on it's too much for a beginner and many of the patterns are obsolete.
The Art of Computer Programming #ref The programming "bible" but like Design Patterns you should hold off on this iconic book until you've got your basics covered. It would make for a great purchase with your first paycheck or first promotion :)
I'm a ~10 year sysadmin that has decided to rebuild my software dev skills that I haven't used since college. Here's what I did to reawaken that part of my brain:
3.5) After going through the last chapters of C Primer Plus, I realized that some of my math skills were not up to par, so I took this MOOC from MIT to supplement that. No idea if that's something you need.
I'll start off with some titles that might not be so apparent:
Unexpected Fundamentals
These 2 books provide much needed information about making reusable patterns and objects. These are life saving things! They are not language dependent. You need to know how to do these patterns, and it shouldn't be too hard to figure out how to implement them in your chosen language.
&nbsp;
Good General book
&nbsp;
This book is great if you're going to make a browser based game
&nbsp;
General Knowledge books
&nbsp;
Provide a working moveable 3D model with C++ and DirectX, Very cool.
&nbsp;
More general game base building
&nbsp;
Working product results books, little if any modification needed
Releasing in a couple months (hopefully) 2 Very good books using C++ to develop by.
&nbsp;
Not presented in the best manner but still noteworthy:
&nbsp;
I used to love XNA...but now it's not feasible for commercial development. If you're a beginner to game design...starting out with XNA might actually be useful. It's easy to pickup and put out a working product. XNA is C#
&nbsp;
&nbsp;
Working product books, modification needed to make run on current systems
Provides a working FPS game in C++ on DirectX 9. Good for some starting out knowledge for an FPS
&nbsp;
Good for 3D Terrain rendering in DX9...however much of this is outdated...some concepts still apply, and it's not the worst idea to see a working example.
&nbsp;
TLDR: Click links starting at top, buy, read, profit
This is one of those questions that basically yields no useful answers.
Not sure if I answered your question. You can checkout topics on design patterns. there's a good book on Amazon for that I think just called design patterns. Or checkout tutorialpoint.Com ? For software architecture.
Design Patterns Book
Software Architecture Quick Overview
edit: added resources and changed format
Thank you so much for your reply. I actually do plan on taking Andrew Ng's course just cause the book I am talking about is very limited to Python but I've heard great things about it. However, the Stanford course I was referring to was the Statistical Learning course based on the ISL book.
Yes I plan on doing some kaggle challenges once I feel comfortable with my skills to build up my portfolio or see if I can find some other novel projects to work on.
Ideally I'd like to be in a data science consultancy type role where I get to work on different kinds of projects and don't necessarily need very specialized domain knowledge. But at this point I think more direction as to what kind of roles exits would also be helpful. I just don't know what the field is actually like and I've never really met anyone doing data science for a living.
Thank you again for your reply. It was very helpful.
Here are my two big enthusiastic suggestions.
Sign up to Lynda. You get a 10-day free trial, and then it's $30 for a month. Watch Foundations of Programming: Fundamentals (4h 47m), Foundations of Programming: Data Structures (2h 29m), Programming Fundamentals in the Real World (3h 8m), and Python 3 Essential Training (6h 36m).
These are short crash courses, you obviously don't walk away a full-on programmer. But the main instructor Simon Allardice is excellent at explaining the basics and fundamentals in a very clear, natural way. I have taken university courses, I have watched MIT and Harvard courses, I have used a dozen tutorial sites and watched a bunch of lecturers and read three dozen books: the Lynda programs I linked are the best first-intro things I've seen. I strongly recommend that you watch at least the first two.
You might not understand it all, that's fine. Don't worry about what languages he uses for examples, 90% of stuff carries over between languages. If you can absorb a good chunk of that material it'll be a huge boost for you and a nice foundation to start on. You'll walk into your first real class in a better headspace, already knowing the gist of what you're going to flesh out and properly sink your teeth into. And if you find that the Lynda stuff really works well, look up their C and databases courses, since you'll wind up using that stuff too.
My second recommendation is that you buy and read Charles Petzold's wonderful book Code: The Hidden Language of Computer Hardware and Software. This book doesn't focus on a specific programming language, or how to implement programs, or the mathematics, or the theory. Instead it answers "So what is a computer actually doing under there when we program? What's a CPU, exactly, how does it understand words and numbers?" It does this in a very natural, accessible, for-the-layman way, starting with really simple analogies about signal flags and morse code, and before you know it, bam, you understand logic gates and binary arithmetic, and a lot of the mystery and magic of computers has dissolved for you.
> Can you give me any more info on what types of things you simulate
There are so many different things. One example that involves physical simulation is rendering. Rendering, turning a 3d description of a scene into a 2d image, is all about simulating the pysics of light transport. Given a set of lights and surfaces you simulate how light bounces around and what a virtual observer placed somewhere in the scene would see. Another example is explosions. Cool/realistic looking explosions for movies involve simulating burning materials, fluid/gas movement, sound propagation, fracture, plastic/non-plastic deformation, the list goes on and on.
Here are some books that might get you started in the right direction
As for programming languages, you're definitely going to need to learn C/C++. Graphics applications are very resource initensive, so it's important to use a fast language. You'll probably also want to learn a couple of scripting languages like python or perl. You'll also need to learn some graphics API's like OpenGL or DirectX if you're on Windows.
I hope this helped!
I personally really benefitted from Jose Portilla's udemy class on python for Data Science: https://www.udemy.com/python-for-data-science-and-machine-learning-bootcamp. It deals with the machine learning algorithms at a pretty basic level but he does a good job overviewing things and this course personally gave me more confidence. He also wrote a helpful overview for how to become a data scientist: https://medium.com/@josemarcialportilla/how-to-become-a-data-scientist-2d829fa33aba
Additionally, I found this podcast episode from Chris Albon helpful: http://partiallyderivative.com/podcast/2017/03/28/learning-machine-learning
Finally, I have just started going through Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems and I love it. It's very easy to read and applicable: https://www.amazon.com/dp/1491962291/_encoding=UTF8?coliid=I1VIM81L3W5JUY&amp;colid=2MMQRCAEOFBAX
Hope this helps.
For programming, what kind of programming is he into? Here are some cool programming books and things:
Well if you want to be the next Carmack, get cracking! :) You have a lot of ground to cover, such as: mathematics (matrices, linear algebra, etc), physics, artificial intelligence, real-time processing, multithreading, architecture, networking and protocols, rendering, sound, and much more!
It is certainly possible with enough time and dedication to develop your own engine. It's just that there are so many excellent engines already out there, that you would be competing with projects that have already invested many thousands of hours and have loads of titles already developed for them. Why not get involved with an existing project to start?
BTW I really like your idea of creating a FPS with one room and focusing on making that environment the richest possible, exploiting a wide variety of techniques. Do it!!
Is your ultimate goal to create an engine? Or to create a game? Remember, the engine is in many ways a means to an end - it's not much use without a game that uses it!
Either way, I think you would be well advised to get involved with one of the open source game engine projects, and start contributing. Once you've learned how they work, you will be in a much better position to design your own. And realistically, you can't really just design an engine without a game - you need to know how games work in the first place, and what features and architectural decisions and designs make for a good engine.
Consider joining:
Here's a list of good books to get you started:
The Reddit /r/gamedev wiki has a great list of resources:
There are lots of great videos on YouTube featuring Carmack's gamedev talks, so I highly recommend watching those too.
You can start out with some of the New Boston Android Tutorials - http://www.youtube.com/playlist?list=PL3D7BFF1DDBDAAFE5
The best way to learn is to pick a project and see it through to fruition. Go with something simple but not too simple, I'd recommend trying to make your own clone of this tip calculator. Don't just make it kind-of work, get it to where you would be proud to release it.
If you are completely new to programming it will be slow going at first, but there is no better time to learn than now, with some google searches you can find hundreds of free online programming courses (MIT Open Courseware and UC Berkley should get you started). You can google just about any problem and find someone who has encountered it before and solved it, sites like stackoverflow have hit the mainstream with programmers and it has become far easier to disseminate and learn best practices.
Also, Code by Charles Petzold is by far the best introduction I have ever read on computing theory, it has very little to do with conventional programming though.
Shamelessly stealing from one of the Amazon reviews:
> The average person who uses a computer to surf the web or type letters has so little knowledge of the underlying technology he or she is using that it may as well be magic. Even programmers, who typically spend their days solving problems with the high-end abstractedness of object-orientation, may be more than a little unclear about what's actually going on inside the box when their compiled code is running.
Petzold attempts, and largely succeeds at, writing a book that leaves the reasonably intelligent layperson with a thorough comprehension of each layer that comprises a modern electronic computer (binary coding -> electronic representation -> transistors -> logic gates -> integrated circuits -> microprocessors -> opcodes -> assembly language -> high-level language -> applications). At times, the reader must follow along carefully, but Petzold tries to avoid needless complication.
Depends on what you are interested in.
If you are interested in games, pick a game and do it. Most board games are not that hard to do a command line version. A game with graphics, input, and sound isn't too bad either if you use something like Allegro or SDL. Also XNA if you are on windows. A lot of neat tutorials have been posted about that recently.
If you are more interested in little utilities that do things, you'll want to look at a GUI library, like wxWidgets, Qt and the sort. Both Windows and Mac have their own GUI libraries not sure what Windows' is called, but I think you have to write it with C++/CLI or C#, and Mac is Cocoa which uses Objective-C. So if you want to stick to basic C++ you'll want to stick to the first two.
Sometimes I just pick up a book and start reading to get ideas.
This is a really simple Game AI book that is pretty geared towards beginners. http://www.amazon.com/Programming-Game-Example-Mat-Buckland/dp/1556220782/
I enjoyed this book on AI, but it is much more advanced and might be kind of hard for a beginner. Although, when I was first starting, I liked getting in over my head once in a while. http://www.amazon.com/Artificial-Intelligence-Modern-Approach-2nd/dp/0137903952/
Interesting topics to look up.
Data Structures
Algorithms
Artificial Intelligence
Computer Vision
Computer Graphics
If you look at even simple books in these subjects, you will usually find tons of small manageable programs that are fun to write.
EDIT: Almost forgot, I think a lot of these are Java based, but you can usually find a way to do it in C++. http://nifty.stanford.edu/ I think I write Breakout whenever I am playing with a new language. heh
> Why?
First, let's define what we're talking about. I haven't seen this Anime, but it shows someone jacking into VR via a plug in the back of their neck. So for the purposes of discussion, let's assume this is Matrix-level VR. That means a virtual reality that is literally indistinguishable from actual reality, via a plug into the back of the head.
In this exact form, this is impossible (see below). More extreme mechanisms (brain in a jar) might be possible, but that's currently a total unknown. And we won't have the tech for hundreds of years.
Why?
For starters, our command of biology is currently profoundly limited and progress is slow. We're justifiable proud of how far we've come, but we're still crude butchers. In the last few hundred years we discovered anesthesia and antibiotics, but we still fix people but cutting them with knives and literally sewing them back together. We don't understand how the brain works, much less have the ability to send it accurate information or read information back out.
So what do we need for Matrix-level VR? We need to:
Intercept and replace all information going into and out of the brain non-destructively.
This in itself is probably impossible. It's likely that Matrix-level VR will require removing the brain from a body, or severing the spinal column and doing all manner of damage to the face. Note that in the Matrix (and the OP's video), we bypass the senses via a plug in the back of the neck, with the idea being that you intercept all data going to and from the body via the spinal chord. But most of the data going into and out of your brain doesn't happen via the spinal chord. The eyes, ears, and nose have direct links to your brain. For instance, a jack on the back of your neck can't intercept and replace signals coming from your optic nerve. So that's just a bit of science-fiction fantasy that makes for convenient story telling, like faster than light travel.
Perfectly replicate the entire nervous system and musculature of the human body in a computer and flawlessly simulate bidirectional nerve impulses to the brain.
In the Matrix, you can feel every muscle in your body. You can feel that hot sauce you just ate, or the need to take a shit or a piss. Your entire body is simulated and the nerve impulses going to the brain are indistinguishable from those of a real body. Moreover, the entire network of nerve firings required to say, walk, is flawlessly interpreted by the virtual body -- contracting all the correct muscle fibers and resulting in you having the grace to dance, or do Kung Fu.
We're probably 50 years to even having the compute power to model that, much less the technology to perfectly interface it with a human brain, assuming we had the brain in a vat and didn't have to figure out how to intercept all replace those nerve signals without harming the person.
Perfectly replicate the entire world in a machine.
We've been working on computer graphics for over 50 years, and we still haven't achieved real time photorealism, especially in stereo, at retinal resolution, at human max FOV. Let's say we get there in the next 20 years or so (highly optimistic), now we have the surface of things. We can render an apple that looks 100% convincing. The next 100 years or so will be doing everything else.
What's inside the apple? In order to full simulate what can happen to an apple -- how it responds to a knife, or a tooth, the skin of it, the juice inside, the way it will bruise or rot, what a slice of it looks like under a microscope, so on and so forth, you have to simulate it from first principles. Now imagine that you also have to perfectly replicate the way it feels in a virtual mouth, and the way it tastes, the way it smells. Again, you have to simulate it from first principles. You have basically build a model of the entire chemistry of an apple (not to mention perfectly simulate bacteria) to cover all possible cases.
And that's just an apple. What about everything else? We basically need to be able to simulate a universe from first principles. We don't even know if that's possible. Clock speeds for our current technology stagnated a decade ago. We're about to run into a quantum mechanical limitation for transistor size. We assume we'll find a way around it, but that's currently unknown.
We know that computing power has been rising exponentially, and we expect it to continue to do so for a while, but there's no guarantee that it will do so forever. Bacteria in a petri dish multiply exponentially, too. If some early generation noticed this trend, they might be tempted to imagine that the bacteria will eventually take over the entire universe. But their exponential growth hits a hard limit (when they run out of space/food). It could be there's a similar limit on computing power. We don't know. In any case, the kind of power we need for the Matrix is at best centuries away, if it will ever exist at all. That's not even counting the biological engineering involved.
There's only one way I could see it coming any sooner (again, assuming it's even possible): we develop a superhuman AI which can do our research for us at vastly accelerated subjective timeframes. But then we have much bigger problems.
>What brought you that position?
My path included convert parents, BIC, a very happy childhood in a huge loving family, RM, 30ish years TBM, 10 years agnostic (closeted 7 or 8 years), going on I think 7ish years now as a Christian. I'm bookish - PhD in engineering. My agnostic period kind of grew out of the full term surprise stillbirth of our second child. I was already starting to question BoM historicity, I had issues with the whole "I know the church is true" thing / epistemology, and it was a fairly quick worldview failure after that. Then with the discovering church history. You get the idea. During my agnostic period, I held a position pretty much identical to what I hear you describing: I cared deeply about truth (still do, very much), I knew logic works (still a big fan, but more aware of its limits now), eventually felt called to try to 'get off the fence' of agnosticism, if I could do it authentically. My approach was to start reading more. Things I read: Godel Escher Bach, Tolstoy's War and Peace, Pilgrim's Progress, a bunch of CS Lewis, the bible in modern english, and a bunch of other stuff I can't remember. Some things that most impressed me about the bible: stories about what goes on in people's hearts that I could see in myself, in my loved ones, and around me in the world, the coherence of the entire narrative around the theme of redemption, the concept of Grace implied in God's relationship to His people and later extended to the individual by Paul.
> what makes you believe in the Christian God?
Here is one thing I wrote about that before.
The Machine Intelligence Research Institute is putting out a call for intelligent stories illustrating concepts related to (artificial or natural) intelligence. Guidelines are quite specific; read below.
-Pay Rate: 8c/word, up to 5000 words.
-Multiple Submissions ok
-Simultaneous Submissions ok
-Submissions window: Until July 15
&nbsp;
This call is intended to reward people who write thoughtful and compelling stories about artificial general intelligence, intelligence amplification, or the AI alignment problem. We're looking to appreciate and publicize authors who help readers understand intelligence in the sense of general problem-solving ability, as opposed to thinking of intelligence as a parlor trick for memorizing digits of pi, and who help readers intuit that non-human minds can have all sorts of different non-human preferences while still possessing instrumental intelligence.
The winning stories are intended to show (rather than tell) these ideas to an intellectually curious audience. Conscious attempts to signal that the ideas are weird, wonky, exotic, or of merely academic interest are minuses. We're looking for stories that just take these ideas as reality in the setting of the story and run with them. In all cases, the most important evaluation criterion will just be submissions’ quality as works of fiction; accurately conveying important ideas is no excuse for bad art!
-
To get a good sense of what we're looking for—and how not to waste your time!—we strongly recommend you read some or all of the following.
Superintelligence
Smarter Than Us
Waitbutwhy post 1, Waitbutwhy post 2 (with caveats)
&nbsp;
Withdrawal policy:
After you submit a story, we prefer you don't withdraw it. If you withdraw a story, we won't consider any version of that story in the future. However, if you do need to withdraw a story (because, for example, you have sold exclusive rights elsewhere), please send an e-mail telling us that you need to withdraw ASAP.
&nbsp;
Important Notes:
MIRI is neither a publishing house nor a science fiction magazine and cannot directly publish you. However, MIRI will help link a large number of readers to your story.
We frankly do not know whether being selected by MIRI will qualify as a Professional Sale for purposes of membership in the SFWA. We suspect, through readership numbers and payscale, that it will, but we have not spoken to the SFWA to clarify this.
If you have a work of hypertext fiction you think might be a good fit for this call, please query us to discuss how to submit it.
&nbsp;
How to Contact Us:
To contact us for any reason, write to intelligenceprize@gmail.com with the word QUERY: at the beginning of your subject line. Add a few words to the subject line to indicate what you're querying about.
&nbsp;
(We've discontinued the previous, smaller monthly prize in favor of this more standard 'Publishing House Call' model.)
What about some classics like Uncle Bob talks?
A lot of good design focuses on decoupling and creating components which work together but separately also.
Id first look into language agnostic design principles such as SOLID
https://youtu.be/TMuno5RZNeE
https://youtu.be/zzAdEt3xZ1M (golang)
A good book will help.
https://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882
I still keep a copy of the Gang of Four book at arms reach even tho the popularity is dwindling as OOP is not topic of most convos today. However when dealing with DI and sharable components in Golang, i find myself still falling back to abstract factories patterns. Its a good breathe of knowledge to at least glimpse at the patterns here:
Design Patterns: Elements of Reusable Object-Oriented Software https://www.amazon.com/dp/0201633612/ref=cm_sw_r_cp_api_i_7BzvDbXJC5WS4
Once you get thru these topics, you can start picking up what I consider the "hot" architectures of today:
Microservices, event based systems, domain driven design, event sourcing, and architectures aiding themselves to functional programming.
I really enjoy reading Martin Fowler blog posts: https://martinfowler.com/tags/application%20architecture.html
He covers a lot of these topics.
PS: maybe a niche and a personal favorite of mine but ive learned ALOT by researching the different types of kernel architectures. Nothing really geeks me out like those topic, but not for everyone.
A fabulous free course exists on these topics:
https://www.udacity.com/course/advanced-operating-systems--ud189
My advice is:
I was in the same boat as you 3 years ago. CS degree anddd thousands applying for the same position as me. Luckily i got a job in the end. Started applying 3 months before i finished my exam. Ended up getting a job straight after finishing my exams as a junior software developer. :)
I was hoping to get specifically into crypto/privacy. I've been learning from these books:
and supplementing that with the Coursera Cryptography I class
my eventual goal is to do either information security or penetration testing, but pen testing seems like one of those jobs that sounds great and seems so cool that everyone wants it. Like the job equivalent of planning on being a rock star.
I've got a working knowledge at least of Java, but no programs to show for it yet (which was the source of my wanting this advice here.)
Also, I have been doing this without a college, and don't really plan on going to college at any point soon.
I do want to look into certifications, they were something I've had an eye on, but the opinions on their use is so varied on them I just figured I'd wait to get them until after I had a working knowledge base, then just blow through them to have the piece of paper.
I've read around that the CISSP takes 5 years to take credit for, and the associates is like 3 or so. While I do want the most laudable one (i've read the DoD/Gov'ts cert requirements and it cares a LOT about the CISSP), That would mean 3-5 years of a catch-22 of not having the job to get the CISSP exp. with, because it would be my only cert so far and I can't take credit for it, therefore I have no certs and can't get exp.
I've messed around with backtrack and armitage, and got through as much of Hacking Exposed (6th edition) to know at least the process, but haven't applied any of it and it seemed like it might be better to learn how things work before subverting details and breaking protocols for fun and profit.
I do plan on getting the CISSP, but I'm not gonna start that process until I already have a job in the field i can use as experience to get more jobs, otherwise I'll just be sitting on my hands.
Does that all seem alright, or do you have any advice? Sorry for talking your ear off, if that's what i did just now.
I can't offer you a lot in the way of non-fiction. If you haven't read it, Gödel, Escher, Bach by Douglas Hofstadter is a good read. It is very dense and slow reading, but can be rewarding. If you like computer science, biology, math, or music in any combination, this could be a good book for you.
The secret to picking good non-fiction is to find something you're interested in or curious about and read a book about it. Things like neuro-linguistic programming, cryptography, riding horses, biking, running, cacti of the saguaro desert, Trees of the Eastern Forests, Scuba diving, Lockpicking, Prestidigitation (aka "magic tricks"), etc.
Of other books I've loved but could not mention in my top 3, I include:
That's all I can think of right now at work, but if you want more, PM me and I'll see what I can dig up.
> I learned in android studio and got to a level where I could create an app. Little did I know, it was just a giant main activity with 100s f methods. My friend looked at the code and told me I needed to learn polymorphism. Now I've redone the code so it's all inclasses.
Yep. This is a really common stage in learning. It sounds like you maybe went overboard and created way more classes than you needed.
Next time I might suggest starting with your working program with just one big activity, then splitting things into separate classes one at a time.
> Well, I have to keep updating a bunch of list sizes to the main activity, but the problem is that the polymorphism has the list sizes passing through 4 classes to get to the main activity. So, I set up a bunch of interfaces that react when things are done within the classes all encapsuled in one overreaching class. I don't know.
I think you're blaming "polymorphism", but polymorphism is just a tool. You can use it to make good designs or bad designs. It's quite easy to use it to design something that's cumbersome and less effective than if you had no classes at all, and it sounds like that may have happened here. It doesn't mean there's something wrong with polymorphism.
I think you're conflating two separate things here: (1) what's a good design, and (2) how do you make your code work.
(2) is easy. If you post a small, complete program that doesn't work, we can help you understand why. We can't do that if you just post vague questions and snippets of code.
(1) is hard. This takes years to get the hang of, and a lifetime to master. At a good software company you'd learn this slowly by mentorship - you'd have a senior programmer reviewing every change you make and guiding you through the design one feature at a time. You'd get help organizing your code long before it got to hundreds of functions.
If you don't have that option, or even if you do, I'd recommend the book Design Patterns as a way to better understand how to use polymorphism effectively.
You're also welcome to post such questions here, but they have to be very specific. You have to tell us what your app does, in a lot of detail, and how you've organized it into methods. I can't give you advice on something vague like "passing list sizes to the main activity" because I don't understand the purpose of the lists, the purpose of the sizes, or the purpose of passing them to the main activity.
You can always read books. Textbooks are much better to read when you're free to browse and pick out whichever ones you like. You can get a surprising amount of reading done just by reading on the bus, on the can, and whenever you've got nothing better to do.
A popular stack overflow answer has a pretty good list. You can preview the introduction of most books on amazon.
People like to champion the internet as "oh, you can learn anything on the internet!" Which is true. But you can learn it much faster and better from a book (generally speaking).
Books provide a long format which has a chance to build upon itself. Also, everything is collected in one place for easy access. More developers ought to sit down and read good books.
How about a new language? Or writing software rather than web stuff?
After doing webdev for a while I got in to offline Java (software) development. Aside from helping me tighten my grasp on real OOP, it also caused me to shift/alter some of my design patterns. Java itself can be extremely cumbersome to write, but the process of doing so definitely made me stop and re-think exactly HOW much I want to leverage the flexibility of other languages.
Writing Java also lead me to read books, from which I took lessons I could apply to other languages. These books in particular were helpful:
http://www.amazon.ca/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612
http://www.barnesandnoble.com/w/data-structures-and-algorithms-in-java-robert-lafore/1100840388?ean=9780672324536&amp;itm=1&amp;usri=9780672324536
Obviously I'm just using Java as an example. If you already write Java, or feel great about your understanding of OOP, perhaps try a functional programming language? Lisp(Clojure)? Scala? The performance of some of these languages is really wild, and I'd think it would be possible to leverage their speed when working on large SAAS projects.
Anyways, to each their own. Obviously the other comments here have plenty of good suggestions. I know I certainly didn't regret getting familiar with SASS. And I could probably stand to spend time forcing myself to get more proficient with my editor. Right now I use Sublime Text 2, and I could either make sure I'm using all the proper key binds for traversing/manipulating text OR just go boss mode and make the switch to vim. Or even bite the bullet and use an IDE (http://www.jetbrains.com/).
Best of luck!
It's so great you're being so proactive with your learning! It will definitely pay off for you.
I like other's suggestion of Clean Code, but I fear as a first year that it may have mostly flew over my head--not that it would at all hurt to read. For a first year student specifically, I'd recommend either of two books.
Structure & Interpretation of Computer Programs, also known as The Wizard Book and free on the link I just sent you, is a famous textbook formerly used in MIT's Intro to Computer Science course. However, it's conceptually useful to programmers on any level. If you really, seriously read it and do the exercises, it's gonna give you a rock-solid foundation and shoot you ahead of your peers.
It uses Scheme, a quote-on-quote "useless" programming language for any real-world purpose. That's arguable, but the important thing about the book is that it's really edifying for a programmer. The skill it helps you develop is not the kind that will directly show on your resume, it's nothing you can point to, but it's the kind of skill that will show in your code and how you think and approach problems in general. That said, the book has exercises and the MIT site I linked you to has labs that you could potentially show off on your github.
Code: The Hidden Language of Hardware and Software is much more approachable, is not marketed specifically for programmers, and does not contain any exercises. Read it, though, and you'll find you have a huge boost in understanding the low-level computing classes that your classmates will struggle with. What is basically does is show the reader how one can build a computer, step by step, from the very basics of logic and switches. It's readable and written for a casual audience, so you may find it easier to motivate yourself to finish it.
SICP and Code, despite both being extremely popular, can be a bit difficult conceptually. If you don't fully understand something, try reading it again, and if you still don't understand it, it's fine. Everyone experiences that sometimes. It's okay to move forward as long as you feel like you mostly get the topic. Don't let the perfect be the enemy of the good.
Best of luck to you, and be excited! It's thrilling stuff.
Hi PizzaPartify,
I believe that different companies/teams will place emphasis on different skills. When I was helping to hire software engineers for EA's motion capture studio, I liked to see candidates who showed a strong aptitude for engineering code to be maintainable. For me, this meant a familiarity with design patterns and software development processes (like Test Driven Development or Extreme Programming). In my department, much of our code was in C++ and Python. However, other departments would use languages like Java, C# or ActionScript - depending on the project.
It would be helpful to know what role you are applying to.
To answer your specific questions:
Regardless of the language you're working in, I would also recommend Design Patterns by the gang of four (http://www.amazon.ca/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612).
A game-specific recommendation is Game Engine Architecture by Jason Gregory (http://www.amazon.ca/Game-Engine-Architecture-Jason-Gregory/dp/1568814135). It doesn't matter if you intend to write an engine or not, it is immensely helpful to understand how they work.
I own all of the Game Programming Gems books but use them more as a reference library. The books above will be more helpful right now.
I hope that helps.
Objects are for grouping related data and methods together. It really is as simple as that.
Start off by writing applications where you're just creating and consuming objects, not writing your own classes. Java and .NET both have tons of libraries that contain a wide assortment of objects. You mentioned C#, so write a few .NET apps. Try to start identifying and understanding the way properties and methods are grouped into objects, and how the different objects relate to each other.
Once you're comfortable using objects, then you can start writing your own classes. A lot of universities try to teach this by having you write common data structures. This approach is worth considering, as it's important to be familiar with data structures, but this isn't the only way to learn object-oriented programming (nor the best, in my opinion). Another commenter recommended writing a video game, which sounds like it's worth a try. Ultimately, the right approach is the one that interests you the most.
Getting good at OOP will take some practice, but it is possible. Objects are like functions: they should do one thing well. Enforce separation of concerns. Learn the design patterns. Practice makes perfect(-ish).
Recommended Reading:
It depends a bit on what areas you're interested in. For interactive graphics you'll likely do OpenGL or DirectX or such.
Non real-time graphics usually means ray tracing or some variant like photon mapping where you want to produce physically correct images, with flexibility depending on your art direction e.g. Big Hero 6. With ray tracing you're essentially simulating how light interacts in the scene.
Here's some useful books/links for real time graphics:
Here's some for ray tracing:
For general math topics I also recently picked up Mathematics for 3D Game Programming and Computer Graphics which looks very good, though I haven't gone through it as thoroughly.
As mentioned already /r/GraphicsProgramming is a good subreddit, there's also /r/opengl for OpenGL questions.
I've seen a lot of people recommending Code: The Hidden Language of Computer Hardware and Software
It is about 15 years old though, so it might seem a little out-of-date in some places/might appear to omit some modern developments in technology and computer science, but the existing content should still be pretty solid.
That being said, I do agree that the best thing to do is to just jump straight in. The best way to gain a good mindset for programming is to just start programming. You'll run head-first into obstacles and bugs, and figuring out how to fix those bugs/avoid those bugs is pretty much how you acquire that sort of mindset.
By biology I don't mean what they teach you in college or med-school, I mean understanding the basic processes (physiology-esque) that underlie living things, and understanding how those systems interact and build into more complex systems. Knowing the names of organs or parts of a cat is completely worthless, understanding the process of gene-activation, and how that enables living organisms to better adapt to their environments, especially, for instance, for stress factors activating responses due to new stimuli, can be very valuable, especially as a function of applied neurology.
Also, what we call biology and medicine today will be so pathetically obsolete in 10 years as to be comical, similar to how most mechanics can rebuild a carburetor, but not design and build a hybrid drivetrain, complete with controller software.
Economics and politics are controversial, but it is a question of seeing the underlying forces that is important, similar to not understanding how gravity works, but still knowing that dropping a lead ball will accelerate downwards at 9.78m/s^2. This is a field that can wait till later though, and probably should.
For systems analysis, I'm sorry but I can't recommend anything. I tended to learn it by experience more than anything.
I think I understand what you are looking for better now though, and think you might be headed in the right direction as it is.
For CS I highly recommend the dragon book, and design patterns, and if you need ASM The worst designed website ever.
For the other fields I tend to wiki subjects then google for papers, so I can't help you there. :(
Best of luck in your travels however! :)
edit: For physics, if your math is bad get both of his books. They break it down well. If your math is better try one of wittens books, but they are kinda tough, guy is a fucking genius.
also, Feynman QED is great, but his other book is awesome just as a happy intellectual read
also try to avoid either kaku and hawking for anything more complicated than primers.
edit no. 9: mit's ocw is win itself.
edit no. 10: Differential equations (prolly take a class depending on your math, they are core to almost all these fields)
It sounds like you have identified your weakness. Presently, that is programming in python, and using the sklearn library.
I would recommend taking a MOOC on python first. Lynda.com has a free trial and python videos. datacamp is another good start. It has a free trial and mayybe some python basics, but definately something on sklearn. and you can get some pandas training or R training there. (the kaggle libraries, most likely).
At that point, if you are going the tensorflow route, Aurelion has a great hands-on book called Learning Tensorflow with sci-kit learn
If you’re going with pyTorch I dunno.
Your mileage is going to vary, you could always use a book to learn python, or whatever.
Just make sure you learn to program first, you’d be surprised how much 2 weeks of very hard work will earn you. Don’t expect it to be ‘easy’ ever tho.
Also, if you’re not formally educated in statisics, keep an eye out for statistics advice until you have the time to work on it. (like in a MOOC, course, or blog). Learning some real analysis will make understanding the papers a real possibility (once again it will probably never be easy)
It is truly stunning how many years of preparation it takes to become competent in this. It’s a lovely science, but the competent ones have generally been on a mathematical/science track since 5th grade. Doesn’t mean we can’t become competent but it takes time. Imagine the equivalent of an undergraduate degree just devoted to ML and you’re about there.
Everybody's learning style is different. Here are some books I believe to be essential for any novice or pro.
Programming For Dummies. It has a stupid title, but it is well reviewed for good reasons. I read through this beast in three weeks. There is no coding involved, as it is mostly theory, but it covers most of the bases of computer science and programming logic. Looking back, much of it confused me at first read, but the big ideas are all presented here. Reading this during the summer before first semester was a huge boost for me. All of the major computer languages are discussed in the book.
Cracking the Coding Interview. A book meant for veterans trying to get into highly demanding top tech companies, the book is a great introduction to programming paradigms. There are numerous examples of problems in each chapter with answers at the back of the book. The whole thing is in Java, with a short chapter on C++.
Design Patterns. As you learn more about object oriented programming, the concept of design is introduced. This book is the holy grail of software architecture and recommended by many. I would hold off acquiring it until you are certain that CS is where you want to be, it is quite technical. This book follows C++, although a Java version of the patterns exists on Github.com
A non-technical book just for fun:
The Innovators is essentially the story of computer science and how it got to present day. It follows the characters, human beings, that were involved each step of the way right up until modern day. Your professors will be impressed that you know who Alan Turing, Grace Hopper, and Charles Babbage were. If only I had been at THE MOTHER OF ALL DEMOS! The actual stories of Microsoft, Apple, The internet, the PC, video games, the space program, etc. On Quiz Up, a trivia app, every other question in the CS category involves names from this book. Read it just to be a real geek that knows where this stuff came from, and the drama/tension that led to innovation. The book is actually really funny at times.
>Oh well, can't work on the past
Everything you do today is tomorrow's past.
You need to do some light learning. I'm linking a few books that are fantastic in accomplishing these goals and I think your english is pretty great, so I don't anticipate so much trouble. I highly recommend doing everything in Java; not because I'm arguing that java is what will help you the most, but because almost every code example from these classics refer to Java.
For design patterns, Gang of Four is a good introduction, but it's a little harder to understand http://amzn.to/1giIrF6
I would recommend finding AND implementing these patterns.
Grab a copy of Code Complete by Steve McConnel and Clean code by Robert Martin. That will help you write nice code that you'd be proud to share with a company.
I just finished reading Code: The Hidden Language of Computer Hardware and Software and will state unequivocally that this book is the most satisfying read I've experienced. It starts with flashlights blinking through windows, moves to Morse code, introduces electrical relays and demonstrates how they can be connected to form logic gates, then uses those gates to construct an ALU/counter/RAM and multiplexors. It goes on to describe the development of an assembly language and the utilization of input and output devices.
This book can be described as knowledge hose flooding the gaps in my understanding of computer hardware/software at an extremely enjoyable pace. It may help satisfy your interest in the concepts and technology that led to modern computers. Check out the reviews for more info.
If you haven't already studied logic gates in depth in your formal education, I would suggest using a logic simulator to actually build the combinational logic structures. I now feel very comfortable with logic gates and have a strong understanding of their application in computing from my time spent building the described logic.
I went through the book very slowly, rereading chapters and sections until I felt confident that I understood the content. I can not recommend this book enough.
After reading CODE, I have been working through The Elements of Computing Systems: Building a Modern Computer from First Principles. If you are looking to gain a better understanding of the functions of hardware components, this is the book to read. This book's companion site http://www.nand2tetris.org has the first chapters free along with the entire open source software suite that is used in the book's projects. You will build, in the hardware design language starting with Nand gates, each logic gate and every part of a computing system up to a modern high level language with which you can program custom software of your own design to compile in a compiler you designed into an assembly language you specified which is turned into binary that runs in a processor you built from Nand gates and flip flops. This book was very challenging before reading CODE, now I feel like I'm simply applying everything I learned in code with even more detail. For somebody that hasn't attended college for computing yet, this has been a life changing experience.
http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319
http://www.amazon.com/The-Elements-Computing-Systems-Principles/dp/0262640686
I think your major problem here is that you want the "why not"s instead of the "why"s. A good programmer can look at a chunk of code and determine "why" the programmer is doing certain things. These pre-extising code blocks that people refer to are given because you should be able to read through it and interpret what's going on and why. The questions you most likely ask at the "interpreting" stage isn't "why" but instead "why that way and not this way?"
Really, when it comes down to it, the answer as to that question for a lot of things in engine programming (or just programming in general) is that it's what the lead designer or lead programmer thought was the best idea.
For instance: How do you want to store your array of tiles? As integers representing tile indexes in a tile set? As separate Tile class instances in a vector array containing vector arrays of Tile instances? As a hashmap indexed using characters to grab a tile? etc. There's a million ways to handle each and every part of an engine, it all comes down to what design patterns and what theories you think are the best for what you need your engine to do.
I suggest reading up on some of the design patterns in here (actual link in the sidebar) and here. They're a great way to start understanding the multitudes of ways of handling different ideas in your engine! Reading up on pre-existing theory or seeing pre-existing pseudo-code is fine and dandy, but sometimes you have to reinvent the wheel. Sometimes, for the most part you can follow a lot of design patterns that already exist.
P.S. For a great tutorial on loading tile maps and working with them in your game, lazyfoo's got you covered (it's in C++ but can easily be adapted for other languages) Here
This book reference gets overused, but Design Patterns was a change for me in the whole way I saw code. It's not easy to get through and some of the patterns take several reads for the lightbulb to go on, but once you get a design pattern, that knowledge stays with you through any OO language, not just C++. The patterns in the book are things that you run into all the time in programming, and that spidey sense of "I've seen this before" when you're working through the design of a problem is beyond valuable. Knowing what the patterns do, and more importantly, when it's appropriate to use them, is a key differentiator especially in C++ programmers.
Technically, for C++ specifically, knowing your stuff on memory management is a highly valuable. Not just how memory management works, but what are the best practices? Who is responsible for freeing the memory if a function creates a new instance of an object and returns a pointer to it? How do you know that all references to an object are out of use once an object is released from memory? (I saw smart pointers referenced before) When do you have to zero-init alloc'd memory and why? alloc, free, calloc, realloc, new, delete, learn all of that memory management stuff.
http://www-math.mit.edu/~rstan/ec/
I'll give you a brief about the book: It's really dense and probably will take you a while to get through just a couple of pages, however, the book introduces a lot of interesting and difficult concepts that you'd definitely see if you pursue the field.
https://math.dartmouth.edu/news-resources/electronic/kpbogart/ComboNoteswHints11-06-04.pdf
Is a Free book available online and is for a real beginner, basically, if you have little to no mathematical background. I will however say something, in Chapter 6, when he talks about group theory, he doesn't really explain it at all (at that point, it would be wise to branch into some good pure math text on group and ring theory).
https://www.amazon.ca/Combinatorics-Techniques-Algorithms-Peter-Cameron/dp/0521457610
This is a fantastic book when it comes to self studying, afaik, the first 12 chapters are a good base for combinatorics and counting in general.
https://www.amazon.ca/Concrete-Mathematics-Foundation-Computer-Science/dp/0201558025
I've heard fantastic reviews about the book and how the topics relate to Math 2 3/4 9. Although I've never actually used the book myself, from the Table of Contents, it appears like it's a basic introduction to counting (a lot lighter than the other books).
Regarding whether or not you can find them online, you certainly can for all of them, the question is whether legally or not. These are all fairly famous books and you shouldn't have trouble getting any one of them. I'm certain you can study Combinatorics without statistics (at least, at a basic level), however, I'm not sure if you can study it without at least a little probability knowledge. I'd recommend going through at least the first couple of chapters of Feller's introduction to Probability Theory and it's Applications. He writes really well and it's fun to read his books.
This is a popular topic but I don't often see a comprehensive answer. I'm by no means an expert and currently learning myself.
There's two key stepping stones before jumping into AI, that being learning Python and data science. Python has wide support and a host of libraries reflecting the latest research on AI development.
There is also R, Octave and Java depending on the libraries you're looking to use, but they aren't nearly as popular as python. Note that if you want to embed your AI scripts into web apps or apps, then you'll need to learn javascript and java respectively.
The best resources for Python are
Great resources can be found here:
The next step is to get a brief grasp of data science. You can learn these from:
I wouldn't recommend codeacadmy since it's dated written in Python v2.x whereas Python 3.6x is more widely used
Then I would consider AI Specific courses found online. Theres two routes again here, there's the heavily academic route that delves into the theory and mathematics then there;s the practical route. Depends on the speed and pace you want to learn at because it's a massive field.
Theoretical
Practical:
First of all, congrats on the promotion and the learning spirit. I wish more managers had your attitude.
I had a similar situation where I went from in-house linguist to loc manager, and I wonder if my experiences might be of use to you. Like you, I definitely did not describe myself as "into programming." I'm still not into that sort of thing. But learning as much of it as I could had a direct benefit to a lot of my daily tasks, and I would recommend at least giving the more learner-friendly tutorial sites a try.
I finished a lot of modules on codecademy.com and genuinely enjoyed them because they were not particularly difficult and also allowed me automate a lot of things and also gain a deeper understanding of how things work. I went through Learn Python the Hard Way and gained a lot from that, especially since subsequent projects included quite a lot of assets in Python. I went so far as to plow through the first half of Code: The Hidden Language of Computer Hardware and Software (the latter half too arcane for me) and found that quite useful as well, although in hindsight it was a bit overkill.
Even after my department was given an actual programmer to code up solutions for us, I at least was able to understand how a good amount of it worked. Coding aside, a localization manager is the person that the linguists and testers go to when things break, and man do they break a lot. That said, I would also recommend training of some sort in SDL and Kilgray's products if you use them. In my experience as manager, both broke often or were fussy at best.
A few years later, I haven't really read much about code, but I still try to ask developers as many questions as I can about the technical aspects of their products and find it really helpful to follow up on Stack Overflow or just Wikipedia.
Good luck with your new position!
Start with a good algorithms book like Introduction to algorithms. You'll also want a good discrete math text. Concrete Mathematics is one that I like, but there are several great alternatives. If you are learning new math, pick up The Princeton Companion To Mathematics, which is a great reference to have around if you find yourself with a gap in your knowledge. Not a seminal text in theoretical CS, but certain to expand your mind, is Purely functional data structures.
On the practice side, pick up a copy of The C programming language. Not only is K&R a classic text, and a great read, it really set the tone for the way that programming has been taught and learned ever since. I also highly recommend Elements of Programming.
Also, since you mention Papadimitriou, take a look at Logicomix.
> If it is about how we interpret our experiences, we can't but fall prey to confirmation bias.
When you're aware of confirmation bias, it is possible to overcome it. That is one of the most important parts of the scientific method: that it accounts for and eliminates the problem of confirmation bias. We can take lessons from the scientific method and eliminate our own confirmation bias in the same way. How we do this is by not trying to prove what we want to prove, but by trying our hardest to disprove it with objective tests, then have others check our work (like peer review).
> Assuming they existed, if you were to witness the sighting of a real ghost, you would likely interpret it as a wisp of fog.
Actually you're just projecting. I would not jump to any conclusions if I didn't know what it was. If it disappeared before I could investigate, the most I could conclude would be that it was something unexplained that disappeared before I could investigate it. I might say that it was probably a wisp of fog and I could back that up with evidence and reasoned arguments, but I wouldn't just jump to whatever conclusion happens to fit my beliefs best as you clearly would.
> It doesn't matter which view you favor, for confirmation bias to occur. It only matters that you favor one over the others and that this can influence your perception of the event.
Which is why, for important conclusions, we should make sure to eliminate confirmation bias using the methods I described above, or some other method that would also take account of it and correct for it. When we do this for any supernatural claims, they quickly fall apart or retreat to the untestable.
> You don't tell people who don't share your basic assumptions to look at things objectively, because it will mean different things to both of you.
I don't think you quite know what the word "objective" means. If it means different things to both of you, it's subjective, not objective.
> If you tell someone who believes in God to look at this objectively, it isn't an unfounded leap at all.
Yes it is. Asserting otherwise doesn't make it true. It does not logically follow from the fact that a dying man sees some light and feels calm that he is seeing an afterlife. There is no logical connection between the two. The only thing that connects them is pure speculation and wishful thinking.
> It is actually very reasonable: A belief predicts that after death my soul will go to heaven, a place filled with light, warmth, and joy. When I am close to death I experience a feeling of floating up, out of my body, and sensations just as predicted by my faith.
Actually I would be willing to bet that near death experiences far out-date religion. But even if not, there is a rather large disconnect between "Seeing some tunnel of light and feeling high" and "Seeing the afterlife". You are making a completely unfounded assumption (actually quite a lot of them), but you are too steeped in your own web of beliefs and superstitions to see it.
The absolute most you can reasonably conclude is that there is an unknown reason why people close to death experience what you described above. To conclude anything above that without demonstrable evidence is unfounded.
> Where a leap of faith is necessary, is at the level of basic assumptions: Belief in God and truth of the Bible. And nobody even disputes that...
They do, actually. Plenty of people, especially on Reddit, claim that their beliefs in God and the truth of the Bible are supported by evidence and reasoning, just as you are doing right now. Like you, they are too tangled in their own web of beliefs and confirmation bias to see that they are just making unfounded leaps.
> Welcome to the mind-body problem, and the deep waters of philosophy.
You need not welcome me. My girlfriend is pursuing a PhD in this very problem, and I myself have been interested in it for a long time. I've already heard and read far more about it than I could possibly digest. The more I've researched it, by the way, the more it seems to be true that there is no such thing as a separate soul or some entity that can survive the death of our physical brains. All of the evidence points away from such a thing. As a starting point, may I recommend Consciousness Explained by philosopher, Daniel Dennet. It attempts to provide a possible explanation of how consciousness is a distributed process in the brain with each part working together, rather than centered in some Cartesian theater. For a heavier read, I would recommend Gödel Escher Bach by physicist, Douglas Hofstadter. It explores the deep meaning that comes from recursion and self reference in art, music, mathematics, logic and in the world itself, and explains how consciousness could be the result of many layers of self referential recursive systems and structures.
Let me just say right off the bat that it sounds like you're well on your way to being a successful programmer.
One thing I can definitely suggest (which helped me a lot) is reading Code or some other book like it. It is effectively a guide from the ground up of how computers and programming work in general. I had the fortune of reading most of it before I started my CS degree, and it really helped me breeze through my hardware courses.
As well, any books on data structures would probably be helpful, as this is one of the early topics covered in many CS programs. I can't suggest any specific books, but I'm sure others can.
Most of all I have to suggest just getting very comfortable with programming and learning several different languages. It looks like you're already well on your way with this, but the goal here is to have a strong passion for programming before college. That way, when you're up at 3 AM the night before an assignment is due, it's not because you procrastinated and you waited until the last minute to start because you loathe the thought of programming, but because you're so excited about making your code perfect and adding in additional functionality because you absolutely love programming.
>There have been some excellent trading opportunities with returns as high as 30% to your overall portfolio! Crypto is providing big returns that are uncommon in traditional markets.
I guess you have a good intention, Mr. Hustle, but I'd hate to see the kind shibes here being taken advantage of again. You should be more objective and also warn people that they can as easily lose that much of money when trading, especially when they don't know what they are doing initially.
And the effectiveness of technical 'analysis' is a highly debatable issue. I'd just leave this quote from Wikipedia:
> Technical analysis is widely used among traders and financial professionals and is very often used by active day traders, market makers and pit traders. In the 1960s and 1970s it was widely dismissed by academics. In a recent review, Irwin and Park[13] reported that 56 of 95 modern studies found that it produces positive results but noted that many of the positive results were rendered dubious by issues such as data snooping, so that the evidence in support of technical analysis was inconclusive; it is still considered by many academics to be pseudoscience.[14] Academics such as Eugene Fama say the evidence for technical analysis is sparse and is inconsistent with the weak form of the efficient-market hypothesis.[15][16] Users hold that even if technical analysis cannot predict the future, it helps to identify trading opportunities.[17]
...
> Whether technical analysis actually works is a matter of controversy. Methods vary greatly, and different technical analysts can sometimes make contradictory predictions from the same data. Many investors claim that they experience positive returns, but academic appraisals often find that it has little predictive power.[51] Of 95 modern studies, 56 concluded that technical analysis had positive results, although data-snooping bias and other problems make the analysis difficult.[13] Nonlinear prediction using neural networks occasionally produces statistically significant prediction results.[52] A Federal Reserve working paper[21] regarding support and resistance levels in short-term foreign exchange rates "offers strong evidence that the levels help to predict intraday trend interruptions," although the "predictive power" of those levels was "found to vary across the exchange rates and firms examined".
I'm not saying not to take coaching from DogeHustle, just that if people want to do it, be aware of its 'limitation' too and have fun doing it with your disposable money only. As an alternative, I strongly suggest shibes who want to try predicting the future based on pattern analysis to do it in a principled manner and learn math, stats and machine learning. It won't be easy, but it will have a wide application beyond trading (so-called data 'science' is the hot job nowadays). It will also teach you the limitation of such methods, and when it might fail, especially in such a manipulated market like crypto. This is a good book to start with:
http://www.amazon.co.uk/Machine-Learning-Probabilistic-Perspective-Computation/dp/0262018020
A good textbook will do you wonders. Get one that is fairly general and includes exercises. Do the exercises. This will be hard, but it'll make you learn an enormous amount faster.
My personal favourite book is Christopher Bishop's Pattern Recognition and Machine Learning. It's very comprehensive, has a decent amount of maths as well as good examples and illustrations. The exercises are difficult and numerous.
That being said, it is entirely Machine Learning. You mention wanting to learn about 'AI' so potentially you may want to look at a different book for some grounding in the wider more classical field of AI than just Machine Learning. For this I'd recommend Russel and Norvig's [AI: A Modern Approach](https://smile.amazon.co.uk/Artificial- Intelligence-Modern-Approach-Global/dp/1292153962). It has a good intro which you can use to understand the structure and history of the field more generally, and following on from that has a load of content in various areas such as search, logic, planning, probabilistic reasoning, Machine Learning, natural language processing, etc. It also has exercises, but I've never done them so I can't comment much on them.
These two books, if you were to study them deeply would give you at least close to a graduate level of understanding. You may have to step back and drill down into mathematical foundations if you're serious about doing exercises in Bishop's book.
On top of this, there are many really good video series on youtube for times when you want to do more passive learning. I must say though, that this should not be where most of your attention rests.
Here are some of my favourite relevant playlists on YouTube, ordered in roughly difficulty / relevance. Loosely start at the top, but don't be afraid to jump around. Some are only very tenuously related, but in my opinion they all have some value.
Gilbert Strang - Linear Algebra
Gilbert Strang - Calculus Overview
Andrew Ng - Machine Learning (Gentle coursera version)
Mathematical Monk - Machine Learning
Mathematical Monk - Probability
Mathematical Monk - Information Theory
Andrew Ng - Machine Learning (Full Stanford Course)
Ali Ghodsi - Data Visualisation (Unsupervised Learning)
Nando de Freitas - Deep Learning
The late great David MacKay - Information Theory
Berkeley Deep Unsupervised Learning
Geoff Hinton - Neural Networks for ML
Stephen Boyd - Convex Optimisation
Frederic Schuller - Winter School on Gravity and Light
Frederic Schuller - Geometrical Anatomy of Theoretical Physics
Yaser Abu-Mostafa - Machine Learning (statistical learning)
Daniel Cremers - Multiple View Geometry
First Advice! Don't listen to high school guidance counsellors.
You require certain high school courses to qualify for College or University programs. What you want (ideally) is a 3 Year University Bachelor's Degree in Computer Science with a Minor in Game Development. Figure out which schools you're interested in and what they require for admissions to their programs.
You're probably hoping to start making games now though, so I'll give you some advice on that.
Try to get a copy of this book: http://gameenginebook.com/
It's kind of a heavy book, both figuratively and literally, but it'll teach you everything about how a video game works. Even if you're not going to be building this or that part of a game, it's important to know how it all works and fits together.
This YouTube Channel Extra Credits has a lot of great video articles on game design and development, and links to other channels.
You can find a ton of guides on YouTube for computer programming, game design, 3D modelling, music, etc. The key word to search for is Tutorial.
Once you've learned how to program computers, you should read this book to learn how to program computers well: Design Patterns: Elements of Reusable Object-Oriented Software
It's a pretty old book, but that's because nobody has had reason to update it; it's just that good. That said, a smart fellow released a free web-based followup book for Game Developers here: http://gameprogrammingpatterns.com/
If you want to focus on Game Design I'm a big fan of Raph Koster so here's his book too: http://www.theoryoffun.com/
I guess the rest is up to you. Any specific questions?
I have no formal training in CompSci, but this book seems like a pretty standard 1st or 2nd year text. It's one of the best technical book purchases I ever made, imho: Applying UML and Patterns: An Introduction to Object-Oriented Analysis and Design and the Unified Process by Craig Larman. I would recommend it to anyone who wants to learn programming. Goes great with the classic "GoF" book, Design Patterns. For any particular language's syntax and libraries, I just read the docs and check stackoverflow or IRC for any tricky idioms and for best practices.
The whole subject is a bit too complicated and a bit too deep for a short ELI5, but I'll give a stab at the gist of it.
The reason why computers work (at least in the vein of your question) is very similar to the reason why we have language -- written, spoken, etc.
What you're reading right at this very moment is a complex system (language) simplified to symbols on the screen. The very fact that you can read these words and attain meaning from them means that each sentence, each word, and each letter represent a sort of code that you can understand.
If we take an apple for example, there are many other ways to say that in different languages. Manzana. Pomme. Apfel. And so on. Codes -- some symbol maps to some concept.
In the context of computers, well, they can only "understand" binary. Ones and zeros. On and off. Well, that's okay, because we can map those ones and zeros to codes that we (humans) care about. Like 101010111 could represent "apple" if we wanted it to.
So we build these physical circuits that either have power or don't (on and off) and we can abstract that to 1's (power flowing through that circuit) and 0's (no power flowing through it). This way, we can build physical chips that give us basic building blocks (basic instructions it can do) that we can leverage in order to ultimately make programs, display stuff, play sounds, etc. And the way we communicate that to the computer is via the language it can understand, binary.
In other words, in a basic sense, we can pass the processor binary, and it should be able to interpret that as a command. The length of the binary, and what it should contain can vary from chip to chip. But lets say our basic chip can do basic math. We might pass it a binary number: 0001001000110100 but it might be able to slice it up as 0001 | 0010 | 0011 | 0100 -- so the first four, 0001, might map to an "add" command. The next four, 0010, might map to a memory location that holds a number. The third group of four might be the number to add it to. The last group might be where to put it. Using variables, it might look like:
c = a + b. Where "c" is 0100, "a" is 0010, "b" is 0011, and the "+" (addition operator) is 0001.
From there, those basic instructions, we can layer abstractions. If I tell you to take out the trash, that's a pretty basic statement. If I were to detail all the steps needed to do that, it would get a lot longer -- take the lid off the can, pull the bag up, tie the bag, go to the big garbage can, open the lid, put the trash in. Right? Well, if I tell you to take out the trash, it rolls up all those sub actions needed to do the task into one simple command.
In programming, it's not all that different. We layer abstractions to a point where we can call immense functionality with relatively little code. Some of that code might control the video signal being sent to the screen. Some of that code might control the logic behind an app or a game. All of the code though, is getting turned into 1's and 0's and processed by your cpu in order to make the computer do what is asked.
If you want to learn more, I highly recommend Code by Charles Petzold for a much more in depth but still layman friendly explanation of all this.
Oh, I have a bunch of recommendations.
First, I really think you should read Elkhonon Goldberg's The New Executive Brain. Goldberg was the student of neuropsychology legend Alexander Luria. He was also a good friend of Oliver Sacks, whose books are both informative and highly entertaining (try The Man who Mistook his Wife for a Hat).
I also think Jeff Hawkins' On Intelligence is a great read. This book focuses on the neocortex.
I think you'll also appreciate Sapolsky's Why Zebras Don't Get Ulcers. Sapolsky is a great storyteller. This book is a pretty good primer on stress physiology. Stress affects the brain in many ways and I'm sure this book will be very eye-opening to you!
More suggestions:
The Age of Insight and In Search of Memory by Eric Kandel are good. The Tell-Tale Brain and Phantoms of the Brain by Ramachandran are worth checking out. If you are interested in consciousness, you should check out Antonio Damasio and Michael Graziano. And Giulio Tononi and Gerald Edelman.
If you're up for a challenge I recommend Olaf Sporn's Networks of the Brain and Buzsáki's Rhythms of the Brain.
Start with getting the XCode developer tools and the command-line package.
C is an important language in Computer Science because it is pretty much the language for heavy duty Operating Systems, the type you see in Desktop OSes, Network OSes (the type that runs on a networking router/switch), Server OSes (Linux, BSD, Windows, etc.).
I think C is a hard language to learn, but it is a great first serious language while also simultaneously learning an easier language like shell or Python to make yourself more efficient/productive.
However fundamental to CS is about the theory of comptuation not really languages. Languages are just a way to express computation. Some languages are better than others for expressing computation to solve certain problems. I would highly encourage also looking into understanding computation from first principles, a great introduction is Theory of Computation (2nd edition is really really cheap used). The only background knowledge you need to know is highschool mathematics.
In my experience, languages are pretty easy to pick up once you know one. I think you'd be better off sticking with Java and exploring concepts, algorithms, data structures and certain frameworks.
When I was starting I got a lot out of the GoF Book. It's a C++ book but they don't really use any C++ features that are hard to translate to Java. I've heard good things about Head First Design Patterns too but haven't read it.
As far as Java goes Spring and Hibernate are two great libraries to be familiar with, since you'll encounter them in the wild pretty regularly.
If I were to suggest something you might not have learned, consider installing VirtualBox and using it to run Ubuntu. Familiarity with Linux will give you a big leg up and Ubuntu is a pretty good way to ease into it. Plus it has packages for a ton of different programming languages so you can experiment with any that catch your fancy.
I have the first edition and yes, it worth a read, keep in mind that it explain how game engines works and not how to make a game engine.
After reading it you will not be a master with UE4 but you will undertstand why UE4 do things in a certain way.
Another book you have to read (and is mentioned in your link) is the Game Programming Patterns book, i have the physical copy and it is awesome, read it after the GoF Design Patterns book, is a masterpiece combo.
EDIT:
Also two sites i want to suggest:
Learning Modern 3D Graphics Programming, is a great tutorial about OpenGL basics.
The Book of Shaders, great to learn how shaders works.
I was in your shoes not long ago, though a much diff background and job situation.
> I guess maybe my question boils down to do I need to at some point go to grad school?
Yes but don't worry about grad school right now. It's expensive and you'll do better with it once you've been working in the real world. Try and get work to pay for it too.
>I'm not against it, but would rather learn on my own and make it that way, is that feasible?
Yes you can start using ML techniques at work without formal training. Don't let it stop you. Get a good book - I use Kevin Murphy's and also have a copy of EoSL on my desk from the work library (its free online pdf though).
ML is a somewhat broad and growing field. So if you have the mindset that you need to cover it all before you start using it you'll be sticking thumbs up your ass for a few years.
A better approach will be what is your specific data. Just like you're probably familiar with from using SQL, standard textbook techniques or something in a research paper rarely applies exactly to you what you're working with. So it's almost better to approach your problem directly. Explore the data, look at the data, study the data (in a stats fashion) and then look into what could an intelligent program do to better analyze it. And then in the meantime you can study more general ML topics after work.
Ok here goes.
One of the challenges with answering your question directly is deciding at what scope to answer it. If we take your question at its broadest level of meaning, it almost becomes "how do computers work?" What I mean by this is when you said in another comment that you'd like to see how a program goes from outputting simple text into a command window, to a program with a GUI like Audacity, you should be asking yourself, "waitasecond...how does the program output to the console?" And for that matter, how do the characters get drawn on the screen? What is happening when I "compile" code into an executable, and what happens when I "run" an executable?
So you see how this can quickly get out of hand. One of the things that is frustrating with starting to learn how to program, at least it was for me, is you get sorta plopped right in the middle of the story. Typically, you don't get the beginning of the story until much later in your studies.
And that's because the beginning of the story, while very interesting and well worth knowing, is pretty damn complicated. It takes quite a bit of study and effort to wrap your head around.
But let me whet your appetite anyway. Since we have been talking about input/output (console output vs. a GUI), take a look at this page. What you have here is a very, very, very basic computer. It doesn't have a monitor, and it doesn't have a keyboard. It doesn't have a disk and it can't connect to the internet. But a computer it is, and its fundamental operation is exactly the same as the computer that is your laptop, or tablet, or smart phone. The way you input information into this computer is by flipping switches on the front. The output from the computer is displayed on the little LED lights on the front. But at a high level, it's not really that different from what happens on your computer: you input information into the computer that you want processed, the computer processes that information, and the output of the processing is displayed to you.
Watch the video on that page, and you'll see what I mean about even though this computer is primitive compared to your laptop, it is still quite complicated. I don't expect you to understand what he's talking about in this video (you will later as you progress), all you really need to take away is this idea that even with a primitive computer like this, explaining how you input data into it and get data back out of it, is fundamentally a complicated thing. If this sort of thing interests you and you would like to know more, I would recommend you pick up this book.
Here's the point I'm trying to make: you won't get a truly satisfying answer to your question until you understand how the toggle-switch-blinkenlight computer works, for starters. Obviously, that won't be for a while. And that's OK! Just understand that you're kind of starting somewhere in the middle of the whole "how computers really work" story, and know that eventually you will read the beginning.
Now to your specific question.
First I think reading this page will do more to put you on the right path than anything else I could say. Therein, the author will show you some very basic Java GUI programs that you can run yourself, even if you're not using an IDE. In one sense, programs like Audacity are just more complicated versions of these primitive GUI programs. So that is off to a good start.
But what this author doesn't really address is how when you write JOptionPane.showMessage, how the computer takes that line of code and is able to go "Ok, um i am now going to draw a box on the screen and make this thing called a button, etc etc".
So what's going on there? As some others have pointed out in various comments, there are things called APIs that Java can access; these are libraries of code including the code to draw those windows and buttons on the screen, made such that YOU can reuse that functionality in your own programs without having to write it yourself. Also others pointed out that you have something called an operating system or OS; similarly this provides a sort of "platform" for your code to run on; so for example you can ask the OS to load a file from disk, read from the keyboard, etc, without having to write that code yourself.
The major point here is that other industrious individuals have spent a lot of time writing tools that your little tiny program can leverage. Even the simplest "hello world" program is relying on all these tools -- otherwise the code wouldn't look so simple!
I hope this helps. Please let me know if any of this is unclear, I'll try to clarify as my schedule allows.
Edit: spelling, grammar, tightened up some sentences
First book I recommend to any programmer, no matter what they're working on, is The Pragmatic Programmer. Excellent stuff.
If you don't get a shot at low-level coding at work, get yourself an Arduino kit and just hack away. Apparently the language is similar to / based on the C programming language. I use C every day.
To do well with embedded systems, real-time, device driver, or kernel type stuff, you have to really, really, really, understand what the hardware is doing. I was able to learn gradually because I started programming when there was one CPU and no cache memory. Each hardware operation was straightforward. Now with multi-core CPUs, multi-level cache memory, multiple software threads, it becomes a bit more complex. But something like the Arduino will teach you the basics, and you can build on that.
Every day I have to think asynchronously - any operation can happen any time, and if they share memory or other resources, they can't step on each other. It can get hairy - but it's really fun to reason about and I have a blast.
There's a lot more I'm sure, but get started with some low-level hacking and you can build from there.
If you want to get meta, many of the best programmers I know love Godel, Escher, Bach because it widens your mental horizons. It's not about programming per se, but I found that it helps my programming at a meta level. (and it'll give you a lot to meditate on when you're baked!)
You need to understand there are a couple of ways to do Java web development.
Definitely learn Hibernate. You can start with the JPA material in the Java EE tutorial.
As for design patterns, Design Patterns: Elements of Reusable Object-Oriented Software is a classic. I also like Patterns of Enterprise Application Architecture for more of an enterprise system pattern view of things. Probably avoid most J2EE pattern books. Most of the Java EE patterns come about because of deficiencies of the J2EE/JavaEE platform. As each new version of Java EE comes out you see that the patterns that have arisen become part for the platform. For example you don't create a lot of database DAOs because JPA/Hibernate handles your database integration layer. You also don't write a lot of service locators now because of CDI. So books like CoreJ2EE Patterns can interesting but if you are learning a modern Java web stack you'll be amazed at how archaic things used to be if you look at old J2EE pattern books.
p.s. Don't buy anything that says J2EE, it'll be seven years out of date.
Sadly, for a large part of the industry, Java and C# is the standard.
.Net still has a huge marketshare for developers, but Java is gaining quickly. Get proficient in either of these and some Agile development practices, as well as design patterns.
The Gang of four book is the standard design pattern book. I don't know off the top of my head what the best books to get on Java and C# are, but there are huge threads on StackOverflow that will point you in the right direction.
Good luck, my friend!
I just began reading CODE and it talks about the lowest level of computing mechanisms. This could be something of interest, although it wont teach you how to program specifically.
For that, I propose to you -- as others have -- Learn to Code the Hard Way. I would recommend the Python version, but he is working on a C version that is being completed. Another great contribution is How to Think Like a Computer Scientist, another book that focuses on Python.
I guess I could best help if I knew what your goals and intentions are. If you want to learn the basics, you can't go wrong with installing a virtual machine with some simple virtual hardware and code at the hardware level. You could even go so far as to build a computer from individual components connected in a specific circuit and hard-code the hardware itself. If you want to learn the more modern, abstract methods, I would strongly suggest Python, C#, or Java. There are many good books on each subject.
If you did the Princeton Algorithms course and have two years experience as a professional developer you're way too advanced for CS50 and other intro classes.
The classes I'd recommend for someone in your position are:
More than the above though I'd recommend learning the following concepts and subjects even though there aren't any good MOOCs on them (to my knowledge):
Despite their age, the MIT lectures were great. If you're good at math and enjoy proofs this is the class for you. Same thing with the CLRS book. One of the best books on DS & Algos out there, but it's so dense it'll make your eyes glaze over, unless you love proofs and highly technical reading.
To get your feet wet, Grokking Algorithms is a good book.
A lot of people recommend Princeton's Algorithm Course. I took Algorithms in school already, but I'm probably going to take this course to round out my knowledge.
EDIT: special shout out to geeks for geeks. Great Website
Gödel, Escher, Bach: An Eternal Golden Braid
It's a book you have to work through (or think through) but it's extremely rewarding and entertaining. It'll make you feel dumb and confused dozens of times, and then give you triumphant moments of discovery where everything you just read makes sense and you feel like a genius. He does this intentionally, and the effect is amazing.
This will make you a smarter person, and it'll make math and thought and life and biology and chemistry and physics and music and art and language more provocative. You'll feel like you're seeing the world in new colors.
Here are the review counts on amazon:
5 star
255
4 star
38
3 star
20
2 star
16
1 star
15
It's hard not to love this book.
You can get it at your library, they all have it. It's been a best seller since the 80s.
That is an incredibly broad question. Without knowing what you've already studied, it's hard to recommend things. Most of the aerospace and mechanical engineers I know use pre-packaged programs rather than writing their own scripts, etc.
Artificial intelligence might be the best one, though. Russel and Norvig is the standard textbook: https://www.amazon.com/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597
The plus side to learning about AI is that it is not really programming intensive - it's logic and statistics intensive.
If you want to go the programming route, it gets a little hairier. The reason is that advanced systems designs will take a lot of initial classes just to get you to a level where you are comfortable programming and can then think about design and program flow.
Take an intro course. I learned programming with C / C++ and Matlab. Recommend those since it's easier to blow your foot off when programming. Once you understand how to design programs, what functions are, how program control can be passed off, move over into Python (much easier to pick up and run with and much better supported).
You might also benefit from a databases or Big Data class due to the amount of data generated from an aircraft.
Regular expressions and scripting is another option. But that's good for anyone.
In that case...
You may want to wait for the 5th edition of UNIX and Linux System Administration, as it should release near the end of this year and they don't release new versions that often.
A good way to get started building a college library is to see what the curriculum for the school is and what books are required by professors. Often other colleges will list their book recommendations for the courses online to get an idea of where to start looking. (I know my school has an online bookstore that lists the books for each course and is open to the public)
At least one or two good books in each of those categories, to get a rough idea to start:
C, C++, Java, Python and Ruby are popular to start with
https://www.amazon.com/Advanced-Programming-UNIX-Environment-3rd/dp/0321637739
https://www.amazon.com/Programming-Paperback-Addison-Wesley-Microsoft-Technology/dp/0134382250
you already have a couple here
https://www.amazon.com/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597
https://www.amazon.com/Windows-Internals-Part-architecture-management/dp/0735684189 (coming soon)
Edit: On second thought, pretty much every college teaches system programming on linux - so perhaps just the linux texts would be adequate for that.
What a great question, and an interesting example. For those confused by OP's example, check out Gödel's Incompleteness Theorem on Wiki. Better yet, read the insightful and very trippy Pulitzer Prize winning book, Gödel, Escher, Bach. Gödel's theorem is a bit abstract but it was both a monumental and surprising discovery. It's not just mathematical -- it's meta-mathematical, in that it reveals the limitations inherent to any mathematical framework or system. From wiki:
>The first incompleteness theorem states that no consistent system of axioms...is capable of proving all truths about the relations of the natural numbers (arithmetic). For any such system, there will always be statements about the natural numbers that are true, but that are unprovable within the system. The second incompleteness theorem, an extension of the first, shows that such a system cannot demonstrate its own consistency.
I'll point out an obvious one, though it's more to do with the aesthetics of the psychedelic experience rather than insights or ideas. Psychedelic hallucinations tend to be geometric, with lattices, grids, spirals, and perhaps most intriguing of all, fractals. All these are geometric forms that can be rigorously defined and analyzed by math. Fractals are especially fascinating because they exhibit self-similarity at every scale, appear sometimes in nature (for example, coastlines), and look extremely trippy. (Seriously, just look at these zoom-ins of the Mandelbrot set, discovered in 1978.)
Small employment gaps are no big deal. Over six months people may ask, but it's all in how you answer. I'm not sure why you feel like you're unmarketable having worked in the industry for two years, but do know a lot of the postings - especially junior postings - are inflated. I've seen one that asked for three years of experience with Visual Studio 2019. If you're halfway there, shoot your shot.
As a junior dev, the expectations are low. All I'd expect you to know is how to get code up and running that I don't have to tear down for the good of the company. Be able to read your language and solve simple problems. The biggest thing I look for in a junior dev is if I can give them some piece of of the software to write while I'm not looking and feel that you're mostly there when I come back to check. Apply for appropriate positions and don't fudge your experience. Enthusiasm and eagerness to learn go a long way. Don't be a know-it-all from your position.
Decide what kind of role you'd prefer, and start the process of brushing up on that. Use the job postings that represent the jobs you want as direction on what you need to learn. If the role you really want is too far, get a job doing what you know to pay for your education in the role you want.
As a front-end developer, you're going to want to learn a Javascript toolchain and one modern framework to start. Npm and Node.js are the backbone of what you do. If you want to switch, learn what juniors do in that paradigm. Do know that the Javascript world is fast-paced and fad-based, so if you miss a wave, wait two years and the next one will be coming around for you to hop on.
Personal projects are a good idea, just make them meaningful by using the proper setup (not just some bullshit hack job) or address an interesting problem. You're going to want to get it up on a personal repository that you can put a link to right on your resume and job site (Indeed, Dice, Glassdoor, Linkedin) posting. Be able to speak to every decision you made, even if it was a bad one. Your personal project doesn't have to be spotless or even completely done, it just has to be yours, it should be able to execute, and you should show some decent decision making. A mod for a game, a contribution to open source, a personal thing that has some use-case or whatever.
Get experience with related technologies. Start to learn one step before and beyond the one you're a specialist in. For example, you're a junior front-end dev. Learn a little about backend work, and learn about deployments. Learn about the experience of your fellow team members as they try to integrate your work with Git, build with Jenkins or AWS Code Build, and containerize with Docker. Think about the pain points you face in architecture, code, building, and deploying; think about how you'd solve them or if you can't, keep an eye on solutions as you go. Know the differences between elements of your chosen sphere.
Higher level concepts like SOLID principles,Design Patterns, and Refactoring Patterns are more goals than expectations, but you should take a look at them now and at least be able to speak to at least one of them somewhat. With limited time, prefer Design Patterns. You don't want to walk into an interview where someone asks you about how you use design patterns and you've never heard of them. Even if they'll accept that, you still won't feel good about it.
Look up some materials on coding challenges, as some companies give coding quizzes. I just had an interview with a guy that touted 10+ years of experience but couldn't read from a file given an hour.
If you feel like you're going to be let go due to performance, get ahead of that and ask your supervisor how you're doing or what you need to do to grow. If you feel like you're going to be let go due to a restructuring you can't affect, you have two options: get to know other teams so you can maybe hop on their project later, or just save your money and get to work on some of the stuff above each weekend until the axe falls. You're a junior dev. You're not expected to be perfect, but you should come in a teachable state - some foundation with programming, a willingness to learn, a willingness to figure things out, and the ability to take direction.
I first heard about these when reading Godel Escher Bach back in later high school. That book was a long, difficult read, but man did it blow my brain wide open. Quines are definitely the thing that I remember most vividly (probably because it was the easiest to understand), but that book was full of awesome stuff like this.
You should totally check it out! You can get it super cheap at used book stores since it was such a successful book.
Although I admire the optimism for a lot of people here I feel they are not being very realistic. Like the mention of using sales experience to 'sell' yourself to employers: it's fine and dandy if you can bullshit your way into a job but if you can't actually deliver you'd be 'let go' within a month.
Learning a language is just one aspect. What's most important is doing actually a ton of programming. So make sure you have at least 3 moderately big projects with good code quality that follow best practices that you can show to employers.
Feel free to hop over onto /r/javahelp to have us review your code and suggest improvements. Being a developer isn't really about languages: it's about turning a customer's problem into a working solution. That's the hard part.
One last tip: for someone without any CS education but who is going into an area where OO skills are a must this book is a must read: http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented-ebook/dp/B000SEIBB8
Heck. Showing an employer that you read and understand it and apply it in your projects would give you a huge head up.
MVC is just a paradigm. Implementations differ from language to language, and some languages have better support than others, but yeah, just a paradigm.
Look into Design Patterns by the (in)famous "Gang of Four" for more information about this.
I will say this -- most experience I have with C# backends leads to great usage of databinding between model and view regardless of application platform (WPF, Windows Forms, even some ASP.NET). And I'm pretty impressed with the support of other design structures that C# and Visual Studio offer with the help of NuGet (looking at you, Angular).
Bostrom's Superintelligence covers gene editing very well, but let me summarize:
The singularity isn't likely going to come through gene editing. The reason is it's too difficult to improve on the brain. If you identify which genes are responsible for genius and activate them (which is difficult to say the least), you could get everyone as intelligent as the smartest person yet. But where you do you go from there? You'd have to understand the brain on a level far, far beyond what we do now.
Then if you did that, chances are you'd run up into diminishing gains. It would be a lot of work to increase everyone's IQ by 5 points once, but far more work to figure out how to do it the 10th time. Rather than getting exponentially increasing gains in intelligence, you get logarithmic increases.
Not to say I'm not a fan of gene editing. It's obviously fraught with controversy when used beyond curing disease, but compared to other forms of trans-humanistic techniques it would leave us with a lot more humanity intact.
Good ones!
I suggest trying to wear two “outer” shirts for one waking day - dress,polo, or any other type of collared shirt.
Find and buy one item solely for airplane miles arbitrage.
Watch an anime from John Siracusa and have Him as a guest. I want to hear Max and Merlin pick on John a bit, though he is almost always “good cop”.
For a serious one (if they ever want to do “serious”) I would love for all of them to expound on their thinking of how the mind handles memory/ consciousness - though this might be a Rec/Diffs topic just for John and Merlin:
I read a fascinating book (On Intelligence) that not only explained in lay terms how your brain (logically) processes inputs, but had a good theory of how a single method of working explained learning, practice, memory, and actually moving your muscles to do something - most theories can’t explain them all in a single method.
Find out what language the class is using. If it is Java, then I've heard that Thinking in Java is a good book (the 3rd edition is free while the 4th is not).
Here are some books you should read, the sooner the better:
Code: The Hidden Language of Hardware and Software - This is a great book that is technical while being a fairly light read. It answers the question "how do computers physically work?" with just enough theory to understand, and it doesn't have, like, metaphors of pictures of trains carrying 1s and 0s around.
Don't Make Me Think and User Interface Design for Programmers which are also light-but-still-technical books on user interface design, which gives a good idea of how to create software that people actually use.
At some point after you've done some coding, read these books:
Code Complete - A great tome of the non-programming aspects of being a good software engineer.
Pragmatic Programmer is a technical book, and probably a good one to read towards the end of your first year maybe.
Also note: You don't need to know a lot of math to program. High school level algebra and basic arithmetic is fine for the most part.
Here's my list of the classics:
General Computing
Computer Science
Software Development
Case Studies
Employment
Language-Specific
C
Python
C#
C++
Java
Linux Shell Scripts
Web Development
Ruby and Rails
Assembly
Chapter 7 of Chris Bishop's book Pattern Recognition and Machine Learning has a nice intro to SVMs.
Here is a list of papers where SVMs were used in a computational biology
> Gene Function from microarray expression data
>
> Knowledge-based analysis of microarray gene expression data by using support vector machines, Michael P. S. Brown, William Noble Grundy, David Lin, Nello Cristianini, Charles Walsh Sugnet, Terence S. Furey, Manuel Ares, Jr., David Haussler, Proc. Natl. Acad. Sci. USA, vol. 97, pages 262-267
> pdf
> http://www.pnas.org/cgi/reprint/97/1/262.pdf
>
> Support Vector Machine Classification of Microarray Gene Expression Data, Michael P. S. Brown William Noble Grundy, David Lin, Nello Cristianini, Charles Sugnet, Manuel Ares, Jr., David Haussler
> ps.gz
> http://www.cse.ucsc.edu/research/compbio/genex/genex.ps
>
> Gene functional classification from heterogeneous data Paul Pavlidis, Jason Weston, Jinsong Cai and William Noble Grundy, Proceedings of RECOMB 2001
> pdf
> http://www.cs.columbia.edu/compbio/exp-phylo/exp-phylo.pdf
>
> Cancer Tissue classification
> from microarray expression data, and gene selection:
>
> Support vector machine classification of microarray data, S. Mukherjee, P. Tamayo, J.P. Mesirov, D. Slonim, A. Verri, and T. Poggio, Technical Report 182, AI Memo 1676, CBCL, 1999.
> ps.gz
> PS file here
>
> Support Vector Machine Classification and Validation of Cancer Tissue Samples Using Microarray Expression Data, Terrence S. Furey, Nigel Duffy, Nello Cristianini, David Bednarski, Michel Schummer, and David Haussler, Bioinformatics. 2000, 16(10):906-914.
> pdf
> http://bioinformatics.oupjournals.org/cgi/reprint/16/10/906.pdf
>
> Gene Selection for Cancer Classification using Support Vector Machines, I. Guyon, J. Weston, S. Barnhill and V. Vapnik, Machine Learning 46(1/3): 389-422, January 2002
> pdf
> http://homepages.nyu.edu/~jaw281/genesel.pdf
>
> Molecular classification of multiple tumor types ( C. Yeang, S. Ramaswamy, P. Tamayo, Sayan Mukerjee, R. Rifkin, M Angelo, M. Reich, E. Lander, J. Mesirov, and T. Golub) Intelligent Systems in Molecular Biology
>
> Combining HMM and SVM : the Fisher Kernel
>
> Exploiting generative models in discriminative classifiers, T. Jaakkola and D. Haussler, Preprint, Dept. of Computer Science, Univ. of California, 1998
> ps.gz
> http://www.cse.ucsc.edu/research/ml/papers/Jaakola.ps
>
> A discrimitive framework for detecting remote protein homologies, T. Jaakkola, M. Diekhans, and D. Haussler, Journal of Computational Biology, Vol. 7 No. 1,2 pp. 95-114, (2000)
> ps.gz
> PS file here
>
> Classifying G-Protein Coupled Receptors with Support Vector Machines, Rachel Karchin, Master's Thesis, June 2000
> ps.gz
> PSgz here
>
> The Fisher Kernel for classification of genes
>
> Promoter region-based classification of genes, Paul Pavlidis, Terrence S. Furey, Muriel Liberto, David Haussler and William Noble Grundy, Proceedings of the Pacific Symposium on Biocomputing, January 3-7, 2001. pp. 151-163.
> pdf
> http://www.cs.columbia.edu/~bgrundy/papers/prom-svm.pdf
>
> String Matching Kernels
>
> David Haussler: "Convolution kernels on discrete structures"
> ps.gz
> Chris Watkins: "Dynamic alignment kernels"
> ps.gz
> J.-P. Vert; "Support vector machine prediction of signal peptide cleavage site using a new class of kernels for strings"
> pdf
>
> Translation initiation site recognition in DNA
>
> Engineering support vector machine kernels that recognize translation initiation sites, A. Zien, G. Ratsch, S. Mika, B. Scholkopf, T. Lengauer, and K.-R. Muller, BioInformatics, 16(9):799-807, 2000.
> pdf.gz
> http://bioinformatics.oupjournals.org/cgi/reprint/16/9/799.pdf
>
> Protein fold recognition
>
> Multi-class protein fold recognition using support vector machines and neural networks, Chris Ding and Inna Dubchak, Bioinformatics, 17:349-358, 2001
> ps.gz
> http://www.kernel-machines.org/papers/upload_4192_bioinfo.ps
>
> Support Vector Machines for predicting protein structural class Yu-Dong Cai*1 , Xiao-Jun Liu 2 , Xue-biao Xu 3 and Guo-Ping Zhou 4
> BMC Bioinformatics (2001) 2:3
> http://www.biomedcentral.com/content/pdf/1471-2105-2-3.pdf
>
> The spectrum kernel: A string kernel for SVM protein classification Christina Leslie, Eleazar Eskin and William Stafford Noble Proceedings of the Pacific Symposium on Biocomputing, 2002
> http://www.cs.columbia.edu/~bgrundy/papers/spectrum.html
>
> Protein-protein interactions
>
> Predicting protein-protein interactions from primary structure w, Joel R. Bock and David A. Gough, Bioinformatics 2001 17: 455-460
> pdf
> http://bioinformatics.oupjournals.org/cgi/reprint/17/5/455.pdf
>
> Protein secondary structure prediction
>
> A Novel Method of Protein Secondary Structure Prediction with High Segment Overlap Measure: Support Vector Machine Approach, Sujun Hua and Zhirong Sun, Journal of Molecular Biology, vol. 308 n.2, pages 397-407, April 2001.
>
> Protein Localization
>
>
> Sujun Hua and Zhirong Sun Support vector machine approach for protein subcellular localization prediction Bioinformatics 2001 17: 721-728
>
>
> Various
>
> Rapid discrimination among individual DNA hairpin molecules at single-nucleotide resolution using an ion channel
> Wenonah Vercoutere, Stephen Winters-Hilt, Hugh Olsen, David Deamer, David Haussler, Mark Akeson
> Nature Biotechnology 19, 248 - 252 (01 Mar 2001)
>
> Making the most of microarray data
> Terry Gaasterland, Stefan Bekiranov
> Nature Genetics 24, 204 - 206 (01 Mar 2000)
Yikes! Well it's going to be pretty hard for you to really understand how to do Python without actually coding in it.
The one thing you could do though is get a book with examples and write them down and try to modify the examples to do something a little extra while at work.
I find the http://www.headfirstlabs.com/books/hfpython/ books the absolute best books for almost anything if you are just starting out. The Java book is especially fun!
I know this isn't exactly what you are asking but it might be a good resource for you to start using.
Another great book that will teach you parts of the theory, and has really good examples on how computers work is http://www.amazon.com/Code-Language-Computer-Developer-Practices-ebook/dp/B00JDMPOK2/ref=sr_1_1?s=digital-text&amp;ie=UTF8&amp;qid=1457746705&amp;sr=1-1&amp;keywords=code+charles+petzold .
That really helped me think about computers in a more intuitive way when I was first starting. It goes through the history and to what an adder is and more. I highly recommend that book if you want to understand how computers work.
If you're serious about getting into software development, I'd recommend you start looking into data structures and algorithms as well. It's something I think a lot of people who were self-taught tend to miss because it's not required knowledge to program, but it will give you a huge competitive advantage.
While I haven't read it, this book seems like a good introduction to the concept: https://smile.amazon.com/dp/1617292230/?coliid=I34MEOIX2VL8U8&colid=MEZKMZI215ZL&psc=0
From there I'd recommend looking at MIT's Intro to Algorithms, 3rd Edition. A bit more advanced, but the topics in there will play a huge role in getting a job in software.
You would love Godel Escher Bach by Douglas R Hofstadter. It won the pullitzer prize and is basically just a really good popular math/computer science/art book. But a really excellent jumping off point. Yes it lacks mathematical rigor (of course) but if you are a bright clever person who likes these things, its a must read just for exposure to the inter-connectivity of all of these topics in a very artistic and philosophical way. But be prepared for computer code, musical staff notation, DNA sequences, paintings, and poetry (all themed around Godel, Escher and Bach).
> Would you say there's more opportunity working exclusively front end and design to exercise nfp creativity or novelty?
NFP creativity and novelty in the sense that Ne has free range, period? Sure, you get more of that in web design and even more of that as to step further and further away from the sciences. There is tons of creativity in real software engineering where you can be creative to solve actually challenging problems, not figuring out what color you'd like a button to be. To me, that's not creativity – or it's a lesser version. Creativity in problem solving is much more interesting. The way I see it is like when I was in music school and all the SFs were bitching about music theory and how they thought it limited their ability to "be creative". Such bullshit. It only exposes their lack of creativity. So you're saying that someone like Chopin who wrote amazing pieces and abided by the rules of music theory wasn't being creative? Hardly.
> Are you a web dev?
No, I'm a software engineer at an astrodynamics company; I do a lot of orbital mechanics, back-end work with web services, high performance computing, etc.
> By hardcore I meant requiring being meticulous, detail oriented.
I think that the lack of attention to detail is never permissible in either back-end software engineering or front-end web development, honestly.
> One thing I've realized is how shit my high school was at explaining math conceptually. Which I think lead to misconceptions about its use in programming
Well, then read some books on computer science and/or mathematics like this.
Well, I pretty much hate it, but if you really wanted to you could pre-plan your projects using UML. You can find some free UML editors online if you look. ArgoUML is okay, but definitely needs improvement (like an undo feature).
Also if you are interested in programming patterns, the canonical text is the GoF book. If you search I'm sure you will be able to find a PDF version of it. There is also a website dedicated to talking about the same book in terms of the application to games:
http://gameprogrammingpatterns.com/contents.html
C and C++ are pretty different nowadays depending on your standard. "Game engine" is a pretty generic descriptor, because you can build game engines in a lot of different ways depending on your needs for the genre and how all-encompassing your engine needs to be, so I'm going to ask you a few questions about specifics in regards to your experience which might help to flesh out where you can start your search.
Hope this helps.
Hmm alright, considering your background, I'd probably recommend you giving Michael Sipser's Introduction to Theory of Computation a read (I sure there are many electronic copies floating around on the Internet). I think they cover the prerequisite math concepts required in a preliminary chapter before the content which I highly recommend you spend some time on. It works it's way up by walking you through notions of computations in increments, first through finite state automata before adding in more features, working its way up to a Turing machine. You can skip most of the exercises, since those are mostly for graduate students who need practice before undertaking research. If you ever get confused about concepts along the way just drop me a PM or a question in /r/askcomputerscience and I'm sure the community would be happy to help out.
Also if you're interested I could mail you my copy of (meaning a copy that I had bought some time ago, not that I wrote it) the Annotated Turing. It does a great job of explaining the concept of a Turing machine provided a non-mathematical and non-CS background. I'd be more than happy to share my books with people who are interested, plus there's no use in me keeping it around now that I'm done with it.
Just bear in mind that unlike most of science, the concepts here are very abstract, there aren't many direct physical implications, this really is a pure study of notions at play. i.e. how does one go about studying "how to do things" and its implications. A lot of details such as "how can such a machine exist with an infinite tape? what moves it? how does it implement its decision making scheme?" are all unimportant and ultimately inconsequential to the study itself.
Instead, what we care about are things like "I have a problem, is it possible for me to come up with a solution (algorithm) for it? Or is it logically impossible?" or things like "I have come up with a way to make a "computer", can it do things that other computers can? If I had to make it sort an arbitrary set of numbers so that they are ordered numerically, can my computer do it?". Turing machines, are a tool to help us reason about formally around these sort of arguments, and to give insight into what we can qualify as "computation". Further down the line we even ask questions like "are some problems inherently more 'difficult' than others?" and "if I can solve problem B, and I somehow use the solution for problem B to solve some other problem A?"
Perhaps this all sounds perplexing now, but maybe just go through some content and spend time reading a little and these should start to make a little more sense. good luck with your future endeavors on this journey!
What’s your background?
Going by some curriculum sounds like a sure way to learn it all properly and in-depth. Though also to get bored easily.
The beauty of being self-taught is that you can learn areas that actually interests you. There are of course fundmentals that you need, for which I recommend the vastly popular and very high quality free course from Harvard - CS50x
Taking it will give you a solid foundation to learn whatever you want by yourself. Whether it’s backend, webdev, mobile apps, data science... maybe even games/graphics, but for those you need deep math knowledge.
Sure there will be gaps here and there but CS50 really does a great job at teaching you where to look. I myself took it 6+ years ago and it was the perfect gateway into this career.
On another note, get this book - Code. It takes you from morse/braile code through logic gateways all the way up to understanding everything (from hardware/logic point) about basic computer.