(Part 3) Best products from r/programming

We found 115 comments on r/programming discussing the most recommended products. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 1,628 products and ranked them based on the amount of positive reactions they received. Here are the products ranked 41-60. You can also go back to the previous section.

Top comments mentioning products on r/programming:

u/ForeverAlot · 1 pointr/programming

I don't know of any one source that teaches "good testing principles". There are thousands of sources and Sturgeon's law is working against you. A few sources are predominantly good, most have bits (often the same bits) of genuinely good advice in-between chapters of bland, uninsightful repetition, many are appropriations of popular acronyms by closely or distantly related professions (no, you're not "testing" a requirement specification, you're just reviewing it), and some sources are just plain bad.

I had an opportunity to attend Dan North's Testing Faster course and would strongly recommend it. In my case it was more helpful for formalising my own experience than learning concrete new things but other attendees did absolutely "learn new things". He made a point that "TDD" and "BDD" are both inaccurate names and that something like "example-guided development" would have been far more honest; he recommended a book, I think Specification by Example, as a good resource to that end (and noted that that name, too, is technically inaccurate). He also confirmed that Cucumber is a solution looking for a problem.

Test Driven Development: By Example by Kent Beck is a classic, and as far as I can remember, decent. It's maybe a little old now, and it definitely misses some subtle points about maintainability of automated tests in general (or perhaps rather, doesn't really address that).

I've skimmed Code Complete 2. I don't remember it in detail but my overall impression of it was that the sooner it becomes irrelevant the better, because that would signify our profession maturing (if not quite reaching maturity). A lot of its contents would be considered basic by contemporary software development standards and that's a good thing. I don't remember what it says about testing. One thing in a very late chapter (33.8?) stuck with me, though: that seniority has little to do with age and your approach to software development will be formed early on.

Working Effectively with Legacy Code by Michael Feathers is excellent, perhaps the most practically applicable one here.

Sandi Metz is famous in the Ruby community for speaking on this topic and there are recordings on YouTube. From what I've seen her material also mainly addresses beginners but it's fast and easy to consume and her form doesn't bother me the way Martin's does.

One piece of advice I picked up from one of those mostly-mediocre sources had to do with naming in tests, trying to capture the essentials. If you're relying on a particular property of a piece of input to test behaviour, make sure this is evident. Conversely, if any input would satisfy, avoid drawing undue attention:

fn bees_can_fly() {
let some_bee = ...
let bumblebee = ...
let dest = ...

assert fly(some_bee, dest);
assert fly(bumblebee, dest);
}

fn bees_can_pollinate() {
let some_bee = ...
let flower = ...

assert pollinate(some_bee, flower);
}

Testing is about developing confidence. There are many kinds of testing and many things to develop confidence in. For automatic tests it's more about checking (arguably not "testing") that you retain correctness in the face of continuous change. Automatic tests that obstruct that change or compromise your confidence are not helping you and should be rewritten or removed. Reliability of tests is usually more valuable than coverage, for instance.

u/aboothe726 · 1 pointr/programming

i agree. books and tinkering are the very best way to get introduced to programming.

i read my first programming book the summer between my sophomore and junior years in high school. (i don't recall how old that made me at the time. 16? bah, i'm getting old.) anyway, that book was Herbert Schildt's The Complete Reference: C++. i found it really interesting, but knowing what i know now Python (The Quick Python Book and Learning Python are good) or Java (I learned on Thinking in Java, but Effective Java is supposed to be good, too) are probably better places to start.

hopefully your parents support your desire to learn programming. $30-$50 for a programming book and access to a computer are a small price to pay for starting a child on a hobby that could turn into a good career!

good luck, and keep us posted! :)

u/paultypes · 5 pointsr/programming

I think things would be much easier with a small team of average developers, primarily because converging on any given set of things you want to do/want to avoid in the language is much cheaper from a communication-cost perspective. If the group is small enough (say 5-10 people), you can probably get 50,000-foot agreement, at least, over a nice lunch, and hash out the details on a Wiki or something.

Scala's great strength remains that it goes out of its way to accommodate you whether you're treating it as "Java without semicolons" or "Haskell without pervasive laziness and with much poorer type inference." Given that, your biggest question, regardless of the size of your team, is: what kind of software development culture do we have/want to create?

Let me sketch a few example scenarios:

  1. You have a non-trivial existing codebase in Java and Spring. You're excited about Scala, maybe are reading Programming in Scala or taking a Coursera course, and this typifies your team. You've looked around for information about how to use Scala with Spring, and found that information is thin on the ground, at best, and when you ask in the #scala IRC channel or on mailing lists, the response is "Yeccch. Don't." Not very helpful with your existing codebase! For this, I would say: work at whatever comfort level with Scala you have, and take advantage of spring-scala (full disclosure: I took over maintenance of spring-scala from the folks at Pivotal Labs who did the actual heavy lifting).
  2. You have the opportunity to do a greenfield project, maybe because you're refactoring your organization's monolith to be microservice-based, and you get to implement one of your organization's first microservices. You've heard great things about using Scala for microservices, but you still fall into the "reading books and taking Coursera courses" category in 1), so important factors here are availability of semi-pre-packaged solutions, community, documentation, etc. Here I would suggest looking at Lightbend's Activator and its templates for opportunities to quick-start a microservice, like the Akka HTTP microservice. Then you can gradually add functionality as requirements and/or your learning curve permit.
  3. You've heard a lot about pure functional programming in Scala, maybe you're reading Functional Programming in Scala, maybe you have at least one team member who's an intermediate-to-advanced Scala developer and/or you have a clear direction in HR to find them. Your colleagues might look at you funny when you say that a REST microservice is obviously just a HttpRequest => HttpResponse that "does some stuff," so it's actually an HttpRequest => F[HttpResponse] where F is a Monad, but they don't run screaming into the night; they want to know more. You decide to do a Proof of Concept with http4s for the REST stuff and Doobie to talk to good ol' PostgreSQL (because not all of your tech choices can be totally nuts). You might have a look at this example to get you started, and one of the things you notice about it immediately is... how straightforward and unscary the code actually is, strongly resembling what you'd write in any other language. But when you change the code, the compiler, way more often than not, tells you how you screwed up, before your code even can run.

    All of these scenarios are, in my opinion, completely reasonable, even if I got a little tongue-in-cheek on the last one. :-)
u/[deleted] · 1 pointr/programming

No, I see the difference, but I attribute it to something else.

Haskell is a simple typed lambda calculus language with very simple, straightforward semantics. In actuality, there isn't really much you can do in Haskell that couldn't be done in Scheme just as easily; the basic constructs are the same.

The difference comes from the fact that Haskell has a powerful type system that ensures correctness, whereas scheme has (almost) no type system. So, doing something complex in Scheme requires you to get it right on your own with no correctness checking, whereas in Haskell, if you don't get it right, it simply won't compile. This attracts developers that want to do some very complex things since because the type system provides such a good safety net, they can actually be done correctly without the programmer having to manually do correctness proofs in their head. This leads of complex monads and arrows and abelian groupoids and all sorts of things that scare even seasoned programmers. This is not IMO a symptom of the complexity of the language, but rather a reflection of the community that the strong type system attracted.

Haskell as a language can be programmed very Scheme like with no knowledge of monads or arrows or any of that stuff... those are constructs tat grow up around the language in the same way that compile time metaprogramming grew up around C++, not because the language demands it, but because the language makes it possible. There is even an introductory text on functional programming using Haskell as a base that never needs to get into such things. I would no more call Haskell complex because of the availability of libraries for doing stacked monads and clifford algebras than I would call C complex because of the availability of BLAS and GMP, even though those are highly complex libraries.

Haskell programmers are not non-mortals :) Anyone who has gone through the above referenced book is a competent Haskell programmer and can do anything they need to do in the language. The language just happens to attract those super-programmers like Oleg K. and Don Stewart who use it to do brain melting tasks. You can program Haskell very effectively without ever even knowing what a monad is the same way that you can program C++ effectively without ever realizing that you can create compile time prime number generators with recursive templates.

At it's core, Haskell is just a normal order lambda calculus evaluator with a system F type system with a syntactic construct to parameterize types. The standard prelude offers pretty much the same functions any Scheme prelude offers and not much else. Anyone with any Lisp or ML under their belt is a competent Haskell programmer in about an hour.

u/clinintern · 2 pointsr/programming

I've been pretty fortunate that I've been able to get interviews and job offers pretty easily within the game industry as a programmer.

I wouldn't consider myself a master programmer by any stretch and definitely don't spend as much time practicing on Code-Kata, etc as the poster. (The majority of my spare time is spent on unpaid overtime - if I can get that under control, then maybe I can apply some time to personal projects).

The 3 keys that I have found to make it easier to land the interview and the job is:

  1. Networking - this is a MUST. If you know someone at the company that can vouch for you, sometimes the technical phone screen can be bypassed completely and it's a lot easier to position yourself above others when you have a good reputation before the interview. Call friends/colleagues that are still in the industry and ask if they are hiring or know anyone who's working somewhere where they are hiring. If there are conferences related to your specific industry (specialized types of programming), try to get to one of those and do what you can to get on a guest list for some of the company sponsored parties. Talking to people in a social setting goes a long way and is typically more effective that meeting a dedicated recruiting sessions.

  2. Be energetic and friendly. Skills are only half the requirement, cultural fit and an easy going attitude go a long way to whether or not you'll get a job. How your potential colleagues feel about you is almost as important, if not more important, than if they think you are qualified (networking can help a lot to achieve this feeling amongst your potential colleagues).

  3. Pick up this book and read it cover-to-cover: Programming Interviews Exposed. Most technical interviews ask questions from this book or variations of questions found in this book. Rarely do they really distinguish a good programmer from average/bad, but you're really competing with people that are good at interview questions and have seen them before vs people that are not or get a question they haven't seen. Every time I'm starting a new job search, I re-read this book, go over old programming tests that I saved and run through some of the problems from college programming tests on the ACM online judge website to get ready for obscure brain teasers.

    I can't really say whether the hiring process is broken or not, much of what they do does weed out people that really don't belong. But if you don't know the game or don't play the game, you may get swept to the side along with them.

    I hope this helps - good luck!
u/c3261d3b8d1565dda639 · 7 pointsr/programming

I think posting material like this with no context is silly, but I upvoted this anyway because I recommend these books often. The Introduction is very short and explains better than I could here why the books were written. The quality of the chapters vary, but they are mostly all worth reading. I'm excited for the upcoming The Performance of Open Source Applications, although I haven't heard any news about its progress in a long while.

One of the editors, Greg Wilson, did some research into how we can be effective programmers. Basically, continuing the research that books like Code Complete were based on. He wrote an excellent book Making Software: What Really Works, and Why We Believe It. He is also involved with the community blog It Will Never Work in Theory, which is tag-lined as software development research that is relevant in practice.

u/andralex · 15 pointsr/programming

I don't have any experience with Clay and have a vested interest in D, so take the following with a grain of salt.

Here are my impressions on Clay vs D2 after having read the brief docs on the wiki:

  • Clay seems to have no take on concurrency whatsoever. In fact even googling for Clay concurrency is quite difficult. D2 tackles concurrency issues directly with a no-default-sharing model.
  • I like Clay's syntax desugaring (what I call "lowering"), but it assigned a name for each operator, which I believe is adverse to generic programming. D2 lowers operators by passing the operator as a compile-time string into the function, which enables further generic processing.
  • Clay's iterators and coordinates seem to be similar to D's ranges. I'm unclear from the documentation provided about the pervasiveness of related entities and idioms in Clay. D2's ranges turned out to be wildly successful. Both use compile-time introspection to assess membership of a type to a concept. For example, in Clay you assess the presence of a member "size" like this:

    [S] SizedSequence?(static S) = Sequence?(S) and CallDefined?(size, S);

    whereas you do the same thing in D like this:

    template hasLength(R) {
    enum hasLength = isInputRange!R && is(typeof(R.init.length) == size_t);
    }

    I don't know how to enforce in Clay that the type of "size" is size_t; I'm sure there must be a way. And indeed the Clay version is shorter.

  • Clay mentions multiple dispatch as a major feature. Based on extensive experience in the topic I believe that that's a waste of time. Modern C++ Design has an extensive chapter on multiple dispatch, and I can vouch next to nobody uses it in the real world. Sure, it's nice to have, but its actual applicability is limited to shape collision testing and a few toy examples.
  • The docs offer very little on Clay's module system (which is rock solid in D2). The use of an antipattern in Clay's docs worries me though:

    import myownlib.test.*;

  • Both languages seem to pay attention to linking with C and assembling applications out of separate modules.
  • D2 is aggressively geared towards allowing its users to write correct code. Clay seems to have nothing else to say beyond a run-of-the-mill exception model.
  • Clay lacks object-oriented programming support, although it does offer a number of palliatives (discriminated unions, for which D2 has powerful library offerings). This design choice seems odd to me because I believe straight OOP is much more important than supporting the obscure multiple dispatch.

    Overall Clay's current offering (judging only by the docs) is very scarce compared to D2's. I found nothing in Clay that's sorely missing in D2, and conversely there's plenty in D2 that I'd sorely miss in Clay.

    Clay looks a lot like the toy language I was working on before deciding to build on D: long on hope, short on bread and butter. In hindsight, I am happy with the decision to start with a language (D1) in which I could take for granted a lot of stuff.
u/rjt_gakusei · 2 pointsr/programming

This book has a pretty strong breakdown of how computers and processors work, and goes into more advanced things that modern day hacks are based off of, like address translation and virtualization with the recent Intel bugs:
https://www.amazon.com/Computer-Systems-Programmers-Perspective-2nd/dp/0136108040
The book can be found online for free. The author's website has practice challenges that you can download, one of them being a reverse engineer of a "binary bomb". I did a challenge similar to it, and it felt pretty awesome when I was able to get around safeguards by working with the binaries and causing buffer overflows.

u/tkellogg · 2 pointsr/programming

I think you're entirely correct about this. There is definitely a lot of value in being precise in our words. Paul is an exceptional writer - very experienced and accomplished in writing with great precision. His book about functional programming is one of the best books I've ever read, period. Believe me, he understands your point very well.

However, this kind of writing is not fun and we end up sacrificing true innovation for the sake of "correctness". I've spent a lot of time blogging. All of the high-traffic posts I've written received the same kinds of nitpicking that Paul is writing about. I've noticed that I now spend a lot more time revising my words - and with each revision the words become tamer and tamer until there's often no real substantial content left. At this point, the joy of writing is gone and it feels impossible to explore the interesting topics that make programming fun. Often I don't even post my writing because the painful thought of the unsubstantial nitpicking feels like a bigger burden than the joy that comes from expressing the ideas.

I wonder how many great ideas never made it into the public light due to fear of this type of nitpicking. I understand the quality impact of nitpicking, but I often wonder if the cost is greater than the value.

u/bbutton · 5 pointsr/programming

I'd focus on identifying where problems are in code, more than trying to fix them. I really believe fixing them is the easy part, with learning how to fix them in really tiny, incremental steps slightly harder. Appreciating what good code looks like and why lesser code can and should be improved.

SOLID is a great set of rules to start with, if you haven't ready incorporated that into your thought process. I'd also recommend a really old book by Kent Beck as a great resource about thinking at the level of coding idioms. It's called Smalltalk Best Practice Patterns (https://www.amazon.com/Smalltalk-Best-Practice-Patterns-Kent/dp/013476904X) and oops, I had no idea it was so expensive! I bought it years ago, and have read it over and over. Kent has a way of talking you through what good code should look like. It's in smalltalk, but his ideas are still great.

u/martoo · 9 pointsr/programming

Smalltalk Best Practice Patterns by Kent Beck. In my opinion, it's his best book. It's a great book on the nitty gritty of coding.. great for all programmers. It's easy to read even if you're not a Smalltalker; all you have to do is google for a Smalltalk cheatsheet.

I also like Working Effectively with Legacy Code. It's about the sort of code that most of us confront daily: how to deal with its problems, and get it under test so that you can refactor it or add to it without gumming it up.

u/arcticfox · 1 pointr/programming

Are you confusing dynamic typing with weak typing?

> That being the case, I find defining the available types explicitly

You do that with dynamic typing. Dynamic typing is not the absence of type... it's late binding.

> But you refer to this as imposing unnecessary clutter, which seems odd to me.

It's the explicit formalizations that are required by static typing that imposes clutter, not the typing itself.

> I hardly ever use reflection.

I use reflection all the time and it helps be reduce the amount of code by at least a factor of 2. In many cases, this factor is larger (3x to 5x) (Note: these factors apply to sections of code and not the whole application).

Here's an example from about 12 years ago. I was head developer on an application written in java (the first ever java application deployed according to Sun and IBM). We had to use AWT (which has it's own problems) and I became increasingly frustrated with their use of the observer pattern (listeners). The problem was too much formalism. I had to define listeners which could only respond to one message (actionPerformed). Then, I had to map this message to the real message... the one relevant to the problem. (For example, when an "ok" button is pressed, the action relevant to the application logic is "ok" and not "action performed". When you start building complex interfaces, the need to map between the conceptual space that is implemented by the model and the interface can add considerable needless complexity).

My solution was to use reflection. Even an implementation of reflection as poor as that in java was enough to remove huge swaths of code from the application. I created gui components that allowed the registration of listeners, but the registration component included not only the reference to the listener but also the name of the method that should be invoked against the listener. This method was invoked reflectively. Our application was about 125,000 lines of code and this change removed about 15,000 lines. (BTW... this design was not mine. I had previously worked with NeXTSTEP and Objective-C. I simply emulated the design I had seen employed there).

The change was significant. Not only did it remove a lot of code, but it significantly reduced the complexity of building GUIs. The complexity imposed by the static typing added no benefit to the application... its use served only to increase the complexity of the solution which made the application harder to code, longer to code and more difficult to change and fix.

> Perhaps our mindsets are different here, and you prefer to work more intuitively for longer, hence preferring a tool that better supports that approach?

In the example I gave above, the use of reflection significantly reduced the amount of code that was required. To me, this is a win regardless of one's analytic preferences. I see this effect whenever I employ the use of dynamic typing to a project.

The one thing to remember is that dynamic typing is a tool. It can be used well or badly, just as static typing can be used well or badly. It's necessary to take a disciplined approach when it is applied.

If you're interested, I highly recommend The Art of the Metaobject Protocol

u/gorset · 10 pointsr/programming

Ok, let's see. A few quick points:

Rooted in in standard, everyday mathematical notation? public static final volatile interface abstract class transient inner class inner anonymous class objects values boxed values..... etc...

Java Generics has drawn heavily from the functional programming camp which is based on lambda calculus. See google tech talk: Faith, Evolution, and Programming Languages

Extremely easy to read and understand? Go read Java Puzzlers and Effective Java to see how many easy mistakes you can make.

Static types enables blablabla...? Not possible without static typing...? Does he realize that tools like Eclipse grew out of a smalltalk project and that smalltalk pioneered automatic refactoring ages ago? Smalltalk is one of the most dynamic language around, and it's more than 30 years old.

Most bugs are found at development time with java? That's not my experience :-)

Simple puzzle: for which values for someInt does this code fail?
int myPositiveInt = Math.abs(someInt);
assert myPositiveInt >= 0;

u/knaveofdiamonds · 4 pointsr/programming

It depends exactly what you're looking for, but I'd strongly recommend The Algorithm Design Manual by Skiena ( http://www.amazon.com/Algorithm-Design-Manual-Steven-Skiena/dp/1848000693/ref=sr_1_1?ie=UTF8&s=books&qid=1261657117&sr=1-1-spell ).

It focuses much more on intuitive explanations rather than spending time on proofs, is very readable and has a solid reference section covering over a hundred different algorithms with further references. See Steve Yegge's writeup of the book at http://steve.yegge.googlepages.com/ten-great-books (number 6).

Cormen et al. is good as well, but a bit dry compared to the above book, and with a much heavier focus on the proof side of things.

u/trashhalo · 2 pointsr/programming

Disclaimer: This may not be true for your job, but it has been for every job I have worked at.

That everything they are teaching you about algorithms will not be useful to you when you get into the field. Your education starts day one at your first job. Clients don't pay us to innovate in algorithms. They pay us to find and glue together other peoples libraries and to use this to present them the requested information.

Code you will actually be writing:

  • Glue code. Integrate Library X with Library Y.
  • Unit tests. To make sure your glue code works as expected.
  • UI code.

    Things you will be doing that CS degree does not prepare you for

  • Fleshing out incomplete requirements documents
  • Dealing with drama between teams
  • Estimation
  • Understanding that 80% is often good enough

    I would suggest reading books like Design Patterns, Mythical Man-Month and Code Complete
u/lifeson106 · 7 pointsr/programming

Interesting that I was never required to read any of these books while in college. Luckily, I have read 5 of the 12 on my own time and they have definitely helped me in my professional development - Refactoring, Clean Code and Design Patterns in particular. I also highly recommend Peopleware and reading other people's code on Github or elsewhere, particularly if you are learning a new language.

u/reventlov · 2 pointsr/programming

First, be prepared to write code that sucks and is unmaintainable for a while. (This could be months or years.)

If you only know Java, then you'll need some C concepts. In particular, you need to become familiar with pointer arithmetic, avoiding buffer overruns, and manual memory management. (C++ has ways (RAII) to make manual memory management less error-prone, but you still need to understand what's going on.)

To learn the basics of the language, read The C++ Programming Language, Special Edition, by Bjarne Stroustrup.

Read a lot of things by Herb Sutter and Andrei Alexandrescu. I particularly recommend Modern C++ Design and Exceptional C++; I'm sure others in this thread can give you some good suggestions.

Finally, start writing code, and get it critically reviewed by someone who really knows C++. (I've known 2 or 3 such people in my career; they exist but are the exception, not the rule. Look for people who can explain all the things I listed, and more.)

(Edited to add The C++ Programming Language.)

u/inopia · 1 pointr/programming

Personally I'm very partial to Design Patterns: Elements of Reusable Object-Oriented Software. I see Java more and more as a software engineering, rather than a programming language. You can do programming more effectively in Python/Jython or (J)Ruby, but Java for me still is king for developing type-safe, robust libraries and unit testing.

You might also want to read up on Eclipse or another decent IDE. Eclipse reduces the amount of monkey typing that all Java developers must endure dramatically with things like templates, getter/setter generation, delegate method generation etc. Since the editor parses the code as you type and keeps an AST in memory, refactoring support is excellent and you'll spend less time worrying about minor design issues when starting a new project. Code is compiled on the fly so startup times are minimal. It's also able to produce very descriptive and useful information about any errors you might have in your code (unlike GCC, for instance:)

If you want to know a bit more about how the JVM itself works, read The JavaTM Virtual Machine Specification, Second Edition which is online and free. It'll give you a bit more insight into why some crappy things are as crappy as they are (backwards compatibility with Java 1.1). Read books that are recent enough to include language features of 1.5 and 1.6, such as static imports, enums, generics, varargs and so on, and decompile some Java code to see how the compiler implements them.

u/TimMensch · 2 pointsr/programming
  1. If you think that C++ isn't harder to learn than C, then you don't really know C++. Read Modern C++ Design, and a few dozen pages of the C++ FAQ, and if your head doesn't explode, tell me again that C++ isn't harder to really learn than C.
  2. I know the Linus article. I know he's competent. But I still don't agree; I think he's so steeped in his kernel code world that he doesn't see the advantages, at least for application software.
  3. Anyone who writes code in C++ but who doesn't know pointers, can't concat strings, etc., is someone who also really doesn't know C++.

    Someone who doesn't really know C++ can do things in C++ and be productive, but keep them away from writing templates, macros, and/or doing anything with pointers. The right library can protect amateurs from shooting themselves in the foot, at least too often; that's why people talk about using a "subset of the language."

    I'm a game developer. It's still the case that most serious game developers demand C++, and for good reason.

    For reference: I've interviewed dozens of developers applying for a C++ game development job. Most rate themselves 9 to 10/10 in C++, and yet most also choke when I start asking them even slightly interesting questions about the language. Not even the standard (or any other) libraries -- just about the core language. This is all stuff I know like the back of my hand -- and I know that there are things in C++ that I couldn't just get right the first time (see Modern C++ Design for a few examples).
u/kanak · 1 pointr/programming

I think SICP is one of the greatest books I've ever read and that anyone who is serious about programming should read it (or be aware of the ideas discussed there).

However, it is a daunting book especially for newcomers (doubly so if the newcomer wants to get the most out of the book and wants to do every exercise).

I would recommend a book such as Simply Scheme to build up some background knowledge before tackling SICP.

I also highly recommend the Schemer series: Little Schemer, Seasoned Schemer.

u/cronin1024 · 25 pointsr/programming

Thank you all for your responses! I have compiled a list of books mentioned by at least three different people below. Since some books have abbreviations (SICP) or colloquial names (Dragon Book), not to mention the occasional omission of a starting "a" or "the" this was done by hand and as a result it may contain errors.

edit: This list is now books mentioned by at least three people (was two) and contains posts up to icepack's.

edit: Updated with links to Amazon.com. These are not affiliate - Amazon was picked because they provide the most uniform way to compare books.

edit: Updated up to redline6561


u/eadmund · 1 pointr/programming

I'd add The Art of the Metaobject Protocol which in a few short pages creates an entire flexible class system and demonstrates how to extend it to do pretty much anything one needs. And it's easy and pleasant to read too.

After you've read it, you'll never again look at problems the same way; it demonstrates that for the sufficiently-clever there's always a way to make the problem solve itself.

u/kurogashi · 5 pointsr/programming
Have a look at Effective Java by Joshua Bloch - Each chapter in the book consists of several “items” presented in the form of a short, standalone essay that provides specific advice, insight into Java platform subtleties, and outstanding code examples. The comprehensive descriptions and explanations for each item illuminate what to do, what not to do, and why.

:edit:: wrote Essential instead of Effective
u/illegible · 7 pointsr/programming

I highly recommend "The Code Book" to any novices interested in this sort of thing, it's readable and entertaining without being insulting or excessively complex.

u/in0pinatus · 23 pointsr/programming

I admire your dogged adherence to being wrong in every particular. It takes a special brand of stubborn contrarianism to quote someone's badly edited notes as a primary source and then followup by a claim that this is best possible research.

However, outside in the real world, Alan Kay writes extensively and authoritatively here and in his numerous contributions on Hacker News quite aside from publications spanning decades.

And an awful lot of people agree with his definition. The introduction of the classic Design Patterns defines objects as an encapsulated package that only responds to messages. People who teach OO programming readily quote Mr Kay's definition. The Ruby programming language is fundamentally based upon it, and before you shout "but Ruby has classes" note that Ruby classes are actually themselves objects, for which the new message happens to do something particular by convention. And so on; the point being that Alan Kay's definition is super influential, which is why the idea that Erlang is the most object-oriented language is not a new proposition.

u/artsrc · 1 pointr/programming

> It can mean two things:

> Test Driven Development
> Test Driven Design

It could mean two things but I think we would live in a better world if we said TDD means this:

http://www.amazon.com/Test-Driven-Development-Kent-Beck/dp/0321146530/ref=sr_1_1?ie=UTF8&qid=1323053751&sr=8-1

This is the group that popularized the technique, so why not stick to that.

u/weemadarthur2 · 1 pointr/programming

For an excellent, easily accessible description of some of these codes and the work done to break them, I recommend The Code Book by Simon Singh. Amazon link

u/rferranti · 2 pointsr/programming

Not an opposite view, but IMHO definitely worth quoting on this matter:

> If more than 20-25% of a component has to be revised, it's better to rewrite it from scratch. (Thomas et al, 1997)

It's Greg Wilson, author of "Making Software: What Really Works, and Why We Believe It"

u/munificent · 2 pointsr/programming

> What would be cool for Magpie would be a Meta-Object Protocal.

I just finished AMOP so that's definitely on my mind. My current plan is to get a solid object/multimethod system working first. Once that's in a happy place, I'll try to start moving bits of it over to Magpie itself and make them more extensible.

At the very least, I want user-defined patterns. That covers a lot of the MOP space since it would let you define your own method matching logic.

u/gnuvince · 3 pointsr/programming

I like Skienna; short intro to the theory at the beginning and then it delves into actual algorithms. CLRS is good if you need more theory (e.g. solving recurrences).

u/AlSweigart · 1 pointr/programming

I think Amazon should have better delivery time, if you don't mind paying the full price. http://www.amazon.com/Invent-Your-Computer-Games-Python/dp/0982106017/

You can also download it as a PDF for free from the site.

u/ItsNotMineISwear · 2 pointsr/programming

Every algorithms textbook I've read starts with a chapter that involves breaking down algorithm runtime into mathematical terms more specific than Big O (for instance, nested for loops turn into nested sums). Then the mathematical terms are simplified as much as possible and you get something like Runtime(n) = n(n-1)/2. Then you apply the definition of Big O to prove that that's O( n^2 ). The concepts and definitions of Big Omega and Big Theta are also usually brought into the mix as well.

If you want a specific book recommendation, I thought Skiena's Algorithm Design Manual was a good read.

u/EughEugh · 6 pointsr/programming

There are several good books on designing good software:

Code Complete

Design Patterns

Refactoring

u/srnull · 2 pointsr/programming

Run and get a copy of Computer Systems: A Programmer's Perspective. It covers everything from assembly to semi-modern computer architecture, operating system abstractions, and concurrent/parallel programming.

It's pricey, but worth it. A used copy of the first edition would be plenty good enough if price is an issue.

u/jms_nh · 1 pointr/programming

Article doesn't tell you how, it's more like a handful of thoughts on how to do so, without any real data (as the author admits)

>But I think you should take more away than a handful of application-wide metrics. You should take away a preference for statistical and empirical consideration. Figure out how to quantify trends you observe and form hypotheses about how they impact code quality. Then do your best to measure them in terms of outcomes with the application. We lack a laboratory and we lack non-proprietary data, but that doesn’t stop us from taking the lessons of the scientific method and applying them as best we can.

Someone has already done this. Read chapters 8, 9, and 23 of Oram and Wilson's Making Software

u/mickrobk · 10 pointsr/programming

Programming Interviews Exposed is an extremely good resource, Steve Yegge also has a good post on areas a programmer should be proficient in.

Nothing is really a substitute for being knowledgeable and enthusiastic.

u/lgstein · 49 pointsr/programming

This is a nightmare. After reading Peopleware (http://www.amazon.de/Peopleware-Productive-Projects-Tom-DeMarco/dp/0321934113/ref=sr_1_1?ie=UTF8&qid=1421350607&sr=8-1&keywords=peopleware) you'd expect major players like Facebook and the likes have learned by now. But nooooo, let's continue to pretend a software company is a huge fabric where people sit in front of monitors instead of working the assembly line. What else could be different?

\
It is entirely possible that they just do this to show off to stakeholders, because those aren't impressed by a row of closed doors.

u/ytinas · -4 pointsr/programming

Actually C++ does this as well, it's just better hidden from the user.

This is the issue with perl5 "OO". It may have the potential to be the same as C++ or Python or whatever, but the packaging is non-existent. You must do everything yourself. Manually. I suppose that's ok if you want to learn how OO actually works (though personally I would recommend going with The Art of the Metaobject Protocol).

Python does expose the fact that the "this" argument is passed to the function which is, as you stated, a mistake. But at least everything else has pretty decent packaging.

u/pchiusano · 2 pointsr/programming

I'm not sure what you are asking for is possible. The difficulty is that you need to somehow express a type signature that is impossible to express in other languages, and there is no pseudocode convention for the concept! You could look at, say, the implementation of sequence, and know that it works for Maybe, Either, and lots of other data types that come equipped with a Monad, but that implementation and type signature is going to be greek to you if you don't know Haskell. So basically, you just have to "trust" that it is a similar sort of thing to other abstractions like IEnumerable or whatever--it saves you from having to duplicate code.

IMO the benefits of monads and other abstractions only become clear after you've used and written enough functional libraries to really feel the code duplication problem solved by monads and their ilk. Until that point, it's difficult to grok. I know this isn't all that satisfying. :(

I co-wrote a book on Scala and we decided to hold off on even mentioning monads until more than halfway through the book, instead focusing on getting the reader lots of practice with the functional style using concrete examples. By the time we finally do introduce abstractions like monads, the reader had lots of concrete examples and (we hoped) a gut feeling for the code duplication problem that needed solving. I think this worked reasonably well for people, but even then it is still a bit of leap!

u/mikepurvis · 20 pointsr/programming

> creating a strong cryptography algorithm is primarily trial and error

I don't think it's trial and error, more like a series of steps in which each is designed to befuddle a particular class of attack that may have been successful on previous iterations. This sort of goes back to the Cold War and even WWII—a code like Enigma was the scramblers and the plugboard, where the plugboard was specifically added to foil a type of dictionary attack.

Anyhow, not that I really know too much about it, but this is a fantastic book on the topic.

u/RedSpikeyThing · 8 pointsr/programming

Concrete Mathematics by Graham, Knuth and Patashnik is a great start to almost all of those topics.

u/fragglet · 5 pointsr/programming

Peopleware has an entire chapter on this, as I recall. Great book.

u/unovasa · 12 pointsr/programming

I really enjoyed Concrete Mathematics by Graham, Knuth, & Patashnik

u/xoxer · 2 pointsr/programming

This pretty much defines the TDD workflow. It's exactly as mr_chromatic describes it.

u/gauchopuro · 3 pointsr/programming

The "Gang of Four" design patterns book http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612/ref=sr_1_1?ie=UTF8&s=books&qid=1251223171&sr=8-1 is the one that started it all. There are code samples in C++.

I've also heard good things about the "Head First Design Patterns" book.

u/oneandoneis2 · 13 pointsr/programming

The Art of the Metaobject Protocol is one of my favourite programming books of all time - the sudden "click" in my head that marked the transition from "What the hell is a MOP anyway?" to "That's such a simple and awesome idea, why doesn't everyone do that?" was the closest I've gotten yet to the fabled "Lisp enlightenment" :)

u/daddyc00l · 23 pointsr/programming

there is an excellent book called peopleware that goes into lots of management fads, check it out, you might just like it.

u/opensourcedev · 3 pointsr/programming

If you are looking to go a little deeper, I can recommend this book as well:

"Computer Systems: A Programmer's Perspective"

http://www.amazon.com/Computer-Systems-Programmers-Perspective-2nd/dp/0136108040/ref=sr_1_1?ie=UTF8&qid=1292882697&sr=8-1

This book has a similar thread, but is much more in-depth and covers modern Intel and AMD processors, modern Operating Systems and more.

I learned all about linkers, compiler optimization, the crazy memory management structure of x86...

u/gvwilson · 35 pointsr/programming

A shorter (and better) version of the talk is online at http://software-carpentry.org/4_0/softeng/ebse/. If you want to keep up with the latest in empirical studies of programming, check out http://www.neverworkintheory.org/ (where we're blogging once or twice a week about recent findings), or have a look at our book "Making Software" (http://www.amazon.com/Making-Software-Really-Works-Believe/dp/0596808321) which collects the most important findings, and the evidence behind them.

u/joshrulzz · 5 pointsr/programming

> Computer science is about building things like engineering but without the luxury of a toolbox and components taken from the physical world. No one has worked out reliable and effective procedures for building large pieces of software as the engineers have done for physical project.

At first, I started to take issue with this statement because of software patterns. But I he means more than this. The author's points were thus:

  • Engineering components exist in the real world - No one expects a widget to break the laws of physics.
  • Engineering components interact in more predictable ways - In the real world, separate subsystems are really separate.
  • Engineering has fewer fundamental midcourse design changes - Waterfall process works because people don't demand mid-course changes as much.

    IMHO, the author's points are a human problem, not an engineering problem. In point 1, a project manager didn't set expectations for a client. In point 2, developers did not use tools that exist. Buffer-overrun protection DOES exist (his example), and other technologies help modularize software properly. Hell, cohesion and coupling are among the core software design principles. In point 3, again, a project manager did not properly set expectations for the client. PM technologies like agile methods have been developed to fight this. Further, because software is much newer to the human knowledge collective, it's less well understood. When it has been around for as long as architecture and machines, clients will have better expectations of the process.

    In all, it sounds like his experience with software has not been true engineering rather than modern software development techniques.
u/nopointers · 2 pointsr/programming

Check this thread for a reading list. IMO, The Algorithm Design Manual should have made the top too.

u/leoc · 1 pointr/programming

All gone now. (05:30 UMT 10 August) LiSP and Probability Theory: The Logic of Science are still in the top two slots, but amazon.ca appears to have sold out of new copies.

u/robertcrowther · 5 pointsr/programming
  1. He's not the author
  2. The book was published in 1994
  3. Java was first released in 1995
u/Woolbrick · 1 pointr/programming

Gang of Four; an industry nickname for the four authors who wrote this book

u/TheSuperficial · 6 pointsr/programming

Andrei Alexandrescu's "Modern C++ Design". It was so mind-expanding and well-written, not to mention useful, I was reading it for 2-3 hours a day until I finished it.

u/keithb · 2 pointsr/programming

Try Test-Driven Development. In TDD, programming proceeds by accumulating ever larger catalogues of concrete examples. Get hold of Beck's Test Driven Development: by example Get your buddies to set you up with a TDD environment in Ruby and off you go.

u/zenon · 2 pointsr/programming

Effective Java is necessary if you work with Java.

u/LordHumongous · 1 pointr/programming

There's also a relatively cheap book on programming interview questions.

u/chunky_bacon · 3 pointsr/programming

I thought so too, until I discovered The Algorithm Design Manual.

So, what about CLR(S) warrants a new edition?

u/IronTek · 0 pointsr/programming

This one is one of the most (if not the most) popular book of C++ design patterns.

u/PsychoI3oy · 3 pointsr/programming

The Code Book for a primer on the basics up through some simple explanations of modern computer cryptography and the Handbook of Applied Cryptography for serious math proofs of a lot of the modern crypto standards in use today.

u/Shmurk · 4 pointsr/programming

Add Concrete Mathematics if you're a maths retard like me.

u/teknobo · 22 pointsr/programming

Even though this seems to be just aggregating some Stack Overflow answers into blogspam, I'll bite.

> Two lines of code is too many

If you're seriously going to complain about one extra line of code in a method, I don't see this ending well.

> If it's not native, it's not really programming

Semantics. Even if you don't call it programming, you'd damn well better know those things if you want to use them. SQL, Java, and any other VM-based language may not qualify as "programming" by this definition, but they're still damn useful.

> The "while" construct should be removed from all programming languages. (In favor of "Repeat...Until")

Semantics again. There is no functional difference between the two, and I would argue that while is actually preferable since it puts the looping condition right there on the same line, instead of having to skip to the end of the block to find out if you even entered the block in the first place.

> Copy/pasting is not an anti-pattern.

No, it's not, and it's been proven. I'm having a hard time finding the peer-reviewed study on copy/paste programming right now, but basically, it's been shown to save a lot of time as long as you're using it properly.

Where the hatred for it comes in is that, like GOTO, if you use it too often, you'll probably end up using it wrong.

> Developing on .NET is not programming, it's just stitching together other people's code

A reiteration of his 2nd point, but honestly, a huge amount of working as a professional programmer -- hell, almost the definition of working in a team -- is stitching together other people's code. There's nothing wrong with that, and it's hardly controversial.

> The use of try/catch exception handling is worse than the use of simple return codes and associated common messaging structures to ferry useful error messages.

This has been getting debated a lot in go-lang circles, but the general consensus seems to be that unless you're working in an embedded environment (or some other highly-constrained environment), you're probably better off with try/catch.

> Test constantly

Test-Driven Development is something that I personally agree with, and truthfully has become a very popular practice among Rails people. I don't see how that would qualify it as being controversial.

That said, certain studies have shown evidence that TDD is not as effective as many seem to believe.

> Object Oriented Programming is absolutely the worst thing that's ever happened to the field of software engineering.

I've heard this claim semi-often. It seems to mostly come from people having worked with languages that claim to be OO but constantly make exceptions to the rules, like Java, C++, or Python. In fact, the author specifically calls out Java.

Try Smalltalk or Ruby and you'll come to see that OOP done right is actually quite wonderful.

> C (or C++) should be the first programming language

Debatable, but certainly not controversial by any stretch of the imagination.

> Classes should fit on the screen.

How big is your screen? I can fit any class definition on a 64" monitor.

Some classes simply must be large. It is an unavoidable fact that certain things are simply more complex to model than others. This point isn't controversial, it's just asinine.

> Making invisible characters syntactically significant in python was a bad idea

This again? Is it really a controversial opinion if it's been something non-Python programmers have been whining about for decades? Because as far as I can tell, people whine about it for about the first five minutes of Python coding, and then give up because they would've been indenting anyway.

It can cause bugs when transferring code between computers, I'll give them that. Otherwise, it's Python demanding good formatting, something that you should be demanding from everyone on your team anyways.

My main regret with Python is that I haven't found a good tool that auto-formats everything (a la "gofmt").

But otherwise, Python's indentation requirements are so in line with common indentation in almost every programming language that proper indentation comes naturally to more or less everyone. In how many programming languages that you regularly use do you not format your conditional, looping, class/method, or exception blocks?

> Singletons are not evil

It's not controversial to agree with Design Patterns. That book is more or less the undisputed truth on the subject, and it thinks the Singleton pattern is fine and dandy.

u/yellowstuff · 3 pointsr/programming

There's a whole section on this problem in Concrete Mathematics. The first edition was published in 1989, and the problem was old then.

u/gte910h · 0 pointsr/programming

>Classes without template parameters are not in a different type system then those with them.

Many people very strongly metaprogram in C++, using templates to make abstract types that bear no resemblance to traditional C++ classes, doing polymorphism, data hiding and various other cannonical aspects of classes without a class keyword in site. It quite clearly is a different type system, much more esoterically different than the one in ObjC, being compile time. We're not talking "classes with template parameters". It's a second, wholecloth type system. Interfaces (as java would call them) without the interface declaration.

>Also what the hell are "new style" vs "old style" template classes.

An old style template class is a class that works on a datatype or two which is templatized. A new style template class is something out of http://www.amazon.com/Modern-Design-Generic-Programming-Patterns/dp/0201704315 a very popular style in some shops.

>And what the hell are you even talking about with your "weird extra encapsulated ways"?

Class 1 has a private member variable to a second class called a Class1_impl which is a completely different class which implements the internals of Class1. This is an attempt to enforce data hiding from friending.

The reason I listed more than 2 type systems, is because I doubt you can say ALL the listed type systems I have are the same one. If even 2 of them are, C++ has "at least as many" as objective C does (or Java for that matter). If you've NOT encountered these esoterica, I just think you need to work in 2-4 more C++ shops and you definitely will. They're pretty common in my experience.

u/mlester · 1 pointr/programming
u/_georgesim_ · 1 pointr/programming

Well, simply thinking that Knuth could make such a mistake, strikes me as naive. After all he wrote this and this.

u/StoneCypher · 0 pointsr/programming

No, it isn't. Wikipedia is, like usual, because it's written by people who learned what MVC means from random frameworks that also get it wrong.

http://www.amazon.com/Design-Patterns-Elements-Reusable-Object-Oriented/dp/0201633612

u/legacyReasons · 3 pointsr/programming

I hate tech tests.

I worked at a company were we needed a C++ developer to work on some legacy C++ code we had. We had no C++ developers but my manager still had to scramble around for a tech test for a C++ developer.

I don't understand why there is no certification for developers. Everyone thinks their tech test is unique but they are not. I've been to many tech tests and they all cover the same crap that is not relevant to any job I've ever had. If tech test were unique books like this would not exist http://www.amazon.com/Programming-Interviews-Exposed-Secrets-Programmer/dp/047012167X/

And if I need an accountant I don't put them through a tech tests to see if they are qualified.

u/nostrademons · 7 pointsr/programming

Largely because CL has had the benefit of several years and hundreds (thousands?) of users beating on it, while an ignorant reimplementation is likely to make all the same mistakes that Lisp did when that feature was first introduced.

For example - generic functions were first introduced in Flavors, the object system for the Lisp Machine. A bunch of issues cropped up concerning the interaction of multiple dispatch and multiple inheritance, issues that the Python community hasn't even considered. There's a whole book - The Art of the Metaobject Protocol - written about these, along with many pages in the CL HyperSpec. Even with standardization, CL made a mistake: the superclass precedence order should be monotonic. The paper A Monotonic Superclass Linearization for Dylan goes into this in more detail.

u/rockstar_artisan · 15 pointsr/programming

Can't speak for the whole subreddit, but my personal hatred towards him comes from Martin being a snake-oil salesman.

Take his 3 "laws of TDD". Advertised as a practice equivalent to doctors washing their hands, results in horrible and unmaintainable test suites (I've seen plenty of tests whose idiocy couldn't be explained other way than those rules)

Take his "architecture" lectures. Advocating programming in ruby on rails while abstracting away everything about rails or the web or the database, or anything you didn't define yourself. In principle to enable testing and be able to run the code outside the web. In practice: https://youtu.be/tg5RFeSfBM4?t=299 The lecturer says that the architecture in practice is "horribly complicated" and that Bob had only a single implementation of the architecture, which he couldn't share. That didn't stop him from continuing with his lectures.

Take his blogposts (an example: http://blog.cleancoder.com/uncle-bob/2017/03/03/TDD-Harms-Architecture.html) Awful strawmans (this terrible architecture i drew is what you propose), cringeworthy writing style (lengthy discussions with made up opposition instead of real one). Unfalsifiable theories: "Is is only programmers, not TDD, that can do harm to designs and architectures.". That sentence is always true, either you succeeded and therfore TDD works, or you failed and TDD had nothing to do with it. No data to backup his claims "TDD works." - well there are plenty of publications that disagree (see chapter 12. of https://www.amazon.com/Making-Software-Really-Works-Believe/dp/0596808321)

Take his SOLID principles. While some of them are well defined and can be applied meaningfully (Liskov substitution principle), the Single Responsibility Principle has definition so weak and useless it's ridiculous.