#9 in Software testing books
Use arrows to jump to the previous/next product

Reddit mentions of Explore It!: Reduce Risk and Increase Confidence with Exploratory Testing

Sentiment score: 2
Reddit mentions: 2

We found 2 Reddit mentions of Explore It!: Reduce Risk and Increase Confidence with Exploratory Testing. Here are the top ones.

Explore It!: Reduce Risk and Increase Confidence with Exploratory Testing
Buying options
View on Amazon.com
or
    Features:
  • Pragmatic Bookshelf
Specs:
Height9.25 Inches
Length7.5 Inches
Number of items1
Weight0.8157103694 Pounds
Width0.51 Inches

idea-bulb Interested in what Redditors like? Check out our Shuffle feature

Shuffle: random products popular on Reddit

Found 2 comments on Explore It!: Reduce Risk and Increase Confidence with Exploratory Testing:

u/tech_tuna ยท 2 pointsr/QualityAssurance

My feeling is that the more you can condense and summarize your test plans in this context, the better. I'd even argue that most companies handle test plan management poorly.

I've read two testing books recently that discuss this in a way that I find palatable and sensible:

http://www.amazon.com/Explore-It-Increase-Confidence-Exploratory/dp/1937785025

http://www.amazon.com/Google-Tests-Software-James-Whittaker/dp/0321803027

In both books, the authors argue that it's better to create high level test summariesstrategies than fully expanded and granular lists of EVERY SINGLE test case.

I completely agree with this sentiment, however the problem I have with Explore It is that the author proposes the term "Test Charter" for this type of test plan. I don't like how that sounds personally and I'm both hesitant and skeptical about adding new QA-specific lingo into the mix. I.e. using new terms to describe testing to others (non-testers). I am not a fan of the terms Exploratory Testing and Context Driven Testing. I also have a problem with the whole testing vs. checking debate brought on by Michael Bolton and James Bach.

Overall, I like the Explore It book much more than the Google one - I found the Google book to be lacking in details about the "magic" of Google's testing processes. . . also, it should be noted that James Whitaker left Google (for Microsoft!) a few months after that book was published. . .

Anyway, the core problem is that no one wants to sit through a review of a gigantic spreadsheet, or whatever tabular format that you are almost guaranteed to be using. It's incredibly boring.

Furthermore, you will have a very difficult time getting a developer to review a test plan that looks like this. Visually scanning a long list of test cases is anathema to most developers, trust me, I've been there and done it. This is entirely bad either, any good developer naturally despises repetition and inefficiencies. Which is the problem here, reviewing a long list of test cases isn't an efficient group activity.

Also, the tool/format matters a lot too. I've used a bunch of different tools to manage test plans, my current favorite is TestRail. It's not perfect but it's much more pleasant than anything else I've used in the past (NOTE: I have nothing to do with the company that makes TestRail but I used it at me last job and we use it at my current company).

tl;dr Ask people to review a test plan summary. You may want to call out specific risks and challenges, just don't ask people to read through a list of 500 rows in table somewhere. In the setting of a meeting, a high level presentation (Powerpoint or whatever) might be a good starting point, followed by a Q & A and brainstorming session.

I could go on but I feel like test plan management is yet another aspect of testing software that everyone seems to disagree about. :)

u/Yogurt8 ยท 0 pointsr/QualityAssurance

Most "schools" that offer QA programs or courses are usually a waste of money. This is due to the fact that there are not many regulations or standards that exist for education in this field. They can teach some extremely outdated syllabus and get away with it because their students and admins do not know any better (look at all the useless certifications out there). Testing is an extremely nuanced and complicated art, it's one of those things that is very easy to get started and do badly, and most people cannot tell the difference. This is an area where I'd like to make a difference later in my career. For now though, if you want to get into testing, I would suggest you to both learn the automation side (even though you didn't pass your java course, you are still probably technically savvy enough to learn the basics and go from there) and the theoretical testing concepts.

You get a lot of devs that do not have a testing mindset or testers without enough technical skills / coding experience. If you can do both really well then you will be looked at like a unicorn and can make a good living (depending on your country/area).

The easiest way to get into automation is learning through a tool like Postman (back end testing) or Selenium. There's tons of Udemy courses and youtube content for these.

Check out Valentin Despa's content for PM, and John Sonmez or Naveem's stuff for selenium.

For testing concepts such as analysis, risk, quality criteria, communication, test design and techniques I would suggest reading the following books:

https://www.amazon.ca/Explore-Increase-Confidence-Exploratory-Testing/dp/1937785025

https://www.amazon.ca/Lessons-Learned-Software-Testing-Context-Driven/dp/0471081124

https://www.amazon.ca/Perfect-Software-Other-Illusions-Testing/dp/0932633692

and consider taking Rapid Software Testing classes from michael bolton or james bach, they get pretty theoretical but are based upon practical work that you will be asked to perform.

These videos can also give you a pretty good sense of the testing role:

https://www.youtube.com/watch?v=ILkT_HV9DVU&t=19s

https://www.youtube.com/watch?v=3FTwaojNkXw&t=2048s