Don’t Break Apps: Dispel Notions

You might have heard it from a tester or two, now and again: “I get paid to break apps”. It’s cute & funny and who doesn’t love being both or either? But, it’s wrong. Unless you are writing and creating the application, you are not “breaking” it in your testing adventures.

I firmly believe in the spirit of “breaking an app”: attacking a program with tools and chaos and exploiting known vulnerabilities and the like. That’s the stuff we live on. I enjoy pummeling an app just as much as the next gal but I take issue with the cavalier notions that may come from using the term as a badge of honor. Don’t get it twisted, we’re not just breaking things. You don’t take a laptop and test it by running it through a dishwasher and drying with a ball-peen hammer. That’s not what I do.

Our role is to expose weaknesses and nonfunctioning areas that you proclaim to have broken. To surmise and posit defects and report our findings. To illuminate the value status to our team and clients. To research everything even remotely related to the program and use our imagination to put the application through the rigors of human use and programmatic abuse to highlight the areas where things just aren’t behaving as expected or where our expectations don’t fit. Our pride should come from us effectively, expediently dispelling the notion that things are working. Refuting, in learned detail, the belief the program is operating correctly and effectively. To hoist a flag when it appears as though we have “broken” the app.

I dispel notions for a living.

272 Words

It’s a Bug Report

And here we have a example of a bug report form and my intentions for each area. It looks like a lot of stuff and maybe redundant but you just clip what you don’t have and give ‘er all you got.

Title of Your Bug Report Should Sum Up The Whole Thing


  • This is where you expand your issue in more of a paragraph form.

Steps to reproduce

  1. you open the app
  2. you do a thing
  3. you log it
  4. you number these steps for clarity

Results expected

  • what you were hoping, wishing, dreaming and expecting so see as the result of your above numbered actions

Actual results

  • what you observed after the steps that was not in line with your expectations


This list should cover what you found when attempting to reproduce

  • Repeatable
  • Non-repeatable
  • Situational
  • Unknown
  • Always
  • Often
  • Sometimes
  • Once


This is the proposed severity based on your understanding of the issue, prepare to be overrode by your favorite Designer, Developer, Product Owner or Project Manager

  • Crasher
  • Blocker
  • Critical
  • Major
  • Minor
  • Trivial
  • Enhancement

Work around

  • If this is a customer-facing issue, ensure support has a way around things. Being able to see a path around may help the bug report reviewer figure out the source and severity.

Proposed Solution

  • Anything outside of the Expected section above that you feel would resolve the issue.

Documentation / External References

  • This is where things like, the HIG, WikiPedia, Design Patterns, Specs or the like can be noted if you think it might be helpful to move the case forward.

Existing Cases / Internal References

  • Tangentially related bugs you have already filed or documents and requirements from the client or whatever occurs to you.

Regression Status

  • In your repro you may have checked a prior build or release, branch or commit, this can be enormously helpful to your resolver.

Configuration Details

  • Hardware – be explicit
  • OS – be explicit, harder
  • Build Info – down to the SHA or Jenkins link or whathaveyou
  • Pertinent Settings – network issues, proxies, OS settings, etc


  • Sample files, mock data, test accounts, the build, screenshots, videos or whatever source you have to reproduce the issue

Personal Notes

  • The personal appeal, confessions of known biases, love notes and curious musings.

377 Words

Skeleton Test Manifest

This is one part your standard test plan, one part my weird, weird brain. Context is everything. Take what you want and leave the rest. The tests approaches are by no means exhaustive but serve as a good jump-off point to get you going.

App Test Manifest

What is this app? 30K ft view. Make it pithy.

Document Usage Guide


This is where the project mission statement goes. If you don’t have one from outside, write your own. Provide an overview of the plan in terms of what the particular project covers. You may briefly mention items such as limitations in resources and budgets, scope of testing and how other activities such as reviews are related. This is just a summary, so keep things short.


Provide details of who has the responsibility of delivering different parts of the test plan. Name names. List specific parts of the testing.


Give a low down of who is doing what with the project outside the scope of testing. Name names. List titles and departments, maybe even contact data.


Your overarching testing paradigm if that wasn’t covered elsewhere.


Testing assumptions we will be making.

Staffing and training needs

Identify the people and skills needed to deliver the plan.


Expected items from working through this guide. This may include bug reports, updates to the guide itself. Identify what should be delivered as part of this plan including pass/fail reports, performance tracking, customer-ready documents, etc. Every thing that will be turned over to another party.

Environmental needs

Are there any special requirements for these tests. You may want to consider things like special hardware, test data or restriction to any system during testing.

Testing tasks

Outline what functional tasks are required with exception to the actual testing. You may want to consider equipment setup along with any administrative tasks.

Suspension criteria and resumption requirements

State in what circumstances to stop and restart the test. Define clearly.

Item pass/fail criteria

State the acceptable pass / fail criteria. This can be a general criteria or at individual test case level.


Outline the overall test strategy for this plan, identifying the test process and rules that will be followed. You may also want to include any information about tools, software, or hardware that will be used.

Test items

List the items that will be tested. This could be a software product, system or website. This is the overarching thing we are testing.

Features to be tested

List the features that will be tested and are within the scope of this test plan. This is broad don’t worry about every tiny feature or sub feature. Those will come later. Will they ever.

Features not to be tested

List the features / requirements that will not be tested and are outside of the scope of this test plan.


Project setup and hot links. External tools, test accounts, mock data, knowledge bases. etc. Know your roots.


Provide a realistic estimate to the time required. Think about sections, tasks, subsections, regression testing and publish intervals.

Risks and contingencies

Identify any known areas that are high risk and may cause delays or disruptions.

Approvals & Completions

Identify who can approve the completion and what that even means.


Where is each approach outside of this document covered? Unit testing. Smoke testing.

Document Maintenance

Who how where and when regarding this document. Dropbox, DVCS, alongside the project source?


Plans, Areas, Methods, Features and Checklists Testing the app from 10K ft to 0.01 mm away. This is where we start to get our hands dirty.

Smoke Tests

Identify and automate the core functions that needs to pass in order for a build to be ready for testing. Perhaps passing unit tests first. Perhaps just confirming the build can open and close. Perhaps any automated tests you have laying around. Figure this out and attach it to the build process.

Unit Testing

Describe any unit testing in the project and how it relates to document user here.

Automated Testing

Give a census of all automated tests and point to instructions for maintenance or additions.

Basic CRUD Testing

Create Read Update Delete. Perhaps this is also automated and attached to the build process. Write a script that will do all in a swoop and also check each action.

Hardware Testing

Max and minimum supported hardware. Document and test against. CPU, RAM, et al. Make sure to do a check on Virtual Machines (faked hardware) and graphics drivers/cards.

Input Method Testing

Keyboard, touchscreen, trackpad, voice, pen, foreign language devices, assistive devices, bluetooth enabled devices, other supported peripherals.

Bounds Testing

Wedging for fun and for profit.

Feature Testing

Map each application-specific feature hierarchically and verify performs as expected. Go as deep as you are able. Functionality.

UI/UX Testing

Layouts. Sections. Buttons. etc. Define views and pages and their respective parts. Cover animations & transitions here. Get flows and storyboards so you have known outcomes.

Screens Testing

Resolution, screen size, pixel density, window size, responsiveness, orientations, external screens.

Search Testing

Searching, finding, highlighting and all things related to search within your application.

Preferences Testing

User configurable application preferences. In application and behind the scenes.

Communication & Dialogs Testing

Interactivity, clarity, spelling, context, behavior. Feedback.

Menus, Key Equivalent Testing

Checking each and every menu item, contextual menu item, each keyboard shortcut and all variations and states.

Configuration Testing

Test each and every internal setting and the features they touch.

Internationalization Testing

How does it handle another locale or language setting.

Localization Testing

Is the app localized?

Claims Testing

Review marketing materials and ensure that each thing is true. Review release notes and ensure each new entry is accurate.

Feasibility Testing

Is the product feasible on the marketplace? Are you solving a problem that exists? Did you begin the product starting at the customer and not the technology behind the app?

Beta Testing

Exploratory black box testing from non-stakeholders. Outlay your plans here.

Documentation Testing

Help guide, about page, getting started – any piece of written stuff that is available in-application.

Security Testing

Buffer overruns? SQL injection? Gatekeeper, signing, sandboxing, injecting, licensing, etc

Binaries Testing

Inspect the application bundle in the finder as deep as you are able. Do the same in Xcode and with any known inspection tools you have. Pass a strings over it. Try to fiddle with the innards and see if there is anything inappropriate

Risk Testing

All points where data is saved and changes are executed. Make a list and test cases for each and every area.

Upgrade & Installation Testing

Installs, updates and upgrades. Cover un-installations here.

Analytics Testing

Any analytics support and confirmation in test/prod.

Third Party Tools Testing

store, Sparkle, TestFlight, frameworks, ad libraries, etc

External Obligations Testing

Are you contractually obligated to show a client logo in certain areas? Do you need to have license credits in the about page? Are you using the correct social media logos?

Debug Testing

Have you left any debug nonsense laying around? Do your logs get shuffled off to an email? Do you point to a temporary server? Is the console spewing things it should not? Have a developer help you audit these items.

First Run Experience Testing

Welcome screens, EULAs, data migrations, prefs, licensing, updates, system checks, how-to guides, use reporting, configuration, walkthroughs, anything and everything tied to the first run of a fresh install. The first run of an update.

Interoperability Testing

Interoperability between other apps or OS’s or services.

Modes and States Testing

Dirty environs, messed up settings, sleep mode, safe mode, restarts etc.

Friendly Neighbor Testing

Determine apps and tools that users may use with your application and make sure they play nice.

Network Testing

Configs, failures and events. Get busy with the Network link conditioner. Jam a proxy into the mix. Cover network on first launch and relaunch issues.

Project Legacy Testing

Review bug reports from previous versions and beta tests from early incarnations.

Accessibility Testing

Mouse-less, sight-less, sound-less, colorblind, enlarged text, inverted color, zoom view. is everything perceivable, operable, understandable, robust experience? Truly sightless VO testing, etc.

Stress & Performance Testing

Load, endurance, boundaries, interruptions, starvation etc. Establish numbers and then push them.

Chaos Monkey Testing

Take your cue from the monkey and break some stuff. Look up Chaos Monkey that Netflix uses and see if you can replicate somehow.

Scenario Testing

Develop use cases and stories. Extract examples from the team, from the support queue and our potential customers. Your stakeholders from all points of entry. As a $USER I need to $ACTION so that I can $RESULT.

Internal Stakeholders Testing

Pick a team member or area and ask an interested team member what they would like tested or if they have any particular concerns about their areas.

Support Advocacy Testing

Identify weak points in prior versions that have caused support load and tackle these here. Try to identify weak points that have carried over and any new ones. Beta & exploratory ad hoc testing are your friends here.

Mockups & Design Track Testing

Are we in line with published mock-ups? Has the designed diverged? Create test cases that can flow with the constant change of direction. This is more back-end stuff that you should do with the aid of design & dev, not general UI/UX testing.

Competitor Testing

Identify existing competitors. Run actions in their software that we can accomplish in ours. Compare and contrast and report findings. Review their release notes and support FAQs for test ideas.

Core Values Testing

Refer to project mission statement. Refer to software values ethos of product owner. Take specific statements and create test cases against them.

Regression Testing

Performance vs last public release. Versus last beta, alpha, build, update, etc.

Personal Testing

Identify yourself as a stakeholder, what needs to be tested by you and for you? What are areas you think no one is paying attention to? Document and share and test those items. You have a passion for software quality and are a champion for the product customer, right? What are you doing above and beyond to fight for the end user?

1,689 Words