This is one part your standard test plan, one part my weird, weird brain. Context is everything. Take what you want and leave the rest. The tests approaches are by no means exhaustive but serve as a good jump-off point to get you going.
App Test Manifest
What is this app? 30K ft view. Make it pithy.
Document Usage Guide
This is where the project mission statement goes. If you don’t have one from outside, write your own. Provide an overview of the plan in terms of what the particular project covers. You may briefly mention items such as limitations in resources and budgets, scope of testing and how other activities such as reviews are related. This is just a summary, so keep things short.
Provide details of who has the responsibility of delivering different parts of the test plan. Name names. List specific parts of the testing.
Give a low down of who is doing what with the project outside the scope of testing. Name names. List titles and departments, maybe even contact data.
Your overarching testing paradigm if that wasn’t covered elsewhere.
Testing assumptions we will be making.
Staffing and training needs
Identify the people and skills needed to deliver the plan.
Expected items from working through this guide. This may include bug reports, updates to the guide itself. Identify what should be delivered as part of this plan including pass/fail reports, performance tracking, customer-ready documents, etc. Every thing that will be turned over to another party.
Are there any special requirements for these tests. You may want to consider things like special hardware, test data or restriction to any system during testing.
Outline what functional tasks are required with exception to the actual testing. You may want to consider equipment setup along with any administrative tasks.
Suspension criteria and resumption requirements
State in what circumstances to stop and restart the test. Define clearly.
Item pass/fail criteria
State the acceptable pass / fail criteria. This can be a general criteria or at individual test case level.
Outline the overall test strategy for this plan, identifying the test process and rules that will be followed. You may also want to include any information about tools, software, or hardware that will be used.
List the items that will be tested. This could be a software product, system or website. This is the overarching thing we are testing.
Features to be tested
List the features that will be tested and are within the scope of this test plan. This is broad don’t worry about every tiny feature or sub feature. Those will come later. Will they ever.
Features not to be tested
List the features / requirements that will not be tested and are outside of the scope of this test plan.
Project setup and hot links. External tools, test accounts, mock data, knowledge bases. etc. Know your roots.
Provide a realistic estimate to the time required. Think about sections, tasks, subsections, regression testing and publish intervals.
Risks and contingencies
Identify any known areas that are high risk and may cause delays or disruptions.
Approvals & Completions
Identify who can approve the completion and what that even means.
Where is each approach outside of this document covered? Unit testing. Smoke testing.
Who how where and when regarding this document. Dropbox, DVCS, alongside the project source?
Plans, Areas, Methods, Features and Checklists Testing the app from 10K ft to 0.01 mm away. This is where we start to get our hands dirty.
Identify and automate the core functions that needs to pass in order for a build to be ready for testing. Perhaps passing unit tests first. Perhaps just confirming the build can open and close. Perhaps any automated tests you have laying around. Figure this out and attach it to the build process.
Describe any unit testing in the project and how it relates to document user here.
Give a census of all automated tests and point to instructions for maintenance or additions.
Basic CRUD Testing
Create Read Update Delete. Perhaps this is also automated and attached to the build process. Write a script that will do all in a swoop and also check each action.
Max and minimum supported hardware. Document and test against. CPU, RAM, et al. Make sure to do a check on Virtual Machines (faked hardware) and graphics drivers/cards.
Input Method Testing
Keyboard, touchscreen, trackpad, voice, pen, foreign language devices, assistive devices, bluetooth enabled devices, other supported peripherals.
Wedging for fun and for profit.
Map each application-specific feature hierarchically and verify performs as expected. Go as deep as you are able. Functionality.
Layouts. Sections. Buttons. etc. Define views and pages and their respective parts. Cover animations & transitions here. Get flows and storyboards so you have known outcomes.
Resolution, screen size, pixel density, window size, responsiveness, orientations, external screens.
Searching, finding, highlighting and all things related to search within your application.
User configurable application preferences. In application and behind the scenes.
Communication & Dialogs Testing
Interactivity, clarity, spelling, context, behavior. Feedback.
Menus, Key Equivalent Testing
Checking each and every menu item, contextual menu item, each keyboard shortcut and all variations and states.
Test each and every internal setting and the features they touch.
How does it handle another locale or language setting.
Is the app localized?
Review marketing materials and ensure that each thing is true. Review release notes and ensure each new entry is accurate.
Is the product feasible on the marketplace? Are you solving a problem that exists? Did you begin the product starting at the customer and not the technology behind the app?
Exploratory black box testing from non-stakeholders. Outlay your plans here.
Help guide, about page, getting started – any piece of written stuff that is available in-application.
Buffer overruns? SQL injection? Gatekeeper, signing, sandboxing, injecting, licensing, etc
Inspect the application bundle in the finder as deep as you are able. Do the same in Xcode and with any known inspection tools you have. Pass a strings over it. Try to fiddle with the innards and see if there is anything inappropriate
All points where data is saved and changes are executed. Make a list and test cases for each and every area.
Upgrade & Installation Testing
Installs, updates and upgrades. Cover un-installations here.
Any analytics support and confirmation in test/prod.
Third Party Tools Testing
store, Sparkle, TestFlight, frameworks, ad libraries, etc
External Obligations Testing
Are you contractually obligated to show a client logo in certain areas? Do you need to have license credits in the about page? Are you using the correct social media logos?
Have you left any debug nonsense laying around? Do your logs get shuffled off to an email? Do you point to a temporary server? Is the console spewing things it should not? Have a developer help you audit these items.
First Run Experience Testing
Welcome screens, EULAs, data migrations, prefs, licensing, updates, system checks, how-to guides, use reporting, configuration, walkthroughs, anything and everything tied to the first run of a fresh install. The first run of an update.
Interoperability between other apps or OS’s or services.
Modes and States Testing
Dirty environs, messed up settings, sleep mode, safe mode, restarts etc.
Friendly Neighbor Testing
Determine apps and tools that users may use with your application and make sure they play nice.
Configs, failures and events. Get busy with the Network link conditioner. Jam a proxy into the mix. Cover network on first launch and relaunch issues.
Project Legacy Testing
Review bug reports from previous versions and beta tests from early incarnations.
Mouse-less, sight-less, sound-less, colorblind, enlarged text, inverted color, zoom view. is everything perceivable, operable, understandable, robust experience? Truly sightless VO testing, etc.
Stress & Performance Testing
Load, endurance, boundaries, interruptions, starvation etc. Establish numbers and then push them.
Chaos Monkey Testing
Take your cue from the monkey and break some stuff. Look up Chaos Monkey that Netflix uses and see if you can replicate somehow.
Develop use cases and stories. Extract examples from the team, from the support queue and our potential customers. Your stakeholders from all points of entry. As a $USER I need to $ACTION so that I can $RESULT.
Internal Stakeholders Testing
Pick a team member or area and ask an interested team member what they would like tested or if they have any particular concerns about their areas.
Support Advocacy Testing
Identify weak points in prior versions that have caused support load and tackle these here. Try to identify weak points that have carried over and any new ones. Beta & exploratory ad hoc testing are your friends here.
Mockups & Design Track Testing
Are we in line with published mock-ups? Has the designed diverged? Create test cases that can flow with the constant change of direction. This is more back-end stuff that you should do with the aid of design & dev, not general UI/UX testing.
Identify existing competitors. Run actions in their software that we can accomplish in ours. Compare and contrast and report findings. Review their release notes and support FAQs for test ideas.
Core Values Testing
Refer to project mission statement. Refer to software values ethos of product owner. Take specific statements and create test cases against them.
Performance vs last public release. Versus last beta, alpha, build, update, etc.
Identify yourself as a stakeholder, what needs to be tested by you and for you? What are areas you think no one is paying attention to? Document and share and test those items. You have a passion for software quality and are a champion for the product customer, right? What are you doing above and beyond to fight for the end user?