When I teach people how to do exploratory testing, a common point of confusion is around what to put in your notes. While I often tell people it depends on you, what you're testing, and the company you're working for - they still want some concrete advice. So I often show some examples from past projects and provide the following template:
- Mission: list out what you're testing with this charter
- Environment: list out meta information related to your test environment (versions, configuration, location, etc...)
- Risk: as you test, list out what risks you're looking for while testing
- Coverage: as you test, list out what areas, features, users, data, or other meaningful dimension you're covering while testing -- it's worth noting, I also instruct them to list out what they didn't have time to cover...
- Techniques: as you test, list out what you're doing... what techniques you're using, how you develop tests, etc... (in math class, this would be the "show your work" section of the document)
- Status: as you test, list out questions that occur to you that you need to get answered later, possible issues/problems/bugs you find while testing, notes about automation or future test sessions, etc....
- Obstacles: as you test, list out things that get it your way or ideas you have for things that would make your testing more effective -- this can be tools, hardware, information, training, etc...
For those readers who do session based testing on a regular basis, you'll notice I don't capture some of the classic items like setup time and time spent investigating issues. If you need to capture those metrics (or other metrics your team uses), simply add them in. Over time your session notes morph to become your own and you'll develop a format that works for you.
I'd be interested to see what other people capture.