Evaluating methods

PSYC 11: Laboratory in Psychological Science

Jeremy R. Manning
Dartmouth College
Spring 2026

What makes a "good" methods section?

  • It produced the intended outcome (the reproduced drawings matched)
  • People following the instructions understood what to do (few assumptions needed)
  • The procedure wasn't overly complex or time-consuming

Which of these three criteria matters most? Can they conflict? For example, could instructions be crystal clear but produce the wrong outcome? Could vague instructions still produce a good result?

How did your group's instructions hold up?

  • Did the reproduced drawings match the originals? Where specifically did they diverge?
  • What assumptions did people make when following your instructions? Were any assumptions correct despite not being stated?
  • If someone followed your instructions "perfectly" but the drawing still looked wrong, what does that tell you about the instructions vs. about the person?
  • What would you change if you could rewrite them knowing what you know now?

What data do we have?

  • Original drawings and instructions
  • Reproduced drawings (images)
  • Lists of assumptions made about each set of instructions
  • Ratings of how closely instructions were followed
  • Evaluations of instruction quality (appearance, meaning, clarity, efficiency)

What do the assumptions data tell us?

  • More assumptions could mean more ambiguity
  • But more complex instructions might also require more assumptions while conveying more information
  • If the end product is still correct, those ambiguities might not matter much
  • The type of assumption may matter more than the number

Evaluating your own vs. others' work

  • When you rated or evaluated, which criteria did you prioritize? Were they the same criteria the writer would have chosen?
  • Would you rate your own group's instructions differently than outsiders did? In which direction, and why?
  • Peer reviewers in science face the same challenge: they evaluate work without having done it. What biases might creep in? How does "insider knowledge" about your own instructions change your evaluation?

Framing for the lab report

  • Your lab report should connect these data to a broader question about communication and methods
  • Think about: What did this exercise teach you about writing for reproducibility?
  • Use data and examples to support your conclusions

Sample analyses

Open analysis notebook in Colab

Questions? Want to chat more?

📧 Email me
💬 Join our Slack
💁 Come to office hours
  • Write your lab report using the data and analyses from this week; due Monday at 11:59pm!
  • Check canvas for the lab report rubric
  • Next week: data sleuthing lab! Important: no class on Wednesday; we'll do part 2 of the lab during our Thursday X-hour instead.