Welcome to the Psychologist's Toolbox!

Hi there! Welcome to one of the most important parts of your Psychology course. Think of Research Methods as a "toolbox." Just like a builder needs different tools to build a house, a psychologist needs different methods to study human behavior. Whether we are watching people in a park or testing a new memory trick in a lab, we need to know exactly how to do it right.

Don't worry if some of this feels like "science-heavy" at first—we'll break it down piece by piece. By the end of these notes, you’ll be able to spot the difference between a field experiment and a naturalistic observation in your sleep!

1. The Big Four: Core Research Methods

A. The Experiment

An experiment is the only method that lets us say one thing caused another. We do this by changing one thing (the Independent Variable) and seeing what happens to another thing (the Dependent Variable).

Types of Experiments:
1. Laboratory Experiment: Done in a controlled environment. Example: Testing memory by asking students to learn words in a quiet room.
2. Field Experiment: Done in a real-world setting. Participants often don't know they are in a study. Example: Dropping a wallet in a busy street to see if people help.
3. Quasi Experiment: The researcher doesn't change the main variable because it already exists (like age, gender, or a personality trait). Example: Comparing the memory of 10-year-olds vs. 50-year-olds.

Quick Review: The Variable Trick

Independent Variable (IV): The thing I change.
Dependent Variable (DV): The Data we collect.

B. Observation

Sometimes we just want to watch! But in Psychology, we have to be very specific about how we watch.

Ways to Observe:
- Naturalistic: Watching behavior where it normally happens (like a playground).
- Controlled: Setting up a specific situation (like a play-room with specific toys).
- Participant: The researcher joins the group they are watching (like an undercover boss).
- Non-participant: The researcher stays outside the group.
- Overt: Participants know they are being watched.
- Covert: Participants are being watched "undercover" (they don't know).
- Structured: Using a checklist of behaviors to tick off.
- Unstructured: Writing down everything interesting that happens.

C. Self-Report

This is simply asking people about themselves.
- Questionnaires: A set of written questions. Great for getting lots of data quickly!
- Interviews:
    o Structured: Every person gets the exact same questions in the same order.
    o Unstructured: More like a conversation; the researcher can follow up on interesting answers.
    o Semi-structured: A mix of both—some set questions, but room to chat.

D. Correlation

A correlation looks at the relationship between two things. It does not prove that one caused the other!
- Positive Correlation: As one thing goes up, the other goes up. Example: The more you revise, the higher your test score.
- Negative Correlation: As one thing goes up, the other goes down. Example: The more time you spend gaming, the less time you spend sleeping.
- No Correlation: There is no link at all. Example: Your shoe size and your IQ.

Key Takeaway

Experiments find causes; Observations describe behavior; Self-reports get thoughts and feelings; Correlations find links between variables.

2. Planning Your Research

Aims and Hypotheses

Before a psychologist starts, they need a plan.
- Research Aim: A general statement of what the study is about. "I want to see if coffee helps people study."
- Hypothesis: A clear, testable prediction.
    o Null Hypothesis: Predicts no difference. "Coffee will not affect study scores."
    o Alternative Hypothesis: Predicts there will be a difference.
    o One-tailed (Directional): Predicts which way it will go. "Coffee will increase scores."
    o Two-tailed (Non-directional): Predicts a change but doesn't say which way. "Coffee will change scores."

Who are we studying? (Sampling)

We can't study everyone in the world (the Target Population), so we pick a Sample.
- Opportunity Sampling: Using whoever is available at the time. (Easy but biased!)
- Random Sampling: Everyone has an equal chance of being picked (like pulling names from a hat).
- Snowball Sampling: You find one person, and they find their friends, and so on.
- Self-selected (Volunteer): People sign up themselves after seeing an ad.

Experimental Designs

How do we organize our participants?
1. Independent Measures: Different people in each group. (Group A drinks coffee, Group B drinks water).
2. Repeated Measures: The same people do both tasks. (Everyone drinks water on Monday and coffee on Tuesday).
3. Matched Participants: We find two people who are very similar (like twins or same IQ) and put one in each group.

Memory Aid: The "Independent" Trick

Think of Independent as "Individuals." Every person is a different individual in a different group!

3. Designing Tools (Fine-Tuning)

Designing Observations

To make observations fair, we use:
- Behavioural Categories: Breaking behavior into specific "tally" boxes (e.g., "hitting," "shouting," "pushing").
- Time Sampling: Recording behavior every 30 seconds.
- Event Sampling: Recording every time a specific behavior (like a sneeze) happens.

Designing Self-Reports

When writing questions, we choose between:
- Open Questions: Participants use their own words. "How do you feel today?"
- Closed Questions: Fixed answers like Yes/No.
- Rating Scales:
    o Likert Scale: "Strongly Agree" to "Strongly Disagree."
    o Semantic Differential: Choosing a point between two opposites (e.g., Happy [---] Sad).

4. Common Pitfalls (What can go wrong?)

Psychology is messy! Watch out for these:
- Extraneous Variables: "Extra" things that might mess up your results (like a loud noise outside the lab).
- Demand Characteristics: When participants figure out the aim of the study and change their behavior to "help" or "hinder" the researcher.
- Social Desirability: When people lie on a questionnaire to make themselves look better.
- Researcher Bias: When the researcher's own expectations accidentally influence the results.

Did you know?

The "Hawthorne Effect" is a famous example of demand characteristics. Workers worked harder simply because they knew they were being watched, not because of the changes the researchers made!

Summary: The Quick Checklist

1. Pick a Method: Experiment, Observation, Self-report, or Correlation.
2. Write a Hypothesis: Null or Alternative (Directional or Non-directional).
3. Choose a Sample: Random, Opportunity, Snowball, or Volunteer.
4. Pick a Design: Independent, Repeated, or Matched.
5. Control Variables: Keep it fair by removing extraneous variables!


Don't worry if this seems like a lot of terms! The best way to learn these is to practice identifying them in the "Core Studies" you will read later. You've got this!