Evidence in Practice: Bridging Research and Real-Life Therapy
May 02, 2025
By Lizzy Dawson - The Paediatric Practitioner
If you’re a clinician, educator, or therapist, you’ve likely heard the phrase “evidence-based practice” more times than you can count. And while we all want to be evidence-informed, the reality is that integrating research into day-to-day practice can feel… well, clunky. Or confusing. Or just not relevant to what’s actually happening with the children and families we support.
Luckily, I feel like this is pretty much what I have done with my courses and with my online business and mentoring at The Paediatric Practitioner and in a recent podcast episode with Em, we unpacked exactly how I think about using research in real life. We looked at how to spot high-quality studies, how to apply them practically, and what to do when research doesn’t quite match what you're seeing on the floor.
Why Research Still Matters (Even When It Feels Out of Reach)
We know that research is meant to guide best practice but sometimes, that gap between “the study” and “the session” feels huge. That doesn’t mean we throw the research out. It means we use it as a compass, not a rulebook.
Here’s how I stay grounded in evidence without losing sight of real-world needs:
-
Stay updated: I scan databases like PubMed, subscribe to clinical journals, and attend webinars when I can. (Even a 10-minute scroll while eating lunch can spark a new idea!)
-
Critically appraise what I read: Not all studies are created equal. If it’s a treatment question, I’m looking for randomised controlled trials. If it’s a new trend or concept, I want to see a systematic review or meta-analysis, not just one person’s opinion.
-
Consider context: Research gives us guidance, but our clients give us the context. What works in one study population might not be right for the child in front of me and that’s okay.
What to Look For in Good Research
Let’s be honest the internet is full of “studies.” But what actually makes one worth taking the time to read and understand?
Here’s my go-to filter:
-
Peer-reviewed journal?
If it hasn’t been through peer review, I take it with a grain of salt.
-
Study design matches the question?
RCTs for treatment efficacy. Observational studies for exploring patterns. Systematic reviews when I want the big picture.
-
Sample size?
It really depends on what you are doing, I always thought bigger was better, and in some cases it is but through my research with The University of Queensland we learned that if you are getting more data on a smaller sample size it is bette.
-
Transparent methodology?
I want to know exactly how they did the study, who was included, and what they measured.
-
Conflict of interest?
Was it funded by a company that benefits from the outcome? That’s not a dealbreaker, but it matters.
Quick example: I once found a study on a great-sounding sensory intervention but it had a tiny sample and was written by someone selling a product. That doesn’t mean it’s useless, but it definitely shapes how I interpret it.
Translating Research Into Real-Life Practice
Knowing what’s “evidence-based” is great but how do we actually use that knowledge without overwhelming our workflow (or our clients)?
Here’s how I make it manageable:
-
Start small.
I pick one strategy from the research and trial it with 1–2 clients. I see what happens. No pressure to make it perfect.
-
Reflect and tweak.
I track what’s working (and what’s not), then adjust. I might use informal notes, observation, or specific outcome measures if I’m doing a deeper dive.
-
Bring your team along.
If I’m introducing a new protocol or tool, I walk the team through the research and the “why.” That shared understanding helps keep things consistent and avoid confusion.
-
Educate the families.
Parents don’t want a research paper, they want to understand how something will help their child. So I explain the strategy in real terms: “We’re trialling this approach because research shows it helps with regulation and attention, and I think it could really support what you’re seeing at home.”
When Research Doesn’t Match What You’re Seeing
This is such a common reality and something we dove into in the podcast.
Sometimes, the research says one thing, but your clinical experience says another. So what do you do?
Here’s my approach:
-
Look deeper into the study.
Was the population different from yours? Were there limitations in the method?
-
Cross-check with other research.
Is this study an outlier? Or part of a bigger trend?
-
Talk to your peers.
Chances are, someone else has read the same study and had similar thoughts. Their experience might help shape your next step.
-
Come back to the child.
Always. Is this strategy right for this child, right now? If it doesn’t fit, don’t force it adapt it.
💬 Final Thoughts: Be Curious, Not Rigid
At the end of the day, research isn’t meant to replace your experience — it’s meant to support it.
The best clinicians I know are the ones who are:
✔️ Curious
✔️ Open-minded
✔️ Confident in their clinical judgement
✔️ Willing to evolve
So keep reading. Keep questioning. Keep applying what fits and having honest conversations about what doesn’t.
And check out my courses because I pretty much am blending the research with my 15 years experience in working with kids as an exercise physiologist.
Remember, research isn’t just data it’s a way to guide us toward better patient outcomes. Keep questioning, keep learning, and most importantly, keep doing what’s best for your patients.