“Worth More than Vibes” – Evidence-Based Practice in Leader Development

Aaron Pomerantz, PhD

Leadership development vibes

In today’s professional and educational landscapes, almost everyone claims to be “data driven” or committed to “evidence-based practice.” This makes sense, given how central science has become in our culture. Just as the ancient Greeks consulted the oracle of Delphi, the modern world appeals to empiricism. However, just as the ancients often didn’t understand the Seers and Oracles to whom they appealed, many moderns face a similar lack of clarity around the meaning of evidence-based practice.

The fact is, evidence-based practice (EBP) is more than just a buzzword. It’s more than testimonials, engagement metrics, or retention rates. And it’s certainly more than following popular models just because they feel intuitive or seem to work.

Real, meaningful evidence-based practice is far more rigorous - but that also makes it more honest and useful.

Of course, EBP looks different across contexts. EBP in medicine will look vastly different from EBP in coaching or academic research. That’s especially true for leadership, evidence-based leader development in one context might look wildly different than in another.

If we want our leader development practice to be truly evidence-based, meaning replicable, explainable, and effective, then there are three core principles we can’t ignore.

Use Theory, Not Vibes

Claims about developing leaders - whether it be identity work, KSA-oriented trainings, or more systemic concerns like communication structures and conflict management strategies - are all inherently empirical. They are assertions that, after a developmental process, meaningful, observable change will have occurred.

That means developmental goals, expectations, and practice must be grounded in theory. “Standing on the shoulders of giants” isn’t an abstract scientific truism -  it’s the fundamental, non-negotiable core of scientific inquiry.

It’s why every good research paper starts with a literature review, and why graduate students must familiarize themselves with their discipline’s history and practice before they become independent researchers.

Now, that doesn’t mean that practitioners need to be Ph.D.s to meaningfully engage in leader development. However, it does require a certain degree of literacy regarding the area of practice. Appeals to “common sense” or things “just making sense” are the beginning of bad science, meaning they’re also antithetical to evidence-based practice, which needs to be contextualized within the current state of the field just as much as a scholar must with a research question.

That might mean more reading. It might mean reaching out to experts (who, it should be said, should themselves do a better job of ensuring their insights are accessible and understandable by practitioners). It will almost certainly mean slowing down. However, the payoff will be more than worth it, because this kind of evidence-based practice doesn’t just contextualize or temper expectations - it sharpens them, making work more specific, more testable, and, at the end of the process, more impactful.

That’s why frameworks and theories like managerial coaching or Transformational Leadership have endured so well. It’s not a matter of charisma, luck, or magic, but being grounded in decades of scholarship that allowed new paths to be forged in leadership science (Deng et al., 2023; Dhar, 2022).

​​Measure What Matters

Assessment and evaluation aren’t optional for evidence-based practice - they’re essential to it. Where theory provides the foundation, measurement is the evidence that allows us to track if our practice is actually doing anything. Without measurement, we’re left in the dark, hoping for change rather than knowing it’s happening. And just as hope is not a strategy, it is also not evidence.

It is also important that measurement be valid, reliable, and objective. The fact is that humans, as decision-makers and observers, are biased. We interpret everything through personal filters, which is why social cognition terms like “confirmation bias,” “groupthink,” and “cognitive dissonance” have become so prevalent in popular discourse.

Empiricism’s rules and practices were intentionally designed to overcome biases in human thinking - to allow us to be more sure that our observations reflect reality, not our own flawed assumptions.

However, not just any metrics will do. Our measures must map directly onto our theory of change - meaning they must capture the specific shifts we’re aiming to produce. If we claim to be developing leaders who are self-aware, humble, and purposeful, we must measure self-awareness, humility, and purpose - preferably in a variety of ways using different types of data! However, while doing so might seem like a challenge, it will also allow our conclusions to be that much more nuanced and impactful.

This isn’t easy - proper measurement takes time, effort, and expertise. But it is integral to evidence-based practice, because it is, itself the “evidence” in that term. It is the price of admission to EBP. Yes, not everyone needs to become a full-time data scientist, but we still must commit to intentional, meaningful measurement - and have the courage to act on what it shows us.

Become Truly Data-Driven

Just like backseat drivers aren’t really driving a car, occasionally glancing at data isn’t evidence-based practice. If the data aren’t in the driver’s seat, something else is, and it’s usually bias, inertia, or wishful thinking. That kind of driving has consequences.

At best, it means our work may be less effective than it could be, or that we won’t understand why we get the results we do, leaving us unable to replicate, scale, or build on them. Even success becomes little more than a happy accident when not grounded in theory and evaluation.

At worst, the consequences are much broader: wasted resources, junk science, and real harm.

Consider the Myers-Briggs Type Indicator (MBTI), a “personality test” still wildly popular in developmental settings (with prices starting at $50 a pop!) despite being based on outdated Jungian theory and showing poor reliability, weak validity, and no predictive power (Pittenger, 2005; Koerth, 2018).

Similarly, take the enduring belief in learning styles - the idea that people learn better when taught in their “preferred modality” (visual, auditory, kinesthetic, etc.). Decades of research show no evidence that matching teaching to style improves outcomes (Pashler et al., 2008). It might  even harm learners by creating fixed, limiting beliefs about their abilities (Willingham, 2021). These ideas feel intuitive and may even be bolstered by flashy (albeit meaningless) data. However, when real evidence is in the driver’s seat, it doesn’t take pseudoscientific detours.

With leader development representing a $370 billion industry (Westfall, 2019), we owe it to our clients (and ourselves!) to be truly data-driven, even when it’s hard. That means not cherry-picking metrics to confirm what we want to believe. It means not confusing feedback with impact. And, perhaps most importantly, it means having the discipline (and humility) to admit when something isn’t working and the courage to do something about it.

But this isn’t bad news; it’s also incredibly empowering. When data is in the driver’s seat, we’re not just identifying what’s working, we’re also unlocking where to invest our time, where to scale our efforts, and where we can make the biggest difference. Evidence-based practice isn’t just a noble ideal - it’s a meaningful advantage. It might be hard work; but it’s hard work with real, tangible, and lasting rewards.

Going Beyond Buzzwords

Although plenty of people claim to use “evidence-based practice,” such claims often fall apart under scrutiny. But this shouldn’t be cause for cynicism; it should be a call to action.

If anything, it’s an opportunity: a chance to hold ourselves accountable, raise our standards, and become the kind of practitioners, researchers, and organizations who don’t just say we’re data-driven, but who anchor ourselves in theory, measure what matters, and let the data guide our decisions.

Evidence-based practice is more than a label, a marketing hook, or a shibboleth. It’s a commitment to rigor, to clarity, and to meaningful impact. And that kind of commitment is what produces leader development that endures.

Yes, it’s hard. But it’s worth it. Because developing tomorrow’s leaders is too important a task to trust to vibes.