Getting to the Root of Engineering Improvement with DORA Core
DORA has distilled years of research to create a new, stable framework to help your team. Capabilities predict performance, which in turn predicts outcomes.
Suppose you wake up with a sharp pain in your back. Oof. That’s a rough start to the day, but thanks to the wonders of medical science, help is available! You head straight to the doctor and say: “Doc, it hurts! What should I do?”
Your doctor reaches into a drawer, pulls out a huge ream of academic articles, and drops them onto your lap with a thump. “Here, read these. They’re full of science.” You eagerly dive into the material, dense with jargon and data, and you find that they tell you to do… what, exactly? One study recommends heat; another, ice. One is from ten years ago, with lots of supporting evidence. Another is from just last week, too fresh to be replicated. There’s numbers and charts and formulas, Soon, your back pain is no better, and it’s been joined by a throbbing headache. “C’mon Doc,” you exasperatedly say: “Tell me: What should I do?”
Too often, R&D leaders experience this same dilemma when trying to apply best practices to their teams; a ton of research, not a lot of actionable takeaways. The DORA team wants to change that. Since 2014, we’ve surveyed tens of thousands of technology professionals and published a number of reports with deep analysis of the data collected. It’s a powerful body of work, but when you’re trying to drive change in your organization, it can be hard to apply in context.
To help translate research into action, in 2023 we introduced DORA Core, a concise framework of capabilities, metrics, and outcomes. Similar to artifacts published by the public health organizations that equip practicing doctors with guidelines for clinical treatment, Core can help you apply findings from DORA research as you navigate a turbulent technological landscape.
Everything is always changing. Or is it?
Working within the technology space—and animated by a passion for continuous improvement—it’s only natural that DORA constantly updates and iterates on our methodology. Technologists constantly refine their ways of working through an evolutionary (that is to say, random but reinforced) process. And so, each year that DORA has conducted our research with practitioners, we’ve added, refined, and removed questions from our data collection survey. Each year may introduce entirely new areas of inquiry, or it may bring minor tweaks to question sets we’ve asked before. Hand in hand with ongoing changes to the questions we pose, we also continuously refine our data analysis methods to keep pace with technology’s rapid rate of change.
And yet. Looking back across the history of our field reveals a surprising level of familiarity… or perhaps circularity: cloud computing harkens back to time-sharing on mainframes. A preference for compiled languages gives way to interpreted, then back to compiled. And we continually rediscover the fundamentals of communication within our teams: “Blameless culture” as practiced by SREs is consistent with Ron Westrum's findings about using failures as opportunities to improve the system, which in turn echoes the philosophies of the Toyota Production System and other human-centric work cultures. Beneath a chaotic surface, there are durable foundations.
The search for those foundations motivates the DORA Core project, to complement our ongoing investigation into what’s happening today. In our research, we observe the present, and even make some speculative attempts to predict the future. Our flagship report is called The Accelerate State of DevOps—as in, the current state. But, from its inception, DORA’s mission has had the goal of delivering research that’s applicable in practical contexts: advice that real people can use in their real jobs. To succeed in that goal, our advice can’t change from one moment to the next, because improving the culture, processes, and even the tools of an organization frequently requires hard, multi-year efforts and the coordination of many people. For such a project, references need to remain recognizably consistent, year over year. These should reflect the durable foundations of our knowledge, not the rapidly oscillating surface.
The DORA Core model
DORA Core offers a simplified view of DORA’s findings throughout the project's history. It’s a natural entry point for anyone just starting to explore the research, or it can serve as a reference point for organizations engaging in ongoing continuous improvement projects. At the highest level, it summarizes the predictive influence effects that we have documented year over year: Capabilities predict Performance, which in turn predicts Outcomes.
If you want to achieve outcomes like profitability and employee well-being, research suggests that improving software delivery (as measured by DORA’s “four key metrics”) will likely help. How can you improve software delivery? The model shows the capabilities—ranging from technical practices like continuous delivery to human elements like promoting a “generative” culture of shared learning—which we have found are likely to make your software delivery metrics improve. A team can use Core as part of a reflection on their current state, and a tool for identifying growth opportunities.
A process of distillation
The scientific method is not a matter of asserting what is true and what is false; it’s a process of framing hypotheses, and continuously adding support for some, while reducing support for others. DORA has amassed a formidable corpus of information: 10+ years of data and analysis. While each yearly study is conducted with methodological integrity, that methodology changes each year. We employ evolving approaches to an evolving problem space.
This makes the task of simplification far from simple: there will be some subjectivity. Of course, subjectivity is a feature of all science: measurement can be performed with rigor, but the choice of what to measure, and how data are interpreted, are human, social activities. Furthermore, the value that we ascribe to research is a product of its utility: good science isn’t merely well-conducted; it’s also well-consumed by its audience. With an eye toward balance, we developed a simple rubric to determine which of DORA’s many research findings are included in the Core model. To warrant inclusion, an item must have:
Reproducibility: It must have been researched at least twice, with consistent findings
Applicability: It must have proven valuable in practitioner contexts
For the model’s first iteration, a number of capabilities, metrics, and outcomes made the cut, and their relationships are shown in the visual model at dora.dev/research. Additionally, each capability is linked to an article in DORA’s capability catalog. These articles describe what each capability means in practice, how to start implementing it, and how to measure its effectiveness.
Make use of DORA Core
DORA Core can help you apply the research findings in your own context. If you’re part of a team that makes or delivers an application or service, our research points to the value of continuous improvement. This means collaborating to experiment with new practices, new tools, and new ways of working, all informed by data and the experiences of others.
The conversations you have are the essential work: they’re how you discover your current state and find opportunities for change. DORA Core is a visual aid that you can gather around and debate: where are your bottlenecks? Which capabilities need improvement? (Like to doodle? Print it out and mark up the parts you want to focus on!) Beyond your team, Core can help engage your stakeholders:
Point your leadership to the outcomes at the right, then trace the predictive arrows backwards to show how software delivery drives progress on the organizational goals that they’re measured on.
Talk about how you can partner to incentivize teams to strengthen their capabilities and improve performance, and how well-being is an important means to that end.
Continued exploration
DORA Core is, intentionally, a conservative artifact: it evolves slowly, so that it can effectively motivate change. But it does evolve. Capabilities and connections may be added, as the research adds validation and practitioners put them into practice. They may be removed if they become less useful over time. It's all subject to change, based on the data, and on feedback from practitioners like you. So we want to hear from you: contribute to Core’s evolution, and share your thoughts in the ongoing discussion. And join your fellow travelers to share stories and research: an essential part of DORA’s research program is socializing findings with the DORA Community; we’d love to see you there.
About the author: Dave Stanke is a Developer Advocate with DORA and Google Cloud, specializing in DevOps, Site Reliability Engineering (SRE), and other flavors of technical relationship therapy.
Sponsored: Download the 2023 State of DevOps (DORA) Report
The DevOps Research and Assessment (DORA) team at Google and LinearB have partnered to gather research from over 36,000 professionals worldwide for the 2023 Accelerate State of DevOps Report.
Use these findings to accelerate organizational performance while reducing burnout and get access to:
Key Outcomes from DORA
DORA Metric Performance across categories
Crucial Performance Predictors
In-depth info on AI, Cloud Infrastructure and Reliability