|
|
|
|
PhD Dissertation |
EdD Dissertation in Practice |
|
Purpose |
Extend theory, discover something new |
Impact a complex problem of practice and self as leader |
|
Questions |
Research questions – theoretical/academic within one’s field or questions other researchers have not considered |
Significant, high-leverage questions focused on complex problems of practice – problems are user-centered and compelling |
|
Literature |
Comprehensive literature review – in-depth review of the historical, contextual, or social foundation of the study
|
Review of Scholarly and Professional Knowledge - concise review blending professional, practical knowledge with scholarly knowledge to understand the problem, find solutions, and develop measures that will provide evidence of change (or not) Scholarly knowledge is deciphered, debated, and used for solutions |
|
Methods |
Quantitative, qualitative or mixed Researcher is an outsider
|
Practical measures and processes aimed at uncovering if the change is working May be quantitative, qualitative or mixed |
|
Analysis |
Completed by the researcher with some member checks |
Completed by the scholarly practitioner (maybe with participant input) |
|
Spread |
Published in peer-reviewed journals, presented at conferences
|
Disseminated in various ways – Communicated to stakeholders Published in professional and scholarly journals, presented at conferences |
|
Career |
Basis of an academic career – start of a publication record |
Advance professional knowledge and self as a leader |
CPED advocates for the teaching of applied research methods in EdD programs to a) guide the student in their DiP work; and b) give them a tool box for investigating and improving future problems. Applied methods that students can employ when working on their DiP include: Action Research, Improvement Science, Evaluation, and Design-Based Research. While some CPED member institutions may have a preferred methodological approach that is built into their EdD program, others allow students to choose an approach that best aligns with their chosen PoP. Learn more about these approaches, including their similarities and differences, in the sections below.
When students begin to work on their DiP, one of their most important considerations is selecting their methodological approach. A methodological approach is distinct from the research methods that students employ in their DiP. A methodological approach guides the nature, scope, and objectives of an EdD student’s DiP, while research methods are the concrete research tools and tactics that students use to investigate their problem of practice. This section outlines the methodological approaches that are central to the DiP, which include Action Research, Improvement Science, Evaluation, and Design-Based Research.
Action research is a methodological approach that involves examining and trying to solve problems of practice directly within one’s work environment. The process of action research consists of an iterative cycle of observing a problem that affects student learning outcomes, designing one or more potential solutions, directly implementing these solutions, evaluating their efficacy, and modifying these solutions accordingly to achieve an even better result. This cycle is called the Plan-Do-Study-Act (PDSA) method, and it is defined by a “learning through doing” philosophy; in other words, PDSA involves not only researching the causes and effects of a PoP, but also studying the process and outcomes of addressing this PoP in order to become more informed about and adept at solving this PoP long-term. Ideally, action research occurs continually over a long period of time, until an educator finds an optimal, adaptable, and sustainable solution to their PoP.
Students who use action research generally employ it on a small scale to tackle a problem that they notice in their direct work environment—such as a classroom—or firsthand in one of their spheres of influence, such as an extracurricular reading or math program, or within a teacher training module. For example, a teacher who observes that their students are struggling with reading comprehension might try to incorporate a new initiative or modify their lesson plans to try and address this issue. Action research can also include small-scale group projects; examples could include a group of elementary school teachers working together to design and implement a new lesson plan that they will each incorporate into their classrooms before meeting to discuss the outcomes, or instructors at a STEM summer program ideating, refining, and testing more engaging ways for students to interact with mathematical or scientific concepts.
In summary, action research typically consists of an individual educator or a small group of educators who are trying to fix a problem of practice in their work environment, by first understanding the problem, designing and testing potential interventions, and assessing the outcomes of those interventions to continue to improve upon their solutions. It is a highly applied methodological approach that focuses on quickly addressing a pressing educational problem.
Improvement science, like action research, focuses on developing solutions to directly improve educational outcomes for students and other stakeholders in an academic or organizational setting. However, improvement science considers a broader and more systems-oriented scale. In other words, students who use improvement science as their research approach not only identify a problem within their work setting or spheres of influence but also take a step back and ask, “How is the system in which this problem occurs actively causing, contributing to, and/or helping to perpetuate this problem?”
While action research is typically localized to one site, whether that is a K-12 classroom or a department or office within a larger university, improvement science might take into consideration the role that other institutional parties play in the situation they are seeking to investigate. For example, an EdD student who is a university guidance counselor might select improvement science for their DiP because they are invested in researching why graduation rates for transfer students at their university are so low. To investigate this problem, the student would employ a systemic-oriented lens, seeing how different departments or offices, such as admissions, student counseling, tutoring services, financial aid, student advising, and course availability all contribute to the problem.
From there, the student might develop an intervention that impacts one or more of these parties in an effort to have multiple pieces working in concert to address their PoP. To continue with the example above, the university counselor who is investigating low transfer student graduation rates might design an intervention that involves admissions and financial aid working together to support transfer students better, or a specific advising program that connects transfer students with tutors and advisors who can help them plan a program of study and choose the correct courses for on-time graduation. Like action research, improvement science utilizes the PDSA method, and therefore EdD students and education practitioners should continually iterate on their interventions within their work setting, while taking into consideration how the system in which their PoP occurs also impacts the problem over time.
While improvement science is defined by a broader view of an educational problem and the factors contributing to it, that does not mean that students using improvement science for their DiP must create a systems-wide intervention or solution to the problem they have decided to focus on. Rather, the systems-wide view of the problem informs students’ work on an on-the-ground solution that they can more feasibly implement during their time in their doctoral program, and ideally beyond. DiPs are, by definition, designed to serve as a springboard for educators to become scholarly practitioners who use applied research to enact continual improvements in their work setting.
Evaluation as a methodological approach places the focus primarily on data collection and a broad inquiry to determine the efficacy and/or feasibility of an education system, initiative, practice, or policy. Unlike action research and improvement science, both of which focus on iteratively developing, implementing, and studying the effects of education interventions that target specific PoPs, evaluation is an approach that foregrounds the assessment piece to determine what is and what is not working in an existing education program, before making recommendations to improve the program in question.
For example, a student whose DiP uses this approach might investigate an educational program that was designed to increase the educational attainment of select students, but which has not been performing as expected. The student investigating this issue would conduct thorough research of the program, its impacts, and its stakeholders’ needs in order to develop recommendations for the program to improve and better serve its students. In this case, while the EdD student conducting this research for their DiP has not implemented an intervention, their research still focuses on practical applications to improve learning outcomes.
Design-based research is similar to evaluation in that it does not include an “on-the-ground” implementation phase or a PDSA cycle. Instead, design-based research focuses on using design principles to create plans for innovative and engaging educational experiences. A scholarly practitioner who is enrolled in an EdD program and uses design-based research for their DiP might ask themselves, “What design principles, instructional technologies, and educational approaches can be combined to develop a successful learning experience for my target student population?”
To answer this question, the practitioner can incorporate research methods such as interviews, user experience testing, participatory and speculative design sessions, and observational studies to gather data on their students’ learning needs and preferences. After gathering and analyzing sufficient data, the practitioner can design a prototype of their envisioned learning experience for students, which they can show to colleagues and faculty members at their EdD program for feedback on how to improve it.
Examples of learning experience prototypes that scholarly practitioners could design include an interactive module for teachers’ professional development, an educational game that students could play on their tablets as part of class, an online resource and discussion hub for students, or a comprehensive plan for teaching students how to responsibly integrate artificial intelligence (AI) technologies into their learning and assignment work.
While this article focuses on how the methodological approaches discussed above are distinct from each other, it is worth noting that these approaches are united in their aim to directly improve learning outcomes through a combination of scholarship and practice. Given this mission, which is central to CPED’s scholarly practitioner paradigm, these methodological approaches have numerous overlaps in the research methods they utilize. Action research, improvement science, evaluation, and design-based research all use intensive scholarship to gain insight into and increased empathy for the experiences of individuals and communities that are affected by pressing PoPs. Whether they gain this insight through interviews, surveys, statistical analyses of school performance data, or implementing a project and observing its results, EdD students who uphold CPED’s values focus on applied scholarship that addresses complex problems of practice in educational settings.
For more information on the DiP and research approaches, check out the FAQ developed by CPED Partner, OnlineEdDPrograms.com: What is an EdD Dissertation in Practice (DiP) and what does it entail?